Blood tests were originally envisaged as a way of helping to diagnose diseases. Since their inception, the number of ailments tested for has risen dramatically, along with the volume of ‘routine’ tests we’re exposed to.
The latter are typically carried out when there is nothing actually wrong with the patient. While these may be slightly unpleasant, they’re viewed as a necessary evil because they’re only minimally invasive and can yield supposedly valuable information about our health.
A phrase that’s often used to justify these blood tests is ‘better safe than sorry. However, while these may seem like only a minor inconvenience, it’s easy to lose sight of the enormous amount these tests cost the National Health Service each year.
The way we justify this expense is by looking at blood tests as a preventative measure; as the only way of ensuring that we are, in fact, as healthy as we appear. However, we’re not avoiding any sort of disastrous outcome if we are in fact healthy to begin with!
Even where these results say we’re healthy, though, how can we be certain we actually are? We rarely stop to question the validity of these findings, but can it perhaps be dangerous to accept what we’re told - and, ergo, the standard way of doing things - without any wider questions being raised?
That’s what we’ll explore in this discussion piece.
How are blood test results interpreted?
When you have a blood test, the results of that test are compared to the normal and accepted range for healthy individuals. These numbers indicate the lower and upper limits, and if your results fall outside of them, you’re considered to have an issue.
However, there is no ‘manual of life’ to tell us where these figures should fall. Medical professionals have invented these ranges by taking the results of lots of seemingly healthy individuals and creating an average.
So how did we manage before blood tests were invented? If we look at medical case history, taking naturopathy and other forms of early medicine into account, it becomes clear that we were able to assess the health of individuals long before we developed these methods.
This raises obvious questions. Firstly, if we can evaluate someone’s health without a blood test, why is it necessary to carry them out on such a frequent basis? Secondly, if a person looks and feels healthy but their blood tests disagree, are they unhealthy or not? Thirdly, if a person feels unwell but blood tests as being healthy, is the blood test correct or their lived experience?
This is a major issue with modern medicine. We have decided that the blood test is more important than any other diagnostic tool and that it overrides what we actually feel. This becomes problematic once we realise that an industry that benefits from us being ill is the one deciding on whether or not we’re healthy.
This can thus cause us to question ourselves. We may feel perfectly well, but if a test tells us otherwise, we’re expected to take action. The medical establishment suggests we must act immediately, undergoing costly treatments to make ourselves ‘well’ again i.e. to make numbers on paper align.
These blood tests are meant to provide a clear picture of how our bodies and their internal systems are operating. This means flagging up any organs that are being overworked or coming under strain. For example, if high levels of a particular hormone are detected, the assumption is that the gland producing these has become overactive. Similarly, if low levels of nutrients are uncovered, medical professionals suggest we should be supplementing these.
We, therefore, view these results as an accurate indicator of how our bodies are working, using these results to guess which organs or glands are working at a less than optimal level, what activity we ought to suppress, and which we should be supporting. However, there are certain issues with such an approach.
Veins versus arteries
When medical professionals take blood, this comes from a vein rather than an artery. The two perform different functions, with the latter taking oxygen and nutrients to the cells and the former returning used blood, which is low in oxygen and high in waste products.
This has three important consequences.
1. Blood tests only show us the nutrients in our blood after our cells have already taken what they require. This means that where high levels are detected, only a small amount has been taken, and where low levels are found, the cells have satiated themselves to a greater extent.
As a result, it’s impossible to say whether low levels are simply a reflection of the cells having taken what they need, in the way a plate is more likely to be empty when a diner is hungry than when they’re full.
2. When high levels of waste products are found, an assumption is made that a related organ is not performing its role properly. However, veins are designed to take waste products away from cells, so when organs are working hard, they naturally produce more of this.
Ergo, this doesn’t necessarily mean the organ isn’t working properly; it could simply be the case that they're doing a good job. An assumption to the contrary is like suggesting that a factory is inefficient and/or not working well because it produces a large amount of waste.
3. The level of hormones found in a single blood sample is not the same as the amount of hormones being produced by a gland and carried to cells for operational purposes. Rather, it only shows what a person’s blood looks like once the cells have taken what they need from it.
To put that a different way, the hormones in a blood sample are nothing more than an indication of the hormones produced by the cells themselves. These come directly from the cells as this is where the blood is being transported from when the test is performed.
This casts the validity of blood tests in a very different light to the one we’re used to.