Everyone in fitness is evidence based now. Every coach. Every influencer. Every person selling a training plan or a supplement or a six week transformation. They are all evidence based. They all cite studies. They all have the science on their side.
The phrase has become so overused that it has lost all meaning. It is now just marketing. A badge people stick on their content to signal credibility without actually understanding what it means to interpret research properly.
There is a paper that proves this better than anything I could write.
CIGARETTE SMOKING: AN UNDERUSED TOOL IN HIGH-PERFORMANCE ENDURANCE TRAINING
That is the actual title. Published in the Canadian Medical Association Journal in 2010. Peer reviewed. Properly cited. Available on PubMed right now.
The paper argues that cigarette smoking should be incorporated into high performance training programmes. It makes three main claims, all supported by references to published research:
First, smoking increases serum haemoglobin and haematocrit levels. Higher haemoglobin means greater oxygen carrying capacity. This is the same mechanism behind altitude training and blood doping. The citations check out.
Second, smoking increases lung volume. Smokers develop larger total lung capacity as an adaptation. Larger lungs, more air, better performance. Again, the research exists.
Third, smoking causes weight loss through appetite suppression and increased metabolic demand. Lower body weight improves power to weight ratio. In weight bearing endurance sports like running, this matters. The studies are real.
The paper even addresses implementation. It suggests that age restrictions on tobacco sales may be limiting athletic development, and that countries should consider exemptions for young athletes with endurance potential. It notes that developing countries with fewer tobacco restrictions have been more successful in endurance events recently.
Reading it, you would think this is a legitimate scientific argument. The structure is correct. The citations are real. The logic flows. It looks exactly like every other review paper you have ever seen.
The whole thing is satire.
The authors wrote it deliberately to demonstrate this exact problem. Their opening line states that review papers, when done poorly, "have the potential to create a convincing argument for a faulty hypothesis" through selective citation and improper extrapolation. They then spent the rest of the paper proving their own point.
The fitness industry does this constantly. Not with cigarettes, but with supplements, training methods, recovery protocols, nutrition strategies. Find a study that supports your position. Cite it. Ignore the ten studies that contradict it. Call yourself evidence based.
THIS IS NOT AN ISOLATED EXAMPLE
The BMJ publishes satirical papers every Christmas to make similar points. In 2003, they ran a systematic review of randomised controlled trials on parachute use. The finding: no RCTs existed, so parachutes had not been proven effective. In 2018, researchers actually did the trial. Participants jumped with either a parachute or an empty backpack. No significant difference in death or injury. The catch: all jumps were from stationary aircraft parked on the ground, average height 0.6 metres.
In 2012, the New England Journal of Medicine published a study showing a significant correlation between national chocolate consumption and Nobel Prize winners per capita. The p value was 0.0001. Statistically bulletproof. The media reported it straight. The author had written it as a joke about correlation and causation. Other researchers pointed out you could find equally strong correlations between Nobel prizes and IKEA stores.
These papers exist because serious researchers wanted to demonstrate how easily scientific methodology can be abused. The fitness industry watched this happen and took notes on the wrong lesson.
WHAT EVIDENCE BASED ACTUALLY MEANS
Real evidence based practice is not about citing a study. It is about understanding where that study sits in the hierarchy of evidence, what the limitations are, whether the findings have been replicated, and how they apply to the specific person in front of you.
When someone says "a study showed," ask yourself: what kind of study? How many participants? Has it been replicated? Who funded it? What were the limitations the authors themselves acknowledged? What does the overall body of evidence say?
A single study showing something is not proof. It is a data point. Sometimes it is a data point that contradicts twenty other data points. Sometimes it is a data point from a study with twelve participants over three weeks funded by a company that sells the thing being studied.
THE FITNESS INDUSTRY'S EVIDENCE PROBLEM
Scroll through fitness social media and you will see the same patterns everywhere. Someone posts a claim. They cite a study. The study either does not say what they claim it says, or it is one study that contradicts the broader literature, or it is methodologically weak, or it is taken completely out of context.
But it looks scientific. It has a citation. It sounds evidence based.
The irony is that the people who shout loudest about being evidence based are often the worst offenders. They use science as a weapon rather than a tool. They cite studies to win arguments rather than to understand reality. They treat research as a collection of ammunition rather than a body of knowledge with nuance, limitations, and context.
WHAT I ACTUALLY DO
When I make a recommendation to an athlete, I try to base it on the best available evidence while acknowledging uncertainty. Sometimes that means saying "the research suggests" rather than "science proves." Sometimes it means saying "we do not know for sure but here is what makes sense given what we do know." Sometimes it means running an experiment of one and seeing what works for that specific person.
Real evidence based practice involves:
Understanding what type of study design can actually answer the question being asked. Knowing that a correlational study cannot prove causation no matter how strong the p value. Recognising that effect size matters more than statistical significance. Accepting that most nutrition and training research has serious methodological limitations. Being willing to say "I don't know" when the evidence is genuinely unclear.
It also involves recognising that practical experience matters. That coaching intuition developed over years of working with athletes is not worthless just because it is not published in a journal. That the plural of anecdote is not data, but that patterns observed across many athletes can inform hypotheses worth testing.
My coach is currently working on his PhD. I ask him the most stupid questions. Even he says there is never a definitive answer, only the current best interpretation of incomplete data. If someone doing a doctorate in this stuff is comfortable with that level of uncertainty, maybe the guy on Instagram claiming his method is "proven by science" should be too.
The goal is not to be right. The goal is to be less wrong over time. To update beliefs when new evidence emerges. To hold conclusions loosely and methods tightly.
THE BOTTOM LINE
If everyone is evidence based, nobody is. The phrase has been diluted to meaninglessness by people who use it as a marketing strategy rather than a methodology.
Real evidence based practice is harder than citing studies. It requires understanding research design, statistical concepts, the hierarchy of evidence, and the difference between what a study actually showed and what headlines claim it showed. It requires intellectual humility and comfort with uncertainty.
The next time someone tells you they are evidence based, ask them about the limitations of the studies they cite. Ask them what would change their mind. Ask them about the studies that contradict their position and how they reconcile the conflicting findings.
If they cannot answer those questions, they are not evidence based. They are just citing studies.
And as the cigarette paper showed, you can cite studies to prove almost anything.