This was today’s Daily Mail headline....like many more across Europe, I’m sure. Detailing the new research from the Netherlands Centre for Human Drug Research, and just published in the British Journal of Clinical Pharmacology...another example of ‘bad science’....just like steroids don’t work (see reviews in Wilson, 1988; Elashoff et al., 1991; O’Connor and Cicero, 1993; Friedl, 2000). And eggs will kill you (again The Daily Mail).
And - along with the following sequencing of events - it was the inspiration of today’s post.
...is there such a thing as evidence? Even Über-scientist, Dr. Stuart Phillips admits the limitations of scientific research, comparing the results of reams of scientific inquiry into intensity, effort, intent, volume, and sequencing of loading to a hopeful guess.
In perhaps the first of its kind, Twitter yesterday was kind enough to host a ‘Stutrifecta’ discussion on the proof - or lack thereof - offered by the research that optimizing load, intensity and sequencing is an important part of the performance process.
Professor Philips has a point. To a point. Do any of the reams of literature really prove anything? No - it is NOT evidence....but is it supposed to be?
Besides suffering from various personal and professional biases and preferences, statistic limits, population specificity, straight-up errors, and misinterpretations, the linear-causal mechanistic, and reductionist simplicity of the typical scientific investigation - double-blind, placebo-controlled, etc. - frequently just does does not fit into the complex, ever-changing, interdependent, chaotic sports-world where multiple, poorly-understood cause-effect interactions are the order of the day.
The athletic world works through complex mechanisms operating simultaneously, that cannot be reduced to simple, single-condition research.
Dr. Richard Horton, editor-in-chief of The Lancet, addressed this issue in an editorial titled, The Precautionary Principle: “We must act on facts, and on the most accurate interpretation of them, using the best scientific information. That does not mean we must sit back until we have 100% evidence about everything...”.
Discussing the current condition of medical science - not sports-science per se - Horton continues “Application, synthesis, and reflection—these are my personal wishes for a renaissance in clinical medicine. It is not concerned with hierarchies of evidence; it is not dependent on up-to-date literature alone as the arbiter of clinical decision making; it does not proselytize a bottom-line approach to the reading of new research. Rather, it is about preferring interpretations to conclusions, external validity to internal validity, context to the highly controlled—and artificial—experimental environment.”
Dr David S. Jones, in the Textbook of Functional Medicine, opines that at the very best, research makes us uncertain. And - in the face of uncertainty - we must take a broader view. Step back from the canvas of individual studies, and view the landscape of scientific inquiry as a whole.
“The paradox of the clinical trial is that it is the best way to assess whether an intervention works, but is arguably the worst way to assess who will benefit from it”
So until medicine can transcend Newtonian mechanics, and become truly biological, incorporating evolutionary and organismic biology into its molecular scheme. Until science and researchers better understand the organization, interrelationships, and interconnections of the self-organizing and self-regulating complex system. Until we can believe and confidently apply the research we read...
...good coaches will use an epistemological method in developing a training philosophy - constantly asking themselves “what do I know?” - “why am I doing this?”. Within this, we can choose to take a rationalist or an empiricist approach; most often, combining the two. (i.e. we use both reason -deduction- and experience -induction- to guide us).
Our rational selves use the ‘narratives’ provided us by ‘scientific research’ to first deduce a ‘thought experiment - eventually playing itself out practically as part-whole of our program. We use this practical experience to inductively support or alter any subsequent gedankenexperiment, and thus program.
Relying less on top-down planning - instead adopting a rational Bayesian decision framework - we continue in this manner, combining research narratives and personal experiences (of both coach and athlete(s)) - both feedforward and feedback processes - all the while being careful not to stifle our inner bricoleur, remaining open to the individuality and daily fluctuations of the athlete dynamical system. As Nicholas Nassim Taleb would put it - “...exposing ourselves to the envelope of serendipity”.
We will monitor, assess, and adjust constantly. By monitoring the correct variables, patterns will eventually form. We can then begin to more accurately predict adaptation. We will adapt our training methods to the athletes, not vice versa. We will question ‘knowledge’, and respect introspection and a willingness to admit to uncertainty. We will not be emotionally tied to our plans. We will learn to adapt on the fly. To improvise.
And we will continue to read. And to learn. And even to respect the scientific works of researchers such as Professor Phillips.
"Que sais je?" - Michel de Montaigne