Journalists and bloggers, teachers and everybody on Facebook love to use the phrase “studies show.”
I love it, too.
“Studies show” tickles the part of us that asserts a superior sort of rationality and an up-to-date command of the facts. It makes us feel smart, particularly when the study we cite is surprising or new, but especially when it reinforces what we already believe.
Yet we all know that studies show a lot of things, many of them contradictory. A study may come out linking golf to brain damage, and then a few years later another will come out saying that golf will help you live long and avoid dementia.
There's nothing shocking about this. Research, particularly in biology and the social sciences, is often about probabilities. Results are subject to being skewed by sample size and selection, by study design and even by the connotations of the words on a questionnaire.
Rather than acknowledging these things and being circumspect, though, we keep using “studies show” to try to end arguments, not to foment more debate.
According to the historian of religion Karen Armstrong, we lost the ability to accept uncertainty when, during the Age of Reason in the late 17th century, we left behind a world split between “logos,” or rational thought, and “mythos,” spiritual and associative ways of thinking.
While this has been great for technology, it has also made us use terms like “studies show” as their own kind of magical incantations, able to transform the messiness and uncertainty of the real world into easily understood data-points plotted into convenient charts.
What the phrase “studies show,” really demonstrates, then, is not how rational we have become, but how unbalanced. With no way to exercise our unknowing in open debate, we fall back on misplaced certainty.
And perhaps this explains our declining engagement in literature and the arts: they show a side of us we fear, one that can't be backed up by the studies we hold so dear.