Rather shockingly, it seems that much of what medical researchers conclude in their studies is misleading, exaggerated, or flat-out wrong and that doctors are still drawing upon misinformation in their everyday practice.
Dr. Ioannidis has spent his career challenging his peers by exposing their bad science. He analyzed 49 of the most highly regarded research findings in medicine over the previous 13 years. And of the 49 articles, 45 claimed to have uncovered effective interventions. Thirty-four of these claims had been retested, and 14 of these, or 41 percent, had been convincingly shown to be wrong or significantly exaggerated.
A couple of highlights from the article:
Simply put, if you're attracted to ideas that have a good chance of being wrong, and if you're motivated to prove them right, and if you have a little wiggle room in how you assemble the evidence, you'll probably succeed in proving wrong theories right.
This array suggested a bigger, underlying dysfunction, and Ioannidis thought he knew what it was.
"The studies were biased," he says. "Sometimes they were overtly biased. Sometimes it was difficult to see the bias, but it was there."
Researchers headed into their studies wanting certain results -- and, lo and behold, they were getting them.
We think of the scientific process as being objective, rigorous, and even ruthless in separating out what is true from what we merely wish to be true, but in fact it's easy to manipulate results, even unintentionally or unconsciously.
"At every step in the process, there is room to distort results, a way to make a stronger claim or to select what is going to be concluded," says Ioannidis. "There is an intellectual conflict of interest that pressures researchers to find whatever it is that is most likely to get them funded."
This problem is not unique to the medical or scientific world, I suspect it is far more prevalent in the business world. We make up our mind and then select the evidence to support it!
To me, this is the sort of issue that Knowledge Management should be addressing - how do we avoid or at the very least minimise such cognitive biases? And, its not the only one, the list of our cognitive biases is endless.
Ted in the Dilbert comic strip sees the problem in relation to strategy!
Dilbert on Cognitive Bias
Footnote: In searching the web for information on how to overcome cognitive bias I came across this gem of a website:
This is a group blog on why we believe and do what we do, why we pretend otherwise, how we might do better and what our descendants might do, if they don't all die.
Credit: Overcoming Bias
The "if they don't all die" bit caught my attention because if we don't get better at making decisions that's surely what is going to happen.