Sep 10 • Curtis Tait

Ignoring the Evidence: Why Does It Happen? (Part 1)

Let me tell you a story: “Once upon a time there was a young physical therapist struggling to make sense of persistent pain presentations while working in a pain management program. Everyday he would try to apply what he learned in school about pathoanatomy and biomechanics to this patient population, but despite high ratings of pain, these patient's imaging studies had few findings. One day he stumbled on to pain neuroscience, which made his world make sense, and he learned as much as he could about this wonderfully revolutionary idea.

Because of that he spread the word of pain neuroscience to all his patients. Because of that he felt much more confident that he was making a difference in the life of his patients. Until one day he worked with a patient that did not respond to his wonderful pain metaphors, and citing of the latest evidence. In fact it seemed that this patient became even more sure that something was very wrong with their body that no one had been able to find. The more he tried to educate this patient about the wonders of pain neuroscience, the more this patient was sure that it couldn’t possibly apply to them.”

While this is a story from the beginning of my career, I am sure that I am not alone in this experience. You could also imagine a similar scenario involving a EBP based physio discussing the latest evidence with a staunchly biomechanically focused colleague, citing study after study refuting the biomechanical model of pain, only to have the colleague become more sure that biomechanical faults explain pain (I may have been guilty of this one too).

So why is it that someone can be informed of the magnitude of evidence that is contrary to their view and seemingly “ignore the evidence?” Well, in large part, it involves something called the “backfire effect.” I recently was introduced to this psychological phenomenon by a colleague, and it made me think about how this effect can apply to patient and fellow clinician education regarding best evidence.
So what is the backfire effect? Nyhan & Reifler coined the term the “backfire effect” in a 2010 study when they observed those that strongly believed in the G.W. Bush era war in Iraq, actually strengthened there belief that Iraq had weapons of mass destruction after receiving evidence to the contrary (data was gathered during 2005 & 2006). Similar findings have been found in health care where evidence countering the belief that MMR vaccinations cause autism lead to a decrease in the number of respondents that thought this; however, those who held this belief strongly actually decreased the likelihood that they would vaccinate their children in the future (ref). These studies suggest that for those people that strongly agree with a belief, counter-evidence can lead to the strengthening of said belief.
However, this does not hold true for every belief. It seems the belief must be core to that individual’s identity, such as political beliefs, which have strong ties to our individual and group ideologies. This idea of core beliefs is not too dissimilar to how physiotherapists may view how our assessment and treatments approaches work (i.e. pain science, MDT, SFMA, corrective exercise, manipulation, dry needling, etc). In a similar vein, some patients will strongly adopt a view of their body, and why they have pain, as a core belief, making it primed for a backfire effect.

Kaplan and colleagues (ref) conducted an fMRI study of 40 individuals receiving 5 pieces of counter evidence to previously identified strongly held beliefs. They observed that when beliefs that were non-political in nature (i.e. “Thomas Edison invented the light bulb”) were challenged they were more likely changed than those that were political in nature (i.e. “Abortion should be legal”) in the short and medium term (~ 48 days). Moreover, these authors suggested that the areas active when the political beliefs were challenged (insular cortex and the amygdala) were likely, “signaling threats to deeply held beliefs in the same way they might signal threats to physical safety.”
For those of us that have seen a twitter or facebook battle on physiotherapy ideologies, we have witnessed people defend there view points with such vigor and intensity that you would think they were being physically threatened. In a similar way we may have tried to educate a patient on pain neuroscience to have them dig in their heels and firmly reject the idea that their pain is caused by something other than their hip being “out,” fascia being restricted, skull being rotated, or … well, just insert your favorite patient misunderstanding here.

So why might this happen? 

Why do we not simply update our core beliefs when new and accurate evidence is provided? Here is where we see the effect of “motivated skepticism.” Taber and Lodge demonstrated motivated skepticism as a combination of strength of prior beliefs, disconfirmation bias (denigrating arguments that don’t agree with your own), and confirmation bias (actively seeking out confirmatory evidence to your beliefs) (ref).

They showed that for a group of political science students exposed to evenly balanced pro and con arguments for gun control and abortion, the participants consistently rated arguments inline with prior beliefs as stronger, developed more thoughts regarding arguments challenging their prior beliefs (mostly aimed at the weaknesses of the argument), and when allowed time to freely read arguments they chose to read confirmatory arguments to their position (up to 3:1 in the group with the strongest prior beliefs).
What is the take home message here?

When presented with new evidence on a topic, we chose to evaluate information contrary to our beliefs with more scrutiny.

We more readily accept information that confirms biases, and the stronger we believe something, the more rigorously we do this. 

In the case of the backfire effect, it is this active process of motivated skepticism that might actually allow us to become more sure of our core belief. We're poking holes in the evidence that does not confirm our bias, and actively seeking out evidence that does. As a result, we strengthen our resolve in our protected beliefs. In the “you are not so smart” podcast on the backfire effect Sarah Gimble (an author of the fMRI study) puts it beautifully. She suggests that it is not in our best interest to alter these core beliefs drastically, as it puts us “at odds” with our “in-group” and those that we most identify with; it is just “easier not to” (ref).

If we are going to move our patients and profession forward we must learn to combat the perception that it is just “easier not to.” In part 2 we will discuss what we can do about “ignoring the evidence.” Stay tuned!

In the mean time: Educate. Encourage. Empower.

Want to learn more? Check out these resources:


“You are not so smart” podcasts on the backfire effect – part 1; part 2; part 3

Curtis Tait, BSC, MPT, IMS