How often have you presented to an audience (or person) who just didn't seem to grasp the logic of your presentation, and thought to yourself, "If they only understood the data ..." or "If they'd just stop for a moment and look at the numbers ..." or "Why can't they look at this objectively instead of emotionally"?
I've written elsewhere about confirmation bias - most people's tendency to ignore evidence that conflicts with their opinion while embracing evidence that agrees with it (see Confirmation bias; why it's so difficult to change some people's minds) and if you're a left-brained person, you probably thought when you read that article that you could get round this by using facts.
After all - "people lie, facts don't" - right? Put enough numbers in front of someone and they have to see the logic of what you're saying. But a recent study by Yale Law Professor Dan Kahan (read the complete study here or the rest of the article for the layman's version) has shown that this isn't necessarily true.
Because when someone already has a strong view about something, it actually interferes with their ability to do the math, believe it or not.
Over 1100 people were asked about their political views and also a number of questions designed to gauge their mathematical reasoning ability. They were then split into two groups and given a mathematical problem to solve. But while the data that they were asked to assess remained the same, half the time the study was described as dealing with the effectiveness of a new skin cream but the other half as involving the effectiveness of “a law banning private citizens from carrying concealed handguns in public.”
The results were astounding, in that the participants' ability to do the math fluctuated wildly depending on whether they thought they were dealing with face cream or a subject on which they had a strong opinion (i.e. gun control). And the more numerate a person was, the higher their susceptibility to letting their opinions influence the math.
The problem they were asked to solve is shown below. The correct answer is that patients who used the skin cream were “more likely to get worse than those who didn’t” because the ratio of those who saw their rash improve to those whose rash got worse is roughly 3 to 1 in the “skin cream” group, but roughly 5 to 1 in the control group — which means that if you want your rash to get better, you are better off not using the skin cream at all.
If you got it wrong, don't panic - 59% of participants in the study did as well. Because at first glance, your intuition will lead you astray; you have to take the time to compute the ratios to get it right (in other words, jettison 'fast thinking' and use 'slow thinking'; see my article Fast- vs slow-thinking). The more numerate a person was, the more likely they were to get the answer to this “skin cream”
...... if you take the same basic study question and simply label it differently, you get a different answer. Rather than reading about a skin cream study, half of Kahan’s guinea pigs were asked to determine the effectiveness of laws “banning private citizens from carrying concealed handguns in public.” and given data not about rashes, but about cities that had or hadn’t passed concealed carry bans, whether crime had or hadn't decreased.
NOW ..... highly numerate Democrats did almost perfectly when the data suggested that a ban decreased crime. But did much worse when the correct answer suggested it increased it. And the opposite was true for highly numerate Republicans.
In fact on average, highly numerate people were 45% more likely to get the answer right when their opinions were confirmed by the data.
So what's happening here?
Kahan thinks that our first instinct is to leap instinctively to the wrong conclusion, because if you just compare which number is bigger in the first column you’ll be led astray. But when more numerate people sense an apparently wrong answer, they're motivated and equipped to have a second look, to dig deeper, and perform some calculations — which normally then leads to a more accurate response.
But If the 'wrong' answer conforms to their ideological positions, there's far less motivation to do this and people are much more likely to accept it. So people come up with the answer that they think should be correct based on their convictions, and then just pck that option wthout dong any supporting analysis.
So the idea (beloved of many politicians and politcal commentators) that if people just had more knowledge or reasoning ability they'd be better able to understand complex issues like climate change, evolution, the economy, the safety of vaccines, etc., etc., appears to be wrong. Because we view data, information and 'facts' through the prism of the opinions we already hold.
So next time you're presenting to an audience that is 'hostile' or might need a bit of persuasion to accept your proposal, don't assume that the facts will speak for themselves. The data they see on screen might be different from the data you see.
If you've found this interesting, you might like:
Confirmation bias: why it's so difficult to change some people's minds
Fast- vs slow-thinking