# Statistics are complicated, right? Well, thinking that is why we tend to get misled.

Ars Technica keeps it simple, smart, with a new study that shows why statistics can be the third kind of lie* – sometimes the hard way to think about a problem really is the wrong way, but it can seem so much more likely than the simple way… which makes it a potent way to confuse us:

A new study in Frontiers in Psychology examined why people struggle so much to solve statistical problems, particularly why we show a marked preference for complicated solutions over simpler, more intuitive ones. Chalk it up to our resistance to change. The study concluded that fixed mindsets are to blame: we tend to stick with the familiar methods we learned in school, blinding us to the existence of a simpler solution.

“As soon as you pick up a newspaper, you’re confronted with so many numbers and statistics that you need to interpret correctly,” says co-author Patrick Weber, a graduate student in math education at the University of Regensburg in Germany. Most of us fall far short of the mark.

Part of the problem is the counterintuitive way in which such problems are typically presented. [Prosecutor Sir Roy] Meadow presented his evidence in the so-called “natural frequency format” (for example, 1 in 10 people), rather than in terms of a percentage (10 percent of the population). That was a smart decision, since 1-in-10 is a more intuitive, jury-friendly approach. Recent studies have shown that performance rates on many statistical tasks increased from four percent to 24 percent when the problems were presented using the natural frequency format.

That makes sense, since calculating a probability is complicated, requiring three multiplications and one addition, according to Weber, before dividing the resulting two terms. By contrast, just one addition and one division are needed with the natural frequency format. “With natural frequencies, you have one reference set that you can vividly imagine,” says Weber. The probability format is more abstract and less intuitive.

But what about the remaining 76 percent who still can’t solve these kinds of problems? Weber and his colleagues wanted to figure out why. They recruited 180 students from the university and presented them with two sample problems in so-called Bayesian reasoning, framed in either a probability format or a natural frequency format.

For instance, the probability of a randomly picked person from a given population being addicted to heroin is 0.01 percent (the base rate). If the person selected is a heroin addict, there is a 100 percent probability that person will have fresh needle marks on their arm (the sensitivity element). However, there is also a 0.19 chance that the randomly picked person will have fresh needle marks on their arm even if they are not a heroin addict (the false-alarm rate). So what is the probability that a randomly picked person with fresh needle marks is addicted to heroin (the posterior probability)?

Here is the same problem in the natural frequencies format: 10 out of 100,000 people will be addicted to heroin. And 10 out of 10 heroin addicts will have fresh needle marks on their arms. Meanwhile, 190 out of 99,990 people who are not addicted to heroin will nonetheless have fresh needle marks. So what percentage of the people with fresh needle marks is addicted to heroin?

In both cases, the answer is five percent, but the process by which one arrives at that answer is far simpler in the natural frequency format. The set of people with needle pricks on their arms is the sum of all the heroin addicts (10) plus the 190 non-addicts. And 10/200 gives you the correct answer.

Weber and his colleagues were surprised to find that even when presented with problems in the natural frequency format, half the participants didn’t use the simpler method to solve them. Rather, they “translated” the problem into the more challenging probability format with all the extra steps, because it was the more familiar approach.

That is the essence of a fixed mindset, also known as the Einstellung effect. “We have previous knowledge that we incorporate into our decisions,” says Weber. That can be a good thing, enabling us to make decisions faster. But it can also blind us to new, simpler solutions to problems. Even expert chess players are prone to this. They ponder an opponent’s move and choose the tried-and-true counter-strategy they know so well, when there might be an easier solution to putting their opponent in checkmate.

Weber proposes that one reason this happens is that students are simply overexposed to the probability format in their math classes.

* “There are three kinds of lies: lies, damned lies, and statistics.” – credited to Benjamin D’Israeli.