How to Lose an Argument: Biased Assimilation in Rational Thinking

I think we owe ourselves a congratulations. We made it through the holidays! It isn’t […]

CJ Green / 1.2.17

I think we owe ourselves a congratulations. We made it through the holidays! It isn’t always easy. Spending time with family, occupying the same dinner table, digging into the same refrigerator, watching the same movies with a group of people we never chose our relation to. Perhaps, at some point, a belligerent relative staggered through the doorway with all sorts of opinions about the president-elect, the state of affairs, the millennials. And perhaps it felt right to set him straight: “Let’s talk through this,” it’s tempting to say. “The science says this… The facts say this…”


When it comes to hot topics like politics or religion, rarely will our most fact-like facts actually persuade our opponents. Interestingly, no less than three Dans have weighed in on this: Dan Kopf, over at Quartz, recently covered Dan Kahan’s decade-long study about how even the most rational thinkers prioritize their own biases—which draws inspiration from Nobel-prize winner Dan Kahneman’s two-system mind. “System 1” thinking is faster, more heuristic, emotion-driven; “System 2” is the rational system, the slower thinking which relies on empirical evidence and calculations.

Generally speaking, System 1 is considered more biased and less constructive, while System 2 is more detached and less partisan. “If only we would all just use our rational, scientific minds,” Kopf writes. “Then we could get past our disagreements. It’s a nice thought. Unfortunately, it’s wrong.” In his ten-year study, Kahan concluded that not only does System 2 prove unhelpful in creating common ground but it can actually reinforce pre-existing biases.

Rather than use our best thinking to reach the truth, we use it to find ways to agree with others in our communities.

“The process is called biased assimilation,” says Kahan. “People will selectively credit and discredit information in patterns that reflect their commitment to certain values.”

In one illustrative study, Kahan asked over 1,500 respondents whether they agreed or disagreed with the following statement: “There is solid evidence of recent global warming due mostly to human activity such as burning fossil fuels.” For these same respondents, Kahan also collected information on their political beliefs, and measured their “science intelligence”—a metric based on answers to questions developed by the National Science Foundation, Pew Research Center, and others. These questions are intended to gauge a combination of scientific knowledge and quantitative reasoning proficiency.

Quick interjection: Kahan’s “scientific intelligence” metric isn’t a general intelligence rank; rather, its focus is on parsing out a distinction between System 1 and System 2.

When Kahan analyzed the data he found that those with the least science intelligence actually have less partisan positions than those with the most. A Conservative Republican with strong science intelligence will use their skills to find evidence against human-caused global warming, while a Liberal Democrat will find evidence for it. This is also true for issues like fracking, evolution, and the risks associated with gun possession—whatever your preconceived political belief on this issue, you’ll use your scientific intelligence to try to prove you’re right.

tb1a2njKahan explains that “individuals can be expected to form identity-protective beliefs and to use all of the cognitive resources at their disposal to do so.” Regardless of “scientific intelligence” levels, those who rely on more System 2 thinking demonstrate the greatest attachment to preconceived beliefs, thus using their predisposition to reason to defend those beliefs. Kahan goes on to explain:

Individuals highest in the critical reasoning dispositions associated with System 2 information processing were using their cognitive proficiencies to ferret out evidence consistent with their cultural or ideological predispositions and to rationalize the peremptory dismissal of evidence inconsistent with the same.

When we say “science proves this,” we are really saying “your beliefs are invalid, and here’s why.” We’re ultimately trying to control our opponent–or, at the very least, impose a metric, a law that determines who is right(eous) and who is wrong.

Kahan’s research suggests, however, that using System 2 thinking as a means of controlling another person rarely works. When God told Adam and Eve not to eat from the tree, what did they do? Eat from it. The more our “rational” argument serves to accuse, the more intensely the accused will defend his beliefs. This may explain why, in my apologetics phase, I succeeded in converting exactly zero people. That said, in my non-apologetics phase, I’m still batting .000…

What’s more, System 2 thinkers tend to defend stances they hadn’t scientifically researched for themselves:

Perhaps Kahan’s most disconcerting finding is that people with more scientific intelligence are the quickest to align themselves politically on subjects they don’t know anything about. In one experiment, Kahan analyzed how people’s opinion on a unfamiliar subject are affected when given some basic scientific information, along with details about what people in their self-identified political group tend to believe about that subject. It turned out that those with the strongest scientific reasoning skills were the ones most likely to use the information to develop partisan opinions.

Kahan argues that this is actually a very rational way of using our best thinking. “A person who forms a position out of line with her cultural peers risks estrangement from the people on whom she depends on for emotional and material support,” writes Kahan. Better to use your intellectual faculties to stick to the company line.

In other words, rationality doesn’t work to serve the truth but to serve itself, out of (rational) self-preservation. When we defend our political or religious positions, it’s less about defending those positions and more about defending ourselves as people, fragile as we are.

And I probably would have believed all of this before Kahan had researched it.