For most of the 20th century, political science, economics and philosophy relied on the premise that people base their opinions and choices on facts and logical reasoning. More recently though, thousands of studies have proven that people actually rely on emotion and ingrained beliefs far more than they employ objective facts or logic.
In his book Thinking Fast and Slow, Nobel Prize-winning scientist Daniel Kahneman summarized this field of research, describing dozens of ways that cognitive biases skew human reasoning. Many other scientific books and articles confirm that human minds are predisposed to believe falsehoods and exaggerations because of biases, heuristics and fallacies. But there is one cognitive bias that is particularly important to understand if we are to be successful in politics.
Confirmation bias
It is confirmation bias. This is when people seek out information that conforms to what they already believe or want to believe, while—inside their minds—ignore or refute information that disproves those assumptions.[1] It is a selective use of evidence through which people reinforce to themselves whatever they want to believe.
Confirmation bias is one of the oldest-known and best-proven cognitive biases. Sir Francis Bacon explained it 400 years ago. In the 21st century, it is accepted science.
If people believe that violent crime keeps increasing, they will retain information about recent crimes and disbelieve or ignore the fact that crime rates have declined for decades. If individuals think the Earth is thousands, instead of billions, of years old, they will not believe the truth even when shown fossils in a museum. For that matter, if people are convinced that Friday the 13th is unlucky, they will pay attention and remember the times bad things happened on this date but will fail to remember all the Friday the 13ths when no misfortune occurred.
In short, when faced with facts that contradict strongly felt beliefs, people will almost always reject the facts and hold on to their beliefs.
Confirmation bias is crucial because, when it comes to politics, all of us carry in our heads a long list of preexisting beliefs, stereotypes and biases. So, if you present evidence or use language that seems to challenge your listeners’ key beliefs, they will stop listening. If they think you are saying “you’re wrong,” a switch clicks in their brains turning off rational consideration and turning on negative emotions.
Why do people’s brains work that way?
Bias inside the brain
Psychologists widely use the labels System 1 and System 2 to describe two main memory systems in the human brain. System 1 is the “fast” system which reacts instantaneously, reflexively and emotionally. This part of the brain is automatic, intuitive and subconscious. System 2 is the “slow” system that is deliberate, controls abstract thinking, and stores memories such as facts and events. The System 2 part of the brain is more rational and reflective.
Because System 1 operates in milliseconds, its reactions can override or redirect System 2’s slower reasoning. If your listener’s reflexive system determines that you are attacking an important belief, it will divert thinking away from the rational mechanisms in the brain to emotional ones. Simultaneously, the listener’s mind will cherry-pick memories to reinforce the preexisting belief that seems to be under attack. In other words, System 1 will engage the “fight or flight” reflexes that protected the evolving homo sapiens in order to protect our modern-day beliefs.
Let us imagine you are discussing voter fraud with an irascible neighbor who believes it’s a problem and you say, “There is no evidence of massive voter fraud,” which is unquestionably true. His brain will perceive your words as an attack, he will feel a strongly negative emotional reaction, he will then remember and focus on the very real-to-him fake news that supports his belief in voter fraud, and you will have no chance to persuade him of anything. Your effort at persuasion has failed.
As political activists, we wish that we could reason with people and have calm, cool, dispassionate discussions about public policy. But instead, we tend to trigger in our listeners a negative emotional response, reminding them of memories that reinforce those negative emotions. We are arguing with ghosts from our listeners’ pasts—and losing.
Clinical psychologist Drew Westen of Emory University used functional magnetic resonance imaging (fMRI) to examine what was going on in the brains of partisans who supported either George W. Bush or John Kerry during the 2004 presidential contest. He gave test subjects a series of openly contradictory statements from each candidate. Based on confirmation bias, he expected that each partisan would overlook the contradictions of his or her own candidate while indignantly protesting the contradictions of the other guy. And just as Westen (and Sir Francis Bacon) would have expected, the test subjects did precisely that.
When Drew Westen looked at the fMRIs, the subjects—not too surprisingly—had not engaged the logical parts of their brains. They had engaged their emotions instead. And then, after rationalizing away legitimate attacks on their favored candidates, the brain’s pleasure center released the neurotransmitter dopamine. As Westen explained in his book The Political Brain:
Once partisans had found a way to reason to false conclusions, not only did neural circuits involved in negative emotions turn off, but circuits involved in positive emotions turned on. The partisan brain didn’t seem satisfied in just feeling better. It worked overtime to feel good, activating reward circuits that give partisans a jolt of positive reinforcement for their biased reasoning. These reward circuits overlap substantially with those activated when drug addicts get their “fix,” giving new meaning to the term political junkie.
This means that when you attack preexisting beliefs, not only are your arguments rejected, but you are also helping to emotionally reward partisans for their stubbornness, deepening their attachment to false ideas.
The leaders of the radical right seem to understand all of this. They know that conservative voters are not searching for truth. They are, instead, consciously or unconsciously, seeking out information that conforms to their preexisting beliefs. That’s why those voters watch Fox News, listen to Rush Limbaugh, and read Breitbart. That’s also why conservatives are so susceptible to “fake news” on the Internet. They believe the lies because they want to—it quite literally feels bad to admit one is wrong and feels good to assert one is right.
In sum, there are tremendous barriers in the path of persuasion. How do we work around those obstacles?
[1] We use this term generically, as others do, to encompass associated labels which describe how people irrationally confirm and defend their beliefs and desires, such as motivated reasoning, desirability bias, and disconfirmation bias.