The Stickiness of Misinformation

Even the most ridiculous rumors can cling in our minds—despite what the proof says. Sharon Begley tells us why.

Illustration by Sébastien Thibault

Isn’t it scandalous that Barack Obama, whose health-care reform law established death panels, is a Muslim who was born in Kenya? And isn’t it scary that all those scientific studies have shown that childhood vaccines can cause autism?

You might not believe these falsehoods, but if so, you’re a minority. In a 2015 study, political scientist Adam Berinsky of MIT asked thousands of US voters to rate the truth or falsity of seven myths, such as that Obama is a Muslim or that vote fraud in Ohio swung the 2004 presidential election to George W. Bush. On average, people believed about two of them, he found. “Some people believe a lot of crazy things,” Berinsky said, “but mostly it’s a lot of people believing a few crazy things.”

Such credulity is bad enough in terms of personal decision-making, as when it causes parents to opt out of childhood vaccines. The notion that a democracy’s electoral decisions are partly shaped by outright lies and slanted rumors must have George Orwell chortling smugly in his grave. Even worse is that misinformation can be “sticky,” or impervious to correction. But the reasons we believe misinformation and resist efforts to debunk it shed some not-very-flattering light on the workings of the human mind.

Start at the beginning, when we first hear a claim or rumor. People “proceed on the assumption that speakers try to be truthful,” psychologist Stephan Lewandowsky of England’s University of Bristol and colleagues explained in Psychological Science in the Public Interest. “Some research has even suggested that to comprehend a statement, people must at least temporarily accept it as true.”

That’s because compared to assuming the truth of a claim, assessing its plausibility is cognitively more demanding. It requires paying careful attention, marshaling remembered facts, and comparing what we just heard to what we (think we) know and remember. With the exceptions of assertions from a messenger we reflexively mistrust (as in, “I won’t believe anything Fox News says”) or involving something we know like our own name, our cognitive reflex is that what we’re hearing is likely true. The mental deck is stacked in favor of belief, not skepticism.

In addition, people are generally more likely to accept claims that are consistent with what they already believe. In what’s called “motivated reasoning,” we process new information through the filter of our preexisting worldview. Think of the process as akin to filing papers. If a new document arrives and fits the contents of an existing folder, it’s much easier to file—remember—than if it doesn’t. Similarly, if many Americans had not already been primed with the idea that Obama is an outsider and a threat to “people like them,” the birthers and death-panel assertions would not have gained the traction they did.

So now we have widely-believed falsehoods. Let’s debunk them.

MIT’s Berinsky tried. In a 2015 study, he asked nearly 2,000 US voters whether the 2010 Affordable Care Act (“Obamacare”) established death panels that would decide whether treatment should be withdrawn from elderly patients. Among voters who said they follow political news, 57% said the death-panel claim was untrue, Berinsky reported in the British Journal of Political Science.

Fifty-seven percent might seem like cause to despair (“only 57% knew the truth?!”). But wait, it got worse. When Berinsky showed people information from nonpartisan sources such as the American Medical Association correcting the death-panel claim, it made little difference in the ranks of believers. “Rumors acquire their power through familiarity,” he said. “Merely repeating a rumor”—including to debunk it—“increases its strength” because our fallible brains conflate familiarity (“I’ve heard that before”) with veracity (“…so it must be true”). As a result, “confronting citizens with the truth can sometimes backfire and reinforce existing misperceptions.”

His findings reinforced something scientists had seen before: the “fluency effect.” The term refers to the fact that people judge the accuracy of information by how easy it is to recall or process. The more we hear something, the more familiar we are with it, so the more likely we are to accept it as true. That’s why a “myths vs. facts” approach to correcting beliefs about, say, vaccinations often fail. Right after reading such correctives, many people accept that something they believed to be true (that the flu vaccine can cause the flu, to take an example from one recent study) isn’t. But the effect fades.

Just hours later, people believe the myth as strongly as ever, studies find. Repeating false information, even in a context of “this is wrong,” makes it more familiar. Familiarity = fluency, and fluency = veracity. The Internet, of course, has exponentially increased the amount of misinformation available to us all, which means that we are “fluent” in evermore fallacious rumors and claims.

People judge the accuracy of information by how easy it is to recall or process. The more we hear something, the more familiar we are with it, so the more likely we are to accept it as true—even if we’re told it isn’t.

Debunking faces another hurdle: If misinformation fits with our worldview, then obviously the debunking clashes with that view. Earlier studies have shown that when self-described political conservatives were shown information that Iraq did not possess weapons of mass destruction (WMDs) at the time of the 2003 invasion, they were more likely to believe Iraq had those weapons. Challenging a core conservative belief—that the invasion was justified on those grounds, that the George W. Bush administration was correct in claiming those weapons existed—caused them to double down on their beliefs. It is harder to accept that the report of WMDs in Iraq was false if one supported the 2003 invasion and the president who ordered it. WMD debunking worked, correcting erroneous beliefs, only among opponents of the invasion and others whose political beliefs meshed with the retraction, a 2010 study found.

Now, to switch presidents, relinquishing belief in Obamacare’s death panels challenges the mental model of the president as a nefarious schemer who hates People Like Me. If that’s my cognitive model, then removing the fact (sic) of death panels weakens it. Challenging my mental model makes me have to pause and think, wait, which negative rumors about Obama are correct and which are myths? Easier to believe they’re all true.

Misinformation is sticky because evicting it from our belief system requires cognitive effort. Remember the situation: Our mind holds an assertion that likely went down easy, cognitively speaking; we assumed the veracity of the source and fluently easily slotted it into our mental worldview. Now here comes contrary information. It makes us feel cognitively uneasy and requires more mental processing power to absorb. That’s the very definition of non-fluent: the information does not flow easily into our consciousness or memory.

All is not lost, however. In Berinsky’s death-panels study, he followed the AMA debunking with something quite different: quotes from a Republican senator slamming the rumors as a pack of lies. Now 69% agreed it was a fabrication—a significant uptick—with more disbelievers among both Democrats and Republicans. When an “unlikely source” refutes a rumor, Berinsky explained, and the debunker’s debunking runs contrary to its interests (a Republican defending Obamacare?!), “it can increase citizens’ willingness to reject rumors.”

If the most effective way to debunk false rumors is to get a politician to speak against his or her own interests…well, I leave it to you, reader, to decide if, in our hyperpartisan world, this is more likely to happen than pigs flying.