Why Smart People Believe Stupid Things

missmentelle:

If you’ve been paying attention for the last couple of years, you might have noticed that the world has a bit of a misinformation problem. 

The problem isn’t just with the recent election conspiracies, either. The last couple of years has brought us the rise (and occasionally fall) of misinformation-based movements like:

  • Sandy Hook conspiracies
  • Gamergate
  • Pizzagate
  • The MRA/incel/MGTOW movements
  • anti-vaxxers
  • flat-earthers
  • the birther movement
  • the Illuminati 
  • climate change denial
  • Spygate
  • Holocaust denial 
  • COVID-19 denial 
  • 5G panic 
  • QAnon 

But why do people believe this stuff?

It would be easy - too easy - to say that people fall for this stuff because they’re stupid. We all want to believe that smart people like us are immune from being taken in by deranged conspiracies. But it’s just not that simple. People from all walks of life are going down these rabbit holes - people with degrees and professional careers and rich lives have fallen for these theories, leaving their loved ones baffled. Decades-long relationships have splintered this year, as the number of people flocking to these conspiracies out of nowhere reaches a fever pitch. 

So why do smart people start believing some incredibly stupid things? It’s because:

Our brains are built to identify patterns. 

Our brains fucking love puzzles and patterns. This is a well-known phenomenon called apophenia, and at one point, it was probably helpful for our survival - the prehistoric human who noticed patterns in things like animal migration, plant life cycles and the movement of the stars was probably a lot more likely to survive than the human who couldn’t figure out how to use natural clues to navigate or find food. 

The problem, though, is that we can’t really turn this off. Even when we’re presented with completely random data, we’ll see patterns. We see patterns in everything, even when there’s no pattern there. This is why people see Jesus in a burnt piece of toast or get superstitious about hockey playoffs or insist on always playing at a certain slot machine - our brains look for patterns in the constant barrage of random information in our daily lives, and insist that those patterns are really there, even when they’re completely imagined. 

A lot of conspiracy theories have their roots in people making connections between things that aren’t really connected. The belief that “vaccines cause autism” was bolstered by the fact that the first recognizable symptoms of autism happen to appear at roughly the same time that children receive one of their rounds of childhood immunizations - the two things are completely unconnected, but our brains have a hard time letting go of the pattern they see there. Likewise, many people were quick to latch on to the fact that early maps of COVID infections were extremely similar to maps of 5G coverage -  the fact that there’s a reasonable explanation for this (major cities are more likely to have both high COVID cases AND 5G networks) doesn’t change the fact that our brains just really, really want to see a connection there. 

Our brains love proportionality. 

Specifically, our brains like effects to be directly proportional to their causes - in other words, we like it when big events have big causes, and small causes only lead to small events. It’s uncomfortable for us when the reverse is true. And so anytime we feel like a “big” event (celebrity death, global pandemic, your precious child is diagnosed with autism) has a small or unsatisfying cause (car accident, pandemics just sort of happen every few decades, people just get autism sometimes), we sometimes feel the need to start looking around for the bigger, more sinister, “true” cause of that event. 

Consider, for instance, the attempted assassination of Pope John Paul II. In 1981, Pope John Paul II was shot four times by a Turkish member of a known Italian paramilitary secret society who’d recently escaped from prison - on the surface, it seems like the sort of thing conspiracy theorists salivate over, seeing how it was  an actual multinational conspiracy. But they never had much interest in the assassination attempt. Why? Because the Pope didn’t die. He recovered from his injuries and went right back to Pope-ing. The event didn’t have a serious outcome, and so people are content with the idea that one extremist carried it out. The death of Princess Diana, however, has been fertile ground for conspiracy theories; even though a woman dying in a car accident is less weird than a man being shot four times by a paid political assassin, her death has attracted more conspiracy theories because it had a bigger outcome. A princess dying in a car accident doesn’t  feel big enough. It’s unsatisfying. We want such a monumentous moment in history to have a bigger, more interesting cause. 

These theories prey on pre-existing fear and anger. 

Are you a terrified new parent who wants the best for their child and feels anxious about having them injected with a substance you don’t totally understand? Congrats, you’re a prime target for the anti-vaccine movement. Are you a young white male who doesn’t like seeing more and more games aimed at women and minorities, and is worried that “your” gaming culture is being stolen from you? You might have been very interested in something called Gamergate. Are you a right-wing white person who worries that “your” country and way of life is being stolen by immigrants, non-Christians and coastal liberals? You’re going to love the “all left-wingers are Satantic pedo baby-eaters” messaging of QAnon. 

Misinformation and conspiracy theories are often aimed strategically at the anxieties and fears that people are already experiencing. No one likes being told that their fears are insane or irrational; it’s not hard to see why people gravitate towards communities that say “yes, you were right all along, and everyone who told you that you were nuts to be worried about this is just a dumb sheep. We believe you, and we have evidence that you were right along, right here.” Fear is a powerful motivator, and you can make people believe and do some pretty extreme things if you just keep telling them “yes, that thing you’re afraid of is true, but also it’s  way worse than you could have ever imagined.”

Real information is often complicated, hard to understand, and inherently unsatisfying. 

The information that comes from the scientific community is often very frustrating for a layperson; we want science to have hard-and-fast answers, but it doesn’t. The closest you get to a straight answer is often “it depends” or “we don’t know, but we think X might be likely”. Understanding the results of a scientific study with any confidence requires knowing about sampling practices, error types, effect sizes, confidence intervals and publishing biases. Even asking a simple question like “is X bad for my child” will usually get you a complicated, uncertain answer - in most cases, it really just depends. Not understanding complex topics makes people afraid - it makes it hard to trust that they’re being given the right information, and that they’re making the right choices. 

Conspiracy theories and misinformation, on the other hand, are often simple, and they are certain. Vaccines bad. Natural things good. 5G bad. Organic food good. The reason girls won’t date you isn’t a complex combination of your social skills, hygiene, appearance, projected values, personal circumstances, degree of extroversion, luck and life phase - girls won’t date you because feminism is bad, and if we got rid of feminism you’d have a girlfriend. The reason Donald Trump was an unpopular president wasn’t a complex combination of his public bigotry, lack of decorum, lack of qualifications, open incompetence, nepotism, corruption, loss of soft power, refusal to uphold the basic responsibilities of his position or his constant lying - they hated him because he was fighting a secret sex cult and they’re all in it. 

Instead of making you feel stupid because you’re overwhelmed with complex information, expert opinions and uncertain advice, conspiracy theories make you feel smart - smarter, in fact, than everyone who doesn’t believe in them. And that’s a powerful thing for people living in a credential-heavy world. 

Many conspiracy theories are unfalsifiable. 

It is very difficult to prove a negative. If I tell you, for instance, that there’s no such thing as a purple swan, it would be very difficult for me to actually prove that to you - I could spend the rest of my life photographing swans and looking for swans and talking to people who know a lot about swans, and yet the slim possibility would still exist that there was a purple swan out there somewhere that I just hadn’t found yet. That’s why, in most circumstances, the burden of proof lies with the person making the extraordinary claim - if you tell me that purple swans exist, we should continue to assume that they don’t until you actually produce a purple swan. 

Conspiracy theories, however, are built so that it’s nearly impossible to “prove” them wrong. Is there any proof that the world’s top-ranking politicians and celebrities are all in a giant child sex trafficking cult? No. But can you prove that they  aren’t in a child sex-trafficking cult? No, not really. Even if I, again, spent the rest of my life investigating celebrities and following celebrities and talking to people who know celebrities, I still couldn’t definitely prove that this cult doesn’t exist - there’s always a chance that the specific celebrities I’ve investigated just aren’t in the cult (but other ones are!) or that they’re hiding evidence of the cult even better than we think. Lack of evidence for a conspiracy theory is always treated as more evidence for the theory - we can’t find anything because  this goes even higher up than we think! They’re  even more sophisticated at hiding this than we thought! People deeply entrenched in these theories don’t even realize that they are stuck in a circular loop where everything seems to prove their theory right - they just see a mountain of “evidence” for their side. 

Our brains are very attached to information that we “learned” by ourselves.

Learning accurate information is not a particularly interactive or exciting experience. An expert or reliable source just presents the information to you in its entirety, you read or watch the information, and that’s the end of it. You can look for more information or look for clarification of something, but it’s a one-way street - the information is just laid out for you, you take what you need, end of story. 

Conspiracy theories, on the other hand, almost never show their hand all at once. They drop little breadcrumbs of information that slowly lead you where they want you to go. This is why conspiracy theorists are forever telling you to “do your research” - they know that if they tell you everything at once, you won’t believe them. Instead, they want you to indoctrinate yourself slowly over time, by taking the little hints they give you and running off to find or invent evidence that matches that clue. If I tell you that celebrities often wear symbols that identify them as part of a cult and that you should “do your research” about it, you can absolutely find evidence that substantiates my claim - there are literally millions of photos of celebrities out there, and anyone who looks hard enough is guaranteed to find common shapes, poses and themes that might just mean something (they don’t - eyes and triangles are incredibly common design elements, and if I took enough pictures of you, I could also “prove” that you also clearly display symbols that signal you’re in the cult). 

The fact that you “found” the evidence on your own, however, makes it more meaningful to you. We trust ourselves, and we trust that the patterns we uncover by ourselves are true. It doesn’t feel like you’re being fed misinformation - it feels like you’ve discovered an important truth that “they” didn’t want you to find, and you’ll hang onto that for dear life. 

Older people have not learned to be media-literate in a digital world. 

Fifty years ago, not just anyone could access popular media. All of this stuff had a huge barrier to entry - if you wanted to be on TV or be in the papers or have a radio show, you had to be a professional affiliated with a major media brand. Consumers didn’t have easy access to niche communities or alternative information - your sources of information were basically your local paper, the nightly news, and your morning radio show, and they all more or less agreed on the same set of facts. For decades, if it looked official and it appeared in print, you could probably trust that it was true. 

Of course, we live in a very different world today - today, any asshole can accumulate an audience of millions, even if they have no credentials and nothing they say is actually true (like “The Food Babe”, a blogger with no credentials in medicine, nutrition, health sciences, biology or chemistry who peddles health misinformation to the 3 million people who visit her blog every month). It’s very tough for older people (and some younger people) to get their heads around the fact that it’s very easy to create an “official-looking” news source, and that they can’t necessarily trust everything they find on the internet. When you combine that with a tendency toward “clickbait headlines” that often misrepresent the information in the article, you have a generation struggling to determine who they can trust in a media landscape that doesn’t at all resemble the media landscape they once knew. 

These beliefs become a part of someone’s identity. 

A person doesn’t tell you that they believe in anti-vaxx information - they tell you that they  ARE an anti-vaxxer. Likewise, people will tell you that they  ARE a flat-earther, a birther, or a Gamergater. By design, these beliefs are not meant to be something you have a casual relationship with, like your opinion of pizza toppings or how much you trust local weather forecasts - they are meant to form a core part of your identity. 

And once something becomes a core part of your identity, trying to make you stop believing it becomes almost impossible. Once we’ve formed an initial impression of something, facts just don’t change our minds. If you identify as an antivaxxer and I present evidence that disproves your beliefs, in your mind, I’m not correcting inaccurate information - I am launching a very personal attack against a core part of who you are. In fact, the more evidence I present, the more you will burrow down into your antivaxx beliefs, more confident than ever that you are right. Admitting that you are wrong about something that is important to you is painful, and your brain would prefer to simply deflect conflicting information rather than subject you to that pain.

We can see this at work with something called the confirmation bias. Simply put, once we believe something, our brains hold on to all evidence that that belief is true, and ignore evidence that it’s false. If I show you 100 articles that disprove your pet theory and 3 articles that confirm it, you’ll cling to those 3 articles and forget about the rest. Even if I show you nothing but articles that disprove your theory, you’ll likely go through them and pick out any ambiguous or conflicting information as evidence for “your side”, even if the conclusion of the article shows that you are wrong - our brains simply care about feeling right more than they care about what is actually true.  

There is a strong community aspect to these theories. 

There is no one quite as supportive or as understanding as a conspiracy theorist - provided, of course, that you believe in the same conspiracy theories that they do. People who start looking into these conspiracy theories are told that they aren’t crazy, and that their fears are totally valid. They’re told that the people in their lives who doubted them were just brainwashed sheep, but that they’ve finally found a community of people who get where they’re coming from. Whenever they report back to the group with the “evidence” they’ve found or the new elaborations on the conspiracy theory that they’ve been thinking of (“what if it’s even worse than we thought??”), they are given praise for their valuable contributions. These conspiracy groups often become important parts of people’s social networks - they can spend hours every day talking with like-minded people from these communities and sharing their ideas. 

Of course, the flipside of this is that anyone who starts to doubt or move away from the conspiracy immediately loses that community and social support. People who have broken away from antivaxx and QAnon often say that the hardest part of leaving was losing the community and friendships they’d built - not necessarily giving up on the theory itself. Many people are rejected by their real-life friends and family once they start to get entrenched in conspiracy theories; the friendships they build online in the course of researching these theories often become the only social supports they have left, and losing those supports means having no one to turn to at all. This is by design - the threat of losing your community has kept people trapped in abusive religious sects and cults for as long as those things have existed.