“People are pouring across our borders.” “Immigrants are taking our jobs.” “Unemployment is the worst it’s ever been.” “Refugees are coming in and we have no idea who they are.” “The second amendment is absolutely under siege.” “Islamists want to conquer this country and impose sharia law.”
If there is a single narrative at stake for America, this is it—“they’re coming for you.”
And if you’re thinking “that’s the other guys,” flip the script—don’t just look at the words of one particularly unfiltered and untruthful demagogue, look at the narrative overall.
“Money is pouring into politics and controlling our elections.” “Corporations are destroying our jobs and our health.” “Chemicals are ending up in our food and we have no idea what they do.” “Christians want to take over the government and impose their restrictive beliefs.”
Whether you frame it as a story of fear or a story of heroic resistance, the core is the same: we’re under attack by dangerous, insidious people who have come to take what we have, and if we don’t fight back, we’ll watch our way of life disappear. So stand up and fight, or be prepared to lose your freedom.
Except… every single one of those statements is a lie. Every. Single. One. Some of them are motivated lies, and some of them are ignorant lies, and some of them are exaggerated lies, but they are all lies.
Conformity is one of those tricky things: we like to give it the side-eye, but we also like to practice it, often without even knowing we’re doing it. We enjoy the feeling of being “right” with everyone else. The trouble is, it’s really hard to think differently than the rest of a group—so the feeling of being “right” isn’t really a feeling of being right at all. It’s just a feeling of being the same.
There is a series of psychological experiments that speak to the question of conformity. Collectively, these are known as the Asch Paradigm, and the most oft-repeated result of these studies is that, given enough peer pressure, a large number of people will give obviously wrong answers to questions. For example, when asked a simple question like “which of these three lines is the same length as this fourth line?” people were much more likely to pick one that was obviously longer or shorter if a group of other people confidently chose the wrong line first. In other words, seeing other people give the wrong answer with confidence made them change their own answer—and even doubt their own judgment.
You can tell this as a story about how we succumb to the pressure of the group and espouse ideas that are wrong. But I think it is more interesting as a story about how we impose conformity on others—about how confident we are in our views, especially in groups, and how viciously we ostracize people who propose something different.
It’s easy to say members of the fringe aren’t part of the group. We’d prefer that they not be, at least we it comes to public perception. The fringe is an uncomfortable reminder of the flaws in our beliefs: as the Westboro Baptist Church is to Christians, as PETA is to environmentalism, as racist Trump supporters are to Republicans, as GamerGate trolls are to gamers, and so many others. We want to say these people are not really Christians, or environmentalists, or whatever group they claim to be part of.
But that’s rarely true—more frequently, these are the members we uncomfortably ignore, espousing views we have left carefully unstated inside our communities. They are bad actors we tolerate in our midst because, somewhere, we decided that solidarity trumps civility. When they finally become the loud voices, we suddenly want to distance ourselves from them, but it’s too late. Our complicity is already established.
People lie. People lie pretty much all the time—but most of those lies aren’t the sort of lies that matter. They are untruths that we expect and reinforce socially. They are lies that are, in a sense, required.
“How are you today?”
“Fine, how are you?”
I have trouble with things like that because I always want to answer truthfully. It took me a while to accept that it’s not a real question so much as a script, and that the answer is part of the script, and that because the answer doesn’t convey real information, it isn’t really untrue. It’s not really a lie. I may not be fine, but if I say that I am, that’s fine.
That’s a lie that isn’t really a lie, repeated for the benefit of a social script. We like social scripts, and they make us feel better. They make us feel like we understand the world. But there are lies we tell ourselves, too. There are social scripts we repeat to ourselves, and others, that are deeply, fundamentally, untrue. And while most of us know that “fine” doesn’t really cover it when the lie is about ourselves, it’s easy to forget that the scripts don’t really cover it about anyone else, either.
Especially if they have a different experience. And especially if the script is a script for those of us with social privileges. Like, say, if you’re white.
There is a point in believing an idea where, regardless of where we began, we lose the habit of refining that idea. Instead of seeking to improve our positions, we begin to defend then. Instead of searching for the nuance, we begin to strip it away.
It isn’t every idea—but certain ideas seem to burrow into our politics, our religion, and our activism, and once they are firmly in place, we refuse to let them go. And we begin to vilify anyone who suggests otherwise. I cannot tell whether it is due to external elements, like deep social division, or internal elements, like an uncritical approach to one’s own beliefs. Perhaps it is both, or perhaps it is something else entirely. But I think it not coincidental that these are tribal ideas: they are ideas that mark our membership as much as they define our position.
The qualities that lead to an idea going viral and the qualities that make an idea credible, apparently, do not overlap. Personally, I think this is because many of us have bad ideological immune systems: we accept ideas based on whether they fit what we already agree with, not based on whether they are well-supported by the evidence. That’s what the research seems to show so far, anyway.
The latest wrinkle in the puzzle of how bad ideas spread as easily as good ones comes from a study recently published in the Proceedings of the National Academy of Sciences. You can read the abstract or the full study online, but the shorthand version is that Vicario et al. looked at conspiracy theories and science news to see how they spread on Facebook and to try to learn something about the dynamics of that spread. Like a number of studies in the past, the results showed that both kinds of ideas tend to spread in “homogeneous clusters” of users (essentially echo-chambers), slowly diffusing through a cluster over a period of days or weeks.
What I find interesting about this study is that it also shows that assessment of information is lost along the way; science news and conspiracy theories both rapidly become background information as they diffuse through an echo-chamber. By a few weeks out, users in a group will consider the new information fact and resist attempts to change it.
The echo-chambers are echoing loudly of late. Crisis and fear always seem to pick off the scabs of history. In our media and our minds, a slurry of racist, sexist, xenophobic, and islamophobic ideas ooze back to the surface and spill out into the world around us.
I want to write people off when they say such things, and certainly it becomes harder to believe that people can change. I want to write them off because enemies are simple. But people are complicated; we can change, and we do. We just tend to forget that we have, and thus to judge that other people can’t. Simplifying ourselves encourages us to simplify others, reducing them slowly and surely to enemies.
I think a part of the way forward is to look back: to remember our own changes. To talk about them. To wear changing our minds as a badge of honor rather than shame.
I used to be anti-abortion.