It’s not that I didn’t know Alternate America existed. I knew it did. I knew people believed a whole host of things that, to me, didn’t reconcile with the evidence. Yet, I make of point of being willing to change my mind when presented with solid evidence for a different position, so I assumed, wrongly, that most people would reasonably do the same. Outside of a few hot-button issues where emotions override facts, I figured truth was inherently stronger than fiction, however convenient.
Now that idea seems naïve. Of course the truth is not stronger. Of course the evidence is not convincing to those who don’t want to be convinced. Why did I think it was? The clash between America and Alternate America has been seething beneath the surface, erupting in localized ways, for decades. And yes, Alternate America has been losing a lot of battles, but in response they’ve also been tightening their boundaries and reinforcing their narratives.
That was a smart choice for people who care more about protecting their beliefs than they care about correcting them. Ideology is stronger than truth. I thought it was stronger by a little bit; but it seems to be stronger by a great deal. Mix a potent ideology with a well-chosen narrative, and people will happily ignore their lying eyes.
I’ve been trying to understand how people could possibly believe that host of things that doesn’t match the evidence. But that was the wrong question; the question I should have been asking was “what are the narratives?”
I do not support Donald Trump. But what if I did? He legitimately won the election under our democratic system; only a quarter of the country voted for him, but that is the system we have. His rhetoric is divisive and untethered from evidence, but that is the rhetoric we decided was acceptable. The choices he makes, whether we like it or not, will shape our country and possibly the world for many years to come.
One thing I am sure of is that being politically divided and unwilling to change our views is a self-reinforcing feedback loop. It’s easy to use division to justify more. But I don’t want to do that. I want to have solidly-evidenced political positions.
I don’t plan to say “oh, give him a chance,” because our country already decided to give him that on November 8th, and because I do not personally expect him to become any more respectful or honest as president than he was in the year preceding the election. Nor do I intend to shut up about what I disagree with, because critiquing the government is patriotic and quashing dissent is undemocratic.
So he’d have my critique even if he already had my support. But what would he have to do to get my support? Under what conditions would I say “Well, I didn’t expect it, but he’s doing a good job”? If my opposition to Trump is partisan, there will be no such conditions. But if my opposition to Trump is based on his policies and actions, I should be able to say under what conditions I would change my mind.
It is conventional to give people the benefit of the doubt—to err, when possible, on the side of uncertainty and not to presume the unlikely is untrue. But it is one thing to give the benefit of the doubt in uncertain circumstances, and it is quite another to give an outsized benefit with very little doubt indeed. That, in essence, is origin of false balance.
Worse, of late the media has taken to determining what subjects are in doubt not by what evidence is available, but instead by how forcefully people argue for one side or another. A forceful but untrue statement often triggers a confused and muddled response from journalists, who, by dint of their profession, know both that the statement is painfully untrue and that to contradict it outright is painfully taboo.
Journalistic conventions, intended to ensure fair treatment regardless of personal inclination, fail abysmally when public figures refuse to play by the rules.
Politics has always mobilized the most intuitive kind of lies—the kind that we don’t bother to look at very deeply because they confirm our existing prejudices. Politicians are masters of the lie that feels true, even when all the facts run counter. And we buy those lies, and repeat them, and believe them, not because they have any isolated value, but because they bolster our view of the world.
Yet even knowing that, this election seems to me to be built on uniquely straightforward misinformation.
So I have been paying more attention to this election than some in the past, but not because I am disillusioned or disgusted with the choices, or frustrated by my vote not counting the way I’d like. Instead, it is because I think this election is historic, I very much want to see how we deal with it as a society.
There are a bunch of people out there right now who want to tell you “support the police” and “blue lives matter.” Many of these people also say slightly more nuanced things like “there is more crime in black neighborhoods so police are needed there” or “there are a few bad apples, but the police need the freedom to act” or “black people would be safe in encounters with the police if they just do what the police tell them.”
But all of those phrases have a missing word, right there at the end where it matters the most.
Conformity is one of those tricky things: we like to give it the side-eye, but we also like to practice it, often without even knowing we’re doing it. We enjoy the feeling of being “right” with everyone else. The trouble is, it’s really hard to think differently than the rest of a group—so the feeling of being “right” isn’t really a feeling of being right at all. It’s just a feeling of being the same.
There is a series of psychological experiments that speak to the question of conformity. Collectively, these are known as the Asch Paradigm, and the most oft-repeated result of these studies is that, given enough peer pressure, a large number of people will give obviously wrong answers to questions. For example, when asked a simple question like “which of these three lines is the same length as this fourth line?” people were much more likely to pick one that was obviously longer or shorter if a group of other people confidently chose the wrong line first. In other words, seeing other people give the wrong answer with confidence made them change their own answer—and even doubt their own judgment.
You can tell this as a story about how we succumb to the pressure of the group and espouse ideas that are wrong. But I think it is more interesting as a story about how we impose conformity on others—about how confident we are in our views, especially in groups, and how viciously we ostracize people who propose something different.
The precautionary principle is critical and useful tool for addressing risk. Put simply, it encourages us to resolve uncertainty judiciously and carefully, with an awareness of possible risks. It gives us a check on unbridled enthusiasm, and a check that is altogether important. In fact, many of the regulations we have in place in society are built around precaution rather than simply assuming something is worthwhile.
So the precautionary principle has value in many uncertain circumstances, but we shouldn’t assume that it has value in any uncertain circumstance because, to be perfectly frank, all circumstances are uncertain. The question is of degree. And improperly applied, the precautionary principal can be unbridled and dangerous—the exact attitude it is intended to keep in check.