In popular discussion, uncertainty serves as a wedge—a point of weakness with which you can destroy an idea you don’t like. So it isn’t that surprising that the selfish and self-serving use scientific uncertainty as a wedge as well; it doesn’t work in the scientific literature, but it does work in the minds of the public. We hear “uncertainty around climate change” and, for many of us, it means “we don’t know.”
The simplicity is appealing—after all, we know something or we don’t. Can you really half-know? Well… yes. Even that simple question shows us the difference between our gut feeling about knowing, and how it actually works. There is a great range of nuance in the idea of uncertainty, and when scientists say that something has uncertainty, they mean something much more specific than what most people think of as uncertain.
The language of science requires us to embrace uncertainty in order to understand it. Science is all about shrinking uncertainty, not to zero, but to the smallest reasonable range that evidence and method can support. The whole endeavor of science is to presume we don’t know, and then eliminate things we can be sure are wrong. Not to be certain about what is true, but to arrive at an approximation we can work with.
That means in order for us to have a discussion about so nuanced and evidence-heavy a topic as climate change, we need to go out of our way to understand uncertainty. The good news? We already do—we just need to think about it in different terms.
I do not support Donald Trump. But what if I did? He legitimately won the election under our democratic system; only a quarter of the country voted for him, but that is the system we have. His rhetoric is divisive and untethered from evidence, but that is the rhetoric we decided was acceptable. The choices he makes, whether we like it or not, will shape our country and possibly the world for many years to come.
One thing I am sure of is that being politically divided and unwilling to change our views is a self-reinforcing feedback loop. It’s easy to use division to justify more. But I don’t want to do that. I want to have solidly-evidenced political positions.
I don’t plan to say “oh, give him a chance,” because our country already decided to give him that on November 8th, and because I do not personally expect him to become any more respectful or honest as president than he was in the year preceding the election. Nor do I intend to shut up about what I disagree with, because critiquing the government is patriotic and quashing dissent is undemocratic.
So he’d have my critique even if he already had my support. But what would he have to do to get my support? Under what conditions would I say “Well, I didn’t expect it, but he’s doing a good job”? If my opposition to Trump is partisan, there will be no such conditions. But if my opposition to Trump is based on his policies and actions, I should be able to say under what conditions I would change my mind.
Over the past year the Black Lives Matter movement has called attention to disproportionate police violence against disproportionately black Americans. For many black Americans, this was a breakthrough into the mainstream for a challenge they have lived with their entire lives. For many white Americans, this is a new and surprising piece of information about the world.
If you are a white American, it’s understandable that you would find it novel to think black Americans have more to fear from police officers. After all, you may have lived your entire life without worrying much about the police, and certainly without feeling like you have no control over whether you live or die at a traffic stop. You might wonder what we, a free and just society, should do about this new problem.
But of course it isn’t a new problem—just novel to you. It’s an old problem, and what’s new is that white people, by and large, now know about it.
Conformity is one of those tricky things: we like to give it the side-eye, but we also like to practice it, often without even knowing we’re doing it. We enjoy the feeling of being “right” with everyone else. The trouble is, it’s really hard to think differently than the rest of a group—so the feeling of being “right” isn’t really a feeling of being right at all. It’s just a feeling of being the same.
There is a series of psychological experiments that speak to the question of conformity. Collectively, these are known as the Asch Paradigm, and the most oft-repeated result of these studies is that, given enough peer pressure, a large number of people will give obviously wrong answers to questions. For example, when asked a simple question like “which of these three lines is the same length as this fourth line?” people were much more likely to pick one that was obviously longer or shorter if a group of other people confidently chose the wrong line first. In other words, seeing other people give the wrong answer with confidence made them change their own answer—and even doubt their own judgment.
You can tell this as a story about how we succumb to the pressure of the group and espouse ideas that are wrong. But I think it is more interesting as a story about how we impose conformity on others—about how confident we are in our views, especially in groups, and how viciously we ostracize people who propose something different.
There is a point in believing an idea where, regardless of where we began, we lose the habit of refining that idea. Instead of seeking to improve our positions, we begin to defend then. Instead of searching for the nuance, we begin to strip it away.
It isn’t every idea—but certain ideas seem to burrow into our politics, our religion, and our activism, and once they are firmly in place, we refuse to let them go. And we begin to vilify anyone who suggests otherwise. I cannot tell whether it is due to external elements, like deep social division, or internal elements, like an uncritical approach to one’s own beliefs. Perhaps it is both, or perhaps it is something else entirely. But I think it not coincidental that these are tribal ideas: they are ideas that mark our membership as much as they define our position.
“Allegedly” is one of those words that people stick in front of disputed things, and it serves the useful purpose of signaling that the dispute exists. But there is another way people use it as well, and that is less about signaling dispute and more about introducing it. And it works! For me, as a reader, when I see the word “alleged” tied to something, it makes me more critical, more doubtful, and more aware that some other people don’t think the thing in question is true.
So, I find it rather disturbing when people use the word “alleged” for things like sexual assault, abuse, and online harassment. In this context, the word is used as a rhetorical trick, even (especially?) when the event itself is not really in doubt, to create that doubt. People use this word, in short, to minimize the experiences of women.
Certainty is a funny thing. You might think the idea of certainty naturally admits that things are subjective, that absolute proof is difficult, and that beliefs must be updated to reflect changing evidence. But that isn’t how we practice certainty—instead of signaling a spectrum of probable truth, it seems to have become an arbiter of validity.
When someone is certain, that should be a commentary on the evidence they have for a position. Somehow, though, certainty has been divorced from that spectrum of evidence. Instead of certainty being the extreme end, it has become the correct end; the rest of the spectrum is collapsed and we are left with the binary of certainty and uncertainty. It that strange dichotomous world, anything uncertain isn’t worth considering—as though lack of absolutism frees us from any tether to the real world.
Honesty is the most important thing—at least, that was a value I learned growing up. No criticism was left unspoken, nor was there any thought that it should be. I learned to value blunt, direct language. I learned to say what I thought. I learned to be brutally honest, and to believe it was the right thing to do.
What I learned was not unique. I see a lot of people who prefer to be direct and who find honesty refreshing. I know a lot of people who find subterfuge and subtext exhausting, and who are actively annoyed by people who weave and bob and refuse to say what they think. And I, like a lot of other people, am actively annoyed by the fact the public figures say whatever they think people want to hear with no regard for truth. Honesty, I think, is objectively valuable.
What I didn’t learn, at least for a while, is that brutality and honesty need not go together.
Richard Feynman famously described science, and curiosity broadly, as “the pleasure of finding things out.” There are certainly few things I enjoy more than to turn over ideas and work them through to some new place, even moreso in quick and intelligent company. I consider it a life philosophy to avoid stopping at the obvious conclusions, and instead to see what more may be learned with a few judicious questions. It isn’t science per se, but it has in common a reliance on method. In learning, as in science, one must start with the assumption that one is wrong.
The qualities that lead to an idea going viral and the qualities that make an idea credible, apparently, do not overlap. Personally, I think this is because many of us have bad ideological immune systems: we accept ideas based on whether they fit what we already agree with, not based on whether they are well-supported by the evidence. That’s what the research seems to show so far, anyway.
The latest wrinkle in the puzzle of how bad ideas spread as easily as good ones comes from a study recently published in the Proceedings of the National Academy of Sciences. You can read the abstract or the full study online, but the shorthand version is that Vicario et al. looked at conspiracy theories and science news to see how they spread on Facebook and to try to learn something about the dynamics of that spread. Like a number of studies in the past, the results showed that both kinds of ideas tend to spread in “homogeneous clusters” of users (essentially echo-chambers), slowly diffusing through a cluster over a period of days or weeks.
What I find interesting about this study is that it also shows that assessment of information is lost along the way; science news and conspiracy theories both rapidly become background information as they diffuse through an echo-chamber. By a few weeks out, users in a group will consider the new information fact and resist attempts to change it.