Climate of Uncertainty

35521701131_938f2e3112_oIn popular discussion, uncertainty serves as a wedge—a point of weakness with which you can destroy an idea you don’t like. So it isn’t that surprising that the selfish and self-serving use scientific uncertainty as a wedge as well; it doesn’t work in the scientific literature, but it does work in the minds of the public. We hear “uncertainty around climate change” and, for many of us, it means “we don’t know.”

The simplicity is appealing—after all, we know something or we don’t. Can you really half-know? Well… yes. Even that simple question shows us the difference between our gut feeling about knowing, and how it actually works. There is a great range of nuance in the idea of uncertainty, and when scientists say that something has uncertainty, they mean something much more specific than what most people think of as uncertain.

The language of science requires us to embrace uncertainty in order to understand it. Science is all about shrinking uncertainty, not to zero, but to the smallest reasonable range that evidence and method can support. The whole endeavor of science is to presume we don’t know, and then eliminate things we can be sure are wrong. Not to be certain about what is true, but to arrive at an approximation we can work with.

That means in order for us to have a discussion about so nuanced and evidence-heavy a topic as climate change, we need to go out of our way to understand uncertainty. The good news? We already do—we just need to think about it in different terms.

Continue reading

The Collective Endeavor

Philae on Comet 67P Churyumov-GerasimenkoWe know very well that science is not truth—indeed, central to science is the idea that we must acknowledge both the subjectivity of human perception and the limits of individual knowledge. So science does not aim to replace those perceptions; it aims instead to refine them. Science strips away the biases of the individual, the biases of the community, the biases of particularity, and the biases of variable conditions. In so doing it aims not for a pure and incontrovertible view of the world, but merely a verifiable view.

Science is thus not truth, merely the lens through which we view truth. Yes, its role is to bring truth as nearly into focus as it may, but greater than that is its role of ensuring that we are all using the same lens. By prescribing the method, science drives us all to view the world with the limits of our collective understanding rather than the limits of our personal understanding.

Continue reading

Quantum Ex Machina

UntoldStory_viaAftabUzzamanWe all tell ourselves stories about the world—stories to help us reduce the component parts into things we can understand. Sometimes those stories describe the world, and sometimes they describe what we wish the world could be. Usually, I think, they are a little of both.

The edges are always fuzzy, and the connections can be tenuous, and sometimes there are gaps in the stories we want to tell ourselves. Sometimes we just leave those gaps there, unanswered and honest. But sometimes we flail in the fuzzy gaps, and sometimes we try to fill them in.

It’s almost a meme, outside of scientific circles, to use quantum physics for this; after all, quantum physics is pretty cool, pretty attention-grabbing, and pretty unintuitive. Can’t quantum effects be that little bit of magic we secretly hope for?

Continue reading

Climate Denihilism

climatechange_viaADBThe overwhelming scientific consensus is that human-caused climate change is real, ongoing, and extremely dangerous. For those who missed the most recent data point, February of 2016 was the hottest temperature anomaly in recorded history. That, on top of us having racked up most of the hottest overall years on record during the past decade. And yet, somehow, there are still intellectually dishonest people who stand up and argue that climate change isn’t happening, or maybe isn’t so bad, or maybe will just not be a problem because we’ll adapt (or something, and who needs those ecosystems anyway).

It’s enough to make you want to give up. What’s the point in trying to stop climate change when we keep electing people who are happy to disbelieve it? Isn’t it basically inevitable at this point? Realistically, we’ll be lucky if we can all agree that it’s real before we pass the point of no return, let alone do anything about it.

But I think that’s a pretty dangerous point of view.

Continue reading

A Few Bad Apples

Reliably, whenever issues of sexism, racism, and prejudice appear, so too does the phrase “a few bad apples.” University professors are harassing their students, but universities and media hasten to remind us that they are just a few bad apples. Police officers are abusing the people they are supposed to protect and serve, but mostly when those people are black—still, it’s a few bad apples.

“A few bad apples” is in-group language. It’s what you say when you identify with the group in question, and you just can’t believe anything bad about that group because it would also mean something bad about yourself. It is, in essence, group-level denial: that person did something I can’t be associated with, so that must mean they don’t really represent my group.

Continue reading

Infectious Ideas


The qualities that lead to an idea going viral and the qualities that make an idea credible, apparently, do not overlap. Personally, I think this is because many of us have bad ideological immune systems: we accept ideas based on whether they fit what we already agree with, not based on whether they are well-supported by the evidence. That’s what the research seems to show so far, anyway.

The latest wrinkle in the puzzle of how bad ideas spread as easily as good ones comes from a study recently published in the Proceedings of the National Academy of Sciences. You can read the abstract or the full study online, but the shorthand version is that Vicario et al. looked at conspiracy theories and science news to see how they spread on Facebook and to try to learn something about the dynamics of that spread. Like a number of studies in the past, the results showed that both kinds of ideas tend to spread in “homogeneous clusters” of users (essentially echo-chambers), slowly diffusing through a cluster over a period of days or weeks.

What I find interesting about this study is that it also shows that assessment of information is lost along the way; science news and conspiracy theories both rapidly become background information as they diffuse through an echo-chamber. By a few weeks out, users in a group will consider the new information fact and resist attempts to change it.

Continue reading

“Causes Cancer”

One can and should simplify scientific research to make it intelligible, but there is a level of imprecision beyond which simplification becomes mere fiction. I think at this point, in most cases, saying something “causes cancer” is effectively fiction. It wasn’t always, but that phrase has been so abused that it now creates a one-to-one link in the popular imagination between the item of the week and our most potent medical boogeyman.

The recent announcement by the IARC (International Agency for Research on Cancer) and the associated statements by the WHO (World Health Organization) have created a current hullabaloo over red meat, and processed red meat in particular. If you want a good summary of that issue, please read this one, and not any of the more sensationalized pieces exploding into your news feeds.

Because those sensationalized pieces dominate. Most of the media are busily reporting, nuance-free, how red meat and processed meat “causes cancer.” Most are using the most inflated statistic—an increase in risk of 17%. Most are not mentioning baseline risk. Most are not discussing potency. Most are not mentioning that this information is not new, but instead a result of slowly progressing scientific research.

In my view, reporting that something “causes cancer” gives you all the panic with none of the information. What is the baseline risk? In this case, it is 6%, or 6 people out of 100 will get bowel cancer in their lifetimes. What is the increased rate if you eat a lot of processed meat daily? 7%, or 7 people out of 100. So, if everyone ate processed meats only in moderation (about 50 grams is suggested, or two slices of bacon per day, or one bacon cheeseburger per week), we could avoid one additional case of bowel cancer for every 100 people who decreased their consumption.

That matters. That’s significant. But it also isn’t a one-to-one relationship. Processed meat does not “cause cancer” so much as it contributes to a slight increase in your risk of one type of cancer over your lifetime. And that isn’t even that much harder to say. Headline writers, please take note: your hyperbole is helping no one.

A More Perfect Truth

credit: Flickr user shawncalhoun

The scientific method, at its heart, is a set of steps to keep us from fooling ourselves, and one another, and thus to arrive at our best approximation of truth. Each step in the traditional scientific method is a way we reduce bias, eliminate confusion, and further our collective knowledge. But recent high profile research has highlighted some of the ways it can break down along the way, especially in preliminary research.

The idea that preliminary research is mostly inconclusive or incorrect isn’t surprising—preliminary studies are the way a scientific community investigates new ideas. Contrary to public perception, publication in a scientific journal is not so much verification of truth as it is the beginning of a debate. Collective knowledge, in theory, builds from that point onward.

So, when I read recently that more than two-thirds of a group of psychological studies could not be replicated, I wasn’t too surprised. Whatever the media might make of a single small study, and however much they might tout it as a breakthrough (and they do, for everything), the chances are that the results are flawed somehow. Scientists, of course, are still human, and they still get pulled toward positive data. There are a number of habits, like abusing the P value (try it yourself to see how it works) or choosing what measures to focus on after the fact, that can lead to a researcher misrepresenting results, even unintentionally. And, of course, there are a few bad actors who inflate the results of their studies on purpose.

There is a secondary problem in science as well, which is that journals tend to publish positive studies, and researchers tend not to even submit negative studies, leading to publication bias. If you’re a drug company, you might abuse publication bias on purpose to make your products look more effective than they actually are. To makes things worse, journals have their own bias towards new research and often don’t want to publish negative studies or failed replications of previous research. Combined with the set of problems I mentioned above, which lead to iffy research, publication bias effectively hobbles scientific debate by letting lots of ideas in, but silencing the voices that would weed out the bad ones.

You might have noticed that the first set of problems arises from individual biases, while the second set arises from systemic biases. In the first case, researchers are accidentally or intentionally allowing bias into their studies and tainting their results. The scientific method is still subject to human error, or intentional gaming of the system. In the second case, the scientific method has few tools for eliminating systemic biases, so a slightly more developed solution is needed. The only current tool is peer review, but that has its own host of limitations and problems.

I think, however, there is a solution that would reduce problems at both levels simultaneously, and it’s one we already know works: pre-registering research.

Pre-registration of clinical trials is a tool recently employed to deal with publication bias in medicine, and especially to prevent bad actors (such as drug companies with a financial stake in the matter) from gaming the system by hiding negative research. It also eliminates researcher biases because they have to register their methodology before conducting the study, and thus cannot modify that methodology during or after the fact to generate a positive result. The effect has been a dramatic decline in false-positive results.

Some people have rightly pointed out the problems with pre-registering all research, and how difficult it would be to figure out who to register with and how to keep track. This is where the second part of the solution comes in: journals already admit that there is value in publishing negative results, so register prospective research methodologies with scientific journals, which in turn must commit to publishing the end result. Even if that commitment came with some caveats, this would simultaneously prevent researchers from modifying their methodology, thus reducing biased results, and force journals to accept research based on the methodological merits when they are still blind to the outcomes, thus reducing biased publication.

Of course this wouldn’t solve every potential problem in science, but, as I said, science is not a perfect enterprise—it is a collective endeavor to arrive at the best approximation of truth. We can always do better, and we can always learn more, and it’s time to take the next step in that direction. We know we need to eliminate our individual biases, and now we know that we need to address collective biases as well. We also know that we can—it only remains to do so.

Image credit: Flickr user shawncalhoun

Ecological Jenga

jenga_viaKellyTeagueOne quality of human societies is that we shape our environment to fit our needs. Sometimes we do this intentionally, such as when we clear land for agriculture or human occupation. Sometimes we do it as a byproduct of our actions, such as greenhouse gas emissions leading to runaway warming of the global climate. In the past, human societies have endured, or collapsed, or adapted, depending on how much their environment could withstand change.

I think this begs the question, how resilient is our global ecosystem? How much can it handle? What are the limits?

Those with a narrower view tend to dismiss environmental concerns as frivolous, uneconomical, or overblown. The earth has always carried on, they sometimes argue, and even if it doesn’t, we can invent technologies to replace and improve on anything an ecosystem can do. Yet I wonder.

Ecosystems around the world have evolved to be both diverse and redundant, with animals and plants and insects and microbes all functioning together to support the system. Most of that diversity and redundancy is structural—the evolution of an ecosystem, like the evolution of any given species, does not tend to generate and maintain traits with no purpose. I don’t mean that an ecosystem is designed with a place for everything and everything in its place, but rather that diversity and redundancy in a natural system are present because regular stress on the system requires them. They are buffers that protect the system from failure.

From the human perspective, redundancy is usually perceived as an abundance of parts—a river full of salmon, a forest full of old growth trees, a sky full of passenger pigeons. This leaves us with the comforting sense that however many we take away, there are more than enough; the system will not falter.

That can be true on a small scale, but global human society does not act on the small scale. We have an economic engine dedicated to mobilizing resources, and it is very good at it. If a resource is found, there is an effectively unending line of people ready to use it and transform it into human economic capital. But that engine is very bad at asking questions of stability; if a resource is abundant, we use it rapidly and heavily without concern for the broader system. That old individual view, that taking a few doesn’t matter, seems to have evolved into the idea that natural systems can be processed and repurposed by humans without consequences.

Unfortunately, the data says otherwise. The declining biodiversity of forests and the strangled flow of major rivers are examples of what happens to natural systems when their natural buffers are carted off for human purposes. Current complex systems science shows us that the natural systems we rely on are being driven to the edge of catastrophic failure. Ecologists and complex system scientists call this “overshoot,” a state in which the key ecological foundations of a system are exploited much faster than they can regenerate.

Put more simply, we are playing ecological Jenga. Globally, systematically, we are stealing away the foundations of critical natural systems to build a human superstructure on top. Yet questions about that same foundation receive more derision than consideration; with a curious bootstrapping logic, we convince ourselves that the titanic edifice of human society is unsinkable.

That ideological position is so much stranger in the face of the evidence. We have known for some time that we are drawing down natural capital much faster than the rate of replenishment. In the U.S., California is a poster child for depletion of water. In Canada, Alberta is scraping off their largest intact natural forest to dig up tar sands. In the tropics, slash-and-burn agriculture is depleting nutrient-rich topsoil that took thousands of years to form.

As we busily remove the redundancy of natural systems that sustain us, the growing specter of climate change looms large. We are carefully pulling bricks from the base of our tower, scarcely noticing the wind of change ruffling our shirtsleeves. Systems evolved redundancy to cope with stressors, and the biggest stressor for an ecosystem is a changing climate.

Some say human ingenuity will avert any catastrophe. I think they’re right that it could, if we would just look honestly at the implications of our choices. If we could bring ourselves to take them seriously. If we could bring ourselves to alter those choices.

But the tower is getting taller, and the wind is getting stronger.

The science shows us that we can’t continue the game into perpetuity. Natural systems will reach points of change; many already are. Many already have. Some have collapsed.

So let’s hear it for human ingenuity, and let’s fix it. But I have a sneaking suspicion that ingenuity isn’t our problem now. We’re plenty ingenious. What we’re not, is honest.

The Spirit of Inquiry

via Flickr user Massimo VarioloIn conversation a few weeks ago I guessed that there were some thirty republican presidential candidates at this point. It turns out I was wrong—as I write this, the actual (and only slightly less absurd) number is seventeen. Being wrong about that didn’t bother me all that much; thirty felt like a true number, but I have now revised my knowledge because I encountered new information.

Revising based on new information is something (I hope) I do quite often. When I want to know something, I try to reason out the answer first, but then go look up the truth. Both parts of that are important—if I look something up without chewing on it first, I tend to forget it easily. If I guess but don’t bother to check my guess then the distinction between estimate and reality is easily lost to memory.

Guessing and revision is a somewhat Bayesian way of encountering the world, but I think it reflects a spirit of inquiry and exploration. In one sense, it is a personal application of the scientific method. In the broadest sense I can envision, it is a fundamental part of human nature to experiment and discover. We all build predictive stories for ourselves about the world to explain what has happened before and help us expect what will happen next.

Sometimes, though, the link between guessing and checking gets lost. Maybe I guess something and forget to check it later, or maybe I hear someone else’s guess and don’t realize that they didn’t check it first. The provisional story starts to lose its hesitancy and become Real, and True, and Important, and other similarly calcifying adjectives. The story developed to model the world starts to become a world in itself. Ideas drift into ideologies.

When I listen to people pushing an ideology, I sometimes hear the ghost of inquiry in the background. They say with certainty the things I want to ask as questions.

“Nuclear power is not a viable option for mitigating climate change.” But I want to ask, “Is nuclear power a viable option for mitigating climate change?”

“GMOs are harmful and can’t help with worldwide hunger and nutrition.” And I think, “Are GMOs harmful, and can they help with worldwide hunger and nutrition?”

“Cutting social security, medicare, and other entitlements is the only way to balance the federal budget.” And I reply, “Is cutting social security, medicare, and other entitlements the only way to balance the federal budget?”

“Environmental concerns have to be economically profitable to be effective.” Do they really, I wonder?

What are these ideas? Guesses we received from others, but didn’t really check? If you ask someone who fervently believes one of these positions to support it, they will, and vigorously. Motivated reasoning is easy, and unfortunately common. But did they ever think to doubt it? Did they look beyond the favored “evidence” swirling around them from people who agree with the idea, and instead seek out some more dispassionate analysis of the facts?

And if I disagree, did I?

I don’t know. I think much less often than I would like. In the words of the old Russian proverb, appropriated by a certain person who largely ignored it in his domestic policies, “trust, but verify.”

So I keep guessing, and I keep checking. My greatest worry is for those ideas that seem immediately true. Such ideas slip easily past our defenses and set up shop in our stories without scrutiny, bending and distorting our subsequent knowledge of the world. There is no way to investigate all of these—we hear them everywhere, and verifying takes effort. We even create them unknowingly.

The only course left to us, I think, is to doubt our own stories along with the stories of others. To breathe that spirit of inquiry back into our ideas, especially when they have died into ideologies. We may always be chasing the truth, but I think that better, on the whole, than embracing fictions.