The Collective Endeavor

Philae on Comet 67P Churyumov-GerasimenkoWe know very well that science is not truth—indeed, central to science is the idea that we must acknowledge both the subjectivity of human perception and the limits of individual knowledge. So science does not aim to replace those perceptions; it aims instead to refine them. Science strips away the biases of the individual, the biases of the community, the biases of particularity, and the biases of variable conditions. In so doing it aims not for a pure and incontrovertible view of the world, but merely a verifiable view.

Science is thus not truth, merely the lens through which we view truth. Yes, its role is to bring truth as nearly into focus as it may, but greater than that is its role of ensuring that we are all using the same lens. By prescribing the method, science drives us all to view the world with the limits of our collective understanding rather than the limits of our personal understanding.

Continue reading

No Reason to Lie

Pinocchio_ViaJean-EtienneSometimes, in the course of a debate or discussion, a secondhand statement comes under consideration. The actors in the debate must then evaluate how relevant that statement is to the their discussion. This happens in media during interviews, in class discussions, on the internet, with friends and family, and beyond. Wherever it happens, you are as likely as not to hear a particular phrase—“no reason to lie.”

“Look, he has no reason to lie.”

“Why would he lie?”

“She doesn’t get anything out of lying about this—she has no reason to.”

However it arises, the implication of the argument that someone “has no reason to lie” is that having no reason to lie is, itself, evidence for truth.

And our understanding of logic and evidence is so bad that we often accept that.

Continue reading

On the Pleasure of Finding Things Out

puzzle_viaLetIdeasCompeteRichard Feynman famously described science, and curiosity broadly, as “the pleasure of finding things out.” There are certainly few things I enjoy more than to turn over ideas and work them through to some new place, even moreso in quick and intelligent company. I consider it a life philosophy to avoid stopping at the obvious conclusions, and instead to see what more may be learned with a few judicious questions. It isn’t science per se, but it has in common a reliance on method. In learning, as in science, one must start with the assumption that one is wrong.

Continue reading

A More Perfect Truth

credit: Flickr user shawncalhoun

The scientific method, at its heart, is a set of steps to keep us from fooling ourselves, and one another, and thus to arrive at our best approximation of truth. Each step in the traditional scientific method is a way we reduce bias, eliminate confusion, and further our collective knowledge. But recent high profile research has highlighted some of the ways it can break down along the way, especially in preliminary research.

The idea that preliminary research is mostly inconclusive or incorrect isn’t surprising—preliminary studies are the way a scientific community investigates new ideas. Contrary to public perception, publication in a scientific journal is not so much verification of truth as it is the beginning of a debate. Collective knowledge, in theory, builds from that point onward.

So, when I read recently that more than two-thirds of a group of psychological studies could not be replicated, I wasn’t too surprised. Whatever the media might make of a single small study, and however much they might tout it as a breakthrough (and they do, for everything), the chances are that the results are flawed somehow. Scientists, of course, are still human, and they still get pulled toward positive data. There are a number of habits, like abusing the P value (try it yourself to see how it works) or choosing what measures to focus on after the fact, that can lead to a researcher misrepresenting results, even unintentionally. And, of course, there are a few bad actors who inflate the results of their studies on purpose.

There is a secondary problem in science as well, which is that journals tend to publish positive studies, and researchers tend not to even submit negative studies, leading to publication bias. If you’re a drug company, you might abuse publication bias on purpose to make your products look more effective than they actually are. To makes things worse, journals have their own bias towards new research and often don’t want to publish negative studies or failed replications of previous research. Combined with the set of problems I mentioned above, which lead to iffy research, publication bias effectively hobbles scientific debate by letting lots of ideas in, but silencing the voices that would weed out the bad ones.

You might have noticed that the first set of problems arises from individual biases, while the second set arises from systemic biases. In the first case, researchers are accidentally or intentionally allowing bias into their studies and tainting their results. The scientific method is still subject to human error, or intentional gaming of the system. In the second case, the scientific method has few tools for eliminating systemic biases, so a slightly more developed solution is needed. The only current tool is peer review, but that has its own host of limitations and problems.

I think, however, there is a solution that would reduce problems at both levels simultaneously, and it’s one we already know works: pre-registering research.

Pre-registration of clinical trials is a tool recently employed to deal with publication bias in medicine, and especially to prevent bad actors (such as drug companies with a financial stake in the matter) from gaming the system by hiding negative research. It also eliminates researcher biases because they have to register their methodology before conducting the study, and thus cannot modify that methodology during or after the fact to generate a positive result. The effect has been a dramatic decline in false-positive results.

Some people have rightly pointed out the problems with pre-registering all research, and how difficult it would be to figure out who to register with and how to keep track. This is where the second part of the solution comes in: journals already admit that there is value in publishing negative results, so register prospective research methodologies with scientific journals, which in turn must commit to publishing the end result. Even if that commitment came with some caveats, this would simultaneously prevent researchers from modifying their methodology, thus reducing biased results, and force journals to accept research based on the methodological merits when they are still blind to the outcomes, thus reducing biased publication.

Of course this wouldn’t solve every potential problem in science, but, as I said, science is not a perfect enterprise—it is a collective endeavor to arrive at the best approximation of truth. We can always do better, and we can always learn more, and it’s time to take the next step in that direction. We know we need to eliminate our individual biases, and now we know that we need to address collective biases as well. We also know that we can—it only remains to do so.

Image credit: Flickr user shawncalhoun

In the Public Interest

Carter_and_Ford_in_a_debate,_September_23,_1976As someone who hasn’t yet watched the Republican or Democratic debates and hasn’t attended any campaign events, I rely primarily on reporting to keep abreast of the candidates and their positions. Or perhaps I should say “would rely on,” since the things reported in the media and things I want to know have essentially no overlap.

When we, the public, granted private companies the right to broadcast throughout the United States, we also asked for one thing in return: that they spend some time each day in serving the public interest. Thus was born “the news.” Yet the news, in its current incarnation, seems to have shuffled off the public interest in favor of the popular demand..

The things in the public interest to know, in my opinion, would be what positions candidates have taken, what policies they advocate, and what those policies would actually mean for the public. Are these policies feasible? Are they soundly supported by evidence? What are the upsides and downsides? Yet I am hard-pressed to find any mention of policies, let alone reporting that substantively analyzes those policies and discusses the evidence for and against them.

Hillary Clinton has been the Democratic “front-runner” (and already we fall into the horse race) for more than a year, since long before announcing her candidacy. But what do we hear about her policy choices? The media mostly describe them in broad strokes, and when they pass judgment it is out of partisan bias, not evidentiary analysis.

And what about the “leading” candidate on the Republican side, Donald Trump? Reportedly the media are so interested in interviewing him that he can make the absence of policy questions a condition of his participation. Some of his policies do hold the media’s attention, but only those that are so patently absurd (like the proposal of a giant concrete border wall) as to provide gawker value.

Even within the bounds of the horse race, the media can’t seem to base it’s reporting on evidence; instead they suffer from the worst form of confirmation bias: choosing a narrative early on in their coverage and defaulting to that narrative repeatedly, regardless of actual events.

Consider, for example, the coverage of Bernie Sanders (or lack thereof). If Hillary draws a crowd of 20,000 people, this is proof of her “front-runner” status, yet if Bernie draws a crowd of 25,000, it barely registers. Not that I think either of those should define a candidate’s viability, because position in the race has nothing whatsoever to do with value as a leader. At this stage of things, none of the general public has weighed in; the positioning in the race is mainly determined by punditry and biased polling—by candidates, of their supporters, and by media, of their viewers.

Nor is the bias skewed left or right; Donald Trump is, one would think, the undeniable “front-runner” on the Republican side, and yet the media generally treat him as an enjoyable sideshow. In their minds, he is unelectable, which is just the word used by pundits to make their personal biases sound like unassailable facts.

So what am I, a member of the public, to make of this? The things that are in my interest to know are not reported. The things that are reported are irrelevancies plagued with bias. The question of who would lead and serve this country best, and what their positions would mean for us, goes unanswered.

As a member of the public, the message I receive is that the collective governance of our country, and the democratic ideals on which it was founded, and the choice of who will define our polices—these are nothing more than sport.

That is not in the public interest.

The Spirit of Inquiry

via Flickr user Massimo VarioloIn conversation a few weeks ago I guessed that there were some thirty republican presidential candidates at this point. It turns out I was wrong—as I write this, the actual (and only slightly less absurd) number is seventeen. Being wrong about that didn’t bother me all that much; thirty felt like a true number, but I have now revised my knowledge because I encountered new information.

Revising based on new information is something (I hope) I do quite often. When I want to know something, I try to reason out the answer first, but then go look up the truth. Both parts of that are important—if I look something up without chewing on it first, I tend to forget it easily. If I guess but don’t bother to check my guess then the distinction between estimate and reality is easily lost to memory.

Guessing and revision is a somewhat Bayesian way of encountering the world, but I think it reflects a spirit of inquiry and exploration. In one sense, it is a personal application of the scientific method. In the broadest sense I can envision, it is a fundamental part of human nature to experiment and discover. We all build predictive stories for ourselves about the world to explain what has happened before and help us expect what will happen next.

Sometimes, though, the link between guessing and checking gets lost. Maybe I guess something and forget to check it later, or maybe I hear someone else’s guess and don’t realize that they didn’t check it first. The provisional story starts to lose its hesitancy and become Real, and True, and Important, and other similarly calcifying adjectives. The story developed to model the world starts to become a world in itself. Ideas drift into ideologies.

When I listen to people pushing an ideology, I sometimes hear the ghost of inquiry in the background. They say with certainty the things I want to ask as questions.

“Nuclear power is not a viable option for mitigating climate change.” But I want to ask, “Is nuclear power a viable option for mitigating climate change?”

“GMOs are harmful and can’t help with worldwide hunger and nutrition.” And I think, “Are GMOs harmful, and can they help with worldwide hunger and nutrition?”

“Cutting social security, medicare, and other entitlements is the only way to balance the federal budget.” And I reply, “Is cutting social security, medicare, and other entitlements the only way to balance the federal budget?”

“Environmental concerns have to be economically profitable to be effective.” Do they really, I wonder?

What are these ideas? Guesses we received from others, but didn’t really check? If you ask someone who fervently believes one of these positions to support it, they will, and vigorously. Motivated reasoning is easy, and unfortunately common. But did they ever think to doubt it? Did they look beyond the favored “evidence” swirling around them from people who agree with the idea, and instead seek out some more dispassionate analysis of the facts?

And if I disagree, did I?

I don’t know. I think much less often than I would like. In the words of the old Russian proverb, appropriated by a certain person who largely ignored it in his domestic policies, “trust, but verify.”

So I keep guessing, and I keep checking. My greatest worry is for those ideas that seem immediately true. Such ideas slip easily past our defenses and set up shop in our stories without scrutiny, bending and distorting our subsequent knowledge of the world. There is no way to investigate all of these—we hear them everywhere, and verifying takes effort. We even create them unknowingly.

The only course left to us, I think, is to doubt our own stories along with the stories of others. To breathe that spirit of inquiry back into our ideas, especially when they have died into ideologies. We may always be chasing the truth, but I think that better, on the whole, than embracing fictions.

Activism and Evidence

To advocate for anything requires a certain amount of determination, tenacity, and passion. One must be willing to fight for an idea against some other current of belief. Sometimes the beliefs one must fight are deeply entrenched, so activists must expect to hear dissent and, to some extent, expect to reject that dissent. To do so is a necessary strength that maintains a steady course through the winds of change.

Yet there are different sorts of ideas we fight for, with different relationships to evidence.

Take the idea that non-heterosexual or non-exclusive romantic partners are inherently immoral. In examples like LGBTQ rights, the conflict is between two social beliefs: one side arguing that their religious proscriptions should apply to all of society, and the other arguing that everyone should have the freedom to live as they are without discrimination. In such cases advocates have support from the underlying American ideals, and there is no conflicting evidence. Opponents have tried to manufacture that evidence without success, so the conflict remains a social one, and one that LGBTQ advocates are rapidly winning.

In a second category of idea, the evidence for one position is clear, but there are social and economic reasons for pretending otherwise. Climate change falls into this category, and activists can fight to mitigate global warming with a clear conscience. After all, the scientific consensus supports that position. But because the opposition includes powerful businessmen and an entire wing of one major political party, advocates for climate change need to be able to quickly evaluate and dismiss opposing arguments. This isn’t too difficult, because for anyone with scientific literacy and an inquiring mind, the evidence mounted by opponents is clearly cherry-picked, muddled, or fraudulent.

ToxicVaccines_viaJenniferPYet there is also a third, thornier category of idea: that wherein an activist position runs counter to the majority of scientific evidence. For example, there is a vocal minority that fights against vaccines, ignoring the fact that vaccines have been repeatedly proven safe and effective. That minority invents claims at the drop of a hat, seizes on the slightest mention of something the public can recognize as “bad” (like mercury or formaldehyde), and relies on anecdotes and lone retracted papers to counter the overwhelming conclusion supported by literally all the other scientific data.

I find this last category of activist endeavor endlessly fascinating, and I also deeply want to know what it is that leads them to reject the majority of evidence and embrace a position so deeply contrary to the ideal of social change.

I have begun to suspect that what I am seeing is not activism perverted so much as activism taken to an illogical extreme. Advocates for anything need a certain amount of ideological armor to navigate the slings and arrows of outrageous claims, and yet in this last case the fetters of logic have been cast away and the activists themselves have become purveyors of the outrageous. They are become impervious, not just to motivated dissenters, but to whole bodies of objective dissenting evidence.

So too activists must be able recognize and publicize harm that occurs as a result of the opposing view. In the cases of LGBTQ rights and climate change, there are real personal harms that occur from the opposing position. Gay couples are suffering discrimination, and poor coastal countries are suffering unprecedented flooding. Effective activists find these things, drag them into the light, and make society take notice.

In the case of anti-vaccine advocates, though, they rely on made-up harms: the sort of harm one illogically infers rather than the sort of harm with a direct relationship. They make not just unsupported but disproven claims, such as suggesting that vaccines cause autism (they absolutely don’t) or that young immune systems can’t “handle” vaccines (vaccines are less of an immune challenge than almost anything else a child encounters).

Finally, activists need to be able to find and mobilize people who agree with them, and to discredit people who fight against them. When done with the reliable evidence or generally accepted parts of the social contract, such as in the cases of climate change and LGBTQ rights, this is a good and necessary part of creating social change. When done with anecdotes, innuendo, and lies, though, it becomes little more than an ideological cancer. A community of activists can be a center of social innovation, where challenge drives us all to be better, or a hyperbolic chamber of amplified nonsense, where no challenge is ever allowed.

I think, in the end, all advocates and activists walk the knife edge of societal belief, trying to drag that belief farther to one side or the other. This is an absolutely necessary role in society, which might otherwise stay mired in the inertia of bad ideas and the motivated reasoning of the powerful. When activists do this well, they are a check and a balance both on the stagnation of social beliefs. They are nimble, creative, and skeptical of the opposition, but they are also open to new evidence and they embody the ideal of social change.

When activists do this badly, though, they are as brittle and unyielding as the bad ideas and motivated reasoning they so often fight against. The fervor of activism is a part of the solution and a part of the problem both.

I think that strong scientific evidence is the tether holding us on that edge, looking over, and surveying the places we might fall or climb. It lets us reach the edge and innovate, but we must always be cautions to keep close hold of the tether. And, should we find ourselves advocating a position that goes against the majority of scientific evidence, we ought to ask ourselves some very hard questions. If the anti-vaccine advocates are any indication, activists who rush to an extreme relying on bad evidence may fall a long way from the truth, and many never find their way home.