The qualities that lead to an idea going viral and the qualities that make an idea credible, apparently, do not overlap. Personally, I think this is because many of us have bad ideological immune systems: we accept ideas based on whether they fit what we already agree with, not based on whether they are well-supported by the evidence. That’s what the research seems to show so far, anyway.
The latest wrinkle in the puzzle of how bad ideas spread as easily as good ones comes from a study recently published in the Proceedings of the National Academy of Sciences. You can read the abstract or the full study online, but the shorthand version is that Vicario et al. looked at conspiracy theories and science news to see how they spread on Facebook and to try to learn something about the dynamics of that spread. Like a number of studies in the past, the results showed that both kinds of ideas tend to spread in “homogeneous clusters” of users (essentially echo-chambers), slowly diffusing through a cluster over a period of days or weeks.
What I find interesting about this study is that it also shows that assessment of information is lost along the way; science news and conspiracy theories both rapidly become background information as they diffuse through an echo-chamber. By a few weeks out, users in a group will consider the new information fact and resist attempts to change it.
Going beyond the study into some of the implications, nowhere is there any indication that users are checking the veracity of information before they share and assimilate it. Rather than evaluating information, it seems that we often only evaluate the source. If that source is member of an echo-chamber we already mostly agree with, the idea gains credence regardless of whether it is true or even believable. As Vicario et al. observed, “users tend to aggregate in communities of interest, which causes reinforcement and fosters confirmation bias, segregation, and polarization. This comes at the expense of the quality of the information and leads to proliferation of biased narratives fomented by unsubstantiated rumors, mistrust, and paranoia.”
The authors in this study use the example of the Jade Helm conspiracy theory from earlier in 2015, which was basically the bat-shit nutty idea that the U.S. Military was about to invade Texas under cover of a training exercise, for reasons unknown. The plausibility of such a story is close to zero, and yet it rapidly spread through a minority group of conspiracy-minded right-wing extremists. Why? Maybe because it fit nicely alongside the equally unfounded idea of government as a Machiavellian adversary bent on depriving people of guns and liberty; maybe for some other reason. It certainly wasn’t prior plausibility, weight of evidence, or trustworthy sources.
So what does all this mean about ideas overall? Again, projecting a bit beyond the current evidence, I think maybe it means the very concept of going viral is even more accurate than it seems.
As with a disease, ideas tend to spread first and foremost to the people around us in immediate contact. Those people infect more people, and the idea spreads outward exponentially, rapidly reaching a point where everyone in immediate contact has been exposed. As with a virus, repeated exposure increases our chances of contracting the idea. And, as with a virus, reservoirs of receptive individuals can help a bad idea persist in society long after its fitness has passed. Maybe these reservoirs can even re-infect the broader population when polarization and bias are pumped up by media and political narratives (like, say, during a Presidential election).
That leaves me with the question of effective vaccination. Is there anything to be done ahead of time to prevent the spread of bad ideas? The only thing I have found so far is to think carefully about what I see; the more I agree with it on an emotional level, the more closely I should check the actual data.
Unfortunately, I don’t see that becoming a habit in American society anytime soon.
Photo Credit: GDC