One consequence of holding strong views is the desire to change the views of others, especially when faced with equally strong opposing views. I try to pick these moments carefully, but it is all too easy for me to be drawn into a Facebook debate or an impassioned argument on the issues I care about. I find these moments instructive, and they have inspired in me a deep curiosity about how we form, hold, and change our beliefs.
After all, not all views are equal—some are supported by data and evidence, some are refuted by data and evidence, some suffer from contradictory evidence, and some remain untested. And yet, there seems to be little relationship between the objective support for and idea and the strength of beliefs about it. As I mentioned in the first part of this discussion, we often hold ideas that are part and parcel of our worldview or self-worth immune from criticism. Changing our own minds on these issues is difficult enough, so how can we expect to change the minds of others?
Last December, a study was published in the journal Science detailing support for the contact hypothesis—the idea that people may change entrenched ideas if faced with people who are directly affected by those ideas. In this case, the study seemed to show that LGBTQ canvassers in California had lasting impacts on people’s opinions about Gay Marriage. Unfortunately, the lead author seems to have fabricated all the follow-up data, rendering the results useless, and the second author asked Science to retract the study. If true, this would have been the first real data showing dramatic change in controversial views.
The current body of scientific literature on changing minds is, sadly, pessimistic. People generally are not open to alternative views, and the more entrenched their own position, the less willing they are to consider a change. Repeated studies have shown that the more an idea challenges their fundamental views, the more people are willing to reject it. People are even willing to use information they know is wrong as support for their preconceived ideas. And as if that weren’t enough, correcting an entrenched idea with undeniable evidence often leads to the Backfire Effect, wherein people strengthen their incorrect beliefs in response to the challenge.
This doesn’t bode well for change. Exigent controversies such as climate change threaten the fundamental ecosystems of our planet, but data and evidence don’t seem to change people’s opinions. The pursuit of social change in the face of intractable viewpoints seems, to some extent, futile.
But I don’t believe that can be true. It’s not that people do change their minds—I have done so myself on a number of controversial issues; rather, it is that people do not change their minds for the reasons we think they do. Whatever value I place on data and evidence, those are not the things that are most convincing for most people, and those are not the things that stick in our memories.
If we want to change minds, we need to meet entrenched ideas where they live, in the murky realm of worldviews, self-worth, and fundamental values. And, despite the overall pessimism of the literature, there are some strategies that do have an effect. They are not silver bullets, but they are the best we have for now.
Some Evidence-Based Strategies for Changing Minds:
-Provide an alternate narrative. If you undercut a belief that has implications for sense of self or worldview, the person may be more likely to accept it if you help them construct a new narrative that includes the corrected information.
-Make people laugh. People are more open to change and ideas that contradict their own if the mood is light and friendly instead of confrontational.
-Make people feel good about themselves. By shoring up their sense of self-worth, you provide additional capacity for them to change their views. This is especially useful when people have invested their self-worth into the belief in question.
-Include the information in a story, fictional or otherwise. Information that is built into stories is stickier than information that is free-floating, regardless of truth. Misinformation can be spread this way as well, so be careful.
-Use evidence to reinforce beliefs, not to challenge them. Evidentiary challenges work with beliefs that are not low-stakes, but only serve to make someone mistrust the source when the belief in question is tied up in their worldview or self-worth.
The items above are all based on the scientific literature as I understand it. I must also add one more critical strategy for any advocate: doubt yourself. For my part, at least, there are few things as off-putting as someone who ignores your side of the discussion and just keeps repeating their own as if you hadn’t spoken. Being willing to reconsider your own views is a key part of the give and take of a frank discussion. If you want to change someone’s mind, then, make sure you are listening to their contributions; otherwise, they will certainly have no incentive to listen to you.