How to change other people’s minds.

Scientists are involved in finding out how nature works by looking at and generating new evidence. The idea is to build up a set of well-founded, testable beliefs about the natural world. This system seems to work pretty well – our everyday existence, quality of life and indeed lifespan has benefitted hugely from scientific advances and the technologies which flow from them from: antibiotics, brain scans, the internet, television, transport, contraception. It’s easy to think of less positive examples, like nuclear weapons, but none of this stuff would work unless the underlying scientific beliefs had some truth in them.

374-pope-dawkins2

Scientists need to be able to argue effectively with non-scientists – so what’s the best approach? Picture from NewsBiscuit.

Science is incredibly useful, but as long as the ideas remain inside the heads of individual scientists, they can’t change anything. So every scientist has to be concerned about how best to communicate their ideas, and very often this will involve getting other people to change their minds. Scientists need to be able to argue effectively not just with one another, but with non-scientists too – so the question is, how do we persuade people to look at the evidence?

Many social psychologists believe that their are two distinct ways in which the views of others can influence our own beliefs, and that these reflect possibly conflicting reasons for agreement and disagreement. On the one hand, we want to be right. On the other we want to get along with other people and to ensure that our personal beliefs don’t conflict with each another. In order to be right, we need to systematically evaluate the evidence presented by others and where necessary modify our view of reality – a process which is closely aligned to the scientific method. But social psychological studies show that the second set of motives are very important. We can use reasoning shortcuts (heuristics) to arrive at attitudes that align with other members of the group we identify with, or that are consistent with our existing beliefs. The result is that when we want to find out the truth we will typically choose different evidence to rely on than when we are defending our position or trying to get along with other people.

In an interesting study carried out by Sharon Lundgren (Texas A & M University) and Radmila Prislin (San Diego State University) in 1998, 63 students were divided into four groups, and prepared for a discussion with a partner on a contentious and personally relevant topic (tuition fees). The discussion itself never took place, but the situation was staged to determine how people’s opinions and arguments might be influenced by the social setting. One group were told their discussions would be monitored and analysed for logic and reasoning abilities, another group were told that the purpose of the experiment was to measure their social skills and rapport, a third group were told that their discussion would be used to inform the University authorities about student opinions concerning a proposed 30% fee hike, and a control group expected to hear their partner give a speech on the topic. The participants were given an indication of their partner’s opinion  (always strongly in favour if the fee increase) and while preparing for the discussion, they had an opportunity to read arguments about the proposed tuition fee changes or a humour column. They were then told the discussion was cancelled, but were asked to fill out a questionnaire on their own views about the topic. The results showed that people’s engagement with the evidence varied depending on the situation.

Participants in the first group, who thought their logic and reasoning were to be tested, read more of the arguments both for and against the proposal – they weighed up the evidence, and came to balanced conclusions which they perceived as being “the best” positions and most widely shared by others.

Participants in the second group were motivated to get along with a partner with whom they likely disagreed. They spent less time reading and focused on arguments in favour of the tuition fee increase which they thought their partner favoured.

Participants in the third group had been put on the defensive by the prospect of a huge tuition fee rise also read widely, but they focused on arguments that would bolster their position. The authors concluded that “people avoid hostile information in defense of their attitudes when, as in our experiment they have control over their exposure to information”.

The implication for scientists seeking to persuade the public to change their minds in the light of evidence is that attacking someone’s existing view is only likely to entrench their existing opinions, and will drive them to take a selective view of the evidence. Scientists need to beware of shifting our own views to accommodate less well-founded beliefs, but we cannot afford to make enemies of those we wish to persuade.

In order to change someone’s mind they have to be motivated and capable of looking at the evidence in a systematic and balanced way. If they are unmotivated or defensive they are unlikely to draw valid conclusions from the evidence, and will instead by influenced by social and emotional factors. And we may have to accept that sometimes people lack the motivation or ability to make sense of the evidence available to them – the goal then would be to provide motivation and to give them them the tools they need to weigh it up. In either case, adopting a confrontational or negative approach is unlikely to be helpful.

When you read about these social influences on attitudes and beliefs, it is too easy to think that these are flawed patterns that affect others. This type of heuristic thinking is (like the biblical mote in the eye) easier to see in other people than in oneself. But I don’t think scientists in general are immune to heuristic thinking, or social influences – far from it. For example, a biologist unfamiliar with all the evidence on climate change might be inclined to believe in the idea of man-made global warming because that seems to be the prevailing view amongst better informed scientists. Hopefully this example shows that the tendency to look for consensus is not necessarily a bad thing. It is not always be practical to examine all the relevant evidence for an idea yourself, and then the opinions of the group and of particular authority figures** will exert a greater influence. But these mental shortcuts come with risks, which we have to acknowledge*. The group, the world-leading authority is not always right. In many ways I see the scientific method as a way of defending ourselves against these dangers – as Feynman put it “Science is the belief in the ignorance of experts”. Knowing about our very human weaknesses will only make us stronger and by making testable predictions we can check and double-check the validity of our ideas with experiments. And that is what makes science special.

Backstory

The discussion above was prompted by a blog post from Stephen Curry asking rhetorically whether being angry and dismissive was the best way of persuading people to give up flawed pseudosientific beliefs. Jon Simons (in the comments) wondered whether there was any evidence that the angry approach was less effective than calm reasoning.

Persuasion and attitude change have been well studied by social psychologists. Because of its central importance to politics and advertising, this is a big field and I would estimate that 100s or thousands of papers are published each year. It is a tough topic to study scientifically because the basic phenomena under investigation attitudes, motives, and so on are hard to pin down and measure. Because of this, social psychological theories are hotly debated and hard to resolve. I work in a very different area – looking at the way information is represented in the brain and how this affects our memories, so I am not all that familiar with the ins and outs of the psychology of persuasion. In this post, I highlighted one study that seemed most relevant to the way scientists and non-scientists debate. In preparing this, I read one authoratitive, well-cited review article and a couple of key studies.

Wood (2000) Attitude Change: Persuasion and Social Influence, Annu. Rev. Psychol. 2000. 51:539–570

Chen, Shechter & Chaiken (1996) Getting at the Truth or Getting Along: Accuracy- Versus Impression-Motivated Heuristic and Systematic Processing. J Pers Soc Psych 71(2):262-275

Lundgren & Prislin (1998) Motivated Cognitive Processing and Attitude Change. Pers Soc Psychol Bull 1998 24: 715

The Wood review is probably the best paper to read if you are interested in learning more.

I am not able to give a complete overview of the evidence, because that would involve several months or years of study and *I am aware of the irony that this lack of the complete picture puts me in a similar position to the biologist talking about climate change in the post above. However, I also discussed the topic with my colleague and friend Julian Oldmeadow who is a proper social psychologist**, much more familiar with the field. He sent me this summary of the main points from the vast literature (I’m quoting verbatim from his email, with one tiny edit):

“1. If, through your tone or content or whatever, you position the persuasee as and ‘other’, an outgroup member, you’re finished already. They will be motivated NOT to agree with you as a matter of pride, identity, etc. Facts will not persuade, they will only constrain their ability to hold out against you. The problem, really, is that being seen as an ‘other’ provides a (somewhat rational) reason to see their arguments a priori as invalid. This all comes from loads of work in the self-categorization theory of social influence (J. C. Turner, 1991).

2. There are two ways to persuade people – through information and through heuristic cues being a good speaker etc.). But people will only be persuaded by information if they have both the motivation and ability to process that information. Assuming the people relevant to this blog thread have the ability to process information, it comes down to motivation. If they’re not motivated to listen to and think about your arguments you’ve got no chance. This comes from Petty and Cacciopo’s copious work on the Elaboration Likelihood Model. According to this theory, an assertive angry style will be most effective for those with no motivation or ability to think about  the arguments. For those who are motivated and able, it wont make a difference or will be counterproductive (see above).

3. The literature on minority influence shows that, at least for minority positions (pseudoscience?), the most (only) effective strategy is a calm, unanimous and unwaivering commitment to their position. It requires patience and often only works latently, after a period of explicit rejection and denial. theoretically, this applies to minorities, but I would guess the same strategy would work best for majorities too.”

Advertisements

9 thoughts on “How to change other people’s minds.

  1. Nice thoughtful points here Tom, as always. But what I think you’ve omitted is the consideration that *we’re* not always right. People are most likely to change their minds if they are having a rancour-free discussion about the topic where both sides are listening. For example, some of the opposition to GM came not from a knee-jerk yuck factor, but from some very valid socio-economic concerns about terminator seeds, loss of biodiversity and the stranglehold of international agribusiness. People didn’t need a patient explanation of how DNA in your food isn’t bad for you. They needed scientists and policy-makers to listen to the issues they were raising.Brian Wynne’s sheep farmers needed the scientists to stop thinking that they knew best. They needed to take on board the expertise of the sheep farmers (in the matter of, y’know, sheep). http://blogs.ucl.ac.uk/sts-observatory/2009/05/12/when-will-the-sheep-safely-…Sometimes science does actually need to listen. And not just talk more persuasively.

  2. No I think I tried to touch on that point in the last paragraph, and I definitely spelled it out in the previous post Resisting the Dark Side, which I wrote when I first read SCs article. Science is rarely wrong when the method is followed properly, but all too prone to the kinds of groupthink and cognitive bias I began to highlight in this post. So it should be vigourously challenged, and yes, science needs to listen. In fact I wrote in to the GM consultation raising related issues, and felt very frustrated by the way the evidence was put across. I also have some concerns over the presentation of the case for climate change, which I’ve aired in previous posts. I always feel uncomfortable when the scientific consensus seems to bludgeon us into submission. It’s not scientific. Show me the evidence.

  3. My own experience with persuading universities that it is a bad ides to give BSc (Hons) degrees in quack medicine is that a polite rational approach is useless. All the works is public humiliation by revealing the nonsense which is taught to students. In this case, though, the problem is a bit different from that you seem to have in mind. There is no approach in the world that will convert the quacks themeslves, Their views are quasi-religious and unshakable. But to stop universities from prostituting themselves in this way, the people you have to peruade are not the quacks but the vice-chancellors. They, presumably, do not usually themselves believe thet “amethysts emit high Yin energy” ( http://www.dcscience.net/?p=227 ) but are nonetheless prepared to endorse it if it makes money, In such cases, reason is useless, What you have to do is let the public know that they are bringing their university into disrepute, for the sake of the cash (maybe there is some analogy with recent events at the LSE and Libyan money).

  4. Another method of approaching such arguments is for both parties to accept that their positions are flawed to some degree, and to agree up front that the truth lies somewhere in between. Then the argument becomes a joint exercise in reaching the truth, with any movement towards the other’s position an achievement for improving the value of one’s own position. Cool huh? (And, hi Soph!)

  5. (Hi Liam!)I know you did Tom, and I like your dark side post. But I still felt the thrust of this article was ‘How to persuade other people (that we are right)’ and I just wanted to draw attention to the fact that that’s not the only mode of discourse science can have with the rest of the world….

  6. @DC thanks for the comment. The situation you have in mind is a very different kind of problem to the one I had in mind, and I think different tactics might be appropriate where the person you are trying to persuade doesn’t actually believe in their stated position, but is adopting it for material gain (snake oil salesmen come to mind). I think a bit of righteous anger is quite justified in that situation, but it could be counterproductive if one has misread the situation, and they do in fact believe what they’re saying.@liam thanks too. I don’t think I would necessarily want to agree that the truth lies somewhere between two positions. I don’t myself like to adopt a given view with certainty, but rather to entertain different explanations in parallel, with an idea of how probable I think each one is (often almost completely certain or almost completely impossible) I like to try to think of all possible explanations including something like “none of the above”. Then when I get some new evidence I might change the mental probabilities a little. Sometimes opposing positions are incompatible, other times less so. But I ink that thinking this way provides a common reference frame so I can take anyone’s views on board Of course getting other people to do the same will always be tricky, and in any event this type of reasoning is probably just a post-hoc rationalisation of a cruder, more flawed process that actually relies too much on emotion, heuristics, gut feelings and so on. As I said in an earlier post, if its a choice between being right and persuading someone else, I’d rather be right.

  7. @Sophia I understand, and I thank you for an important point, well made. Focused prose is not my forte and I have to resist the urge to deal with all aspects of scientific discourse in every post. Listening probably deserves a whole one all to itself!

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s