Scientists are involved in finding out how nature works by looking at and generating new evidence. The idea is to build up a set of well-founded, testable beliefs about the natural world. This system seems to work pretty well – our everyday existence, quality of life and indeed lifespan has benefitted hugely from scientific advances and the technologies which flow from them from: antibiotics, brain scans, the internet, television, transport, contraception. It’s easy to think of less positive examples, like nuclear weapons, but none of this stuff would work unless the underlying scientific beliefs had some truth in them.
Scientists need to be able to argue effectively with non-scientists – so what’s the best approach? Picture from NewsBiscuit.
Science is incredibly useful, but as long as the ideas remain inside the heads of individual scientists, they can’t change anything. So every scientist has to be concerned about how best to communicate their ideas, and very often this will involve getting other people to change their minds. Scientists need to be able to argue effectively not just with one another, but with non-scientists too – so the question is, how do we persuade people to look at the evidence?Many social psychologists believe that their are two distinct ways in which the views of others can influence our own beliefs, and that these reflect possibly conflicting reasons for agreement and disagreement. On the one hand, we want to be right. On the other we want to get along with other people and to ensure that our personal beliefs don’t conflict with each another. In order to be right, we need to systematically evaluate the evidence presented by others and where necessary modify our view of reality – a process which is closely aligned to the scientific method. But social psychological studies show that the second set of motives are very important. We can use reasoning shortcuts (heuristics) to arrive at attitudes that align with other members of the group we identify with, or that are consistent with our existing beliefs. The result is that when we want to find out the truth we will typically choose different evidence to rely on than when we are defending our position or trying to get along with other people. In an interesting study carried out by Sharon Lundgren (Texas A & M University) and Radmila Prislin (San Diego State University) in 1998, 63 students were divided into four groups, and prepared for a discussion with a partner on a contentious and personally relevant topic (tuition fees). The discussion itself never took place, but the situation was staged to determine how people’s opinions and arguments might be influenced by the social setting. One group were told their discussions would be monitored and analysed for logic and reasoning abilities, another group were told that the purpose of the experiment was to measure their social skills and rapport, a third group were told that their discussion would be used to inform the University authorities about student opinions concerning a proposed 30% fee hike, and a control group expected to hear their partner give a speech on the topic. The participants were given an indication of their partner’s opinion (always strongly in favour if the fee increase) and while preparing for the discussion, they had an opportunity to read arguments about the proposed tuition fee changes or a humour column. They were then told the discussion was cancelled, but were asked to fill out a questionnaire on their own views about the topic. The results showed that people’s engagement with the evidence varied depending on the situation. Participants in the first group, who thought their logic and reasoning were to be tested, read more of the arguments both for and against the proposal – they weighed up the evidence, and came to balanced conclusions which they perceived as being “the best” positions and most widely shared by others. Participants in the second group were motivated to get along with a partner with whom they likely disagreed. They spent less time reading and focused on arguments in favour of the tuition fee increase which they thought their partner favoured. Participants in the third group had been put on the defensive by the prospect of a huge tuition fee rise also read widely, but they focused on arguments that would bolster their position. The authors concluded that “people avoid hostile information in defense of their attitudes when, as in our experiment they have control over their exposure to information”. The implication for scientists seeking to persuade the public to change their minds in the light of evidence is that attacking someone’s existing view is only likely to entrench their existing opinions, and will drive them to take a selective view of the evidence. Scientists need to beware of shifting our own views to accommodate less well-founded beliefs, but we cannot afford to make enemies of those we wish to persuade. In order to change someone’s mind they have to be motivated and capable of looking at the evidence in a systematic and balanced way. If they are unmotivated or defensive they are unlikely to draw valid conclusions from the evidence, and will instead by influenced by social and emotional factors. And we may have to accept that sometimes people lack the motivation or ability to make sense of the evidence available to them – the goal then would be to provide motivation and to give them them the tools they need to weigh it up. In either case, adopting a confrontational or negative approach is unlikely to be helpful. When you read about these social influences on attitudes and beliefs, it is too easy to think that these are flawed patterns that affect others. This type of heuristic thinking is (like the biblical mote in the eye) easier to see in other people than in oneself. But I don’t think scientists in general are immune to heuristic thinking, or social influences – far from it. For example, a biologist unfamiliar with all the evidence on climate change might be inclined to believe in the idea of man-made global warming because that seems to be the prevailing view amongst better informed scientists. Hopefully this example shows that the tendency to look for consensus is not necessarily a bad thing. It is not always be practical to examine all the relevant evidence for an idea yourself, and then the opinions of the group and of particular authority figures** will exert a greater influence. But these mental shortcuts come with risks, which we have to acknowledge*. The group, the world-leading authority is not always right. In many ways I see the scientific method as a way of defending ourselves against these dangers – as Feynman put it “Science is the belief in the ignorance of experts”. Knowing about our very human weaknesses will only make us stronger and by making testable predictions we can check and double-check the validity of our ideas with experiments. And that is what makes science special. Backstory
The discussion above was prompted by a blog post from Stephen Curry asking rhetorically whether being angry and dismissive was the best way of persuading people to give up flawed pseudosientific beliefs. Jon Simons (in the comments) wondered whether there was any evidence that the angry approach was less effective than calm reasoning.
Persuasion and attitude change have been well studied by social psychologists. Because of its central importance to politics and advertising, this is a big field and I would estimate that 100s or thousands of papers are published each year. It is a tough topic to study scientifically because the basic phenomena under investigation attitudes, motives, and so on are hard to pin down and measure. Because of this, social psychological theories are hotly debated and hard to resolve. I work in a very different area – looking at the way information is represented in the brain and how this affects our memories, so I am not all that familiar with the ins and outs of the psychology of persuasion. In this post, I highlighted one study that seemed most relevant to the way scientists and non-scientists debate. In preparing this, I read one authoratitive, well-cited review article and a couple of key studies.
Wood (2000) Attitude Change: Persuasion and Social Influence, Annu. Rev. Psychol. 2000. 51:539–570
Chen, Shechter & Chaiken (1996) Getting at the Truth or Getting Along: Accuracy- Versus Impression-Motivated Heuristic and Systematic Processing. J Pers Soc Psych 71(2):262-275
Lundgren & Prislin (1998) Motivated Cognitive Processing and Attitude Change. Pers Soc Psychol Bull 1998 24: 715
The Wood review is probably the best paper to read if you are interested in learning more.
I am not able to give a complete overview of the evidence, because that would involve several months or years of study and *I am aware of the irony that this lack of the complete picture puts me in a similar position to the biologist talking about climate change in the post above. However, I also discussed the topic with my colleague and friend Julian Oldmeadow who is a proper social psychologist**, much more familiar with the field. He sent me this summary of the main points from the vast literature (I’m quoting verbatim from his email, with one tiny edit):“1. If, through your tone or content or whatever, you position the persuasee as and ‘other’, an outgroup member, you’re finished already. They will be motivated NOT to agree with you as a matter of pride, identity, etc. Facts will not persuade, they will only constrain their ability to hold out against you. The problem, really, is that being seen as an ‘other’ provides a (somewhat rational) reason to see their arguments a priori as invalid. This all comes from loads of work in the self-categorization theory of social influence (J. C. Turner, 1991). 2. There are two ways to persuade people – through information and through heuristic cues being a good speaker etc.). But people will only be persuaded by information if they have both the motivation and ability to process that information. Assuming the people relevant to this blog thread have the ability to process information, it comes down to motivation. If they’re not motivated to listen to and think about your arguments you’ve got no chance. This comes from Petty and Cacciopo’s copious work on the Elaboration Likelihood Model. According to this theory, an assertive angry style will be most effective for those with no motivation or ability to think about the arguments. For those who are motivated and able, it wont make a difference or will be counterproductive (see above). 3. The literature on minority influence shows that, at least for minority positions (pseudoscience?), the most (only) effective strategy is a calm, unanimous and unwaivering commitment to their position. It requires patience and often only works latently, after a period of explicit rejection and denial. theoretically, this applies to minorities, but I would guess the same strategy would work best for majorities too.”