Today saw the launch of the Hay Festival on-line, and I was able to join one of the first sessions of the Festival this afternoon. This was an interview by Nick Stern (Professor of Economics and Government at the London School of Economics) of Naomi Oreskes (Professor of the History of Science at Harvard). Oreskes is the author of “Why Trust Science?”, her most recent book published in 2019. She initially trained as a geologist before extending her research interests to earth science and climate change. Her 2010 book Merchants of doubt is an exploration of the orchestrated campaigns of denial and obfuscation that have attempted to deflect from the scientific consensus that had emerged around topics including the health implications of smoking, the environmental impact of acid rain, and the human-behavioural contribution to both depletion of atmospheric ozone, and the wider changes to the global climate.
Interestingly, Oreskes was pretty dismissive of arguments to the effect that it is ‘the scientific method’ that makes science trustworthy. Her argument is that there is no single ‘scientific method’. Rather, there are elements of most fields of scientific enquiry that tend to make them inherently more trustworthy. These include the critical interrogation of scientific claims through rigorous testing and review during the discovery phase, robust peer review and evaluation (including at scientific meetings) immediately prior to publication, and an acceptance that even after publication, findings and conclusions drawn from them will be subject to continuing scrutiny and challenge. It is this process of discourse, challenge and testing that is key to the trustworthiness of scientific claims. It’s important too, that we do not fall into the trap of trusting the scientist rather than the science. It’s important always to ask to see the evidence, and also to consider whether there is other evidence that conflicts with or offers an alternative perspective on the conclusions that have been drawn.
Oreskes offers some fascinating insights into the motivations of people who deny the scientific evidence almost despite the facts, rather than because of them. She suggests that there is a form of implicatory denial at play here. People deny the science not because they disagree with the findings, but because they object to the implications of those findings. Her research into campaigns designed to undermine the consensus on human-impact climate change suggests that the objection is not to the scientific consensus on whether it is happening, but rather the implication that government intervention to address the problem is an unwarranted interference in the market and therefore the thin end of a wedge that leads to socialism. In similar vein, the objections of those who contest the science around evolution are motivated by a fear that evolutionary theory is the thin end of a wedge that leads to a scientific rejection of the existence of a Creator God.
Recognising implicatory denial is critical to engaging with people because it requires an approach to engagement which is more subtle than simply repeating the scientific facts. Behavioural science studies have shown that working with those who deny evolution to identify ways in which other religious believers have reconciled their religious belief to evolutionary science is much more effective than adopting a binary “I’m right and you’re wrong” position.
Oreskes refers to a cartoon in her new book that is a variation on Pascal’s Wager. The cartoon depicts a lecture during which a slide on the screen listing all the benefits of addressing man-made climate change : a cleaner environment for people to enjoy; improved air quality reducing respiratory disease; urban environments designed around people rather than vehicles; reductions in extreme weather events and resulting famine and disease, and so on. At the back of the lecture, a man raises his hand and says : “But what if climate change is a hoax? We’ll have made the world a better place for nothing!”
Later in the interview, Oreskes refers to the importance of diversity in research groups and the wider research community as a precaution against the groupthink that is an inevitable risk where researchers are predominantly white, male and middle class. Diversity is not just about race, gender, class though. People tend to trust scientists who are authentic in the way that they present their own motivations for conducting the research that they do. All of us carry a set of internal values and beliefs that inevitably constrain the extent to which we can behave truly objectively. Acknowledging those beliefs, motivations and values openly both humanises the scientist and acts as a further check and balance against inadvertent bias and ‘blind spots’ in the way that research is conducted and/or conclusions are reached.
The final section of the interview included some thoughts on the role of scientists in actively addressing attempts to misrepresent their findings. Social media, the internet, mobile technology now enable the spreading of disinformation at a speed and scale that is unprecedented. There is an imperative for scientists in this environment to be prepared to counteract disinformation where it arises, and to call out the protagonists in a calm but evidence-based manner. The evidence suggests that most people react well to these sorts of responses – nobody wants to be taken for a sucker, and calling out the spreaders of disinformation and lies clearly and forensically can be highly effective (even while it may also be exhausting and at the risk of social media pile-ons and on-line abuse in the form ad hominem attacks).
You can find the full interview here for the next 24 hours.
Next up for me is Stephen Fry talking about his new book Troy but you’ll have to wait for tomorrow’s blog to find out more about that one!