Science Communication: A Risky Business

Nuclear power plant in Biblis, Germany. Photograph: Flickr/Bigod

Following recent events at the Fukushima power plant in Japan, the UK government elected to postpone the publication of Sir David King’s report outlining the benefits of nuclear power. Across Europe, reviews of nuclear safety have been launched in several countries, with Angela Merkel declaring a three-month moratorium on plans to extend the lifetime of nuclear plants in Germany. Now, if you’re concerned about the use of nuclear power, you may be tempted to celebrate this spreading wave of atomic cold feet. Yet, regardless of one’s personal stance on nuclear energy, perhaps it is these knee-jerk reactions from governments across Europe which should actually be our main cause for concern.

These reactionary tendencies demonstrate an inherent short-termism on the part of government and lay bare the populist motives underlying decisions which really ought to be made on a rational, disinterested basis. At the root of this problem is the issue of communicating risk. In light of the events at Fukushima, the perceived risk of nuclear power has vastly increased in the minds of the general public. Of course, in reality, no causal link exists between events in Japan and the likelihood of accidents occurring at European nuclear power plants. Events at nuclear power plants geographically isolated from one another are mutually exclusive, meaning that an incident at one does not affect the likelihood of an incident occurring at another.

Nevertheless, the decisions made by European governments on this issue are somewhat understandable. Communicating risk is a notoriously tricky business, and governments are not generally held to have the best track record in this department. A prime example of this here in the UK would be the government’s reaction to the 2009 swine flu pandemic. Ultimately, this pandemic proved to be milder than feared and there have recently been questions raised about the government’s handling of this issue. MP Paul Flynn compiled a report last year for the Council of Europe, criticising the amount of money ‘wasted’ on unused vaccines. He said: “They [the WHO] frightened the whole world with the possibility that a major plague was on the way. The result of that was that the world spent billions and billions of pounds on vaccines and anti-virals that will never be used. It is huge waste of money.”

However, during the BBC’s 2010 Reith Lectures, former president of the Royal Society, Martin Rees, rightly defended the government’s record on swine flu. He argued: “If we apply to pandemics the same prudent analysis whereby we calculate an insurance premium – multiplying probability by consequences – we’d surely conclude that measures to alleviate this kind of extreme event actually need scaling up.”

Now this raises an interesting point. If we consider the occurrence of an event to be a simple binary possibility, then we can calculate risk as follows:

Risk   =   (probability of event occurring)   X   (expected consequences of event)

Thus, events with high expected consequences, even if they are unlikely to occur, can still be described as being high risk. As such, government spending on vaccines during the 2009 swine flu pandemic is completely defensible. In fact, one could even argue that government efforts did not go far enough. To take an extreme example: the probability of an asteroid hitting the Earth is relatively low, but, as the value of the expected consequences tends towards infinity, the risk is still tremendously high. Consequently, we need to invest more money in mitigating the risk of these ‘high consequence low probability’ events.

But perhaps science communicators can turn the question of risk into an advantage when it comes to tackling the great issue of our time: climate change. Well-funded climate change deniers would have the public believe that the underlying science is unreliable; that climate change is unlikely to occur. Of course, anthropogenic climate change is a fact, and it is important that scientists and science communicators alike emphasise this at every given opportunity. Yet, even if we were to hypothetically accept the denialists’ claim that climate change is unlikely to occur, could we not still make an argument for investing in renewable energy sources on the grounds that climate change is a high consequence event? Thus, even if the denialists’ erroneous claim, that the probability of climate change occurring is minimal, were to be accepted, climate change’s huge potential consequences would still make it a risk worth taking action to defend ourselves against.

So, by embracing risk, rather than shying away from it, science communicators can inform people and help them to make considered decisions about the chances we choose to take. Albert Einstein is famously quoted as saying “god does not play dice”. But we certainly do. As such, it would seem to be the job of science communicators to ensure that we at least enter the casino with our eyes wide open.

Andrew Purcell is studying for an MSc in Science Communication at Imperial College, London. He is also the Editor of Imperial’s I,Science magazine.

5 comments

  1. Andrew,

    Great post – thank you!

    I share your concern at the loss of clarity and objectivity regarding what is a vital and greatly beneficial technology. But there are a few things that worry me about aspects of discourse in the wake of Fukushima:-

    Firstly, I disagree that “the root of this problem is the issue of communicating risk”. I think it’s the issue of establishing *trust*. For instance, what use is a perfectly presented statement of the risks in the mouth of a politician or an industry nobody believes? The best scientific work will be for nothing. And this trust hasn’t disappeared by accident. Right now, TEPCO and members of the Japanese government stand accused of a lack of clarity over the unfolding events in Fukushima. This works against any effort to gain an objective viewpoint of the risks – but does not arise out of a public ignorance or press mischief. There are lessons for everybody in this.

    Secondly, I worry that a focus on “communicating risk” to the public runs the risk of assuming a “risk deficit model” of science communication in nuclear power. An assumption that citizens merely require the relevant risk percentages and then support for nuclear power will naturally follow. I suspect that the truth may prove to be more complex.

    That said, I agree that a transparent view of risk is an absolutely vital element in the debate. Perhaps the most positive legacy of Fukushima will be a truly open conversation about what this technology can and can’t do. It’s very long overdue.

  2. Dear Steve,

    Thanks for your comments. I would, however, like to take you up on a couple of issues.

    Firstly, on the issue of’trust’, perhaps you would be able to define your usage of the word in this context? It seems to me that ‘trust’ is essentially built up over time through induction. However, what use is this with new technology? An organisation may be trustworthy in one area, but not in another. Also, an individual’s expertise in a particular field may make them a good person to trust in this area, but this does not necessarily mean that one should trust their judgements across all fields. On the issue of nuclear, I may, for example, distrust the safety of traditional, first or second generation reactors. Yet, to transfer this sense of distrust to third or fourth generation reactors would, in my opinion, be misguided.

    However, I do not want to talk too much about nuclear here. In fact, this wasn’t the main focus of the article at all (only it’s opening paragraph – although I know the choice of image for the post is, perhaps, slightly misleading). I simply tried to use the events of Fukushima as a way to frame my argument and make it relevant to events which are currently at the forefront of people’s minds.

    Also, you seem to have tacitly assumed that I am pro-nuclear power. Yet, personally, I am neither for or against nuclear power. While I do agree with George Monbiot that the claims about nuclear power being unsafe are largely over-hyped and tend to have little scientific grounding, I am extremely cautious about nuclear power for other reasons. Just briefly: I am worried about nuclear power because of the finite amount of uranium available. Yes, I know there hasn’t yet been a thorough geological exploration to ascertain how much uranium we actually have on Earth (primarily because the market drivers aren’t in place, since current demand for uranium isn’t actually that high), but I’m worried that transferring to a nuclear-dependent energy economy before we know exactly how much uranium we do have would be rather imprudent to say the least. Of course, I know that there is the possibility of thorium reactors, but this then presents the problem of producing significant quantities plutonium as a by-product. Also, I fear that by switching to a nuclear economy, we are failing to realise what fundamental changes to our lives, in terms of resource consumption, we will inevitably have to make, if we are to adequately support our global population. In short, I am extremely sceptical about the potential of nuclear power, although this isn’t primarily based on its safety record. Also, being for or against nuclear power is, I believe, a gross oversimplification. For example: we probably shouldn’t be pursuing nuclear power in the long term here in the UK, but in places like China and India, where coal is cheap to access, one could make the pragmatic argument that not allowing them to pursue nuclear on a large scale is consigning the world to a massive increase in carbon dioxide emissions. Sadly, basic market economics means that countries like these will inevitably increase their reliance on coal, if they don’t ‘go nuclear’ in a big way.

    Now, returning to the issue of ‘trust’, I am concerned that using this term implies a binary distinction, between trust on the one hand and distrust on the other. In reality, when we say we trust someone, we actually mean that we think the probability of them letting us down is low. Equally, when we say we distrust someone, we mean that we think the probability of them letting us down is high. So, while ‘trust’ might be a much more trendy, ‘humanities’ way of saying things, might it not, in reality, actually be quite similar to a judgement based on probability?

    I am also concerned that this binary distinction of ‘trust’ versus ‘distrust’ is an oversimplification of the reality, which is easily open to abuse and prejudice, particularly in terms of the people whom we choose to trust. As such, I feel that the more nuanced graduation of probabilities, based on evidence, is far superior to the concept of ‘trust’, which (sadly), for many people, tends to be based on things like race, stereotype, gender, relationship, etc.

    In fact, I suspect that our positions on this matter aren’t all that far apart. I am simply making the more general idealistic argument that we ought to make rational decisions based on evidence, whereas you are making the pragmatic argument that, due to temporal and financial constraints, this is not realistically possible and, as such, we have to base our decisions on trust. In this respect, I think you’re right, we do have to base a lot of our decisions on trust. Yet, just because we do this, does not mean it is necessarily desirable. One could, perhaps, point to a failure of science communication, as to why we actually so often have to rely on trust.

    Finally, on the danger of assuming a ‘risk deficit model’, I agree with you that citizens do not “merely require the relevant risk percentages and then support for nuclear power will naturally follow”. (As I have briefly outlined above, there are other good reasons for opposing the expansion of nuclear power other than safety concerns). However, I think it’s important to realise that we, as a species, have evolved in such a way that we worry disproportionately about some things rather than others. For example, lots of people worry about flying, yet statistically, every single time you step on a plane, no matter how many times you fly, you are nineteen times less likely to die than in your car. Of course, you could argue that the reason people worry about flying rather than driving is because, unlike the pilot, they probably know the person driving the car. The point I am simply trying to make here is: this may be true, but that doesn’t necessarily make it right.
    Thanks.

    Best wishes,
    Andrew.

    PS – Apologies, about spelling, grammar, etc. I’m afraid that becasue of other commitments, I had to write this comment in only around 25 minutes. However, I thought it better to post today (and thus keep the discussion going), rather than waiting until tomorrow or even Thursday to post a thoroughly spell-checked comment.

    PPS – I had italicised a number of words or points to make them clearer when I was writing this comment in MS Word, but these seem to have disappeared now. Also, when I pasted my comment into the box at the bottom of the blog, I seemed to lose all of the hyperlinks I had added. Perhaps in future there could be a way for people to add hyperlinks or use bold/italic text in their comments? Thanks.

  3. Hi Andrew,

    Thanks for the reply. First – no assumptions made at all! I loved the post. I suspect – like you – we may share very similar views. It’s just rather good to have the chance to unpick some of these complex ideas in discussion.

    Trust – yes, it’s a slippery word. In this context I mean the perceptual filter used by citizens to assess the truth claims of information conveyed by those in positions of control or management. As you rightly point out, this is a subtle (non-binary) and extra-rational form of probability assessment (‘common sense’, gut instinct, etc). You’re right too, I think, to point out its ‘humanities’ provenance. It’s a hugely social force – emotional even. And imperfect. As any rich bookmaker knows, we don’t do probabilities very well!

    But for me this wider social dimension is the significant thing. It can’t be contained within in a narrow scientific or technological frame. I feel that many media discussions around Fukushima have adopted a narrow science/tech rhetoric that neither addresses the root of the social issues in nuclear power, nor fully acknowledges the managerial and engineering failures apparent in the crisis.

    Citizen trust has a negative as well as a positive inductive force. It can degrade long-term in response to negative conditioning (multiple nuclear crises, government secrecy, security and disposal issues, etc) Once gone, it can be hard to re-establish. Yet I don’t believe that people are naturally disposed to mistrust what they hear. Most of us practice what Giddens called ‘civil inattention’. We elect and promote experts to handle long-term complex issues, and only intervene when events force us to be ‘attentive’ and apply our perceptual filters. We generally support science and technology – and judging by recent surveys (OECD) are pretty supportive of nuclear power. So I feel that any loss of confidence by the citizenry – whilst lacking a detailed assessment of the risks – is nevertheless a rational inductive response to clear failings in the implementation of nuclear power by those who asserted its safety and efficiency.

    I wholly agree with you that objective risk assessment is the way forward. It is the best means of empowering people to make informed and unprejudiced choices with the maximum amount of information at their disposal. In this, science can do so much good. But unless this information is accompanied by a sea-change in transparency, an acknowledgement of the wider social dimensions of nuclear power, and an acceptance of past mistakes by the nuclear industry, I fear that perceptual filters will remain firmly in place.

    Regards,

    Steve

  4. Hi Steve,

    “As any rich bookmaker knows, we don’t do probabilities very well!” – made me laugh.

    I agree with your specific points about the nuclear industry: need for more transparency, acceptance of past mistakes, etc.

    But, on the issue of trust, I do feel we should at least strive to deal with problems rationally, rather than relying on “‘common sense, gut instinct, etc.”

    It reminds me of the famous Carl Sagan quote:
    “I try not to think with my gut. If I’m serious about understanding the world, thinking with anything besides my brain, as tempting as that might be, is likely to get me into trouble.”

    Best wishes,
    Andrew.

Leave a comment