A new report released today from the UK Department for Business, Innovation and Skills (BIS) Expert Group on Science and Trust emphasizes the need to address risk and uncertainty in developing and using science and technology within society. “Acknowledging risk and uncertainty” is the second of eight broad aspirations from the independent group, established to develop a UK action plan to “enhance society’s capabilities to make better-informed judgements about the sciences and their uses in order to ensure that the “license to operate” is socially robust.”
The report “Starting a National Conversation about Good Science” [PDF, 478 KB] is a rich, informative and insightful document, that demands careful consideration. It comes out of a group assembled to consider new mechanisms to increase public trust in science and engineering; review the impact of the existing science-related ethical code of practice; examine how movement of knowledge and people across the different sectors can be facilitated in order to maximize the benefits and impacts of science and society activities; and think about better ways to evaluate the impacts of science and society initiatives. Despite this being a purely British affair, many of the recommendations are relevant far beyond the confines of a UK-centered “national conversation,” and will hopefully stimulate a global dialogue on what is a global challenge.
Amidst the eight “broad aspirations” of the group, which span public judgment about science and awareness of the scientific process, to underpinning science-informed decision-making and good science governance, I was particularly struck by an emphasis on risk and uncertainty. This may be because in a few weeks I will becoming increasingly involved in risk, uncertainty and science-informed decision-making, as I take over as Director of the Risk Science Center at the University of Michigan. But beyond this, I was struck by the group’s recognition that, from the publics’ various perspectives, uncertainties surrounding science and technology – their implications in particular – are often more important than the science and technology themselves.
The overarching aim of the Science and Trust Expert Group – and of this report – was
“To enhance society’s capabilities to make better-informed judgements about the sciences and their uses in order to ensure that the “licence to operate” is socially robust.”
In this context,the group recommended that
“Expert advice to Government should identify and characterize uncertainties; policy makers should communicate clearly actions that take account of inevitable uncertainties; efforts should be made to support public judgements about risks and uncertainties.”
In particular, the report emphasizes the need to address uncertainties surrounding the potential impacts and benefits of emerging technologies “in the wider context of science and society relations.”
This emphasis on uncertainty is particularly welcome, and closely aligns with where I hope to be taking the University of Michigan Risk Science Center over the next few years. New technologies – or innovative ways of using existing technologies for that matter – lead to inherently uncertain futures. There is a great danger of mistaking this uncertainty for risk (risk is a reasonably well-understood chance of something bad happening; uncertainty is a poor understanding of whether good or bad will come out of a course of action) – with the result that there is a tendency to shy away from potentially beneficial technologies, simply because we don’t know how they are going to unfold. On the other hand, uncertainty means that we do need to move forward carefully, in case there are very real and relevant risks lurking in the shadows. The trick is to develop better ways of handling uncertainty so that the best possible choices are made.
Being up-front about uncertainty and potential risks associated with science and technology is a critical step toward developing conversations and actions that underpin a science-informed approach to minimizing and otherwise handling uncertainty and risk. One particularly good resource that the report recommends is A Worriers’s Guide to Risk [PDF, 222 KB] – a one-pager intended to help everyone make more sense of the seemingly unending series of stories on risk.
In its specific recommendations and actions, the Science and Trust Expert Group includes:
- Support Government to take better account of risks and uncertainties in policy making;
- Support public judgements about risks and uncertainties inherent in the scientific advisory process;
- Support policy makers to take better account of public attitudes and values to the risks, benefits and uncertainties in the governance of emerging technologies;
- Enable wider discussions in the media and elsewhere on uncertainty inherent in the scientific process; and
- Enable greater discussion of risk.
Although these are aimed fair and square at the UK, they provide a valuable template for a global conversation about good science, and its role within society. Hopefully, now that the UK has set the pace, we will see this develop as an International conversation about good science.
Had meetings all day so not read beyond page 10 yet. (Did I mentioned that we have a name check on page 6 (8 of the pdf), Oh yes I did, sorry!!)
I only have one problem with your post Andrew – I don’t see much evidence of people ‘shying away from potentially beneficial technologies simply because we don’t know how they are going to unfold’! Certainly governments, companies, research councils and scientists aren’t doing much shying away and seeing how things unfold – quite the opposite! Perhaps you meant the public, but again not much shying away and lots of embracing and getting stuck in to my mind. Especially given science and industry’s poor track record on unintended consequences of innovations in many areas.
But what’s interesting me, and one of the aspects will be thinking about when I read it, is what they are going to do when the concerted efforts to involve the public indicate that the public doesn’t want something. So GM is a case in point. The public said they didn’t want it because they didn’t think what it did was really necessary, questioned the motives of those who profited from it and were unconvinced of assertions that it was safe for people and the environment. Given the information they had to go on at the time, this was actually a pretty reasonable perspective if you get rid of the hysteria on all sides.
So what will happen in this nirvana of public involvement if that happens again? And how many people disagreeing is enough to make a change? For example, 1 million people in the UK were said to have physically marched against the Iraq war and many more disagreed with the policy, but it still went ahead. How many people have to say no for it to count?
Very very tricky thing to do. Looking forward to reading it to see if they have the answer. I certainly haven’t!
Thanks Hilary,
As usual, I have become caught in the trap of expressing complex ideas in too limiting a forum! hat I was trying to get at is how a poor understanding of uncertainty, and how to handle it, can undermine decision-making within a number of sectors – including government, industry and different publics.
Two of the issues here are confusing uncertainty with risk, and rampant speculation in the face of uncertainty – both of which interfere with evidence-informed decision-making.
Some industries will take a conservative approach to uncertainty, effectively treating uncertainty as a risk, and this can inhibit the uptake of potentially beneficial technologies – look at how rapidly nanotechnology is not being taken up in the food sector for instance. Government regulators who are used to working with more certain risks are left high and dry in the face of uncertainty, and struggle with making oversight decisions on new technologies where the risks are far from certain. This can cut both ways, but there are certainly some industries that would complain this can lead to overly conservative approaches to oversight, which potentially stifle innovation. Then there is the question of how individuals and groups throughout society handle uncertainty. Here there are strong indications that a risk-averse populace will err on the side of caution in the face of uncertainty – issues surrounding GM foods in Europe were primarily about transparency and control, but were fueled by a fear of the unknown.
Of course, the factors here in determining how things unfold are complex and multi-faceted. But I would still argue that responsible and socially-responsive technology innovation faces a huge hurdle when it comes to handling uncertainty.
Also we can’t get away from the risk benefit calculation which we all do – we approach uncertainty in a more conservative way when it’s associated with food for example than we do with medicine or with technology where the uncertainties are more at arms length.
Though interesting it appears to me that negative ‘certainties’ – eg trans fats are very very bad for us – don’t galvanise us into action to boycott as much as uncertainties?
I had a good discussion with someone yesterday where we explored whether if GM was invented today would we have any different an outcome or in fact whether it would be so widely taken up on the US at all? We decided we might not have too different an outcome.
This post was actually about how ‘erring on the side of caution wasn’t such an unwise strategy, but I must have deleted that bit! I am seeing the author of Sci Trust today so was using your blog to ponder my ideas!
But surely if the public is asked to listen to scientists in order to inform their opinion and if scientists across the world are saying that uncertainties associated with nano may present real risks it seems to me that ‘erring on the side of caution’ is a pretty sensible way forward and a sign of a science literate society!
One of the big challenges though is differentiating between uncertainties and risks, and distinguishing between plausible and implausible potential risks (sorry, that was two challenges!). I’m not sure the science community have cracked this one yet, which isn’t going to help a broader informed discussion!