I consider myself to be pretty self-aware. It’s an illusion of course, but one I am usually blissfully ignorant of. Until some insightful reporter shatters it!
This was me a few days ago. I was talking with a journalist about science communication and the perils and pitfalls faced by young scientists. As I got into my groove talking about scientists and communication, she interrupted me and asked, “do you think there many scientists that hold such unusual views?” (or words to that effect).
That caught me off-guard. Okay, I may not always agree with my peers when it comes to communicating about science, but unusual? Me?
It didn’t stop there. As we continued to talk, she commented that, “it’s odd to find a scientist with such libertarian views” (again, I paraphrase).
I’m not a libertarian … am I?
I must confess that, of all the words I would use to describe myself, libertarian is not one of them. So much so that I spent the rest of the interview rather strenuously trying to correct what I was sure was a misunderstanding.
Of course, all that this interviewer had done was reflect my own thoughts back to me. It was a sobering exercise in self-awareness – or lack of it. It also got me thinking more critically about what my philosophy on science communication is as a scientist who spends some time communicating with others.
To start at the beginning …
I developed an interest in science communication relatively late in my career. As a young person, I was an awful communicator – I’m still sure my first essay at university nearly got me kicked out of the course, it was that bad. I didn’t really care though – I was there to do science, not to write about it.
It was only when I hit my mid thirties that I began to get seriously interested in where all that science goes.
As global investment in nanotechnology accelerated through the early 2000’s, I was closely engaged in supporting researchers, policy makers and others understand and address the health and societal impacts of this and other emerging technology trends. As a result I found myself exposed to new disciplines, new ideas, new organizations and new people.
That exposure had a lasting impact.
An eclectic education
Over time, I began to appreciate the importance of science and evidence in decision-making. I learnt more about what motivates people and communities, and how this affects how they respond to information. I hung out with social scientists and philosophers – not a good idea if you don’t want your worldview challenged! I started to work with businesses and non-government organizations as well as academics. I talked with journalists. And I discovered a passion for helping others understand and use science and evidence in their lives and work.
This passion continues to be central to everything I do as a scientist and an academic. It has its roots in the rather eclectic informal education I’ve received from consumers, policy makers, journalists, and a whole host of others. It draws on evolving but nevertheless strong convictions about the role of science and communication in society. But until recently, I hadn’t thought too much about the nature of these convictions, or my overarching philosophy of science communication.
That is, until they were reflected back to me in that recent interview.
So now I’ve had a few days to think about it, what – if anything – is my philosophy of science communication?
Respect
While working as science advisor to the Project on Emerging Nanotechnologies, I had the privilege of observing a number of focus groups as they explored the risks and benefits of new technologies. The participants typically represented a cross section of American society. Not many of them had higher degrees, or well-paid jobs, or knew much about science and technology.
I have two dominant memories from those sessions: First, these people asked intelligent, insightful questions – they were smart. And second, they contextualized the conversation around new technologies in terms of what was important to them – their health, their families, what excited them and worried them; their passions an convictions.
None of the participants had an in depth knowledge of the technologies they were exposed to. But all of them were capable of articulating what was important to them.
Watching these groups, I developed a tremendous respect for the participants. These weren’t the scientifically illiterate “public” I’d been led to believe made up society, but intelligent individuals with their own interests, concerns and insights.
Listening and learning
Observing these focus groups, I began to appreciate the importance of listening to others and learning from them. Sure I had expertise in one particular area. But I began to discover how ignorant I was in so many others – including understanding how people think and respond when faced with new information and complex decisions. This informal education was continued through listening to and learning from many others who had expertise and perspectives outside of my own, including academics, business leaders, policy makers, activists, and, of course, journalists.
The more I worked with different people, the more I began to realize that I was interested in communication as a means to get stuff out of my head and into the heads of others who had some use for it. But for this to happen, it became increasingly clear that I had to put the needs and interests of the person or group I was communicating with first. And this meant listening to them, getting to know them, and understanding where my expertise ended and theirs began.
Empowerment, education, and enrichment
My early engagements with science communication hinged around helping others make evidence-informed decisions. As I’ve matures as a teacher, lecturer and an informal educator, I’ve tried to delve a little deeper into my motivations, and get a better handle on why I do this. More often than not, the answer comes down to three things: enrichment, education and empowerment.
Empowerment is what got me interested in science communication in the first place – providing others with the ability to make evidence-informed decisions. But I’ve come to realize that the worth of science communication is far richer than this.
Taking up my position at the University of Michigan four years ago, I was thrust into the world of graduate education. In one sense, what we do as professors is empower our students to develop their careers and make an impact on the world. But in doing so we are educating them – providing them with the knowledge, understanding, tools and skills that enhance their abilities to achieve their aims and to serve society to the best of their ability.
Education isn’t confined to the classroom however. Some years ago, I became engaged with a network of science museums across the US. I was fascinated by this world of informal education, and the role that science communication has in drawing individuals into learning new things. This fascination stimulated my interest in informal online education, where individuals will actively seek out educational material that feeds their curiosity. This, more than any other environment, is one where science communication is about understanding and connecting with an audience.
Informal education is an important mechanism for helping people understand the world they live in and make evidence-informed decisions. But it also highlights a critical aspect of science communication that is often overlooked – enrichment.
Beyond developing new skills and making informed decisions, science enriches our lives. Scientific facts and insights have intrinsic value. They can wow us, move us, even bring us to tears. They reveal hidden truths about ourselves and those around us. They create connections with the universe we live in. Like art, they help define what it is to be human and to be alive.
Of course, there are a multitude of other ways to look at science communication and its importance within society. But I keep coming back to these three – empowerment, education, and enrichment.
Ditching the deficit model
Something I did learn about early on from people who study communication and decision analysis is the deficit model of science communication.
The deficit model is based on the assumption that people make poor decisions because they aren’t sufficiently knowledgeable or educated to make better ones. It’s a way of thinking that tends to be engrained into how scientists are taught: we make smart decisions because we know how to make sense of data; others make dumb decisions because they don’t.
This worldview leads somewhat rapidly to the assumption that bad decisions can be avoided through better science education, and that where this fails, people must lack the intelligence to grasp the knowledge they need to be responsible members of society.
It’s a worldview that fails in almost every respect.
Experts in decision analysis have realized for years that poor decisions don’t necessarily correlate with lack of knowledge, and that increasing someone’s knowledge doesn’t necessarily improve the quality of their decision-making – at least from the perspective of those who believe they know the right answers. The trouble is, personal and community decisions depend on more than scientific evidence alone. And as a result, assuming that science is the be-all and end-all of every decision demonstrates a supreme level of ignorance in personal and social dynamics.
More than this, implicit in the deficit model is the assumption that there is a small, privileged group of people who know what is right and wrong, and it is their responsibility to impose this on others who don’t have this privileged insight.
I do not buy into this assumption. Where there are complex decisions to be made that depend on a tangled mass of personal, social, economic, environmental and other factors, about the only certainty is that no one group has the monopoly on what is right or wrong. This is especially true where there are disparities between those making – or imposing – decisions, and those who end up living with the consequences.
So what are the alternatives?
Community-centric communication
Sitting here in one of the country’s leading Public Health Schools, this is where things get real for me. I’ve been trained to value evidence and analysis in decision-making. I can appreciate how they lead to a sophisticated understanding of cause and effect. I’m sensitive to situations where decisions that fly in the face of evidence cause tremendous harm to people. And I’m personally and professionally committed to reducing and if possible avoiding this harm.
So why shouldn’t I impose my “superior” knowledge on those who are unaware of how much their lack of understanding is harming themselves and others? Surely that’s the ethical thing to do, right?
Unfortunately, knowledge alone does not confer the right to decide what’s best for others, nor the right to impose your will on them. On the other hand, it does come with great responsibility to empower others.
This is where I veer toward the idea of community-centric science communication. I believe that if I know something that can help someone else, I have a responsibility to help them to the best of my ability make use of this knowledge. But the ultimate responsibility for how they use this knowledge lies in their hands, not mine.
I find this to be a powerful way of understanding communication, as it both empowers of individuals and communities, and encourages two-way communication and engagement. This is how I tend to approach science communication. I accept that I have a responsibility to make what I know and understand as accessible as possible to others. I also have a responsibility to advocate for what I see as important factors in making decisions that could affect the health and wellbeing of others. But I don’t ultimately have the right to decide on my own what is right for someone else.
When people get stupid
There is of course a glaring problem with this approach to science communication: What do you do when communities are clearly making important decisions based on misconceptions and misleading assumptions?
Take for instance childhood vaccination. Or controversies over genetically modified foods. Or climate change. Where personal and community decisions fly in the face of evidence, and cause harm as a result, surely scientists have a responsibility to not only communicate the science, but make sure people change their behavior as well.
This challenge is at the heart of evidence-based risk communication, where the ultimate aim is to reduce health and environmental impacts by changing beliefs and behaviors. It’s a challenge that is driven by the knowledge that people don’t always make decisions that are good for themselves and those around them, and that intervention is sometimes needed.
Without a doubt, changing harmful behaviors is critical to the health and wellbeing of communities. But who has the authority to bring about change, or even to give advice? Simply understanding the science behind a risk, or even being able to estimate its likelihood and severity, does not confer a right to instruct or enforce others to behave differently. And thankfully so, as individuals working on their own – no matter how well-meaning – seldom have the insights necessary to dictate what is right and wrong for others.
In reality there are rarely cut and dried distinctions between “right” and “wrong” when it comes to decision-making. At best, evidence can elucidate the probable consequences of a certain course of action. But there are often personal, societal, moral and ethical values attached to decisions. Ultimately, what is considered “right” and “wrong” (or “better” or “worse”) are governed by these values – not the science.
Collaboration, not coercion
Of course, the hope is that science and evidence underpin values-based decisions, and that important decisions are not built on misunderstandings and falsehoods. Here, scientists may not have the monopoly on what is right or wrong, but they certainly hold important pieces of the puzzle when it comes to understanding the consequences of decisions. The challenge is: how do they make those pieces count?
Again, this comes back to community-centric communication in my mind. Making science and the insights it leads to accessible and understandable is critical to building a foundation on which informed decisions can be made. This is important at the level of connecting with individuals within society. But it’s also important for empowering communities that do have the legitimacy and authority to bring about change, including professional associations, scientific societies, government agencies, and others.
A radical libertarian?
So am I a radical libertarian when it comes to science communication? I don’t honestly know. I do believe that science should be accessible and understandable to anyone who wants to learn more. I believe strongly in the importance of making decisions on facts rather than fantasy (especially when it comes to health and wellbeing), and that the rights of communities are sometimes more important than those of individuals. I believe that community-centric science communication is important, and that respect for your audience as a communicator is paramount. And I believe that scientists have a responsibility to work with others in ensuring the the communities they are a part of benefit as much as they can from science and the insights and opportunities it provides.
Is this a philosophy of science communication? You tell me. All I know is that it’s what I do.
I believe ALL sides in a debate need to have as full a slate of information (not just data) as possible. Then they have the best chance of reaching decisions that benefit all of those sides. That, of course, requires all sides to have a tendency toward compromise, not easy these days when ideology is often assumed to be the same as equal opinion/knowledge. But that’s nothing a communicator (in science or other domains) can do much about.
I think your approach is the only one an honest communicator can take. It can often seem frustrating when you do as good a job as you think you are capable of, but the knowledge you think you are transmitting gets subsumed by other concerns that seem tangential (or irrelevant) but, again, that’s not something you can control. Climate change is one of the best examples, currently. Despite all of the information out there, the cudgel crew often seems to be winning hearts.
Still, the NEED for the best communication is there, and always will be. Even though it can sometimes seem a losing fight, it’s still one worth waging.
Thanks – I couldn’t agree more. One thing I didn’t mention as I was already over any half way decent word limit is that, as a communicator, you sometimes have to accept that people will hear or read your words, then go and do something different. And frustrating as this is at times, it’s also a good reminder that you probably don’t always have all the answers!
Amen!
“Take for instance childhood vaccination. Or controversies over genetically modified foods. Or climate change.”
This is an interesting statement. Either you assume all your readers will share your opinion or you are being coy about your positions (or both). These three examples are not in anyway equivalent at a societal level; and hence, any advice you might give as an honest scientific broker would have different values.
Irrespective of where you live, childhood vaccination benefits the group over some of the individuals in the group (there will always be some adverse reactions). The advice of an honest scientific broker would be just that: a large majority of people will benefit, but a small minority will be disadvantaged and possibly killed. It really isn’t up to you to take this any further: that should be the task of elected governments. (Also, I would note that if you believe that vaccinations must be enforced, this would falsify any hypothesis of you being a pure libertarian.)
GMOs – although the science doesn’t vary, the importance of a recommendation would vary with where you live. In the Western World, GMOs are more about profit than survival (e.g. if you support GMOs you will disadvantage organic farmers, but benefit agribusiness and probably consumers). This isn’t necessarily true in the Third World where it may make the difference between survival and starvation. In either case, your advice should be the same and based on the data.
Climate Change – this is entering the world of religion and politics and best avoided by anyone who cares about science and science communication. If you actually were a ‘radical libertarian’, then my understanding is that you wouldn’t care about other people’s religions unless they imposed on your beliefs (NB – it has messed up my area of research, so I have an excuse). If your area of expertise is emerging technologies, then I can see why you might want to make recommendations about new battery systems or alternative energy generation, etc., but why would you want to come out for or against a doomsday cult? That doesn’t seem very libertarian in any sense that I’ve seen it defined.
Thanks Dave for addressing my question – and am rather glad that I don’t come across as libertarian in your view (showing a thread of my true colors there!)
I was of course using all three as a rhetorical device to open up the conversation around issues where there is a strong sense of what the “right decision” is within sectors of the scientific community. I was neither promoting my opinion, nor assuming readers shared it. Yes, I was being coy – because the narrative here is about enforcing decisions based on an assumption of right, not about these issues specifically. In this respect, what each issue has in common is the tension between advice based on evidence (colored in some cases by a science-centric worldview), and actions based on a wider range of values and factors, some of them seemingly antithetical to the evidence.
The point you make that the advice an “honest broker” might give in each case is an important one I think – it underlines the reality that the science-based perspective is one factor out of many feeding into decisions that individuals and communities make.
I am still grappling with the distinction between advice, as opposed to elucidating options and factors. Maybe it’s too subtle a distinction, but the former seems to assume some degree of authority with the advice giver in determining appropriate courses of action, while the latter is more helping the decision-maker see the value and consequences of options more clearly. Although in both cases, the values of the advisor/elucidator will be tied up with what is conveyed.
Hi Andrew,
I don’t think the distinction between presenting an analysis and giving advice is too subtle. In general, scientists are trained to gather data, look for patterns, develop hypotheses, test them and and analyze the results objectively. If a scientist has any authority, it should come from their ability to do science successfully, not from an illusion that scientists are oracles or understand the world better than others. This seems to work pretty well within scientific establishments where the decision-makers have a basic understanding of the science and the limits of scientists.
Presumably your main interest is in communicating science to non-scientists. If so, then the most important point you can make is that it is the data that is important, not the opinion of the scientist. Scientists are just as likely as anyone else to hold unreliable beliefs about society and our place in nature. As I’m sure you know, there are several historical examples of disciplines being overcome by beliefs that made them blind to their data: the Eugenics movement before WWII is one example; climate science may be another.
From the studies that I have read, and from personal experience, while members of the general population may have some respect for science, they tend not to give claims by scientists more credence then their other sources of information – and truthfully, there is no reason why they should. A politician or a manager should have a better grasp of the implications of a decision than a scientist tasked with testing a hypothesis. If a scientist is asked for advice and feels they must give it, then they should do so as an individual with an opinion, not as someone who speaks from authority.
Hi Andrew – I read your posting and accompanying comments on “Is using nano silver to treat Ebola misguided?” with interest. I suppose it may be confirmation bias, but it does seem to support my belief that scientific advice needs to be presented as objectively as possible. Your title suggests you have already made a conclusion and many of your commentators seemed even more ready to jump to conclusions about your motivations.
I think I would have used a title like ‘Nano Silver and Haemorrhagic Fever Viruses: What do we know?’ and then tried to present what studies were available. I do think your conclusions are justified, but I don’t think your presentation came across as completely objective.
The WHO ‘ethical’ position, however, is both patronizing and no doubt completely ineffective, not to mention bizarrely tone-deaf considering the administration of ZMapp to the two American missionaries. The outrage of some of your commentators is easy to understand.