1 Hilary Sutcliffe April 29, 2010 at 2:56 am

Thanks for this Andrew. I haven’t read this paper yet, but obviously support its thinking in principle.

But also read Robert Winston’s Bad Idea and heard him speak last night. He was very persuasive about the need to create a culture of interest in science, not just science outreach and challenges scientists and all those involved to think differently about their engagement and communication.

I initially resisted his view that we all have a duty to engage with science, but when he spoke last night of the randomness of science, the randomness of successful innovation and the negative repercussions of all our endeavours, I started to think he may be right.

But that will take a lot more than better outreach in science museums and even these Centres. It will take a dramatic shift in communication from everyone in the science ‘supply chain’, an difference in accountability of these actors, including ourselves and a shift in our attitudes to certainty and uncertainty. This is not easy, or probably even doable.

Lots of food for thought!

2 Andrew Maynard April 29, 2010 at 10:35 am

Thanks Hilary,

Am coming to the end of Bad Ideas (eventually – the long reading period being due to work overload rather than the book itself), and will be interested to see how his concluding ideas jive with emerging thinking on the dynamic between science and society.

3 Ivan Amato April 29, 2010 at 7:58 am

As usual, Andrew, you have written a thought-provoking entry. The issue of whether we become enslaved by our own innovations is a fascinating one. One thing that comes t to my mind here is scale. One grain of silica is a speck of mineral, but 100 billion grains (more?) becomes a beach. That’s a nice consequence of scaling up, but what happens when you scale technology up. That too can be good, but not always. A test tube of chlorofluorocarbon in a chemist’s bench top in the 1920s looks like an answer to the deadly refrigerants of the day, but scaling production up year after year for the many uses CFCs ended up having, and we find ourselves confronted with unexpected and threatening consequences such as damage to the planet’s protective ozone layer. Now think of a computer, then millions of them, then all of them connected, then all of them connected with computer chips embedded in our electrical grid, transportation system, and other pillars of the infrastructure. It’s not that we become enslaved to the individual parts, but rather to the system these components grow into without anyone really making a decision that such growth should happen. The systems builds up partly with planning and partly because the technologically literate often operate in the do-it-because-we-can mode. In time, the system can become too big to fail, too big to fully understand, too big to consider dismantling or reconfiguring. As someone who has been in the world of journalism, it is amazing to see how the technology and practices of the Internet coupled to catastrophic but expected cyclic events like major downturns in economies can foist upon us changes and choices. Journalism is deconstructing on its way to some kind of reconstruction (I hope!), but this seismic moment in journalism happened kind of the way an earthquake does–it was not brought on by choice. Perhaps this is not evidence of enslavement to technological innovation, but it is evidence that our technology is susceptible to large macroforces that can inadvertently turn it into our masters.

4 Andrew Maynard April 29, 2010 at 10:39 am

Looking at technology innovation over the centuries, it sometimes seems that the idea that we are in charge of our destiny is merely an illusion – we are programmed to be inventive it seems, and it’s very hard to temper that programming with forethought. But that’s no excuse not to try!

5 Maria Powell April 29, 2010 at 4:38 pm

Well, I’ll be brief and therefore somewhat cryptic.

I think this depends on what “we” we are talking about ;-).

Also, related to that, I think there are two ignored large “elephants in the room,” so to speak, in these conversations–culture and power. What role does culture play in all of this? (e.g., some cultures have built ecological forethought into the very basis of their cultures, others have not).

Who has power? Who doesn’t? Who is really “in charge”?

Sorry, that’s cryptic food for thought…

Maria

6 Richard Sclove April 29, 2010 at 8:15 am

Thoughtful comments, Andrew.

Early on your mention that technology assessment (TA) “is based on the assumption that, if only we can get some insight into where a particular technology innovation is
going . . . we should be able to tweak the system to increase the benefits and decrease the downsides.” As written, that is exactly right. Although if you read my report carefully, you’ll see that I’m interested in seeing if we can push the capability of TA (both participatory and not) to move beyond only studying one “particular technology” at a time to also considering the synergistic interactions among complexes of (seemingly unrelated) techs. Sounds impossible, but it’s an issue I’ve directed attention to for years (including via one NSF-funded project). I don’t think we’ll ever be able to do that with anything approaching perfection, but I’ve got empirically grounded reasons for thinking we can make some headway on it.

As to how participatory technology assessment (pTA) might conceivably expedite decisions: In my report, I address only one aspect of that: the time from initiating a TA project to concluding it. Empirically, pTA projects are often organized and concluded in less time than the U.S. Office of Technology Assessment took to conclude one of its projects.

But in your blog you’re raising the issue of the time it takes for the wider society (or a government body) to reach and implement a decision. My report only alludes to that issue indirectly. But if you think about it, pTA obviously can sometimes to lead to swifter decisions on that score, in at least some instances. Sure, sometimes including more people will lead to mobilizing more interests, raising more questions, and slowing innovation (which can be a good thing, if it leads to a better, more thoughtful
decision and social outcome). But, conversely, think about cases where a decision to implement a technology is made relatively swiftly and implemented with no popular input . . . and then encounters years or decades of popular resistance (aspects of
nuclear power and radioactive waste disposal are the poster child for this kind of thing.) In such cases, expanded popular involvement at much earlier stages might lead to different upstream R&D or innovation decisions that could ultimately be implemented more expeditiously and with better social results. (Although proving that in the real world might be impossible, since you won’t have the empirical counterfactual to prove how it would have gone without citizen engagement).

7 Andrew Maynard April 29, 2010 at 10:46 am

Thanks for the clarification Richard. You’re right of course, when you see pTA in a broader context of sustainable development (in terms of development that doesn’t hit catastrophic or severely limiting barriers early on), up-front participation of laypersons certainly has the potential to enable things to go faster (if this is appropriate) and more smoothly in the long run.

Particularly interesting that you raise the issue of slowing innovation as a potentially good thing in some cases. I think this is an idea that we shy away from in the US, where the dogma of innovation is so deeply entrenched. Yet if citizens are to be empowered to be partners in the process of technology innovation, there has to be this possibility of slowing down “progress” at times – otherwise there is no real empowerment.

Comments on this entry are closed.

{ 1 trackback }