On Monday, the National Institute for Occupational Safety released new data on the potential role multi-walled carbon nanotubes play as a cancer-promoter – a substance that promotes the development of cancer in the presence of a carcinogen. In the study, mice were injected with methylcholanthrene – a cancer initiating agent – and subsequently exposed to airborne multi-walled carbon nanotubes. Compared to a control group, the methylcholanthrene and carbon nanotube-exposed mice were significantly more likely to develop tumors than a control group, developed more tumors, and developed larger tumors. The study provides a strong indication that this particular form of carbon nanotube material can synergistically increase the likelihood and severity of cancer in the presence of a carcinogen. Continue reading Carbon nanotubes as a potent cancer promoter – new data from NIOSH
The World Economic Forum Global Agenda Council on Emerging Technologies has just published its annual list of the top ten emerging technology trends. Based on expert assessment from council members and others, the list provides insight into technologies that have the potential to have a significant economic and social impact in the near to mid term.
This year’s list includes: Continue reading Top 10 Most Promising Technology Trends 2013, from the World Economic Forum
Cross-posted from Risk Sense
This week’s Risk Bites video takes a roller-coaster ride through some of the hottest topics in risk science.
Admittedly this is a somewhat personal list, and rather constrained by being compressed into a two and a half minute video for a broad audience. But it does touch on some of the more exciting frontier areas in reducing health risk and improving well-being through research and its application.
Here are the five topics that ended up being highlighted:
Despite pockets of cynicism over the hype surrounding “big data”, the generation and innovative use of massive amounts of data are transforming how health risks are identified and addressed. With new approaches to data curation, correlation, manipulation and visualization, seemingly disconnected and impenetrable datasets are becoming increasingly valuable tools for shedding new insights into what might cause harm, and how to avoid or reduce it. This is a trend that has been growing for some years, but is now rapidly gaining momentum.
Just four examples of how “big data” is already pushing the boundaries of risk science include:
- High throughput toxicity screening, where rapid, multiple toxicity assays are changing how the potential hazards of new and existing substances are evaluated;
- “Omics”, where genomics, proteomics, metabolomics, exposomics and similar fields are shedding new light on the complex biology at the human-environment interface and how this impacts on health and well-being;
- Risk prediction through the integrated analysis of related datasets; and
- Designing new chemicals, materials and products to be as safe as possible, by using sophisticated risk data analysis to push risk management up the innovation pipeline.
CLOUD HEALTH, or C-HEALTH
Hot on the tails of mobile-health, the convergence of small inexpensive sensors, widespread use of smart phones and cloud computing, is poised to revolutionize how risk-relevant data is collected, processed and used to make decisions. Sensors already built into smart phones are already being used to collect basic information on environmental factors that could impact on health – and increasingly sophisticated add-on sensors are becoming more and more available. On their own, these data aren’t that valuable. But with cloud computing it is becoming possible to process and analyze risk-related data from thousands or millions of users – and then provide contributors with personal, near real-time information on potential risks and avoidance strategies. We’re not there yet – but C-Health is on the way!
The idea of responsible innovation has been around for some time. The idea is to reduce the potential for future adverse health and environmental impacts by integrating risk management and avoidance strategies into the technology innovation process. And with new technologies emerging at an increasing rate, the social and economic importance of responsible innovation has never been greater. In fields ranging from advanced manufacturing, sophisticated materials and synthetic biology, to 3D printing and remote charging, there is an increasing push to ensure that technological development is informed by the science of risk. And it isn’t only to ensure actual risks are avoided – societal and economic success through responsible innovation also depends on addressing perceived risks.
The psychology and sociology of how individuals and groups make risk-relevant decisions, and the subsequent consequences of these decisions, is a critical component of the science of risk. Just because it is social science rather than natural science does not diminish its importance. In fact, without a sophisticated understanding of how empirical data on hazard, exposure and risk translate into human understanding and action, risk assessment and the science behind it is pretty worthless. But why call this frontier “headology” – which is a made-up word from satirical author Terry Pratchett? Apart from being a little tongue in cheek, I wanted to get away from some of the baggage associated with terms like “risk communication” and “social science”. But whatever you call it, in today’s increasingly connected world, understanding the human element linking data and action on risk is becoming increasingly important.
This is a bit of a catch-all, but as the “simpler” challenges associated with health risks are resolved (and I use the word “simple” with caution) we are being faced with an ever-growing array of more complex challenges. These include:
- Exploring and understanding the importance of non-linearity in dose-response relationships – especially at low doses;
- Getting a better handle on the health-relevance of low level exposures to some substances – especially over long time periods;
- Better understanding the science behind exposure to synthetic chemicals with hormone-like properties; and
- Understanding that nature and significance of epigenetic interactions – both within a generation and across generations.
These and similar areas arise from complex interactions between our bodies and the environment we live in – and create for ourselves. The list could be a lot longer, but the bottom line is that some of the knottiest and most significant challenges in risk science involve understanding the positive and adverse impacts of interactions that are not yet well understood.
There are other areas that could have easily made this list – and in all cases these are areas that will continue to remain important well beyond 2013. So feel free to expand on the list in the comments below. And have a great 2013!
Sometimes you read a science article and it sends a
shiver tingle down your spine. That was my reaction this afternoon reading Ed Yong’s piece on a paper just published in Nature Biotechnology by Janna Nawroth, Kevin Kit Parker and colleagues.
The gist of the work is that Parker’s team have created a hybrid biological machine that “swims” like a jellyfish by growing rat heart muscle cells on a patterned sheet of polydimethylsiloxane. The researchers are using the technique to explore muscular pumps, but the result opens the door to new technologies built around biological-non biological hybrids.
To get a sense of what Parker et al. have achieved, it’s worth watching this video of the “medusoid” in action – the movement comes about by a single layer of heart muscles grown on the substrate contracting synchronously as an electric field is applied to the liquid.
What particularly intrigues me here is the fusion between the biological and the non-biological. While synthetic biology has typically focused on manipulating organisms through designer-DNA, this more practical approach to engineering biology could go a long way very fast – even before genetically engineered components are added.
In the case of the machine above, the result is a relatively functionless entity that moves when an external voltage is applied. But it wouldn’t take much to engineer in a self-contained voltage source and pulse regulator, and maybe some control elements – fueled by further hybrid biological components. What you end up with is an engineering construction kits for biological machines that could be as attractive to the DIY bio community as mainstream technologists. With the addition of genetically designed components, this is likely to be a technology to watch.
Of course, the other reason why this story sent a
shiver tingle down my spine is the quote that I used for the title of this piece – which must be one of the coolest biotech quotes ever!
Nawroth, J. C. et al. (2012) A tissue-engineered jellyfish with biomimetic propulsionNature Biotechol. http://dx.doi.org/10.1038/nbt.2269
(Does shiver denote dread? Meant this was spine-tinglingly awesome!)
A few weeks ago I was asked to give a “TED style talk” on nanotechnology for the University of Michigan Environmental Health Sciences department 125th anniversary. What they got was a short talk on “thinking small”:
The other talks in the series are also worth checking out – covering topics as diverse as epigenetics, cancer, exposure science, obesity, endocrine disruptors, global health and mercury in the environment. Watch them here: http://www.youtube.com/playlist?list=PLF87730C0E0C26FEA
Nanotechnology leads to novel materials, new exposures and potentially unique health and environmental risks – or so the argument goes. But an increasing body of research is showing that relatively uniformly sized nanometer scale particles are part and parcel of the environment we live in. For instance a number of simple organisms such as bacteria and diatoms have the capability to produce nanoparticles, either as part of their natural behavior or under specific conditions. Nanoscale minerals, it seems, play an important role in shaping the world we live in. Metals like silver wantonly shed silver nanoparticles into our food and water according to research published last year. And now a group of researchers have shown that food containing caramelized sugar contains uniformly sized amorphous carbon particles.
This latest paper was published in the journal Science Progress a few weeks ago, and analyzes the carbon nanoparticle content of such everyday foods as bread, caramelized sugar, corn flakes and biscuits. The authors found that products containing caramelized sugar – including baked goods such as bread – contained spherical carbon nanoparticles in the range 4 – 30 nm (with size being associated with the temperature of caramelization). This isn’t that surprising as nanoparticle formation is closely associated with hot processes. Continue reading Carbon nanoparticles could be ubiquitous to many foods
It’s been hard to avoid the buzz surrounding nano quadrotors this week, following the posting of Vijay Kumar’s jaw-dropping TED talk – and the associated viral video of the semi-autonomous machines playing the James Bond theme.
The quadrotors are impressive – incredibly impressive. But I’m sure I am not the only person watching these videos who felt a shiver of apprehension about where the technology might lead.
When people talk about emerging technologies – especially when the focus is on potential risks and unintended consequences – it doesn’t take long for the usual suspects to emerge: with nanotechnology, synthetic biology and geoengineering usually appearing toward the top of the list. But I wonder whether focusing on big, well-publicized technology trends sometimes masks some of the less discussed but more important technology innovations that are already impacting on people’s lives.
Tim Harper and I underscored this concern in a report from the World Economic Forum last year where we suggested we should be focusing just as much on the innovations that build on synergistic connections between technology platforms (see below), because this is where many of the more significant disruptive and game-changing technologies will emerge.
It’s partly because of this that I have been so intrigued by the nano quadrotor work coming out of the GRASP lab at the University of Pennsylvania.
The nano quadrotors that Vijay Kumar’s team are developing are a prime example of synergistic innovation leading to a game-changing technology. The quadrotors combine components from multiple technology platforms – sensors, materials, information processing and others – and as a result they present opportunities and risks that depend on the synergism between these platforms. In other words, the potential disruption comes not from the platforms, but how they are combined into products.
Just thinking briefly about the potential impacts of the nano quadrotors, it’s not hard to see how it could shake things up. In fact Chris Anderson, the curator of TED, tweeted after Vijay’s talk:
On the plus side, the nano quadrotor technology clearly opens new avenues into the areas of search and rescue, exploration and surveillance. But it’s also frighteningly easy to see how it could lead down darker paths. I’m sure I am not the first to have the sensation of dystopic Sci-Fi movies playing out before my eyes as I watch the video above.
Applications in military intelligence are a no-brainer – as well as in tracking terrorist activities, or any other activities that goverments and others want to monitor for that matter. The swarming ability of the nano quadrotors also opens up intriguing new options for semi-autonomous offensive systems that are able to outsmart defensive screens. And it’s not hard to imagine the devices being deployed on search and destroy missions, equipped with advanced face recognition capabilities and some suitably nasty toxin. And that’s just after giving the possibilities a cursory thought.
Of course, the technology is almost definitely not as mature as the videos suggest – just yet. The most impressive videos – including the nano quadrotors playing the James Bond theme – downplay the complexity of the external feedback and control systems needed and the limited range of the devices. But this is where synergistic technology innovation that builds on advanced technology platforms comes into its own.
For instance, take these four possible limitations of the technology, and the likely availability of technology-based solutions (and I’m speculating a little here, not being a nano quadrotor insider):
Sensors: To work effectively, the nano quadrotors need feedback – and lots of it. In the lab, this is provided through a combination of on-board and remote sensors. Although out of lab use is possible, it seems to be limited in part by the size, range, speed and sensitivity of on-board sensors at present. This will change. With advances in sensor technology that are already on the horizon, it will be easier to equip the devices with small, lightweight sensors that will allow increasingly autonomous operation.
Materials: The nano quadrotors depend on lightweight, high performance materials to ensure minimum power requirements and maximum maneuverability. Nanoscale science and engineering are already leading to a new generation of lightweight high performance materials that will further improve performance, as well as enabling further miniaturization.
Data processing: The current generation of nano quadrotors depend on incredibly powerful and sophisticated data processing capabilities. The next generation will demand even more. My guess is that there is still a shortfall between what can be achieved and what is needed for strong out of lab performance. But we’re getting there. There is still no end in sight to the exponential growth in processing power, or in smart new ways of using this power to process complex datasets on the fly.
Power. Vijay Kumar estimates that the current crop of nano quadrotors consume 15 watts of power – giving them in my estimate a maximum of 10 – 20 minute operating time between charges using current battery technologies. Not a lot if you are on an extended search and rescue mission! But battery technology is still advancing rapidly, and over the next few years it is entirely conceivable that this range will be doubled or more. Perhaps more intriguingly, it’s not too hard to imagine extending the range of a nano quadrotor to tens of miles by combining the its semi-autonomous behavior with hundreds of well-placed recharging stations. And if those stations used wireless power-transmission technologies currently under development – and thousands of them were air-dropped over a region – the effective range of nano quadrotor swarms could be extended to hundreds of miles or more.
Even looking at these four potentially limiting factors on nano quadrotor performance and use, it becomes apparent that current technology platforms are close to providing solutions that will make this a viable, powerful, and probably highly disruptive technology. Whether this will lead to a net gain or a net loss for society is by no-means clear yet. What I think is clear is that focusing on the responsible development of technology platforms, to the exclusion of the innovations that arise at the intersections between them, runs the risk of us missing what is most likely to change the world we live in.
A few weeks ago I spent some time chatting with Howard Lovy for an article for the Nanobusiness Commercialization Association. That interview was posted by Vincent Caprio on his blog a few days ago, and raised a few eyebrows – was I showing signs of becoming a nano-risk skeptic?
I hope not, as as I still feel emerging evidence and trends indicate major perceived and real risk-related barriers lie in the path of developing nanoscale science and engineering successfully, if we aren’t smart. But I have always adhered to the idea that successful and responsible technology development depends on taking an evidence-based approach – even if that evidence is sometimes uncomfortable. And so these days I sometimes worry that too much is made of artificial constructs surrounding “nanotechnology”, and not enough is made of the underlying science.
Reading through Howard’s piece, I felt it was a pretty accurate reflection of our conversation. There are a couple of places where it possibly indicates less concern on my part than is warranted. Toward the end of the piece for instance I am quoted as saying “there is no need [for the nanobusiness community] to respond to individual challenges such as this lawsuit against the FDA”, referring to a recent lawsuit by consumer advocates against the U.S. Food and Drug Administration, which claims the FDA is failing to regulate nanomaterials in products.
I’m pretty sure I did say something along these lines. But the context was that lawsuits like these are a relatively widely used mechanism for holding federal agencies to account and prodding them into action. And while they are often important, the nanobusiness community need to understand this context and be aware of the bigger picture when it comes to responsible and sustainable development.
Overall though, the piece captures my increasing interest in getting to the bottom of what can go wrong as new technologies are developed, and how we need to start exploring better ways of ensuring responsible innovation.
Here’s the piece that Howard wrote – the original can be read on Vincent Caprio’s blog Evolving Innovations.
When Andrew Maynard, director of the Risk Science Center at the University of Michigan, read the text of a recent lawsuit by consumer advocates against the U.S. Food and Drug Administration, which claims the FDA is failing to regulate nanomaterials in products, one phrase jumped out at him. The groups used the words “fundamentally unique properties” when referring to nanoscale ingredients.
The phrase, in fact, comes directly from marketing material of the National Nanotechnology Initiative. So, in one sense, the nanotech industry is a victim of its own public relations, Maynard believes. A phrase used to promote nanotech commercialization is being thrown back at nanotech advocates by those who would use the same logic to demand strict regulations.
“There is an assumption that you can have everything your own way,” Maynard says. “You can say something was unique and important and world-changing, selling the hype, and yet not really understanding what the long-term consequences of that hype are.”
This is what Maynard does for a living. He tries to reach beyond hype and beyond gloom to assess and communicate the real risks associated with emerging technologies, including nanotechnology. But he approaches these assessments from a starting point that seems increasingly difficult to achieve in these polarized political times – one based on scientific principles rather than political agenda.
The problem with that “unique properties” phrase that has been so misused over the years is that the science does not necessarily back it up. Material at the nanoscale is not necessarily any different from its macroscale cousin.
“Now, with the research that’s been generated in the last few years, it’s become increasingly clear that there’s no well-defined set of materials that raise red flags when it comes to size,” Maynard says. “About the best you can do is say that the smaller and more sophisticated you make things the more you have to think about a wide range of questions when you’re evaluating safety.”
So, when Maynard now discusses nanotechnology and potential risk, he’s not likely to even use the “n” word. He’s talking about advanced materials, or “sophisticated materials.”
For example, he says, what questions do you ask when trying to determine whether quantum dots are safe? Well, you talk about the composition of the quantum dot, how its physical and chemical structure determines how it interacts with biological systems, and how its size effects where it goes in the body and how it interacts within it.
“But those are not nano-specific questions,” he says. “They’re the questions associated with a specifically designed material.”
The same thing with titanium dioxide found in sunscreens. Shrink them down to nanosize and you get concerns raised by advocacy groups such as the Friends of the Earth and others involved in the lawsuit against the FDA, but the research says titanium dioxide, even at that size, is still pretty benign.
It has taken Maynard a few years to reach this point in his thinking about nanotech. Many in the nanotech business community might remember Maynard when he was scientific adviser for the Wilson Center’s Project on Emerging Nanotechnologies (PEN) between 2005 and 2008. The PEN raised many questions about the potential risks of nanomaterials. Has he changed since his Wilson Center days?
“I have, which is I think inevitable. If you take a young field, our knowledge is going to change over time,” Maynard says. “And if we don’t change our opinions based on that knowledge there’s something wrong.”
But one thing that has not changed is his belief that if nanotech is going to develop into a sustainable industry that is economically robust, it needs to also be “socially robust” and develop with an eye toward social implications.
“It makes a lot of business sense, if you’re developing any new technology – not just nanotech or whatever – to be aware of the possiblities of what might go wrong with that technology and those products and shore things up as early as possible,” he says.
The problem, though, is that roughly 10 years after these questions were first asked, after the U.S. government has invested millions in looking at the environmental and health implications of nanotechnology, we still are not much wiser.
“We know a lot more now,” Maynard says. “The question is do we know a lot more that’s useful now. That’s what I would debate.” The problem, he says, is that the wrong questions are being asked.
Take, for example, carbon nanotubes. There is an assumption by many researchers, Maynard said, that the material is similar to asbestos. But nanotubes are not straight, long, rigid fibers, yet this assumption is driving the research.
“I am quite often concerned that you talk to toxicology groups doing research on carbon nanotubes, I don’t think many of them could actually accurately describe to you the physical form or nature of a carbon nanotube. And yet they’re doing research under various assumptions of what these things are like.”
So, this is the mission of Maynard’s Risk Science Center – to start discussions about the risks of technology with a grounding in real science and not on speculation, taking and “evidence-based approach.”
He’s come a long way since the early 1990s, Maynard, now 46, worked on his Ph. D. at Cambridge in the UK, using advanced microscopy techniques to analyze airborne particles. At the time, many of his colleagues told him he was wasting his time. There would be no future in tiny materials. They were wrong, of course, and Maynard got involved further and further into studying emerging technologies. Eventually, he made the jump from doing science to studying the proper ways of communicating it to the public.
Next on his agenda is looking at issues involved in advanced manufacturing, which overlaps with nanotech. Again, he said he is asking questions having to do with how businesses using new manufacturing technologies, producing new materials, can predict where economic and social barriers are going to be and have a plan to get over them. That includes codes of conduct, standards and best practices. It is up to the industry, itself, to make sure these are in place. The alternative is unwanted regulation.
The most-important advice Maynard gives to the nanotech business community is to simply be aware of the possible implications of the technology they’re developing and make sure regulatory agencies are properly informed of what is being done. But there is no need to respond to individual challenges such as this lawsuit against the FDA.
“It’s worthwhile playing the long game and not being too reactionary to what happens,” Maynard says. “What’s happened over the last 10 years is that concerns over nanotechnology really haven’t gained that much traction.”
In fact, it’s just the opposite. People, in general, remain excited about the prospects of nanotechnology.
“I think the bottom line is to be as honest as possible, and talk to people,” Maynard says. “One of the biggest problems is if you come across as trying to hide things or trying to obscure things. Generally, people are really excited about this technology. They just want to know what’s going on. They want to know what it’s about.”
For more on where my thinking is going on sophisticated materials, check out:
Maynard, A. D., Philbert, M. A. and Warheit, D. B. (2011) The New Toxicology of Sophisticated Materials: Nanotoxicology and Beyond. Toxicol. Sci. 120 (suppl 1): S109-S129. [Free download]
Maynard, A. D. (2011) Don’t Define Nanomaterials. Nature 475, 31 [Accessible here]
Maynard, A. D., Bowman, D., Hodge, G. (2011) The problem of regulating sophisticated materials. Nature Materials 10, 554–557 [Accessible here]
Here’s an introduction to the “wonders and worries of nanotechnology” that I think is rather brilliant:
It’s part of a series being produced by the Science Museum of Minnesota for the Nanoscale Informal Science Education network (NISE Net). The series is designed to stimulate discussions addressing the societal and ethical implication of nanotechnology – but in an accessible and non-threatening way.
Keep your eyes peeled for further episodes with Mindy and Denny – having read through some of the draft scripts, I think you will enjoy them!
For the past few months, the World Economic Forum Global Agenda Council on Emerging Technologies has been working on identifying some of the most significant trends in technology innovation. Published yesterday by WEF, these represent ten areas that we as a council felt are likely to shake things up over the next few years in terms of their economic and social impact.
The plan is to update this assessment on an annual basis
Here’s the list:
Informatics for adding value to information
The quantity of information now available to individuals and organizations is unprecedented in human history, and the rate of information generation continues to grow exponentially. Yet, the sheer volume of information is in danger of creating more noise than value, and as a result limiting its effective use. Innovations in how information is organized, mined and processed hold the key to filtering out the noise and using the growing wealth of global information to address emerging challenges.
Synthetic biology and metabolic engineering
The natural world is a testament to the vast potential inherent in the genetic code at the core of all living organisms. Rapid advances in synthetic biology and metabolic engineering are allowing biologists and engineers to tap into this potential in unprecedented ways, enabling the development of new biological processes and organisms that are designed to serve specific purposes – whether converting biomass to chemicals, fuels and materials, producing new therapeutic drugs or protecting the body against harm.
Green Revolution 2.0 – technologies for increased food and biomass
Artificial fertilizers are one of the main achievements of modern chemistry, enabling unprecedented increases in crop production yield. Yet, the growing global demand for healthy and nutritious food is threatening to outstrip energy, water and land resources. By integrating advances across the biological and physical sciences, the new green revolution holds the promise of further increasing crop production yields, minimizing environmental impact, reducing energy and water dependence, and decreasing the carbon footprint.
Nanoscale design of materials
The increasing demand on natural resources requires unprecedented gains in efficiency. Nanostructured materials with tailored properties, designed and engineered at the molecular scale, are already showing novel and unique features that will usher in the next clean energy revolution, reduce our dependence on depleting natural resources, and increase atom-efficiency manufacturing and processing.
Systems biology and computational modelling/simulation of chemical and biological systems
For improved healthcare and bio-based manufacturing, it is essential to understand how biology and chemistry work together. Systems biology and computational modelling and simulation are playing increasingly important roles in designing therapeutics, materials and processes that are highly efficient in achieving their design goals, while minimally impacting on human health and the environment.
Utilization of carbon dioxide as a resource
Carbon is at the heart of all life on earth. Yet, managing carbon dioxide releases is one of the greatest social, political and economic challenges of our time. An emerging innovative approach to carbon dioxide management involves transforming it from a liability to a resource. Novel catalysts, based on nanostructured materials, can potentially transform carbon dioxide to high value hydrocarbons and other carbon-containing molecules, which could be used as new building blocks for the chemical industry as cleaner and more sustainable alternatives to petrochemicals.
Society is deeply reliant on electrically powered devices. Yet, a significant limitation in their continued development and utility is the need to be attached to the electricity grid by wire – either permanently or through frequent battery recharging. Emerging approaches to wireless power transmission will free electrical devices from having to be physically plugged in, and are poised to have as significant an impact on personal electronics as Wi-Fi had on Internet use.
High energy density power systems
Better batteries are essential if the next generation of clean energy technologies are to be realized. A number of emerging technologies are coming together to lay the foundation for advanced electrical energy storage and use, including the development of nanostructured electrodes, solid electrolysis and rapid-power delivery from novel supercapacitors based on carbon-based nanomaterials. These technologies will provide the energy density and power needed to supercharge the next generation of clean energy technologies.
Personalized medicine, nutrition and disease prevention
As the global population exceeds 7 billion people – all hoping for a long and healthy life – conventional approaches to ensuring good health are becoming less and less tenable, spurred on by growing demands, dwindling resources and increasing costs. Advances in areas such as genomics, proteomics and metabolomics are now opening up the possibility of tailoring medicine, nutrition and disease prevention to the individual. Together with emerging technologies like synthetic biology and nanotechnology, they are laying the foundation for a revolution in healthcare and well-being that will be less resource intensive and more targeted to individual needs.
Enhanced education technology
New approaches are needed to meet the challenge of educating a growing young population and providing the skills that are essential to the knowledge economy. This is especially the case in today’s rapidly evolving and hyperconnected globalized society. Personalized IT-based approaches to education are emerging that allow learner-centred education, critical thinking development and creativity. Rapid developments in social media, open courseware and ubiquitous access to the Internet are facilitating outside classroom and continuous education.
Another product of the A World Of Surprises project with James King and a bunch of extremely talented public health and science students. This is a video from Gracie Trinidad, and explores the frisson between superstition and science through medieval paintings – with a contemporary twist at the end [make sure you watch to the very end of the video for the final quote].
A product of the A World Of Surprises project with James King and a bunch of extremely talented public health and science students.
The task was to explore the confluence between mundane and catastrophic risk, which the team does beautifully. Love the technique, and the subtle touches (note the progressive effect of Rhino Bananas on their creator). And the news/web mockups are priceless. Brilliant!
[Make sure you watch to the quote at the end]
Many thanks to:
- Chad Warhola
- Janae Adams
- Anirudha Rathnam
- Sarah Kang
- Alejandro Mendoza
(Needless to say, this is a bit of speculative fiction!)
Last semester, speculative designer James King worked with myself and a small group of science and public health students at the University of Michigan to explore how a fusion of science and creative art can lead to new insights and modes of communication. The exercise was part of the A World of Surprises project – a project James is working on as the Witt Artist in residence at the UM School of Art and Design.
Part of the aim was to take these science-grounded students out of their comfort zone, expose them to some radical new ideas and perspectives, and see what happens.
The results were impressive! Once the students realized that they weren’t bound by the rigid limitations of their science education, they became enthused over using creative techniques to tell science-grounded stories that connected with people on a far deeper level than just the facts would allow.
Today the group presented the fruits of their final assignment: to produce a piece of creative work that captures the tension – in narrative form – between imagined catastrophic risks and experienced mundane risks. As a group, we were interested in the tension between the catastrophic consequences often imagined to arise from human endeavors, and the mundane reality that often develops.
I’ll try to showcase all of the projects over the next few weeks. They were all, in their own way, quite brilliant. Coming up in future posts there will be:
- The Tale of Rhino Banana (a brilliant story of a technological breakthrough that runs up against public resistance);
- Salutary lessons from the struggle between evil and the divine in the middle ages;
- A visual juxtaposition of comparative risks related to Fukushima; and
- A new-future story of technological sophistication and mundane consequences.
(I’ll add the links as they are posted – The Tale of Rhino Banana will be up first)
James will be back in Ann Arbor for the culmination of the A World Of Surprises project in March – stay tuned on that.
The US National Academy of Science today published its long-awaited Research Strategy for Environmental, Health, and Safety Aspects of Engineered Nanomaterials. I won’t comment extensively on the report as I was a member of the committee that wrote it. But I did want to highlight a number of aspects of it that I think are particularly noteworthy:
Great progress so far, but it’s time to change gears. Something we grappled with as a committee was what the value of yet another research strategy was going to be. After all, it wasn’t so long ago that the US federal government published a well received strategy of its own. A key driver behind our strategy was a sense that the past decade has been one of defining the challenges we face as the field of nanotechnology develops, while the next decade will require more focus as an ever greater number of nanotechnology-enabled products hit the market. In other words, from a research perspective it’s time to change gears, building on past work but focusing on rapidly emerging challenges.
Combining life cycle and value chain in a single framework for approaching nanomaterial risk research. As a committee, we spent considerable time developing a conceptual framework for approaching research addressing the health and environmental impacts of engineered nanomaterials. What we ended up using was a combination of value chain – ranging from raw materials to intermediate products to final products – and material/product life cycle at each stage of the value chain. This effectively allows risk hot spots to be identified at each point of a material and product’s development, use and disposal cycle.
Principles, not definitions. Rather than rely on a single definition of engineered nanomaterial to guide risk-related research, we incorporated a set of principles into our conceptual framework to help identify materials of concern from an environment, health and safety impact perspective. These build on the principles proposed by myself, Martin Philbert and David Warheit in a toxicology review published last year. From the National Academies report:
…the present committee focuses on a set of principles in lieu of definitions to help identify nanomaterials and associated processes on which research is needed to ensure the responsible development and use of the materials. The principles were adopted in part because of concern about the use of rigid definitions of ENMs that drive EHS research and risk-based decisions … The principles are technology-independent and can therefore be used as a long-term driver of nanomaterial risk research. They help in identifying materials that require closer scrutiny regarding risk irrespective of whether they are established, emerging, or experimental ENMs. The principles are built on three concepts: emergent risk, plausibility, and severity; …
Emergent risk, as described here, refers to the likelihood that a new material will cause harm in ways that are not apparent, assessable, or manageable with current risk-assessment and risk-management approaches. Examples of emergent risk include the ability of some nanoscale particles to penetrate to biologically relevant areas that are inaccessible to larger particles, the failure of some established toxicity assays to indicate accurately the hazard posed by some nanomaterials, scalable behavior that is not captured by conventional hazard assessments (such as behavior that scales with surface area, not mass), and the possibility of abrupt changes in the nature of material-biologic interactions associated with specific length scales. Identifying emergent risk depends on new research that assesses a novel material’s behavior and potential to cause harm.
Emergent risk is defined in terms of the potential of a material to cause harm in unanticipated or poorly understood ways rather than being based solely on its physical structure or physicochemical properties. Thus, it is not bound by rigid definitions of nanotechnology or nanomaterials. Instead, the principle of emergence enables ENMs that present unanticipated risks to human health and the environment to be distinguished from materials that probably do not. It also removes considerable confusion over how nanoscale atoms, molecules, and internal material structures should be considered from a risk perspective, by focusing on behavior rather than size.
Many of the ENMs of concern in recent years have shown a potential to lead to emergent risks and would be tagged under this principle and thus require further investigation. But the concept also allows more complex nanomaterials to be considered—those in the early stages of development or yet to be developed. These include active and self-assembling nanomaterials. The principle does raise the question of how “emergence” is identified, being by definition something that did not exist previously. However the committee recognized that in many cases it is possible to combine and to interpret existing data in ways that indicate the possible emergence of new risks. For example, some research has suggested that surface area is an important factor that affects the toxic potency of some ENMs; ENMs that have high specific surface area and are poorly soluble might pose an emergent risk.
Plausibility refers in qualitative terms to the science-based likelihood that a new material, product, or process will present a risk to humans or the environment. It combines the possible hazard associated with a material and the potential for exposure or release to occur. Plausibility also refers to the likelihood that a particular technology will be developed and commercialized and thus lead to emergent risks. For example, the self-replicating nanobots envisaged by some writers in the field of nanotechnology might legitimately be considered an emergent risk; if it occurs, the risk would lie outside the bounds of conventional risk assessment. But this scenario is not plausible, clearly lying more appropriately in the realm of science fiction than in science. The principle of plausibility can act as a crude but important filter to distinguish between speculative risks and credible risks.
The principle of severity refers to the extent and magnitude of harm that might result from a poorly managed nanomaterial. It also helps to capture the reduction in harm that may result from research on the identification, assessment, and management of emergent risk. The principle offers a qualitative reality check that helps to guard against extensive research efforts that are unlikely to have a substantial effect on human health or environmental protection. It also helps to ensure that research that has the potential to make an important difference is identified and supported.
Together, those three broad principles provide a basis for developing an informed strategy for selecting materials that have the greatest potential to present risks. They can be used to separate new materials that raise safety concerns from materials that, although they may be novel from an application perspective, do not present undetected, unexpected, or enhanced risks. They contribute to providing a framework for guiding a prioritized risk-research agenda. In this respect, the principles were used by the committee as it considered the pressing risk challenges presented by ENMs.
Maintaining current research and development funding levels. As a committee, we felt that the current US federal government of ~$120 million into environment, health and safety-specific nanotechnology research was reasonable, especially given the current economic climate. However, we did recommend that, as knowledge develops and commercialization of products using nanomaterials increases, funded research is aligned with areas and priorities identified within the strategy.
Developing cross-cutting activities. There were five areas where the committee felt that further funding was needed to ensure the value of nano-risk research was fully realized. Each of these cuts across areas of research, and provides the means to maximize the benefit of the science being supported. From the report:
Informatics: $5 million per year in new funding for the next 5 years should be used to support the development of robust informatics systems and tools for managing and using information on the EHS effects of ENMs. The committee concluded that developing robust and responsive informatics systems for ENM EHS information was critical to guiding future strategic research, and translating research into actionable intelligence. This includes maximizing the value of research that is EHS-relevant but not necessarily EHS-specific, such as studies conducted during the development of new therapeutics. Based on experiences from other areas of research, investment in informatics of the order of $15 million is needed to make substantial progress in a complex and data rich field. However, within the constraints of nanotechnology R&D, the committee concluded that the modest investment proposed would at least allow initial informatics systems to be developed and facilitate planning for the long-term.
Instrumentation: $10 million per year in new funding for the next 5 years should be invested in translating existing measurement and characterization techniques into platforms that are accessible and relevant to EHS research and in developing new EHS- specific measurement and characterization techniques for assessing ENMs under a variety of conditions. The committee recognized that the proposed budget is insufficient for substantial research into developing new nanoscale characterization techniques— especially considering the cost of high-end instruments such as analytic electron microscopes—in excess of $2 million per instrument. However, the proposed budget was considered adequate to support the translation of techniques developed or deployed in other fields for the EHS characterization of ENMs.
Materials: Investment is needed in developing benchmark ENMs over the next 5 years, a long-standing need that has attracted little funding to date. The scope of funding needed depends in part on the development of public-private partnerships. However, to assure that funding is available to address this critical gap, the committee recommends that $3-5 million per year be invested initially in developing and distributing benchmark ENMs. While more funds could be expended on developing a library of materials, this amount will assure that the most critically needed materials are developed. These materials will enable systematic investigation of their behavior and mechanisms of action in environmental and biologic systems. The availability of such materials will allow benchmarking of studies among research groups and research activities. The committee further recommends that activities around materials development be supported by public- private partnerships. Such partnerships would also help to assure that relevant materials are being assessed.
Sources: $2 million per year in new funding for the next 5 years should be invested in characterizing sources of ENM release and exposure throughout the value chain and life cycle of products. The committee considered that this was both an adequate and reasonable budget to support a comprehensive inventory of ENM sources.
Networks: $2 million per year in new funding for the next 5 years should be invested in developing integrated researcher and stakeholder networks that facilitate the sharing of information and the translation of knowledge to effective use. The networks should allow participation of representatives of industry and international research programs and are a needed complement to the informatics infrastructure. They would also facilitate dialogue around the development of a dynamic library of materials. The committee concluded that research and stakeholder networks are critical to realizing the value of federally funded ENM EHS research and considered this to be an area where a relatively small amount of additional funding would have a high impact—both in the development of research strategies and in the translation and use of research findings. Given the current absence of such networks, the proposed budget was considered adequate.
Authority and accountability. In our report, we talk quite a bit about the need for an entity within the federal government to take the lead in implementing a risk research strategy. While the US National Nanotechnology Initiative have done a great job coordinating interagency activities, we felt that there is only so far coordination without authority can go if socially and economically important research is to be conducted in a timely and relevant manner. What this “entity” might look like – we left that to the federal government to chew over.
There’s a lot more to the report – including (as you would expect) a broad assessment of research areas that need attention if the science of nanomaterial human health and environmental impacts is to continue to develop effectively.
This is the first of two reports- the second is due in around 18 months, and will look at progress toward implementing a relevant and effective research strategy.
The National Academies report “A Research Strategy for Environmental, Health, and Safety Aspects of Engineered Nanomaterials” can be downloaded here.
Cross-posted from the Risk Science Blog
The World Economic Forum Global Risks Report is one of the most authoritative annual assessments of emerging issues surrounding risk currently produced. Now in its seventh edition, the 2012 report launched today draws on over 460 experts* from industry, government, academia and civil society to provide insight into 50 global risks across five categories, within a ten-year forward looking window.
As you would expect from such a major undertaking, the report has its limitations. There are some risk trends that maybe aren’t captured as well as they could be – chronic disease and pandemics are further down the list this year than I would have expected. And there are others that capture the headlining concerns of the moment – severe income disparity is the top-listed global risk in terms of likelihood. But taken as a whole, the trends highlighted capture key concerns and the analysis provides timely and relevant insight.
Risks are addressed in five broad categories, covering economic, environmental, geopolitical, societal and technological risks. And cutting across these, the report considers three top-level issues under the headings Seeds of Dystopia (action or inaction that leads to fragility in states); How Safe are our Safeguards? (unintended consequences of over, under and unresponsive regulation); and The Dark Side of Connectivity (connectivity-induced vulnerability). These provide a strong framework for approaching the identified risks systemically, and teasing apart complex interactions that could lead to adverse consequences.
But how does the report relate to public health more specifically?
The short answer is that many of the issues raised have a direct or indirect impact on public health nationally and globally. Many of the issues are complex and intertwined, and are deserving of much more attention than I’ve been able to give the report so far. I did however want to pull out some of the points that struck me on a first read-through:
Unintended consequences of nanotechnology. Following a trend seen in previous Global Risks reports, the unintended consequences of nanotechnology – while still flagged up – are toward the bottom of the risk spectrum. The potential toxicity of engineered nanomaterials is still mentioned as a concern. But most of the 50 risks addressed are rated as having a higher likelihood and/or impact.
Unintended consequences of new life science technologies. These are also relatively low on the list, but higher up the scale of concern that nanotechnologies. Specifically called out are the possibilities of genetic manipulation through synthetic biology leading to unintended consequences or biological weapons.
Unforeseen consequences of regulation. These are ranked relatively low in terms of likelihood and impact. But the broad significance of unintended consequences is highlighted in the report. These are also linked in with the potential impact and likelihood of global governance failure. Specifically, the report calls for
“A shift in mentality … so that policies, regulations or institutions can offer vital protection in a more agile and cohesive way.”
The report’s authors also ask how leaders can develop anticipatory and holistic approaches to system safeguards; how businesses and governments can prevent a breakdown of trust following the emergence of new risks; and how governments, business and civil society can work together to improve resilience against unforeseen risks.
Vulnerability to pandemics. Pandemic-associated risks are in the middle of the pack when it comes to potential impact, but not as high as might be expected on the likelihood scale. In 2007 and 2008 pandemics were listed in the top five global risks in terms of impact in the Global Risks Report, but have not appeared this high since 2009. With increasing talk about flu strains like H5N1, I wonder whether the relegation of pandemics from the top-tier risks is an oversight.
Antibiotic-resistant bacteria. These are flagged up right in the middle of the risk-pack as an emerging risk, and are one of the highest-ranked risks directly related to public health. The report provides little additional information beyond this though.
Food and water shortage crises. Thee are the highest-ranked risks in terms of impact below major systemic financial failure. And while they are both addressed as systemic risks, failure in each area has clear public health implications.
Rising rates of chronic disease. While overshadowed by higher profile risks, this remains an area of significant anticipated adverse impact and likelihood in the report.
Dystopic trends. The chapter addressing potential drivers of a dystopic future does not directly address public health issues. But trends that have an indirect impact on health thread through it. The impact of the current global financial crisis on jobs, working hours and benefits is highlighted, and it is noted that young people have been especially hard hit recently by a lack of career opportunities. The challenges of an aging population are also flagged. Both areas impact indirectly (and sometimes not so indirectly) on health and well-being. One of the questions for stakeholders posed here is “What measures should be taken today to deal with the changing socio-economic dynamics of an ageing population and a bulging young population?” One could equally well ask what measures should be taken to ensure the health of these two populations.
Regulatory risks. In the case addressing asking “How Safe are our Safeguards?” the report’s authors conclude that:
“far-reaching weaknesses in regulations [suggest] that we may be falling behind in our capacity to protect the systems that underpin growth and prosperity”
This report considers regulation extremely broadly, and spans everything from financial regulation to safety regulation. Yet it also stresses the need for integrated approaches to systemic challenges. The highlighted questions to stakeholders at the end of this section are particularly pertinent to health risk-related regulation and governance:
- How can leaders break the pattern of crisis followed by reactionary regulation and develop anticipatory and holistic approaches to system safeguards?
- How can appropriate regulations be developed so that firms will undertake effective safeguards?
- How can businesses and governments prevent a rapid breakdown of trust following the emergence of a new widespread risk?
- How can businesses, government and civil society work together to improve resilience against unforeseen risks?
Emerging technologies and emerging risks: In examining information on technologies and risks, the report concludes
“globally, the latest technologies are increasingly accessible to local industries, but indicators relating to confidence in the institutions responsible for developing safeguards, including those that manage the risks of emerging technologies, have not shown proportional increases.”
Special report on the 2011 Japan earthquake. The March 11 earthquake that hit Japan last year and the following tsunami resulted in widespread social, economic and health impacts. In a special report, the 2011 Global Risk Report takes a holistic look at factors, events and impacts. This is a case review that is well worth reading from a systemic risk perspective.
Risk centers of gravity. The report concludes with a fascinating analysis of risk “Centers of Gravity” within the five sectors it focuses on – these are described as the risks perceived to be of greatest systemic importance, or the most influential and consequential in relation to others, within each sector. The risk centers of gravity that emerged in each sector were:
- Economic: Chronic fiscal imbalances
- Environmental: Rising greenhouse gas emissions
- Geopolitical: Global governance failure
- Societal: Unsustainable population growth
- Technological: Critical systems failure
The bottom line? The report concludes that
Decision-makers need to improve understanding of incentives that will improve collaboration in response to global risks;
Trust, or lack of trust, is perceived to be a crucial factor in how risks may manifest themselves. In particular, this refers to confidence, or lack thereof, in leaders, in systems which ensure public safety and in the tools of communication that are revolutionizing how we share and digest information; and
Communication and information sharing on risks must be improved by introducing greater transparency about uncertainty and conveying it to the public in a meaningful way.
The Global Risks 2012 Seventh Edition is available at http://reports.weforum.org/global-risks-2012/
*I was marginally involved in the report as a member of the World Economic Forum Global Agenda Council on Emerging Technologies
Note to self: When being swept up in the inevitable innovation frenzies* that 2012 will bring, don’t forget to:
- Be aware of where change is needed, and where it is not;
- Focus on inventiveness that will foster new solutions to pressing challenges;
- Develop the foresight to explore and respond to the consequences of actions arising from new ideas;
- Have the humility to ask others for help in areas where expertise runs thin; and
- Not discount simple solutions to seemingly complex problems.
Oh, and go easy on the chocolate and booze.
Hope you all have a happy, fulfilled and productively innovative new year!
*As well as working on and writing about technology innovation as usual, I’m expecting 2012 to be a big year for innovation in the “day job”, including exploring some new approaches to teaching and knowledge translation.
I picked up a new toy this weekend. (If you want to cut to the chase and see what I’ve been doing with it, please head straight to the end of the post).
I’m fascinated by the combination of old tech (essentially “chalk and talk”) and new media that Sal Kahn has been successfully using to teach mathematics and science on-line. The basic approach he uses of writing and drawing while talking is as old as the hills. But he successfully enhances this through “debundling” topics (breaking things down into small digestible chunks) and making his digitized chalk and talk lessons freely available as short online videos.
Chalk and talk is a way of teaching I still find effective, as it forces me to develop ideas at a measured pace, while allowing my students to follow the thought process and take notes. But it’s an approach that is increasingly out of vogue as educators feel they have to pander to today’s tech-savvy and social media-immersed students. So inspired by Sol Kahn, I’ve been looking at ways of combining this approach with new online tools to provide teaching resources that extend what can be achieved in the classroom.
My first approach was to look at Kahn’s setup – essentially using a drawing tablet and software as a digital blackboard, and recording short videos to teach specific concepts and skills. But after just a few minutes, I realized that this was a learning curve that was too steep for me (put it down to age!) – tablets have a remarkable ability to make everything look like it was drawn by a 3 year old, until you get the hang of it!
Then I came across pencasts. Continue reading Pencasts – a useful educational tool?
The past few years has seen an explosion of interest in silver nanoparticles. Along with a plethora of products using the particles to imbue antimicrobial properties on everything from socks to toothpaste, nanometer scale silver particles have been under intense scrutiny from researchers and policy makers concerned that they present an emerging health and environmental risk. But a paper published last month in the journal ACS Nano suggests that, contrary to popular understanding, we’ve been exposed to silver nanoparticles for as long as we have been using the metal. Continue reading Exposure to silver nanoparticles may be more common than we thought
I’ve been up to my eyeballs this past few weeks in stuff, and haven’t had as much time as usual to post here. So this weekend I thought I would take the easy route and post a couple of videos from the recent Symposium on Risk, Uncertainty and Sustainable Innovation.
These were back to back panel discussions that were designed to set the scene for the symposium by helping to distinguish technology reality from technology hype. They make interesting viewing, as well as providing what I thought was a rather interesting take on significant areas of technology innovation – especially the second panel.
The full set of symposium videos can be viewed here.
Techno-hype or techno-reality?
Mark Banaszak Holl, UM Associate VP, Office of Vice President for Research. Thomas Zurbuchen, Associate Dean for Entrepreneurial Programs, UM College of Engineering. Paula Olsiewski, Program Director, Alfred P. Sloan Foundation. James Bagian, Director of the UM Center for Healthcare Engineering and Patient Safety.
How are new technologies changing the world?
Gil Omenn, Director of the UM Center for Computational Medicine & Bioinformatics. James Baker, Director of the Michigan Nanotechnology Institute for Medicine. Ann Marie Sastry, CEO and Co-Founder of Satki3. Jörg Lahann, Associate Professor of Chemical Engineering, University of Michigan.
The latest iteration of the US National Nanotechnology Initiative’s Environmental, Health and Safety Research Strategy was released today – downloadable from nano.gov. A draft of the document has been on the streets since last December – this version was compiled after a public comment period on that draft that closed earlier this year (the key comments received are listed here).
Given the comments received, I was interested to see how much they had influenced the final strategy. If you take the time to comment on a federal document, it’s always nice to know that someone has paid attention. Unfortunately, it isn’t usual practice for the federal government to respond directly to public comments, so I had the arduous task of carrying out a side by side comparison of the draft, and today’s document. Continue reading New US federal strategy for nanotechnology safety research released