Enabling evidence-based and socially responsive action on human health risks.
Transcipt of Nanotechnology Unplugged Webcast, February 8, 2011
Andrew Maynard: Well, welcome everybody to the first in what I hope to be a series of discussions from the Risk Science Center. My name is Andrew Maynard, I'm the Director of the Risk Science Center here at the University of Michigan School of Public Health and I will be the moderator for the next 50 minutes or so as we go through what I hope is gonna be a very entertaining and enlightening conversation on nanotechnology. Before I get started, just a couple of things to go through. We're webcasting this so there are… in fact there are quite a few more people on the web than there are in here, so just be aware of that. It also means there's gonna be a time or a chance for questions and answers. The microphone is in front of you. It should pick you up. Avoid the temptation to fiddle with them, avoid the temptation to--this paper says here. Avoid the temptation to unwrap Christmas presents or candy bars.
[ Laughter ]
Andrew Maynard: If you have to do that, do it under the table. Also, for the people that are watching on the web, we are hoping to be able to field questions from people looking on the web. You can submit your questions either by posting them on the nanotech unplugged website, on Twitter, on our Facebook page. Or if you logged in to the system, if you post them in the comments section here. If we are really lucky, I'll pick them up and I'll be able to field them and fold them into the conversation.
Okay, to get us kicked off: The topic of the conversation this afternoon is nanotechnology. We're gonna be hearing a little bit about what exactly that is. But whether you know what it is or not, it's hard to avoid the fact that nanotech has been heralded as potentially the next industrial revolution. It's been heralded as that for the last 10 years or so. We've had pundits that have claimed it will create new jobs, it will stimulate economic growth, and it will vastly improve people's quality of life. Across the board whether we're looking at things like treating diseases, generating new renewable energy sources, ensuring a ready supply of nutritious food, or just developing the latest electronic gadgetry. And if you look at all these, nanotechnology is being promoted as the technology of the future.
But there is a flip side to this. We're also seeing that as people promote this new and unique technology, other people are also raising fears about what the potential downsides could be. And the question there is are we going to create a technology which creates more problems than it solves? How are we gonna deal with this?
So this is where we stand with nanotechnology. But beyond the hype, where does the reality lie? Is nanotechnology gonna change our lives forever? Or is it gonna be making them pretty awful? Is it too dangerous to invest in? Or in fact, is nanotechnology just a little bit of marketing hype, designed to get more research dollars into the pockets of scientists and engineers. Well, to discuss these questions specifically in the context of occupational health, I have three leading experts here. So going down the line, we have Mark Banaszak Holl, to my right. Mark is Professor of Chemistry here who is working with nanoscale materials. But he's also Associate Vice President for Research in the Physical and Natural Sciences of the University of Michigan. Next to Mark, we have Martin Philbert who is Dean of the School of Public Health here. He is also a distinguished nanotoxicologist. I think you've probably called yourself, sometimes, a nanotoxicologist? But he's also bridging the divide between toxicology and applications, working on new applications of nanoscale devices. And then, at the far right… we have feedback here. Somebody is watching the video as well.
[ Laughter ]
Andrew Maynard: At the far right, we have Shobita Parthasarathy. Shobita co-directs the Science, Technology and Public Policy program at the Ford School of Public School policy. And she is an expert on the sociopolitics of emerging technologies.
And what I'm hoping we're gonna do is have a lively discussion on what nanotechnology really means in the context of public health risk. To kick things off though, I want to start with you Mark. And I wondered if you could just give us a potted idea of what nanotechnology is really about. So, where does the hype finish and where does the science and technology start?
Mark Banaszak Holl: Sure. Thanks. Well, the science and technology starts at this idea of what a nano is. A nano is about 3 to 5 times bigger than an atom or it's about 10,000 times smaller than a human hair. To give you an idea what that sort of scale means, you could each take a stride of about 2 feet. Instead you took a stride 10,000 times larger. That would mean you could walk to Detroit in 10 steps from here at Ann Arbor. So that would completely change your idea of time and scale, wouldn't it, and what you could do. So that I think is actually not a bad way of thinking about the dramatic change that this sort of length scale brings upon us.
To give you a couple examples of how big changes come from the ability to look at new scales like this. Think about the fact that 400 years ago the light microscope was invented that allowed us to see red blood cells, that allowed us to see bacteria, completely changed what we thought was in the world. 80 years ago, the first nanoscopes were invented. Those were electron microscopes. All of a sudden we can see viruses existed. Again, completely changed our view of the world. We now have machines that were invented about 25 years ago that allow us to see the same quick way in the nanoscale, and then once again, it's completely changing what we can know about the world around us. But nanotechnology just isn't about measurement and seeing. Nanotechnology is about the ability to design, make and use things at this nanoscale. And I'd like to give you three examples of nanotechnology. One I hope all of you used earlier today 'cause we're in the School of Public Health after all, soap. Soap was invented about 2500 B.C. and it fundamentally works off of a nanoscale mechanism. It is a nanotechnology. Another one of my favorite nanotechnologies is the rose window in the Notre-Dame Cathedral. These used metal nanoparticles to generate the color. That was constructed in 1200 A.D. And of course looking around I suspect almost everybody with them today has their cell phone with them today. Microelectronics in fact, much of the active components are indeed nanoelectronics, they're nanoscale designs, fabrications and implementations. And so when Andrew said earlier is the--is now technology gonna revolutionize things? I would say the revolution is here. You know, it's here, it's already happened, and--
Andrew Maynard: Sorry, I've got to ask actually. Sorry, you make an eloquent point that stuff has been happening at the nanoscale forever.
Mark Banaszak Holl: Thousands of years, yes.
Andrew Maynard: So what is different now? Is there anything fundamentally different with what's happening now?
Mark Banaszak Holl: Well, as I said we have far better tools particularly starting at about 25 years ago. Some of the nanoscale tools, the electron microscope was very, very expensive. You might have one or two of them at a university, one or two of them at a company. Some of the new nanoscale tools are actually considerably cheaper than one of your cars. That might be the price of an expensive stereo system. And so, many, many more people can have access and use these things. So it makes for a lot more progress in terms of being able to make and characterize the nanoscale materials.
Andrew Maynard: Do you also find that these tools that because people can see what's happening at this very fine scale, they can modify it. So I guess that the soap, it worked but nobody quite knew why it worked. But presumably, we can now make a better soap because we know why it works.
Mark Banaszak Holl: That's right. We can dramatically change both old inventions and we can do a lot of interesting new ones.
Andrew Maynard: Right, right. So where are some of the places besides the soap and the iPads and iPods that this is appearing in our lives?
Mark Banaszak Holl: Well, to me one of the real big excitements is our ability to understand ourselves in biology. Biology is a field that has defined itself by length scales. There are departments of molecular biology. There are departments of microbiology. And I don't know, I don't think it makes any sense to make a department of nanobiology.
Andrew Maynard: Right. Because it's there already.
Mark Banaszak Holl: Well, because actually I rather we integrated our biological knowledge rather than divided it up. But the fact is, as many, many new interesting things will happen by studying biology explicitly at those nanoscales, we didn't have the tools to do it well for a long time, now we do. And we'll learn exciting new things.
Andrew Maynard: So from what you're saying it sounds like we're on a very exciting path but it's an evolutionary path. Things have been happening at the nanoscale and it's just that now we have increasing control over that so we can control more finely our destiny in terms of where the technology is going. Are we seeing any abrupt changes? Are we suddenly discovering that there are things that we can now do now that might be disruptive that we just couldn't do 10, 20 years ago?
Mark Banaszak Holl: I think the electronics are the most obvious example there. I mean that's been an incredibly disruptive technology in the world, and that's probably the easiest one for everybody to get a grasp of everything that's happened with communications, with ability to manipulate data. That's all fundamentally based on nanotechnology.
Andrew Maynard: Right. So nanotechnology has already changed our lives.
Mark Banaszak Holl: Incredibly disruptive technology, that's right.
Andrew Maynard: Right, right.
But--so something interests me here and from the way you describe it, this is an incredibly exciting trend as far as the size in technology goes in terms of what we can do. But then you've got the risk side. And over the last 10 years, there's been this mounting conversation about risks associated with nanotechnology. And the fear that if you do something that's new and something that's different, there's also a chance of new and different risks arising. So I wanted to bring Martin into the conversation.
So Martin, hearing what Mark has said, is this something of a fallacy that we're creating brand new ways in which harm can occur with this new technology. Or is this something we need to be aware of?
Martin Philbert: So there are two ways that you can look at this. You can look at it from the material-centric way which nanotechnology has tended to do over the last decade or so, precisely because of Mark said, we are able to design new materials. And so, we take either existing materials and modify them. Or we ab initio create new materials with new and exciting structures that have functionality or can be used in interesting ways. On the other side, you got the biology, which has evolved in particular ways. And you are talking about structures that are slightly smaller than a virus frequently. Some of them are larger than the virus. And a biology that has grown up in an environment in which viruses have attacked it for a long time. So we have a system, the immune system, and particular parts of that immune system, the reticular endothelial system, that tend to recognize these particles and respond to them depending on their size and structure and chemical identity, or completely ignore them depending on their size, chemical identity and so on. And the real issue here is not that we've created something that responds in new and unexpected ways. It's simply that we don't understand the rules by which the biology responds to these engineered materials.
Andrew Maynard: So presumably, we're moving into an area where we're getting incredibly excited about what we can do, but because we don't fully understand these rules, there is would you say a possibility that we could do something which could have bad consequences just because we don't know exactly what we're doing.
Martin Philbert: The same is true for chemicals.
Andrew Maynard: Right, okay.
Martin Philbert: So we make all sorts of molecules and large molecules and even bigger molecules made up of smaller molecules, all the time, not anticipating the damage that it might cause.
Andrew Maynard: So, have we learned enough from our experiences with chemicals and chemistry to deal with these new nanomaterials or is there something else that we need there?
Martin Philbert: So you just made a transition from a scientific analysis to a value-based analysis.
Andrew Maynard: Right.
Martin Philbert: And a philosophical question.
[ Laughter ]
Martin Philbert: How much is enough? Can you ever prove that something is safe? As a toxicologist, the mantra is it's the dose that makes the poison. So everything is in and of itself both safe and unsafe. There is a level at which you can be exposed to cyanide and actually have a beneficial therapeutic effect. If you have hemoglobinemia, small amounts of cyanide will help you. We don't use that therapy anymore, frankly.
[ Laughter ]
Martin Philbert: Because a small miscalculation in dose ends up with where you are heading anyway.
[ Laughter ]
Martin Philbert: But arsenic was used as a therapy in its time, mercury until relatively recently, and we still use platinum to treat cancer. So that becomes much more of a philosophical and a policy-based question and it's something, frankly, that regulatory agencies struggle with all the time.
Andrew Maynard: So we're clearly on that knife edge if you like, making sure that the actions we take are beneficial rather than detrimental. And I want to come back to that and bring Shobita in a minute to look at some of the policy implications of that. But just keep--getting back to the science, I should say. Looking at the range of new materials that people are developing and of course we do have chemistry and we understand reasonably well if not perfectly the rules of chemistry in terms of biological actions. Are there things coming out of the nanotechnology field or the nanoscience and engineering field, that really seem to raise more questions than there are answers to in terms of how they behave within a biological system.
Martin Philbert: I would actually reject the premise of your question.
Andrew Maynard: Okay.
Martin Philbert: I think we understand for many chemicals the acute issues of toxicity.
Andrew Maynard: Right.
Martin Philbert: But we're still coming to grips with what, for instance, bisphenol-A may do chronically entering development, entering developmental windows of susceptibility. We don't know much about phthalates and other endocrine disruptors in the environment over a long period of time.
Andrew Maynard: So hold on to that thought.
Mark Banaszak Holl: I would like to agree with him. I very much agree with Martin on that. I think so, and I would take it one step further towards some of the new nanomaterials. What I'd say is I'm very confident, a fairly high degree of confidence, in people like Martin when we come to the acute toxicity problems of our nanomaterials. I think our folks will do a pretty good job on that. We'll have some problems but we'll correct them quickly. But it's the chronic impacts where we don't understand our current materials very well regardless of their size scale, that I'm most concerned about.
Andrew Maynard: So what I'm hearing now is actually, we have a problem which is much bigger than nanomaterials.
Mark Banaszak Holl: Absolutely.
Andrew Maynard: We potentially run a danger focusing so much on nanomaterials that we miss the bigger issue of what we don't know and how to deal with it, especially in terms of chronic impacts.
Mark Banaszak Holl: And potentially harm our ability to pursue what could be very powerful new materials that could help us.
Andrew Maynard: So let's--
Mark Banaszak Holl: Because of over concern about a nano name.
Andrew Maynard: So let's just look at that and I am goion to bring Shobita in a second. But I'm also intrigued with this idea that the new technologies, no matter what you call them, can actually solve problems that we haven't been able to solve so far. You look at the world and it's well full of problems. There are still diseases we're struggling with. We're still struggling to get people enough water, enough food, enough energy in some places. And this seems to be part of a suite of technologies that gives us potentially the tools to do that. So how do you--how do you make that balance between pushing the technology forward and making sure we do it in a responsible way so you end up with a net benefit rather than a net loss?
Mark Banaszak Holl: That's a really tough problem, right, I mean.
[ Laughter ]
Andrew Maynard: Maybe this is the time to bring Shobita in.
[ Laughter ]
Andrew Maynard: So we'll come back to this question. So Shobita, talk a little bit about the policy issue. So we've talked about the science and technology, the fact that you have great promise here. There clearly are things we need to think about in terms of public health impacts. And a lot of those questions are gonna revolve around policy issues. But not only policy issues, how individuals make decisions and choices on the technology that pervades their life. How do we sort this out? How do we parse it out so we end up making sensible, responsible decisions rather than ill-informed decisions?
Shobita Parthasarathy: Well, I think in the previous discussion, I think Martin and you captured an issue which I think is important here, which is that the challenge of governance here is not restricted to a challenge of nanotechnology. That is, we have a challenge with dealing with emerging technologies and how do we go about regulating them. Thus far, what we generally speak and do is we, you know, sort of say, oh, we have this nanotechnology problem, let's see the extent to which our existing regulatory frameworks can cover the new challenges that we're identifying. And what we found, I think, in the history of dealing with emerging technologies that way is that that's an unsuitable way to deal with these problems in at least a few ways. But first is that it doesn't deal with the new challenges that arise. The second is that it creates burdensome regulation in ways that the old paradigms don't necessarily fit with the new paradigms. And invariably, those are also inflexible approaches that can't deal with new forms of knowledge, for example, the issues around acute chronic problems that we were just talking about. And finally, that they can become politically controversial, that is the people feel like you're not addressing these new problems, we're really concerned about them, we need to deal with them a better way. And so, you know, when I've been thinking about nanotechnology, I've been thinking a lot about the lessons we can learn from our similar approach to biotechnology. And that's a place where we, I think, took a similar approach to the way we're dealing with nanotechnology now, and it has created a few problems. So for example, in addition to, I think, thinking about a governance structure early on anticipating what we might need and anticipating not just in a regulatory sense but in a flexible sense, what kinds of expertise do we need to bring to the issue, I think, is really important. So that, for example, goes back to this issue of acute versus chronic and what are the new kinds of expertise that we need to put into there, social, moral expertise. Those are the kinds of broader--
Andrew Maynard: I was gonna ask exactly that because the models we tend to have within society are that as the science and technology develop, it's the scientists, the technologists, the engineers that are at the table dictating what is right and what is appropriate. Or at least setting the scene for what is right and what is appropriate in terms of policy and decision-making. But it's your sense that we need to be much broader than that. And in fact, let me reverse the question around. Who should be at the table in terms of deciding how we make decisions as a society on these new technologies?
Shobita Parthasarathy: Well, what you find is that depending on the folks that you bring into the room, it narrows the question in a particular way. So even though we bring scientists and engineers into the room, we assume that they know the most about an issue with regards to science policy. But you know, just recall the exchange that you and Martin have when you asked a question that you thought was a science question and Martin said, "No, that's a values question." Right? And you know, credit goes to Martin who is able to identify those differences. But that's a rare thing, right? The ability to identify the distinction between a science-based problem and a value-based problem.
Andrew Maynard: Yes.
Shobita Parthasarathy: And invariably when it comes to public policy, we're actually dealing with value-based problems. And when you talk about policy problems being value-based problems, then the question of who is an expert in dealing with value-based issues becomes quite different, right?
Andrew Maynard: Right.
Shobita Parthasarathy: So those might include social scientists, philosophers, members of the public, stakeholder groups, that's a different kind of expert question--
Andrew Maynard: But this sounds like a really difficult intersection between the people that believe they have a science perspective and those that believe they bring a values-based perspective. How do you make that intersection work?
Shobita Parthasarathy: Well, I think it requires reflection on both--
Andrew Maynard: Right.
Shobita Parthasarathy: Both ends. So the way I usually think about this is I say, "You know, in the production of knowledge, there is often values embedded somewhere in that process. And in the identification and assessment of values, there can also be knowledge involved in that process. And so, once you start to realize that the whole process of policy making is sort of this mix of knowledge and values, then we can start to say, okay, we can bring in a variety of different perspectives or at least that's the start.
Mark Banaszak Holl: Yeah. I would just like to take a--point out that scientists and engineers are not in any position to dictate this. I like to point out that only an 8 percent of the people in the US congress have a science, engineering or mathematics degree. And only 0.4 percent have a PhD in one of those areas. So the scientists and engineers are in fact not at the national level dictating this policy, and certainly at the congressional level.
Shobita Parthasarathy: But I would just add to that, that you know, that--while that's true, the infrastructure of the way we make regulatory decisions is one that's based heavily on what we call scientific advisory committees.
Martin Philbert: Let me disagree.
Martin Philbert: I hate to break up this happy hug fest.
Martin Philbert: It is immediately apparent to anyone in Washington that even though the development of a new strategy, technology throughout whatever, is highly scientific, engineering, mathematically based and so on and rooted in the evidence. There is almost no connection frequently between the science and the ultimate decision-making because the decision makers have the direct imperative to get reelected, reelected. And so, it is driven by public perception frequently not by the science. And frequently, the decisions are made despite the science. And it's the rare occasion, I would say, that there is a cord between the scientific evidence and the output. Moreover, the value really depends on your own perspective. So we went through in recent history, the banning of cyclo-oxygenase inhibitors that manage pain for a huge number of people. But because, and this is arguable and controversial admittedly, but there are some evidence that some people went off label in taking the drug and damaged their hearts. And so, if you are benefiting from the pain medication, your acceptable risk and therefore your pressure on the legislator is a very different kind from those who perceive that their grandparent or whoever was harmed.
Andrew Maynard: So this is beginning to veer away from the subject of nanotechnology but it seems to confirm to me that when you really begin to dig deep, the issues here are not technology-based issues. They are much broader issues with how society deals with new science, new technology, new engineering. Does that make sense to you as a panel? So do you think the conversation should be more about how society deals with new technologies, new innovations, new possibilities and risks in general rather than focusing on very specific terms such as nanotechnology?
Shobita Parthasarathy: Well, I just wanna say I guess a couple of things and one is to address your point but also to address the second point that Martin made, which is that while it's true that there are personal values that come into play, I think it's also important to think about the fact that when it comes to, you know, sort of thinking about this problem broadly, there are in fact experts who think about the relationship between values and technology, values, science and technology, the relationship between nanotechnology and society, for example. Who can do a better job, for example, of dealing with values beyond personal opinions and personal interests. So I think that's an important thing to remember when we talk about a broader discussion of how do we go about thinking about emerging technology policy or nanotechnology policy. It's not hear science and then hear a bunch of people who have opinions. There are also people who can help us weigh through and wade through the variety of values and help us understand how to deal with what are the values out there, how should we weigh and et cetera. So that, I just wanna put out there. The second point is I do think that we have to have a very--we do a very poor job of dealing in a sober way with emerging technologies. And I think you see that time and time again. And nanotechnology is just the most recent instantiation of it and I've long thought that, you know, it's the same story over and over again depending on the technology. And I--and so, I would say yes. I mean I think our problem is that we sort of on the one hand, we have folks who think this is you know, all of the things you said at the outset. This is, you know, the second coming industrial revolution. It's gonna revolutionize our lives in a positive way. And on the other hand, you have the folks who say, you know, great dude, this is gonna kill us all. And we, despite the fact that I think we're developing increasingly sophisticated ways to think about emerging technologies, that hasn't really been reflected in the way we deal with it in the policy arena.
Andrew Maynard: Right. I want to come back to some of those points specifically looking at the public perception side of things. I also at some point want to get back to the science of what's happening here, what's positive and what we need to be aware of in terms of potential pitfalls. But I did want to give the people here a chance to chip in if anybody has any questions. This has been an argumentative bunch so now is your chance to hold them up to that. Sure. Just say who you are.
[ Laughter ]
Paula Lantz: I'm Paula Lantz. I'm faculty here at the School of Public Health and Health Management and Policy. I guess I'd like--you talked, Martin, about how, you know, congress doesn't always take a full view of the evidence. But I think what we're talking about here is gonna play out more in the regulatory environment than in the legislature. And I'm wondering what the panel thinks about the capability of the FDA and the EPA and the other major regulatory agencies that are really gonna be where the policy plays out. Are they ready? They're dealing with it now but are they ready for it and are they having these conversations about values as well as the science and the engineering.
Martin Philbert: So I would say that we've had that conversation by virtue of practice. And that is that we have successively defunded the regulatory agencies and expected more of them with less. Now some of that is actually reasonable because we've got greater access to technology, informatics and so on, and we can make better decisions one might argue with less in some cases. But I think because we're in a school of public health and in schools of public policy and at University of Michigan that cares about these things, we're always engineering towards the perfect, and there is no perfect environment. And my point as heated and controversial as it was, was simply to say that we as scientists tend to think we can come up with a perfect scientific answer and it will translate into policy. And my point is--
Paula Lantz: That's silly.
[ Laughter ]
Martin Philbert: But that's where I started. It's absolutely where I started because I was under the rules of controlled experiments proving or disapproving the hypothesis and moving on, moving the science on. Whereas in the real world, it's about taking the available imperfect information and making the best decision.
Mark Banaszak Holl: I think the biology is complex enough that we've really struggled with our regulatory agencies to give the best advice and the best regulation. One of the ways you can look at this is, I mean you can go back to a book that was published 50 years ago now, Silent Spring, where that book focused on concerns of a whole series of molecules, not nanomaterials, molecules, which, and the big focus of concern of Rachel Carson was their cancer-causing effects, and she didn't bring up endocrine disruption early in the book. And in fact, we're still struggling 50 years later with how to regulate that aspect of these materials, very much struggling with that. And so, you know, that's when I, I think, both Martin and I said, we're quite concerned with that aspect of things moving forward because on existing materials, we're still struggling on the regulatory side with those problems.
Andrew Maynard: Just a followup on the regulatory question and somebody has post a question online about the effect of regulation on international competition. And I would extend that and ask a question about whether regulation as it stands or how it could be developed, has the potential to stifle innovation in such a way that people's lives will be adversely affected because we don't develop the technologies that they need.
Shobita Parthasarathy: So, I want to address that point. This is something that invariably comes up when it comes to dealing with emerging technologies. And the bottom line is this, the world is a really complex place. There is a complex web of governance that is there that governs everything from taxes to the environment and human health. And so, you know, one suite of regulations that has to do with the toxicology, for example, you know, may or may not make a difference. So this is, for example, something that I have seen in discussions about stem cell research when they said, "Oh, if it's too onerous here, they'll go over there." Well you know what, in stem cell research, in turns out that the intellectual property environment is more conducive here than it is in European context. So you know, that's not something that I worry about because I really think that it's more of a charge than it is something that I've seen really substantiating.
Andrew Maynard: But having said that, do you think there is a danger in some places where there's a knee jerk reaction to a certain technology. And regulation is rapidly put into place which does put a barrier on particular technologies being developed as fast as they could be developed and as efficiently and as effectively. Or do you think that the system is just so complex that that just gets lost in the noise.
Shobita Parthasarathy: Yeah. I think that's a really rare occasion. I've been studying this for a while and I don't see--there aren't many episodes of that.
Andrew Maynard: I'll bounce this back to Martin and Mark, because both of you in some parts of your life are involved with actually developing new technologies effectively. So do you see regulation as something which potentially creates headaches for you? Or maybe it's the opposite of that. Maybe you see regulation as something that can help you get something that's good on to the market.
Mark Banaszak Holl: You know most of the safety regulations that's coming to place, for example, industrial production and academic handling is a really good thing.
Andrew Maynard: Right.
Mark Banaszak Holl: And frankly I'm glad it's there.
Martin Philbert: Okay. I think that we have sort of reached a really nice steady state equilibrium at the moment. For instance, the remit of the FDA is--are that anything be safe and effective. Safe being first and then you get into these arcane discussions about what safe means and therapeutic balance. So what's safe as a cancer chemotherapeutic is different from an over-the-counter medication. But I think what the standard does is it sets the bar under which no technology really should be ethically developed.
Mark Banaszak Holl: It comes back to the point that was made earlier of the kind of it's dose makes the poison idea. Its really key is that regardless of the materials we develop, that we are careful about the exposures of them. And if it's developed for specific application, that's the exposure which we make sure is allowed and we try to avoid uncontrolled exposures. And that's true again for any class of materials that we make.
Andrew Maynard: So this is a great love fest for federal regulators and I'm sure anybody that's tuning in--
[ Laughter ]
Andrew Maynard: You are preempting what I'm just about to say.
Martin Philbert: Well, no, I think your question was specifically towards the technology development.
Andrew Maynard: Sure, yes.
Martin Philbert: But then you get the deployment of the material, right. So you've made a beautiful car with lots of nice nanotech in the body and in the paint and the films and so on. And there's nanomaterials coming out of the tail pipe or the battery is made of nanomaterials. And it goes through a life cycle and there's the intended life cycle and then there is the stuff that we really do.
Andrew Maynard: Yes, yes.
[ Laughter ]
Martin Philbert: Right. And so, you almost have to get to this even more complex analysis of all sources because it's not like we live in a zero background for many things on the chemical side. Arsenic, if you live in the southwest quadrant of this nation, there is arsenic in the water. It's natural. If you breathe air, there are nanomaterials. So then you start getting into these really complex issues of source apportionment and how--what's the permitted level of this sector of nanomaterial that is deliberately released or accidentally released or generated through some natural process.
Andrew Maynard: So that's actually very helpful. So it's a call for more sophisticated approaches to how we ensure stuff is safe. So again in that conversation, do you think there is any value in having that regulatory conversation about Nanotechnology or nanomaterials, or do you think we have to be more sophisticated about the agents that are likely to cause harm in ways that we're not capturing at the moment.
Martin Philbert: So the sad reality is that we have no reliable tool that will help us determine where the nanomaterial comes from, from many classes of nanomaterials.
Andrew Maynard: Right.
Martin Philbert: So from a regulatory standpoint, actually saying you've been exposed to X milligrams of this engineered nanomaterial and regulating it and enforcing in regulation, that is impossible.
Andrew Maynard: But even a step back there, because there are so many different types of nanomaterial, I noticed Mark pointed out. You look at the soap, it's got a nanostructure. You look at medieval stained glass windows, they've got nanomaterial in them. If we even begin to ask the question about how you regulate nanomaterials, are we creating a false question that has no answer?
Martin Philbert: If you're not exposed to the gold in the glass unless you suck on it. For a millennium.
Andrew Maynard: Right. So having established that, would you--find the best way to put this, and so, forget about the term nanotechnology. Just look at engineering stuff at a very fine scale, the nanoscale, dong neat stuff at a very small scale. Are the technologies coming along that you think particularly challenge the way we regulate at the moment, and that are gonna take some tough thinking to ensure the safety of those products?
Mark Banaszak Holl: I think that will be true in some of the biologics that can be made.
Andrew Maynard: Right.
Mark Banaszak Holl: Because we're gonna make some--we're working on like in some things which actually function in a much more sophisticated way with our own biological machinery than our current class of drugs. I mean we want that to happen because then, they'll work better. But everything that you can do really well ought to have bad implications that could happen from it as well. And so yeah, there will be new challenges and new hazards that arise. And the biological framework is certainly one of them, right.
Andrew Maynard: Right.
Martin Philbert: I'd say yes on the drug development FDA side and on the occupational side, because you know what the exposures are. And you can attribute them very, very--you can align them with what the person is exposed to in the workplace or deliberately dosed with. The environmental arena is frankly where there's gonna be, I guess, a lot of difficulty.
Andrew Maynard: Right, and just talk a little bit more about potential risks. Over the last 10 years there has been a lot of concern about potential new mechanisms of action with nanomaterials. People getting very excited about materials previously assumed to be benign like titanium dioxide, cerum oxides with a certain extent. People getting very excited every time the word quantum is mentioned. And yet something you said Martin is that the body has only got so many ways of responding to a new agent. Are we over hyping the possibility of these new materials 'causing harm or are there specific areas that we need to be vigilant to make sure that there aren't things the market that could cause harm in unanticipated ways?
Martin Philbert: So again, on the acute side, the biological response is pretty well set. It's gonna be inflammatory or it's gonna be a necrotic or apoptotic, to get into jargon, type lesion. And you'll see it very quickly. You'll interrupt the excitability of the heart, you'll induce some sort of inflammation in the lung and you won't be able to breathe properly or you'll hemorrhage. And it will be very easy for the physician to see that something is wrong. Now whether or not they'll be able to reverse the trajectory of the process is gonna be an interesting one. But we'll know that from testing the materials in the lab. And we'll understand that it's chemically reactive or physically reactive or so on. To my knowledge, there is no connection yet between quantum mechanical properties of a material and toxicity.
Andrew Maynard: So that's--that in itself I think is important because that seems to have driven a lot of the conversations in recent years, and a lot of the fears and actually brings me back to the point that I do want to get back on to which is public perception. But again, before we get there, has anybody else got any specific questions they want to throw in.
Peter Jacobson: Hi. Peter Jacobson, School of Public Health. Going, you know, back to the regulatory issue and ask whether what we're seeing is the erosion of the precautionary principle which is essentially a default option that says unless you can prove that there is no harm, you don't move forward with the technology. Martin you spoke about an equilibrium, Shobita and Andrew about whether or not the constraint on innovation. And it seems to me, if I'm hearing this correctly, one of the constraints of precautionary principle is eroding. Is that accurate from your perspective?
Shobita Parthasarathy: I'm not--the word eroding is interesting to me in that question 'cause I'm not sure--
Peter Jacobson: We ever had it?
Shobita Parthasarathy: It's there, right.
[ Laughter ]
Shobita Parthasarathy: Then perhaps I'm too much of a cynic but I--one of the things that I think is interesting and I've noticed it a few times in the exchange is perhaps, as a result of our scientific understandings, our perception of what the problems are have changed, right. So we listed a few. The acute chronic, endocrine disruption and one that you didn't mention which is the, you know, sort of what are the implications of things that we ingest for the environment, right. Whether it's drugs or foods or whatever, right. And let those three examples provided and I can come up with more, suggest is that the current regulatory frameworks we have are built around different understandings of the problems. And part of our problem now is that in part because of our scientific understandings, our evolving scientific understandings, we don't have a regulatory place to think about that. And that complicates our ability to really engage in any kind of precaution because the circumstances that we wanna bring into the precautionary principle or into that weighing process are different than the way we've traditionally thought about risks and benefits.
Andrew Maynard: So that is a nice segue into talking a little bit more about public perception because if you look at the various different ways the precautionary principle is applied, at some point there's gotta be a value judgment there in terms of working out whether something is important enough and potentially dangerous enough to take action or to result with inaction. And part of that comes down to looking at how people within society--I hate to use the word public but look at civil society, part of it comes down to how people within civil society respond to a new challenge and react to it. Especially as we're seeing a society which is governed more by individual responses and self- organized group responses rather than hierarchical responses. So, feeding that in turn to public perception, how do we deal with how people within civil society perceive both the benefits and the risks of something like nanotechnology? And how do we feed those perceptions into the decision making process so we end up with something positive coming out of the other end rather than something destructive?
Mark Banaszak Holl: I think that takes a great deal of patience.
[ Laughter ]
Mark Banaszak Holl: You know, I mean patience on the part of the scientists and engineers. Patience on the part of the policy, and patience on the part of the--your broader public. Keep in mind that in 1900 only 10 percent of the people in this country even graduated from high school. And as of 1950, that was about 50 percent. Today, around in the '60s, it peaked up at 70 percent and it stayed there ever since. We're still at about a 70 percent. So at best, 70 percent of our population has a high school understanding of science and technology. Only 30 percent of our population has a college level understanding meaning that they have a degree in a science area. And so it's a small part.
Andrew Maynard: But, but--
Shobita Parthasarathy: So, I guess I would just add to that. I mean, I think that there are at least two dimensions of this. So there's the public education part that I think Mark was alluding to. But I think another part of that is the public engagement dimension which is that, you know, you don't have to understand science to have instincts about a particular emerging technology that are valuable in at least a few ways, right. So they're valuable to policy makers or to producers and the innovators because they can provide information about the kinds of technologies that are going to be seen as socially or politically problematic. Second of all, those instincts can also be useful in helping innovators and policy makers identify potentially new market. So, you know, you can actually enhance innovation. And third, it can often help us anticipate what the technology will do when it actually goes out into society, what kinds of systems, what kinds of structures and what kinds of processes. I just, I think it's important to keep in mind that while public may not have scientific knowledge and there are ways in which that's problematic and that we should deal with that, there--the public invariably has other kinds of knowledge that are relevant to the process of thinking about emerging technology and emerging technology--
Mark Banaszak Holl: This is why I started with the analogy I started with our instincts fail us here. And so the--this is exactly the problem. If we're working on instincts we've got a huge problem. Because as I said if you took a 4-mile stride instead of a 2-mile stride, you would be to Detroit in 10 steps and that would completely change how you interacted with the area around it.
Andrew Maynard: But are there different…
Mark Banaszak Holl: And so your instinct about time and space is wrong for something which is a million to a billion times size scale different from your perception.
Andrew Maynard: But are there different contexts here? I mean, there's obviously the instinctual approach and understanding of the science and technology which as you said can be completely wrong. But then there's also that the instinctive understanding of how a new tool can be used by a person.
Mark Banaszak Holl: I don't think we have instincts about cell phones. I think our instincts were put in long before those things came along. I think our instincts fail us mightily in these--
Shobita Parthasarathy: Well it depends on how--what you're talking about when you're talking about instincts. I mean, you know, I'm not talking about, you know, a survey instrument where you find out, you know, what's your immediate reaction. I mean what I think is important to keep in mind is that there has been now I'd say over 20 years of pretty robust research into thinking about how to do public engagement in science and technology in pretty sophisticated ways. And those are not based on you know, let's find out what your instincts are and you know, let's judge whether or not they're right or wrong. They're about actually engaging in what is ultimately a longer-term engagement of different lengths. But that I think you do, you know, it's true you know your initial interaction with a cell phone might be--might give you one set of information. But perhaps a longer interaction with a cell phone might give you a longer, greater understanding, greater perspective on how it might be used.
Andrew Maynard: Of course the interesting thing about cell phones is that the engineers that created it and created it for one purpose you're now seeing innovation within society with people using them for very different purposes.
Martin Philbert: But I think that part of this is the Heisenberg principle, right. So by just talking about it we perturb the system and raise the question to a higher level of salience than people would do ordinarily. So it's--I'm sure that my niece who texts at the rate of, you know, probably a satellite is not thinking, oh my God, there are nanomaterials in here and they may be hazardous to my health. But she chooses very carefully the shampoos that she uses because that has been elevated, you know, water quality and sustainability and so on. And so I think a great deal of what we do is based on instinct and is raised to a level of importance 'cause we know about it. Meanwhile, we're doing things that are far more destructive to our health and that of a planet because nobody has ever raised a question.
Andrew Maynard: So that seems to be an incredibly important point and unfortunately you've raised it 2 minutes before we're suppose to be.
[ Laughter ]
Martin Philbert: That's what I do.
[ Laughter ]
Andrew Maynard: You did it beautifully. But let's see if I can tie this in. So I wanted sort of a closing couple of thoughts from all of you. And I'm intrigued by this idea of the questions we ask dictate the course of our thoughts, so we end up focusing on minutiae which may not be important while ignoring the big stuff that may. But, if we can try and pull this back to nanoscale science and engineering and public health in particular, I'd love just a couple of sentences to finish from both of you about what you think--what excites you most about what we can do with these new areas of innovation as a society and what worries you most about how they might impact on society, specifically in the context of public health, how we can improve or degrade people's health? Shobita, I'm gonna start with you.
Shobita Parthasarathy: I'm gonna--I need a minute to think about it.
Andrew Maynard: Right, okay. Okay, well Martin, you're usually not lost for words for that.
Martin Philbert: Thank you, I think.
[ Laughter ]
Martin Philbert: My biggest fear and cause for optimism is the fact that there are no--there's no such thing as nanotechnology. There are nanotechnologies and they are as disparate as the regular chemical technologies. There are things that are films, there are things that are particles, there are smart-targeted polymers, there are all kinds of things. There's circuitry and all sorts of cool stuff. My fear is that by lumping all of these technologies together and by considering them as an entity when something bad happens, and I don't know when it will be or what it will be. But when something bad happens, all of these nanotechnologies will be smeared with the same brush and we will lose out on a large swath of benefit to society.
Andrew Maynard: Great. Mark.
Mark Banaszak Holl: You know we are all wonderful nanomachine, and I hope you don't ban ourselves.
[ Laughter ]
Mark Banaszak Holl: Or maybe we should. I have to think about that. But you know we have an incredible opportunity to learn a lot, so much more about ourselves, so much more about the world around us and to me that's the fundamental and most exciting part of all these. We also have a neat chance to make new human inventions and that's new and exciting. And always when we make new human inventions we'll figure out what good and bad ways to use them. And some of them will have consequences we didn't intend. And that's the concern.
Andrew Maynard: Okay, great.
Shobita Parthasarathy: So I think from my perspective I'm perhaps most concerned that we'll continue as we are, which is that we'll engage in an approach to governance that's for better or worse based on different interests, both those hyping it and those you know, who are extremely concerned about it. And that we won't develop, take advantage of what we could which is a more reasoned approach to thinking about how we should go about engaging and governing emerging technologies more broadly. And you know, that it'll continue to be as it has been in the past with other emerging technologies, you know, sort of adversarial butting of heads with the most powerful victor.
Andrew Maynard: Right, thank you very much. Well, we have to close there but if I dare to sum things up and I can't really sum up such a broad ranging conversation. But I'm left with a sense that looking forward, there are incredible opportunities to improve the quality of people's lives through what you might call engineering and science and technology at a very fine scale. But also society has got a number of challenges to make sure that the technology doesn't get ahead of our ability to manage it in ways which don't lead to adverse effects in public heath. But if we can create that balance, we can find that balance, we can really do major things with pushing the technology forward. So with that, thank you very much all three of you.