After two sessions devoted to providing background and context to the topic of chemical risks and the assessment of those risks, the workshop’s third session examined some of the seminal work in the field that has been done or has been in progress over the past few years. In particular, the third session’s speakers described two recent major reports on the subject, Science and Decisions: Advancing Risk Assessment (NRC, 2009) and Exposure Science in the 21st Century: A Vision and a Strategy (NRC, 2012); a conference held by the U.S. Environmental Protection Agency (EPA) in 2011 and subsequent report on the same topic titled Next Generation of Risk Assessment: Incorporation of Recent Advances in Molecular, Computational, and Systems Biology (EPA, 2013); and a current project being conducted by the National Research Council (NRC), the Design and Evaluation of Safer Chemical Substitutions: A Framework to Inform Government and Industry Decisions, which is intended to produce a consensus report that will be released in the fall of 2014.
In the first presentation John Balbus, Senior Advisor for Public Health at the National Institute for Environmental Health Sciences, described the 2009 NRC report Science and Decisions: Advancing Risk Assessment. He served as a member of the committee that wrote the report.
The subject of that report—risk assessment—is just one part of the topic of the current workshop, he noted. “The regulation of chemicals involves a lot of different processes, some of which are steps in risk
assessment,” he said. “A lot of what goes on under TSCA [the Toxic Substances Control Act of 19761] is really just hazard identification because we don’t have the kind of robust information that is required in a risk assessment.”
So one question that must be asked is, How does one gather exposure assessment information that can help in the prioritization of regulatory decisions when there is not enough information for a full risk assessment? “I think that this  report has some thoughts on that that is relevant here,” he said. Of course, he added, there are also situations in which there is plenty of information for making such regulatory decisions, and it is important to recognize the difference in approach when dealing with an information-poor environment versus dealing with an information-rich environment.
The charge to the committee that produced the 2009 report was to
- develop scientific and technical recommendations for improving risk analysis approaches used by the EPA, including practical improvements that the EPA could make in the near term (2–5 years) and in the longer term (10–20 years) (NRC, 2009) and
- focus primarily on human health risk assessment, but also consider broad implications of findings and recommendations for ecologic risk analysis (NRC, 2009).
Balbus emphasized that the committee was looking not just at short-term, very practical recommendations but also at long-term, aspirational recommendations. He noted that the committee asked, if we could really do this the way we wanted to, what would be the goal to set and how would we advance the science to get there?
The members of committee also decided early on that they would look at two different elements of risk assessment. “One was the technical side, the nuts and bolts, the science,” he said. “How do you conduct the stages of the risk assessment, and how can we improve that technical conduct?” The second elements was on the decision-making side, with a focus on how to increase the utility of risk assessment. “How can we alter the framework and the ways that we think about risk assessment, and how do we do risk assessment in a way that improves its utility in the kind of decisions that have to be made?”
Balbus presented a list of the key messages from the report that he would further explain: (1) enhanced framework, (2) formative focus, (3)
1 Toxic Substances Control Act of 1976, Public Law 94-469, 94th Congress.
four steps still core, (4) matching analysis to decisions, (5) clearer estimates of population risk, (6) advancing cumulative assessments, and (7) people and capacity building. These were the official take-home messages determined by the committee, Balbus noted. “A lot of the discussion was about the framework for decision making and how risk assessment plays a role in it,” he said, referring to the first bullet point. “‘Formative focus’ means there was a lot of focus on the setup, the questions asked at the beginning.” In particular, “formative” refers to the process of forming the questions for the risk assessment. The third bullet point refers to the fact that the four steps from Risk Assessment in the Federal Government: Managing the Process (NRC, 1983), known as the Red Book, are still accepted as the key to the risk-assessment process.
“There was a lot of discussion about ‘right-sizing’ risk assessment because risk assessment can get very complex, very involved, very expensive, and very lengthy,” he said. “How do we do that at the right times and the right places but at the same time figure out ways to do good, but less involved, risk assessments where the decisions are better served by that kind of an analysis?” The final bullet point referred to the committee’s belief that there is a need for awareness raising and training, both among risk assessors and risk managers, as well as a need for capacity building.
Balbus then returned to the fifth and sixth bullet points to deal with them in more detail, as he said he felt they were the most relevant to the topics of the workshop.
To obtain clearer estimates of population risk—the fifth bullet point—it will be necessary to deal more effectively with uncertainty and variability in those estimates. One of the report’s recommendations was that the EPA “should encourage risk assessments to characterize and communicate uncertainty and variability in all key computational steps—for example, exposure assessment and dose–response assessment.” In particular, Balbus said, the report recommended that “uncertainty and variability analysis should be planned and managed to reflect the needs for comparative evaluation of the risk management options.”
There was a great deal of discussion in the committee concerning how to determine the right amount of uncertainty analysis for the particular kind of assessment being undertaken, Balbus said. The recommendation was that, in the short term, the EPA “should adopt a ‘tiered’ approach for selecting the level of detail to be used in the uncertainty and variability assessments, and this should be made explicit in the planning stage.”
One of the notable points to come out of the committee’s discussions, Balbus said, was that although there is generally a great deal of attention paid to the uncertainty and variability in the toxicology and dose response of a particular substance, much less attention is paid to the uncertainty and variability on the exposure side.
A second area that received a lot of discussion was the selection and use of defaults, he said. “The overarching theme here was to call for much more transparency and much more explicit discussion of defaults and their basis.” But a secondary theme that emerged was all the implicit defaults that are used, sometimes unconsciously, in risk assessment. “There are a lot of assumptions that are inherent in the way we do risk assessment that are never called out as being defaults,” he said. “A lot of the discussion in the committee was about recognizing these and identifying these and maybe thinking about them a little bit differently.”
A key default appears in site risk assessments where there are multiple different chemicals to which people are exposed. “If a particular chemical doesn’t have sufficient information, it is by default assumed to have zero risk,” Balbus said. “This isn’t considered in the risk assessment. You do the risk assessment for the chemicals that you know about, and anything else that is in there [is assumed to have] zero risk. It may be a good default, or maybe there should be a default that if you don’t anything, it has some kind of average risk [for that particular class of chemicals].”
A second implicit default is that carcinogens have a linear dose response and, furthermore, that they do not have any human individual variability. “These are the kinds of things that were brought up by the committee,” he said. “I think this has some relevance to the way we frame risk assessments and even do some of the hazard identification.”
Another issue that is relevant to risk assessment, Balbus said, is the criteria used to decide when not to use a default assumption. “The committee determined that EPA, for the most part, has not yet published clear, general guidance on what level of evidence is needed to justify use of agent-specific data and not resort to a default.” Among those who carry out risk assessments, “there may be different criteria used for when you depart from a default assumption,” he said. “So there was a call to provide guidance and have transparency and clarity about this.” The committee noted that there are also a number of defaults that are engrained in the EPA risk-assessment practice but that are absent from its risk-assessment guidelines, Balbus said. With respect to the selection and use of defaults, the committee made three recommendations in the report:
- “EPA should continue and expand use of the best, most current science to support and revise default assumptions.
- EPA should work toward the development of explicitly stated defaults to take the place of implicit defaults.
- EPA should develop clear, general standards for the level of evidence needed to justify the use of alternative assumptions in place of defaults” (NRC, 2009).
One of the most controversial parts of the committee’s work was its recommendations for how the EPA should unify its approach to dose-response assessment for both carcinogens and noncarcinogenic substances. The committee thought that the EPA’s treatment of non-cancer and low-dose, nonlinear cancer end points is a major step in an overall strategy to harmonize cancer and noncancer approaches, Balbus said, but the committee also found that there are scientific and operational limitations to this approach.
In particular, the committee focused on the issue of thresholds. While there may be a “clear red-line threshold” for a given effect and a given substance when considering a particular individual, “that threshold disappears when you consider it a population level,” Balbus said. “The committee did a lot of thinking and a lot of deliberation on what might be the approach by which you would unify dose response for carcinogens and noncarcinogens to take into account this population-scale lack of a clear, defined threshold.” The committee’s approach to unification involved “investigating this interindividual variability in susceptibility, looking at population susceptibility and at the influence of underlying disease on vulnerability and looking at other kinds of genetic susceptibility factors, quantifying that, and then taking a look at mode-of-action information and other toxicological information.” The approach that the committee developed is laid out in Figure 4-1.
The first step, Balbus explained, is to look at the toxicological data and at the end points and the nature of those end points and to understand, to the extent possible, the biological mechanisms of those end points. “The second tier is where you start looking at mode of action, but at the same time you bring in consideration of vulnerability factors and the distribution of those vulnerability factors and the importance of them as the variability in background exposure. Then you can use informed expert judgment to decide upon the right conceptual model. In many cases the committee believed that this would lead to a more linear approach to most impacts without clear bright-line thresholds.”
FIGURE 4-1 New unified process for selecting approach and methods for dose–response assessment for cancer and noncancer end points.
NOTE: MOA = mode of action.
SOURCE: NRC, 2009.
Another important issue that the committee focused on in the report is cumulative risk assessment. There is a growing realization among environmental health professionals, Balbus said, that it is not sufficient to examine single-agent exposures in a vacuum. “You have to look at exposures in context—not only in the context of co-exposures with other chemicals, but also in a context of multiple nonchemical stressors, whether that is psychological stress, nutritional stress, or socioeconomic stress.” The committee recommended that the EPA should “draw on other approaches, such as those from ecological risk assessment and social epidemiology, to incorporate interactions between chemical and nonchemical stressors in assessments.” In the short term, the EPA should “develop databases and default approaches to allow for incorporation of key nonchemical stressors in cumulative risk assessments in the absence of population-specific data, considering exposure patterns, contributions to relevant background processes, and interactions with chemical stressors.”
In the longer term, the agency should invest in research programs related to interactions between chemical and nonchemical stressors, including epidemiologic investigations and physiologically based pharmacokinetic modeling.”
In summary, Balbus reiterated that there were two different aspects to the committee’s recommendations. The first was the technical side—various ways to improve the validity and usefulness of the assessments. “The other piece was imbedding risk assessment in a decision framework that considers the question that has to be answered and the choices that have to be made,” he said. “For example, in green chemistry if you have a multitude of different kinds of chemicals that you could be using other than your chemical of concern, that would be a very different risk assessment than if you only have that one chemical to serve a particular function.” Thus, the committee recommended that the EPA “adopt a framework for risk-based decision making that embeds the Red Book risk-assessment paradigm into a process with initial problem formulation and scoping, up front identification of risk-management options, and use of risk assessment to discriminate among these options.”
The second speaker was Paul Gilman, senior vice president and chief sustainability officer at Covantra. He served on the National Academy of Sciences (NAS) committee that produced the 2012 report Exposure Science in the 21st Century: A Vision and a Strategy (NRC, 2012).
Gilman began his presentation with a question: “If you are a toxicologist or a risk assessor or an epidemiologist, you have an appreciation for exposure science. In fact you might even say some of your best friends are exposure scientists. But do you ever really invite them to the buffet?” Gilman said that the purpose of his talk would be to argue that “they should be at the buffet table and be a critical component.”
Before discussing the 2012 report, Gilman offered an anecdote to emphasize the importance of exposure science. In the year following the September 11, 2001, attacks, a group at the EPA put together a screening tool for examining some of the potential hazards the United States faced. The group took a unique approach in that a significant component of the screening tool considered exposure and exposure scenarios. “On the basis of that work they ran tens of thousands of scenarios,” he said. They
examined a number of scenarios on a classified list that was circulating in government and came up with a modified list that was dramatically different. “Using exposure they could strike lines through a number of the scenarios that were on that list and introduce new ones that probably merited higher consideration and preparation,” Gilman said. “It was sufficiently novel that all of a sudden the nascent Department of Homeland Security wanted to understand it. People at the White House wanted to understand it. For many of them who were from the toxicology side of things, it was eye opening. That was in a way my own grounding in the importance of exposure to the consideration of significant real-world problems.”
The goal of the study, Gilman said, was to provide guidance on how to use exposure science in the regulatory arena. As part of that the committee developed a conceptual framework showing the core elements in exposure science and how they are linked (see Figure 4-2).
One of the main focuses of the study was the various scientific and technological advances that have emerged in recent years that can be applied to exposure science. These include geographic information technology, ubiquitous sensing techniques, the use of biomonitoring for assessing internal exposures, and the modeling and information-management tools. Another tool that is still emerging is the use of crowd-sourced information on the exposure side. To illustrate the roles of these new tools in exposure science, the committee created a modified conceptual framework (see Figure 4-3).
FIGURE 4-2 Core elements of exposure science.
SOURCE: NRC, 2012.
FIGURE 4-3 Expanded view of the core elements of exposure science.
SOURCE: Adapted from NRC, 2012.
These emerging technologies combined with a growing appreciation for the power of exposure analysis make exposure science “a place that we might think of as an emerging frontier and one that should be focused on,” Gilman said. The committee laid out a vision of exposure science that is moving from the historical focus on discrete exposure to a new, broader focus that considers exposures
- “from source to dose;
- on multiple levels of integration (including time, space, and biologic scale);
- to multiple stressors; and
- scaled from molecular systems to individuals, populations, and ecosystems” (NRC, 2012).
To capture that new, broader conception of exposure, the committee relied on the notion of an “eco-exposome.” The key idea behind this notion, Gilman said, is that exposure can no longer be thought of in terms of a single stressor occurring at a single point in time. Instead exposure should be conceived of in developmental terms. “It changes through time. It is influenced by all of the other stressors affecting the organism.”
It is a powerful, all-encompassing vision of exposure, Gilman said. “The good news is in thinking about it you have to know everything about everything all of the time.” On the other hand, it is a vision that seems to require quite a lot to fulfill. “The bad news is you have to know everything about everything all of the time.” And so in an era of increasingly tighter budgets for research, which are forcing agencies to be increasingly careful in prioritizing the research that they fund, this conception of exposure science can make the field difficult to “sell” to funders, Gilman said. “People would say that it is just too hard. It is too much. It costs too much. It is a 20-year program of trying to know everything about everything for an organism and all of the organisms that affected it because certainly we were stressing putting people in an ecological context as well.”
With those difficulties in mind, the committee identified two overarching research needs in the area of exposure science:
- “Characterizing exposures quickly and cost effectively at multiple levels of integration—including time, space, and biologic scales—and for multiple and cumulative stressors, and
- Scaling up methods and techniques to detect exposure in large human and ecologic populations of concern” (NRC, 2012).
“This is again the notion of needing to know everything about everything for all time when considering a stressor and the stressed organism or organ or cell,” Gilman said. “But while the advent of new sensing technologies and approaches to looking at information can allow this, you have to try to come to grips with it in a way that you can make progress and not just wait until you know everything about everything all of the time.”
The committee’s strategy for meeting these research needs was to focus on the urgent needs of the day and to use those urgent needs to develop the tools and the infrastructure for carrying out the research. That infrastructure includes the educational infrastructure that will be needed to train researchers in new approaches and to teach them how to integrate these different tools to answer specific questions. “Then using this infrastructure and using these tools, you can begin to look to those environmental health–related hypotheses that are the more general questions,” Gilman said. “They are the questions that will lead us to a point when we can look at exposure science as a predictive science.”
One thing that could help push that strategy along much faster would be improved collaboration among the various institutions that have
knowledge and capabilities relevant to exposure science, he said. “There are so many places in our federal research agencies, in our private research institutions, and in our public research institutions that have information that is … in one place and not shared. It is certainly not integrated.” The 2012 report “is rich with ideas about experiments that could be done, monitoring programs that could be done on very large scales and across very diverse places, looking at all organisms through all stages of their life with all of the stressors,” he said.
The next speaker, Ila Cote, a senior science advisor for the EPA’s National Center for Environmental Assessment, described a conference, Advancing the Next Generation of Risk Assessment (EPA, 2011a), and a subsequent report on the same topic (EPA, 2013).
The Next Generation of Risk Assessment, or NexGen for short, is an effort that has been going on at the EPA for almost 4 years.2 NexGen is looking at if it should and how to use recent advances in molecular, computational, and systems biology to better inform risk assessment, Cote said. The EPA is joined in the effort by a number of partners, including the National Institute of Environmental Health Science, the National Toxicology Program, the Department of Defense Army Corps of Engineers, the Food and Drug Administration National Center for Toxicologic Research, the National Institute of Occupational Safety and Health, Health Canada, the California Environmental Protection Agency, the European Chemical Agency, and the European Community Joint Research Commission.
The NexGen effort started with a review of the recommendations in several earlier reports, including Toxicity Testing in the 21st Century (NRC, 2007), Science and Decisions (NRC, 2009), and Strategic Plan for the Future of Toxicity Testing and Risk Assessment at EPA (EPA, 2009), as well as information presented in workshops from the NRC’s Standing Committee on Emerging Sciences for Environmental Health Decisions.3 One of the recommendations common to the reports was to
develop case studies or prototypes that provide concrete examples of new types of risk assessments and “engender movement from strategy to practical application of new assessment approaches,” Cote said. So development of prototype risk assessment was an important focus of the project. Seven different prototypes illustrating the use of different types of molecular, computational, and systems biology data were developed.
Several other activities were undertaken in preparation for prototypes development:
- Meetings were held with decision makers to learn about their information needs that might be met by new risk-assessment approaches. Matching analysis to decisions is one of the key messages from the NAS, as noted earlier by Balbus. The riskassessment process can be more efficient if “fit-for-purpose” or “right-sized” assessments are developed, she said. Hence, from discussions with decision makers and reviews of the available data, prototype concepts were developed that could support different decision contexts.
- A draft strategic framework that articulated guiding principles for NexGen prototypes development was also developed.
- A meeting was held, November 2010, for experts to discuss the framework and help refine the prototype concepts (EPA, 2011b).
- A public meeting was held in February 2011 to communicate the intended process and to take comment on the plan from a diverse group of stakeholders, as well as to gather advice on how to communicate during the process and the results of the effort (EPA, 2011a). “To some extent the final report is a fulfillment of promises that we made at this public dialogue conference,” she said.
One of the things that came out of the public dialogue conference was a better understanding of the kinds of information that stakeholders want, Cote said. “[EPA] decided that there was a need for a series of technical papers that were primarily targeted to the scientific community. There were approximately 40 papers that either have already been published or will soon be published that are products of NextGen.” The project also produced a summary report of the technical papers, and an executive summary targeted at risk managers and the lay public. The
draft report has completed external peer review and public comment and will be final in spring 2014.4
Cote said for illustrative purposes the EPA tried to look broadly at three kinds of decision-making situations and develop three categories of assessments that could address those needs. The three categories are major-scope assessments, limited-scope assessments, and prioritization and screening. As you move from the first category to the last, Cote said, the number of chemicals that a decision maker has to consider grows sharply from a few hundred to thousands to tens of thousands. Concomitantly, the amount of traditional data available to support decisions declines as the numbers of chemicals increase.
An example of major scope assessments, she explained, is something the agency is familiar with—the EPA’s Integrated Risk Information System (IRIS) assessments or the Integrated Science assessments—“where one is dealing with relatively few chemicals with lots and lots of data.” The EPA uses this type of situation primarily to develop proof-of-concept prototypes, and secondarily to explore how already robust traditional risk assessment might be better informed by new types of data. An example of a limited-scope assessment might be something like dealing with Superfund site cleanup or an emergency response where you might have a few thousand chemicals that a risk manager might have to consider. Finally, an example of prioritization and screening would be to rank “potentially tens of thousands of chemicals” in the environment for additional research, testing, or assessment. A second example of prioritization and screening is choosing safer and sustainable chemicals for use in the society.
Cote then focused the remainder of her talk on the first of the three decision context/assessment categories and described the major-scope assessment prototypes that have been completed. These prototypes are the proof-of-concept assessments that focused on “reverse engineering” from known public health risks to NexGen-type risk assessments, thus verifying the use of new approaches by comparison with robust traditional data. The first three prototypes examined the connections among benzene, other leukemogens, and leukemia; between ozone and lung inflammation; and between polycyclic aromatic hydrocarbons and lung cancer. These are all areas in which a great deal is already known, Cote noted, and that was exactly the point—to use robust traditional datasets to verify how to
4 The public comment deadline was extended to January 13, 2014, and the final report will likely be available in fall 2014.
best use new data types. These proof-of-concept prototypes focused on evaluation of invivo human exposures (molecular epidemiology or clinical) at environmental concentrations where traditional and molecular data was collected concomitantly, and exposure–dose relationships were measured. “There are not many datasets that are like that,” she said.
Cote noted that what the NexGen project intended to do was iterate back and forth between the new types of data, such as omic data and cell biology data, and the traditional data to understand what could and could not be done, sort out what information was most valuable, and begin to develop decision rules that would help the EPA use new types of data consistently and appropriately to get the “right answer.” Several important points were illustrated by these major assessment prototypes, said Cote.
- The molecular epidemiology and molecular clinical studies demonstrated that it is possible to identify molecular patterns that are predictive of specific hazards (i.e., disorder and disease) and exposure–dose responses, Cote said.
- Chemicals that induce the same health outcomes appeared to share mechanistic commonalities. This is important, she said, because identifying underlying molecular patterns of disease could help characterize chemicals for which few traditional data are available but for which molecular mechanism or adverse outcome pathways are characterized. If, for example, a chemical is known, through in vitro data, to similarly affect the same genes and pathways affected by chemicals known to cause leukemia, it would be reasonable to assume that the relatively unstudied chemical might increase the risks for leukemia as well. Thus, understanding disease mechanisms or key steps in mechanisms could help the EPA evaluate large numbers of chemicals without traditional data.
- Chemically induced mechanisms of disease appear to have many commonalities with naturally occurring disease or diseases of unknown origins. This will allow the EPA to utilize the large amount of mechanistic information on disease that has been developed for clinical reasons by the National Institutes of Health and others to help understand environmentally induced alterations in health.
- Disease mechanisms do not parse cleanly into cancer and noncancer mechanisms. For example, pathways involved in altered
immune responses, inflammation, cell repair, and apoptosis contribute to cancer and noncancer health outcomes. Thus, new methodologies will require harmonized approaches to cancer and noncancer end points.
- With new higher-throughput methodologies, it is possible to collect experimental data over a wide variety of potential exposure concentrations and hence refine or replace inferences about low exposure–response relationships with experimental data.
- It is also easy to see how information on the impacts of various chemicals on the same mechanistic pathways could be used to evaluate the cumulative risk of mixtures of chemicals, Cote said. “Obviously, chemicals that interact with the same pathways are more likely to interact in terms cumulative risk than chemicals that don’t interact in the same pathways.” Chemicals and nonchemical stressors could be evaluated via their pathway interactions.
- Lastly, in the prototypes it was clear that variations in human genes can alter responses in subpopulations. New approaches can help the EPA better characterize variability in overall population responses to chemicals, as well as less sensitive and more sensitive subpopulations.
Cote closed by discussing that what was learned from the NexGen prototypes will inform how the EPA will move forward to improve risk assessment. Of the key issues raised in Toxicity Testing in the 21st Century and Science and Decisions, the NexGen report (EPA, 2013) discusses how the agency might proceed on a number of issues, including matching assessments to decision context, harmonization of cancer and noncancer approaches, better characterization of population variability, cumulative risks from mixtures and other environmental stressors, and improved assessment of responses at environmental exposure levels.
She also discussed what has been learned from the NexGen project with respect to the significant challenges facing those who wish to improve risk assessment. First, a great deal of unusable data exists. “If we had known then what we know now, a lot of the studies we have would not have done in the same way,” she said. Consequently, “systematic review and adherence to best practices for data used in risk assessment is going to be critical.” A second challenge will be consideration of variability in the data. “The signal-to-noise ratio can eat you alive in these studies,” Cote said. Consequently, characterization of
variability is very important. Third, a whole new set of uncertainties will need to be described—new uncertainties “that we haven’t spent the last 30 years discussing,” she said. Fourth, it will be critical that the molecular changes under study be imbedded in a mechanistic network or context. “To be able to separate out what is just normal biology from disease biology requires that you put these things in more of a network context.” Lastly, she noted that concomitant improvements in exposure science are needed. In Particular, easily measured biomarkers of exposure and effect are becoming possible, making possible direct measurement in exposed populations of altered biology and potential adverse health outcomes.
Looking to the future, Cote said, it will be important to develop an integrated understanding of cell biology. “One of the things you don’t often see is studies that look at multiple biologic processes measured by various omic and cell biology techniques. For example, people tend to look at proteomics or genomics or transcriptomics but not at the integrated activity of these processes. You don’t see these kinds of tools brought together in single sets of experiments.” Enough is known to understand that many things are going on with chemically induced alterations in biology. “I would like to advocate for studies that take a more integrated approach using a variety of new methodologies.”
Finally, she said, it will be necessary to start developing a dynamic—as opposed to a static—understanding of what happens in response to chemical exposures. “The studies we tend to have are snapshots in time of evolving biologic events. My colleague Lyle Burgoon says science currently gives you a roadmap of disease processes, but what you are really interested in is traffic flow.” In other words, “we are really interested in information flow in the organism,” rather than events at a point in time. However, she added, “we are not there yet” in terms of our analytic tools.
In conclusion, she suggested that the most promising approach to risk assessment will be to collaborate across different fields of study. “I think that integrating what we know about human disease and human genetics that comes out of the study of disease in the absence of chemical exposure with data on the effects of chemical exposures … is going to give us the best information that will allow us to screen new chemicals more rapidly using new molecular methodologies.”
The session’s final speaker was Marilee Shelton-Davenport, a senior program officer with the Board of Life Sciences for the NRC of the NAS. She is the study director for a current NRC study, the Design and Evaluation of Safer Chemical Substitutions: A Framework to Inform Government and Industry Decisions.5 That study will result in a consensus report, which is scheduled to be released in the fall of 2014.
Shelton-Davenport began by discussing a workshop that preceded her study, Applying 21st Century Toxicology to Green Chemical and Material Design, which was held in Washington, DC, in September 2011. It was hosted by the Committee for Emerging Science for Environmental Health Decisions at the NAS and sponsored by the National Institute of Environmental Health Sciences (NIEHS).
“This particular meeting was about using toxicology and new toxicology approaches … early on in the chemistry design process,” she said. “To advance green chemistry we need to have a lot more interaction between the toxicologists and the chemists, not unlike what happens in the pharmaceutical industry.”
Shelton-Davenport repeated several interesting comments that had been made at the workshop. “Richard Denison talked about how the new high-throughput, high-content data … should allow more assessment near the beginning of the chemical design process,” although she noted that “high throughput” might be a bit of a misnomer because “high-throughput, high-content data isn’t always rapid to analyze.”
The idea underlying the meeting was that “chemicals can be designed to be inherently safer, which is the mantra of the green chemistry world,” she said, and the meeting “was trying to get at what toxicologists know that could inform that.” However, she added that it is just as important to understand the limitations of what toxicologists know.
“We had Thaddeus Schug from NIEHS talk about tiered endocrine disruption processes and testing,” Shelton-Davenport continued. Robert Tanguay and Jim Hutchison spoke about the importance of using simple
5 Further information on the Design and Evaluation of Safer Chemical Substitutions: A Framework to Inform Government and Industry Decisions is available at http://www8.nationalacademies.org/cp/projectview.aspx?key=49569 (accessed April 2, 2014).
organisms, such as zebrafish, to complement in vitro studies and tests in rodents. The two also described an interesting example of chemical design people working quite closely with toxicologists on nanomaterials. Alex Tropsha discussed the importance of new approaches for combining short-term biologic assays to inform the structure–activity relationship; in particular, the goal was not just to look at structure and modeling but also to inform that with in vitro assays.
Then Shelton-Davenport switched to describing the study she is currently directing. The study is being funded by the EPA’s Office of Research and Development, and it has its roots in the existence of many different approaches for comparing chemical substitutes. Shelton-Davenport listed just a few of these approaches: GreenScreen, Cleangredients, GreenList from SC Johnson, IC2 out of the Interstate Chemicals Clearinghouse, the EPA’s Design for the Environment, California’s Green Chemistry Initiative, Greenlist, Greenblue, Cradle to Cradle, SubsPort, and so on. “These are just a few of the different approaches,” she said.
The various approaches differ in many ways. They differ, for instance, in how they consider health and safety effects versus ecological risks, such as aquatic toxicity or the environmental impacts of chemicals, when they are comparing alternatives. “To give an example,” she said, “Cradle to Cradle is one that includes everything from environmental impact, as in greenhouse gases and water use, to social fairness. I think that is a pretty broad number of things to include. Some of the others are more focused on hazard or safety.” They also differ in how they handle uncertainty and what they do when there are no data. And some of them are not very transparent so that it is difficult to know what goes into the alternative analysis.
In the new study that she is directing, the goal is to put together a more universally accepted approach to evaluating substitutions, Shelton-Davenport said. “I think most people would agree that if there was some harmonization of—or at least understanding about—the different kinds of approaches and the appropriateness of them for different uses, that would allow wider use of this comparison of chemical options.” Furthermore, a more universally accepted approach should make it easier to plan for developing the scientific information and the tools that will be required for such an approach, and it should also help increase the dialogue among different stakeholders by having them all on the same page concerning which approach to use.
The committee’s statement of task calls for it to develop a decision framework for the evaluation of potentially safer substitute chemicals. That framework should
- support the consideration of potential impacts early in chemical design;
- consider both human health and ecological risks;
- integrate multiple and diverse data streams;
- include details on how to consider trade-offs between risks and factors such as product functionality, product efficacy, process safety, and resource use; and
- identify the scientific information and tools required for the approach.
The committee is also charged with developing at least two examples that “demonstrate how the framework can be applied by different users in contrasting decision contexts with diverse priorities.” According to the statement of task, these examples “shall include demonstration of how high-throughput and high-content data streams could inform assessment of potentially safer substitutes early in the chemical development process.”
Lynn Goldman noted that there have been many recent scientific advances with implications for risk assessment and exposure assessment. Will these scientific advances translate into faster, more efficient assessments?
“I think that there is tremendous opportunity,” Ila Cote said. Thanks to advances in personalized medicine and pharmacology, the field of risk assessment is moving forward quickly, she said. “My concern is that the toxicology community will be left behind. I think there is going to be progress that is going to be made whether the conventional community does anything or not.”
Paul Gilman added, “I think the real challenge is in bringing along the different customers, everybody within the agency, the regulated community, and folks who want to be engaged.” Understanding toxicology at the molecular systems level is a very rapidly moving field, he said. The researchers who are working in more predictive exposure analysis and who are feeding back and forth between structure and likely exposure scenarios “are talking in a language that has always been
difficult to engage the community with.” And now the community is also being asked to engage in informatics and computing and biology at the molecular level. “That is a real challenge, I think.”
“I agree with that,” John Balbus said. “The scientific advancements and technology sites tend to draw us toward the more involved and more complex. It is the political and social side that would be moving us toward a more streamlined process for a lot of decisions.”
Jerry Paulsen of George Washington University suggested that it would be valuable to think beyond substitution when trying to minimize risk from chemicals. “Do we really always need alternative chemicals? Sometimes maybe we don’t need chemicals at all…. Do we really need to spend a lot of time looking for alternative chemicals to be flame retardants in furniture when it is not clear that we need flame retardants in furniture? … Do we need fragrances for consumer products at all? Why do we spend the time looking for safer chemicals when the safest might be none at all?”
Balbus agreed that these are the sorts of questions that should be asked. “Do we need to be exposing people to this? What is the societal end we are trying to get, and what are the different ways to get there, and does it need to be something that involves chemical exposure?” These questions should be part of the decision-making framework, he said.
Marilee Shelton-Davenport said she believes this issue will come up in the committee study she is directing. “The title is Evaluation of Safer Chemical Substitutions, but my thinking is that they are likely going to be having a broader discussion.”
Gina Solomon of the California Environmental Protection Agency also weighed in on the issue. “In the California Safer Consumer Products Regulations, we have devised an off-ramp where if the state identifies a chemical of concern in a product and lists it, the manufacturer of that product may either perform an alternatives analysis or may simply take the chemical out of the product, and then that would save them the trouble of having to go through the entire alternatives analysis. We will see what comes of that and whether that does incentivize removal of some of these chemicals altogether.”
EPA (U.S. Environmental Protection Agency). 2009. The U.S. Environmental Protection Agency’s strategic plan for evaluating the toxicity of chemicals. Washington, DC: Office of the Science Advisor, Science Policy Council. Available at http://www.epa.gov/spc/toxicitytesting/docs/toxtest_strategy_032309.pdf (accessed March 31, 2014).
EPA. 2011a. Advancing the next generation (NexGen) of risk assessment: public dialogue conference. Washington, DC: EPA. Available at http://www.epa.gov/risk/nexgen/docs/NexGen-Public-Conf-Summary.pdf (accessed March 31, 2014).
EPA. 2011b. Advancing the next generation (NexGen) of risk assessment: The prototypes workshop. Washington, DC: EPA. Available at http://www.epa.gov/risk/nexgen/docs/NexGen-Prototypes-Workshop-Summary.pdf (accessed March 31, 2014).
EPA. 2013. Next generation risk assessment: Incorporation of recent advances in molecular, computational, and systems biology (external review draft). Available at http://cfpub.epa.gov/ncea/risk/recordisplay.cfm?deid=259936 (accessed April 2, 2014).
NRC (National Research Council). 1983. Risk assessment in the federal government: Managing the process. Washington, DC: National Academy Press.
NRC. 2007. Toxicity testing in the 21st century: A vision and a strategy. Washington, DC: The National Academies Press.
NRC. 2009. Science and decisions: Advancing risk assessment. Washington, DC: The National Academies Press.
NRC. 2012. Exposure science in the 21st century: A vision and a strategy. Washington, DC: The National Academies Press.
This page intentionally left blank.