Obtaining reliable estimates of exposures of large populations on multiple scales of space and time requires detailed information on emissions or transformation products from a source, on the locations of receptors (personal or ecosystem) in time and space, and on the activity levels of the receptors (as proxies for inhalation rate, ingestion potential, or dermal interaction) at the time when they are affected by a source. Additional information may be needed on how individual heterogeneity influences exposures—including such susceptibility characteristics as genetics, pre-existing health conditions, and psychosocial stress—because these factors may also influence exposure. Information on body burden, obtained by collecting exposure biomarkers, also is essential for understanding the dose from a specific source and the influence of environmental exposures on health risks.
Efforts to characterize exposure have focused on ambient conditions, and an individual is typically assigned to a home address in an epidemiologic health study or a species is assigned to a region it inhabits in an ecosystem. Although those exposure assignments have revealed important health risks, reliance on proxy methods may impart large exposure-measurement error—that is, a modeled exposure may be an inaccurate and potentially biased estimate of the true exposure. Depending on the exposure-error type, health-effect estimates may be attenuated and biased toward a null result, and the true benefits of control measures may be obscured. Obtaining more accurate estimates of internal exposure reduces exposure-measurement error and provides a more realistic understanding of potential health effects of environmental and occupational exposures (Carroll et al. 2006).
Efforts to gather information on personal exposures have relied on specialized equipment that is expensive and cumbersome and thus limits the wear time or number of subjects that can be monitored. Because of those limitations, many studies have used questionnaires (Wacholder et al. 1992) or simple information
on location, such as home address, that is related to exposures. Those techniques have well-known limitations, but they are often the only methods available, particularly for reconstructing historical exposures.
Innovations in science and technology provide opportunities to overcome limitations and guide exposure science in the 21st century to deliver knowledge that is effective, timely, and relevant to current and emerging environmental-health challenges. Personalized medicine1 and telemedicine will increase the pace of innovation in scientific and technologic methods that will benefit the field of exposure science. For example, many new genomic methods for monitoring individual metabolic and exposure phenotypes will be critical for future individualized medicine. In telemedicine, cellular-telephone technologies increasingly contribute to improving diagnostics and patient care and hence to improving our ability to anticipate the effects of exposures (Wootton and Bonnardot 2010). Similarly, new developments in geographic information science and technologies are leading to rapid adoption of new information obtained from satellites via remote sensing, which provides immediate access to data on potential environmental threats. Improved information on physical activity and locations of humans and other species obtained with global positioning systems (GPS) and related geolocation technologies is increasingly being combined with cellular-telephone technologies. Many of these advances are integrated through powerful geographic information systems (GIS)2 that operate either through stand-alone computing platforms or through the World Wide Web. Biologic monitoring and sensing increasingly offer the potential to assess internal exposures. The convergence of these scientific methods and technologies raises the possibility that in the near future embedded, ubiquitous, and participatory sensing systems will facilitate individual-level exposure assessments on large populations of humans or other species.
The new technologies and methods also may help to operationalize the concept of the exposome (see discussion in Chapter 1). Establishing a more complete record of exposures based on internal biomarkers as theorized in the exposome (Wild 2005) requires tools that can also assess external environmental exposures. Many important exposures lead to no internal biomarkers but can be associated with environmental health risks (Peters et al. 2012) (for example, noise, heat, and electromagnetic fields). There is also a need to continue to link sources to exposures; this is the basis of mitigation efforts to protect public health. The committee envisions that many of the new technologies discussed in this chapter will help to broaden the exposome to the “eco-exposome” concept discussed in Chapter 2, and help to quantify exposure indicators to address those concerns.
1Personalized medicine is an emerging practice of medicine that uses information about a person’s genetic profile and environmental exposures to prevent, diagnose, and treat disease (Offit 2011).
2GIS is defined as a system for performing numerous operations involving the acquisition, editing, analysis, storage, and visualization of geographic data (Longley et al. 2005).
In this chapter, the committee reviews some of the newest technologies for exposure science and, in considering their strengths and limitations, identifies near-term and long-term innovations that will guide exposure science in the 21st century. The review is organized according to the framework in Chapter 1 that describes the scope of exposure science from characterizing external concentrations to personal exposures and finally to understanding how internal exposures affect dose. Figure 5-1 expands on the framework by identifying the technologies that will be presented. The discussion begins with a review of geographic information technologies, which help in characterizing sources and concentrations and also can improve understanding of stressors and receptors when used in concert with other methods and information. Ubiquitous sensing systems, ecologic momentary assessment (participatory methods that are used to query subjects about their perceptions and experiences while in the exposure field using cell phones or other real-time devices), and nanosensors are addressed next; these can help in characterizing personal exposures. We then discuss biomonitoring, which can improve our understanding of internal exposures and, when combined with other technologies, can help to identify sources. Finally, models and information-management tools are addressed in the context of their ability to help in interpreting and managing the massive and often complex interactions among receptors and environmental stressors. Many of the technologies in this chapter are illustrated in connection with air pollution, inasmuch as this is one of the most developed sectors of exposure science. As shown in Figure 5-1, however, the committee’s framework and vision are intended to be broadly applicable and relevant to all media to reflect the expected needs for the technologies, and many other illustrative examples are presented.
Three major technologic advances in geographic information technologies—remote sensing, global positioning and related locational technologies, and GIS—have dramatically affected exposure science. As outlined by Good-child (2007), they are inspiring a new emphasis on spatial information in relation to social and scientific inquiry. Over the last 10 years, the technologies have contributed to improvements in exposure science, and they will probably continue to move the field toward more refined exposure assessments that are more comprehensive, more accurate, and more relevant to and valuable in policy-making and in the everyday lives of large populations.
Remote Sensing for Exposure Assessment
Remote sensing (RS) has emerged as a key innovation in exposure science. RS has been defined as “the acquisition and measurement of data/information on some property(ies) of a phenomenon, object, or material by a recording
device not in physical, intimate contact with the feature(s) under surveillance” (Short 2011). The field encompasses the capture, retrieval, analysis, and display of information on subsurface, surface, and atmospheric conditions that is collected with satellite, aircraft, or other technologies designed to sense energy, light, or optical properties at a distance (Jerrett et al. 2009). RS is an important tool for enhancing the capacity to assess human and ecologic exposures because it provides global information on the earth’s surface, water, and atmosphere. It is also widely used for subsurface investigations (for example, electromagnetic imaging of karst in water resource investigations). It also provides exposure estimates in regions where sparse ground observation systems are available.
With respect to air pollution, the most common aerosol characteristic measured with a satellite is the aerosol optical depth (AOD), which quantifies the extinction of electromagnetic radiation from aerosols in an atmospheric column at a given wavelength (Emilli et al. 2010). Six primary satellite sensors provide information on particulate pollution (MODIS, Landsat, IKONOS, Orbview, SPOT, and GOES). Box 5-1 discusses evaluation of the reliability of AOD compared with PM2.53 mass concentrations measured on the ground. Box 5-2 and Figure 5-2 demonstrate how the results of a 1-km retrieval of the MODIS AOD substantially improve the resolution and thus the utility of remote sensing for health and ecologic studies; the current grid size has a 10-km retrieval.
3PM2.5 are fine particles in the ambient air that are 2.5 microns or less in diameter.
Hoff and Christopher (2009) reviewed more than 30 papers that examined the relationship between total column AOD and surface PM2.5 measurements on a station by station basis. Their results underscored “the range of measurements from across the globe and the range of correlations between AOD and mass”. They found a wide range of uncertainty between the two measures of AOD and mass. The studies used simple linear regressions and correlations between the AOD values and the PM2.5 mass concentrations measured on the ground. In some cases the correlations were strong and the AOD served as a predictor of pollution on the ground. In other cases, either because the satellite product itself was not sufficiently accurate or because the particles observed in the total column were in layers aloft, the satellite derived AOD was a poor predictor of pollution at the earth’s surface. The authors suggested conducting a study of the controlling extrinsic factors for each region that would aid in understanding the PM2.5—AOD relationship. The literature continues to grow with efforts to combine information from multiple satellite sensors and models (van Donkelaar et al. 2010) or to introduce auxiliary information, such as meteorologic data (Pelletier et al. 2007) or boundary layer height (Engel Cox et al. 2006). Lee et al. (2011) have hypothesized that the inherent variability in the PM—AOD relationship is due to changes in particle size and composition, earth surface properties, vertical distribution of particle concentrations, and other factors. To account for the variability of these factors, they proposed a daily calibration technique that is based on the spatial variability of ground PM measurements and would make it possible to obtain quantitative estimates of PM concentrations by using AOD measurements.
The development of the 3 km and 1 km products provides an opportunity to test the capabilities of the satellite data to provide the resolution needed for exposure assessment and health related studies. For example, the 10 km aerosol product offered by MODIS is sufficient for climate applications but insufficient for detailed exposure assessment from sources that are variable over small areas, such as traffic emissions. In that regard, Hoff and Christopher (2009) stressed the importance of a finer resolution product on a local urban scale. It is expected that a 3 km product will become publicly available in 2013.
To attain 1km resolution AOD from MODIS, the Multi Angle Implementation of Atmospheric Correction algorithm was applied (Lyapustin et al. 2011a,b). The 1 km product was generated for the New England area during 2003. Figure 5-2 compares the 10 km and 1 km retrievals. It clearly shows that considerably more detail is obtained with the 1 km product.
FIGURE 5-2 Aerosol optical depth (AOD) derived from MODIS data for the New England region with the standard 10-km algorithm (left) and the experimental 1-km algorithm (right) for June 25, 2003.
Remote Sensing for Health, Exposure, and Ecologic Studies
Several studies and reviews (for example, Maxwell et al. 2010) have suggested that higher-resolution data enhance efforts to identify time—space patterns that are the basis of many risk assessments for diseases (Wilson 2002). In many studies, remote sensing data were used to derive three variables: vegetation cover, landscape structure, and water bodies. The ability to sense vegetation remotely from space is important in that nearly all vectorborne diseases are linked to the vegetative environment during their transmission cycle. Furthermore, crop-type information may be important for studying the effects of pesticides (for example, vector resistance and illnesses caused by exposure to toxins) (Beck et al. 2000). Ward et al. (2000, 2006) and Maxwell et al. (2010) used crop location to identify where pesticides were applied in relation to residential locations (Maxwell et al. 2010). Remote sensing of vegetation cover combined with GIS has also been used to develop management strategies to reduce herbicide application (Gómez-Casero et al. 2010) and to assess potential exposure of fish and wildlife to pesticides and metals (Focardi et al. 2006).
Green cover is also associated with higher levels of physical activity, and RS has been used with geolocation technologies to show associations between physical activity of children and their exposure to green cover (Almanza et al. 2012).
Hyperspectral imaging collects and processes information from a wide portion of the electromagnetic spectrum. It has been used to assess human health risks associated with infectious diseases or environmental hazards. Ong et al. (2003) used hyperspectral airborne techniques to quantify dust loadings on mangroves originating from mining. The authors found that they could detect and quantitatively map the distribution of iron oxide. Ferrier (1999) showed that mine tailings (which contain potentially toxic materials) had been dispersed from the mine workings extending down the Rambla del Playazo to within 600 m of the beach at El Playazo (Spain).
Other researchers have used hyperspectral data that were collected over “Ground Zero” for rapid assessment of the potential asbestos hazards associated with the dust that settled over lower Manhattan after the collapse of the World Trade Center towers (Clark et al. 2001; Swayze et al. 2006). Malley et al. (1999), Winkelmann (2005), and van der Meer et al. (2002) have reported soil contamination by hydrocarbons. Wu et al. (2005) studied mercury contamination in suburban agricultural soils in the Nansing region of China. Finally, Chudnovsky et al. (2009, 2011) used the Hyperion satellite data to separate the spectral features of the Saharan dust storm from the underlying surface.
HI sensing has been used to examine exposures of coral reefs to stressors such as sea surface temperature, ultraviolet radiation, wind, sediment load, chlorophyll, acidification, salinity, and coastal development (Maina et al. 2008; Eakin et al. 2010). Sediment load/water clarity (Doran et al. 2011), stressor exposures in benthic ecosystems (Goetz et al. 2008), and other water quality parameters (Bagheri and Yu 2008; Odermatt et al. 2012) have also been analyzed using HI techniques. More recently HI techniques were utilized to assess the extent of the Deepwater Horizon Oil Spill and possible exposures to oil in pelagic and nearshore ecosystems (Bradley et al. 2011; Lavrova and Kostianoy 2011; Bulgarelli and Djavidnia 2012; Mishra et al. 2012).
In 2015, two new hyperspectral sensors will be launched: the National Aeronautics and Space Administration HyspIRI (NASA 2011) and European ENMAP (EnMAP 2011) missions. With their improved hyperspectral and multispectral capabilities, these sensors will increase the ability to monitor the effects of urbanization on the environment and to assess land-cover characteristics that could indicate the presence of or risks posed by vectorborne and animal-borne diseases on a global scale.
To improve data quality for RS and increase its utility for exposure studies, technologic improvements are needed, including
• Breakthroughs in electro-optics technologies.
• Improvement of the current AOD retrievals (to achieve near-laboratory air-quality data) by obtaining accurate and reliable atmospheric vertical profile information.
• Retrieval of high-resolution AOD to discern spatial patterns of pollution in urban environments through frequent daily temporal coverage based on orbital sensors.
Global Positioning System and Geolocation Technologies
Launched in the 1980s for defense applications, the GPS offers exposure scientists a simple means of tracking the geographic position of a person or other species. GPS receivers are now embedded into many cellular telephones, vehicle navigation systems, and many other instruments (Goodchild 2007). The GPS is a utility owned by the US government, and it consists of three components: a space segment with at least 24 satellites that transmit one-way signals to the earth; a control segment that maintains ground stations to track the satellites, reset their clocks, and maintain their positions; and a user segment that consists of individual devices that users deploy to receive the signals and calculate three-dimensional positions and times (GPS 2011). GPS signals can be augmented or complemented by land-based navigation systems that use cellular-telephone triangulation to provide positions when satellite signals are unavailable because of, for example, topographic obstruction or weather conditions (Shoval and Isaacson 2006). Radiofrequency identification can also be used for local tracking of goods, animals, or people (Goodchild 2007). Collectively, these systems are referred to here as geolocation technologies.
Hundreds of studies have used geolocation technologies to improve assessment of environmental exposures, including exposure to infectious-disease vectors (Vazquez-Prokopec et al. 2009) and air pollution (Paulos et al. 2007); to analyze how physical activity is related to different built environments (Jones et al. 2009); and to inform simulation models of potential pesticide exposures (Leyk et al. 2009). Many other applications are found in the literature.
The main contribution of geolocation technologies is to reduce exposure measurement error and to move closer to a “time—geography of exposure” (Hagerstrand 1970; Briggs 2005). That is, geolocation technologies offer the possibility to know, with a high degree of accuracy, an individual’s location in time and space and to provide a window into the moment of contact between a source (that is, an environmental intensity) and a receptor. When data obtained on environmental intensities (for example, air or water quality) are combined with geolocation information and physical activity measurements (obtained with accelerometers), more detailed estimates of potential chemical, biologic, or physical exposures can be made by using data on inhalation rate, ingestion potential, or dermal contact.
There are many examples of how geolocation technologies have improved our understanding of exposures through their use in defining a person’s location
in time and space. They have revealed important limitations of survey-based assessments of location. For example, a study by Elgethun et al. (2007) compared time—activity diaries with actual measurements from GPS and found severe underreporting in the diaries regarding the amount of time spent outdoors at home. Such errors may result in substantial exposure misclassification to such pollutants as ozone that have low penetration ratios from outdoors to indoors, which make time outdoors a key determinant of exposure. The technologies provide more accurate information on geocoded locations of subjects and a better understanding of likely sources of error when points are used to represent large structures, such as schools and day-care facilities (Houston et al. 2006). They are also used in studies in which exposures are measured as study subjects walk, ride bicycles, or drive with pollution monitors. A study by McCreanor et al. (2007) demonstrated the effect of walking through polluted areas on asthmatic symptoms and biomarkers—such as exhaled nitric oxide, a marker of lung inflammation—in London, England. The study provided increased support for the hypothesis that ambient air-pollution concentrations can elicit changes in asthmatic symptoms. Geolocation technologies have already made important contributions to the understanding of exposures at the point of contact between source and receptor, and they appear poised to play an increasingly integral role in widespread population-based individual sensing (discussed below).
Geographic Information Systems
GIS combines topologic geometry, capable of manipulating geographic information, with automated cartography and enables users to compile digital or hard-copy maps. GIS plays a central role in integrating data into coherent databases that connect different attribute data (for example, exposure and health attributes) by geographic location. Input data used to derive exposure surfaces, such as road locations and industrial land uses, also are stored and manipulated in GIS. GIS increasingly serves as the storage and integrative backbone of remote sensing, geolocation technologies, and sophisticated modeling outputs, such as for collecting measurements on the fate and transport of contaminants through ecosystems (Gallagher et al. 2010).
Another important role of GIS in exposure assessment is the quantification of topologic relationships. For example, buffer functions that measure the distance between a source, such as a roadway, and a receptor, such as a house, enable analysts to relate the geographic position of a study subject in space and time with the subject’s likely exposure on the basis of an overlay of location information (Jerrett et al. 2005). That type of buffering, which provides the distance between a source and a receptor, is used to characterize proximity to roadways, factories, water bodies, and other land uses or modifications that have either potentially adverse exposures (for example, pesticide transport from agricultural fields) (Gunier et al. 2011) or potentially favorable exposures (for example, parks and health-food stores in cities) (Morland and Evenson 2009). GIS
can provide information that is stored by the user, both before and after a major change (for example, in land use) or a catastrophe (for example, a tsunami). Figure 5-3 demonstrates a road buffer that was used to characterize human exposures to traffic-related air pollution in Hamilton, Canada (Jerrett et al. 2005). Ecologic studies have combined modeling results with overlay techniques to examine potential exposure exceedances of threatened and endangered species (for example, see Figure 5-4 for cadmium exposures of the Little Owl) (Lahr and Kooistra 2010).
Web-Based Geographic Information Systems for Exposure Assessment
Web-based GIS is becoming more common (Maclachlan et al. 2007) and can serve as a tool in policy-making and in educating and empowering communities to understand and manage their environmental exposures better. (See Chapter 6 for additional discussion of community engagement.) For example, to promote active commuting, Metro Vancouver has collaborated with the University of British Columbia to develop a cycling-route planner (Cycling Metro Vancouver 2007), which allows cyclists to select routes that have the most green vegetation, the least traffic pollution, and the least or greatest elevation, all specified by the user. That empowers cyclists to choose the routes that best suit their fitness levels, minimize exposure to traffic pollution, and reduce their carbon dioxide output (Su et al. 2010). The Web site runs on the backdrop of Google Maps—an illustration of the potential synergies between new private-sector technologies and public-health protection.
FIGURE 5-3 Example of a binary buffer overlay showing people likely to experience traffic-related air-pollution exposure. The circles represent people. People assigned a “0” are outside a prespecified distance, while people assigned a “1” are within a given distance. Adapted from Jerrett et al. 2005. Reprinted with permission; copyright 2005, Journal of Exposure Science and Environmental Epidemiology.
FIGURE 5-4 Map of a flood plain in the Netherlands showing secondary risk of poisoning by cadmium in Little Owls developed using a combination of measured cadmium concentrations, food web modeling, knowledge of foraging in different habitats, and probabilistic risk assessment. Source: Lahr and Kooistra 2010. Reprinted with permission; copyright 2010, Science of the Total Environment.
In addition to human health concerns, web-based GIS has been used to monitor ecologic exposures. For example, Google Earth and Google Fusion Tables with Airborne/Visible Infared Imaging Spectrometer data (AVIRIS 2012) were used to provide public, real mapping of the Deepwater Horizon Oil Spill (Bradley et al. 2011).
With the new technologies—such as cellular telephones, GPS, and computers that apply complex data-mining techniques—private companies are increasingly collecting data that are potentially useful for exposure science, such as location and mobility information, and in some cases direct measurements of exposure through sensing networks. Issues of data ownership, use, informed consent, and data-sharing remain to be addressed. Increased cooperation with private-sector entities offers great potential for enhancing the data available for exposure science.
GPS and GIS have already contributed in important ways to exposure assessment, enabling researchers to refine assessments and to understand how to move from assessing ambient concentrations to understanding the likely exposures received by people. The committee suggests the following measures to capitalize further on these advances:
• Continued support for the GIS “infostructure” to ensure that public-health agencies and researchers have access to the wealth of geographic exposure data.
• Expanded access to existing exposure data that are collected with public funding for researchers conducting scientific research. Many of the existing data sources permit only partial release of critical information or the information is only released in highly restricted data enclaves, often limiting its full use by scientists. Agencies responsible for collecting and maintaining these data should make all efforts to expand access, both in terms of the geographic precision of the data and the availability of the data to researchers.
• Efforts to promote sharing of data should include mechanisms to validate the data collected with federal funding. This would include a minimum adherence to established requirements for interagency sharing of spatial metadata (FGDC 2011). These metadata for sharing should include factors that allow a potential user to evaluate applicability of the data to particular research projects.
• Efforts by government agencies and universities that are involved in exposure-science research should work to foster cooperation with the private sector to encourage data collection, sharing of geographic and exposure information, and the formation of partnerships with exposure scientists with the goal of improving public-health protection.
• Efforts to support and enhance Web-based exposure mapping (for example, Cycling Metro Vancouver 2007) to improve access to data on and understanding of potential exposures.
Limitations of Current Environmental and Personal Monitors and the Potential of Ubiquitous Sensing
Lack of personal and individual exposure data is one of the greatest limitations in exposure science. A challenge in addressing the paucity of data is imputing, interpolating, or modeling the likely exposures that are not directly measured on a person or on a species in its ecosystem. The void in fine-scale exposure data on an individual or a species means that considerable error may be introduced in assessing dose-response relationships. Assessing the influence
of that measurement error on dose—response functions is challenging because researchers typically lack sufficient information on the “gold standard”—measured exposure information on the individual or in key microenvironments. As a result, exposure assessment has been called the Achilles heel of environmental epidemiology (Steenland and Savitz 1997).
The last 20 years have seen substantial technologic advances in personal environmental monitoring. Despite the advances, however, personal sensors are still inadequate in their capacity to obtain highly selective, multistressor measurements in out-of-the-laboratory environments. (Measurements of multiple stressors provide an understanding of exposure to mixtures that may elucidate our understanding of health responses.) Many personal monitors rely on pumps to collect air samples on filters or other sorbent materials, which are then analyzed to measure integrated average concentrations over time (usually days or weeks). However, the size, weight, noise level, and appearance of monitors are unacceptable to some users, and this shortens monitoring times, lowers compliance, and introduces bias. Valuable information on the locations and conditions (for example, sources) where and when exposures occur and on the frequency and magnitude of transient high-level exposures is not captured in longer-term integrated samples. In general, current techniques do not consider the physical activity of the subject during exposure. Passive samplers (for example, badges or sampling tubes) make it possible to collect samples without pumps but are limited in the analytes that can be measured at environmental concentrations and often require longer sampling periods (for example, 1-2 weeks) to obtain sufficient concentrations in nonoccupational environments. The analytic costs of each sample may be high.
Innovations in personalized monitoring and in analysis of a person’s mobility can ameliorate those shortcomings—in particular, through sensing of the environment through modification of cellular telephones, which are carried routinely by billions of people around the world. Many cellular telephones come equipped with motion, audio, visual, and location sensors, and several software applications have been written to exploit the on-board sensors through cellular or wireless networks (see, for example, Seto et al. 2010). Other devices, such as pollution monitors, can be either built into the telephones or connected through a body-sensing network via Bluetooth radio.
Cellular telephones, supporting software, and the expanding networks (cellular and WiFi) potentially can be used to form “ubiquitous” sensing networks to collect personal exposure information on millions of individuals and large ecosystems using citizen—scientists (Crall et al. 2010). Such networks can also take advantage of “embedded” sensors installed in existing infrastructure, for example, weight-in-motion sensors installed on roadways or sensors installed in public vehicles, such as buses, that provide anonymous, continuous data collection. In addition, there is participatory sensing, which differs from other embedded networks in that the people in the network knowingly and voluntarily collect information on environmental conditions (or their own mobility) in exchange for some individual benefit, such as increased knowledge about their
own environmental exposures or information on their level of physical activity (Burke et al. 2006). All these research approaches, however, pose challenges of bias and compliance. Boxes 5-3, 5-4, and 5-5 provide examples of embedded, ubiquitous, and participatory sensing, respectively; these categories, however, are not mutually exclusive, in that they may include elements of one another.
Embedded sensors are now being piloted in the city of Rome, Italy to help with tracking mobility patterns of pedestrians, bicyclists, and vehicles and for managing traffic flows. Given the role of traffic in air pollution, noise, and accident risk, better information on traffic and other modes of transportation are needed. Rome’s embedded sensor system relies on a Telecom Italia’s Localizing and Handling Network Event Systems (LocHNESs) software platform, which uses anonymous information on location from cell phone users in combination with embedded location tracking from public transit vehicles. The system is being tested to supply near real-time traffic monitoring and management information (Calabrese et al. 2011). Such information on traffic could be combined with models to estimate noise or air pollution levels throughout the city. The system can output data into a variety of formats, including a 40*40 m grid cell resolution showing levels of traffic congestion.
LocHNESs illustrates how ubiquitous mobile phones can supply anonymous information on location that can be combined with embedded tracking networks on public infrastructure to deliver real-time data on environmental exposures.
Increasing availability of ubiquitous mobile devices—particularly smart telephones with motion sensing, GPS, and wireless capabilities—has created opportunities to develop new tools and methods to study and intervene to address sedentary lifestyles, obesity, and ambient risk factors, such as air pollution, noise, or ultraviolet radiation. One example of the potential is CalFit software. CalFit is an application that runs on mobile telephones that use the Android operating system. The software uses the accelerometry and GPS sensors that are typically built into all smart telephones to record activity counts and energy expenditure and the time and location in which an activity occurs. The device consists of a single telephone that can be carried and used as a normal telephone by participants in a study or by the general public. The software program has a single on—off switch for data-logging and, once turned on, will continuously collect data as a background service and not stop until turned off (Seto et al. 2010). Pilot studies testing CalFit with 35
free-living human volunteers in Barcelona, Spain, indicate that the software collects data on location and physical activity that compares well with commercially available stand-alone triaxial accelerometers (de Nazelle et al. 2011; Seto et al. 2011) (see Figure 5-5). When combined with dispersion or other models of ambient exposure, CalFit offers the potential for adjusting, for example, the dose of air pollution received by a person to account for exposure to air pollutants, rather than relying on exposure concentration in the home or workplace. The system is also capable of serving as a base station for other sensors that operate via Bluetooth radio to collect such data as air pollution, light, and noise.
CalFit and other cellular-telephone—based systems can also be used to implement context-specific ecologic momentary assessment (CS-EMA). CS-EMA measures real-time exposures and outcomes with sensors that are inside and outside the telephone. The system can communicate information, telling a person to respond to a survey when particular events are observed, such as a period of physical activity, exposure to air pollution, use of steroid inhalants, or consumption of particular food. Responding to these surveys provides opportunities to obtain important information about an exposure or outcome, such as mood, stress, behaviors, and other information (Intille 2007; Duntun et al. 2010).
FIGURE 5-5 Output from a CalFit telephone showing the location and activity level of volunteers in kilocalories per 10-second period in a pilot study in Barcelona, Spain. Activity traces are overlaid onto a map of nitrogen dioxide concentrations. Source: Map supplied courtesy of de Nazelle et al. 2011. Reprinted with permission from the author; copyright 2011.
Participatory sensing refers to systems of distributed data collection and analysis in which participants decide on what, where, and when to monitor in their environments (Mun et al. 2009). Such systems operate on various scales—for example, individual, group, urban, and global—depending on what is being measured (Burke et al. 2006). Participatory sensing systems often combine embedded and ubiquitous systems with Web-based applications that allow participants to share information on their exposures and to understand exposures of others in the participatory system.
One example is the Personal Environmental Impact Report (PEIR) system that operates in Los Angeles, California (Mun et al. 2009). The system measures four main outcomes: exposure to fine particulate matter (PM2.5), exposure to fast food outlets, output of transportation-related greenhouse-gas emissions, and output of transportation-related PM2.5 emissions near sensitive receptors. PEIR relies on cellular-telephone locational and speed information and on a sophisticated activity-classification system that uses information from GPS, cellular-telephone towers, and data on land use and traffic. Activities are classified as walking or driving with a Markov chain algorithm.
Once activities are classified, exposures can be assessed through a near real-time dispersion model for PM2.5 that combines the likely exposure levels and a person’s location to assign a likely concentration. People also can examine their exposure to fast food, generation of transportation-related PM2.5 emissions near sensitive receptors, or their impact on transportation-related greenhouse gas emissions.
The PEIR system has been piloted by 30 volunteers. Respondents, using a Facebook application, can join a social network to review their exposures and emissions in comparison with those of others in the social network. The movement of the processed information into a continuing report that can generate information on exposures and impacts constitutes an innovative fusion of new media, such as Facebook, with mobile sensing platforms. For example, users have access to a weekly impact report and a locational trace of where they have been.
Personal Samplers for Particles, Volatile Organic Chemicals, and Time—Activity Information: Current and Future Technologies
This section discusses personal exposure measurement devices that can be used to obtain information on exposures to particles and volatile organic chemicals (VOCs) and time—activity information. Some of these devices have been fully tested; others are new and yet to be implemented.
Current Microsensor Technologies
This section addresses state-of-the-art microsensor systems. Some systems are currently in the prototype stage, while others are commercially available.
Miniature microsensor systems incorporating preconcentrator, microfabricated gas chromatography (micro GC) systems, and microsensor arrays have been developed for detecting VOCs with detection limits below parts per billion (Kim et al. 2011). The sensor systems include a miniature pump for sample collection. The VOCs are first separated by a high-performance GC system before detection by sensor arrays (Kim et al. 2011; Kim et al. 2012a,b). Sandia National Laboratory has also developed a microsensor system based on micro GC and surface acoustic-wave sensors (Lewis et al. 2006).
Volatile-Organic-Chemical Monitor (Arizona State University HYBRID)
A portable (“wearable”) VOC monitor was recently developed (Iglesias et al. 2009; Chen 2011). It uses a “hybrid chemical sensor” for nearly real-time measurement and wireless transmission of data via a cellular telephone. The sensor combines a miniature gas chromatographic system for preconcentration and separation of chemical species with a novel detector that consists of a tuning fork coated with molecular imprinted polymers. The operation is controlled by an internal microprocessor. Benzene, toluene, ethylbenzene, and xylene components can be individually measured at concentrations as low as 1 ppb. The components and operating characteristics can be varied to measure other types of VOCs.
Pretoddler Inhalable Particulate Environmental Robotic Bioaerosol Sampler
A Pretoddler Inhalable Particulate Environmental Robotic (PIPER) personal sampler to measure indoor particulate matter (PM) and bioaerosols was recently developed (Shalat et al 2011; Wang et al. 2012). It is used as a surrogate to substitute for direct personal monitoring of very young children (it is not practical for them to wear conventional personal samplers). The PIPER sampler can hold up to two personal air-sampling devices and can mimic the speed and pattern of motion and the breathing height of boys and girls in three age groups: 6 months—1 year, 1–2 years, and 2–3 years.
Dual-Chamber Particle Monitor
A small, portable, data-logging particle monitor was recently developed (Litton et al. 2004; Edwards et al. 2006; Chowdhury et al. 2007). The device combines an ionization detector that is sensitive to submicrometer particles with a photoelectric detector that is sensitive to micrometer particles. Because the detection limit is 50 μg/m3, it is not sensitive enough for typical ambient concentrations of particles found in many developed countries. However, the monitor was designed for use in locations with much higher concentrations of particles,
such as households in many developing countries, including kitchens with wood stoves. Simplicity of operation, low cost, battery operation, low weight and small size, and quiet operation are potentially useful features.
University of California, Berkeley Time—Activity Monitoring System
A time—activity monitoring system was recently developed by Allen-Piccolo et al. (2009). It uses small, lightweight ultrasound transmitter worn by participants and an ultrasound receiver (locator) attached to a data logger that is fixed at indoor locations to record accurately when and how long each participant is in each location. The method provides more reliable time—activity information than is typically available in diaries maintained by participants. The time—activity information is used to estimate overall personal exposures on the basis of exposure measurements made with fixed monitors in each microenvironment.
Black-Carbon Monitor (microAeth)
This instrument is small, lightweight, portable, and battery-operated and is commercially available (AethLabs 2012). It measures real-time black carbon by using light transmittance for particles collected on Teflon-coated glass-fiber filters. Its dimensions are 4.6 x 2.6 x 1.5 inches, and it weighs 0.62 lb. Flow may be 50, 100, or 150 mL/min. Battery power will operate the instrument for a minimum of 24 hours with a flow of 100 mL/min and a 5-minute time base.
Personal Multipollutant Sampler
This compact personal and microenvironmental monitoring system, developed by Chang et al. (1999) and Demokritou et al. (2001), allows simultaneous active collection of particles on filters and passive sampling of pollutant gases. Different configurations allow use of selected combinations of Teflon membrane filters to measure PM10 and PM2.5 mass, PM2.5 trace elements with x-ray fluorescence and inductively coupled plasma mass spectrometry, black carbon with reflectance, elemental and organic carbon (on quartz fiber filters) with thermal optical methods, and passive ozone, nitrogen dioxide, and sulfur dioxide with diffusion badges. The sampler can be mounted on a backpack strap on the subject’s chest, and the battery-operated pump is carried in the backpack (it allows 24 hours of continuous sample collection).
RTI Micro-miniature Personal Exposure Monitoring
A personal particle monitor recently developed by RTI International, microPEM Model v3.2 (RTI 2008), is a small, lightweight nephelometer (using
light-scattering) that provides nearly real-time particle concentration data (PM10 or PM2.5) with simultaneous collection of particles on a gravimetric filter. With careful filter measurements and 24-hour sampling, the monitor is suitable for particle concentrations as low as 3.6 μg/m3.
Future Developments with Nanosensors
Developing ubiquitous monitoring networks for personal exposure assessment will depend on rapid advances in pollution-sensor technologies. Although advances have been made with the technologies described, most of them are still not capable of measuring multiple pollutants continuously. A technology needs assessment conducted by the Oak Ridge National Laboratory determined that there was a need for a rugged, lightweight, low-cost, wearable, real-time sensor capable of multianalyte detection with minimal burden on the person using the monitor (Sanchez et al. 2010). Such a sensor would need to be able to detect acute and subacute chemical agents simultaneously with the same sensing system used in the field and then link the data to a specific biologic event. The device would need to be capable of remote data acquisition, location recording, and measurement of both the level and frequency of the environmental exposure (Sanchez et al. 2010).
There are no miniature wearable sensors that can monitor multiple chemicals in real time. Real-time miniature devices—such as multianalyte single-fiber optical sensors, sol-gel indicator composites, portable acoustic-wave sensors, thin-film resonators, and surface acoustic-wave array detectors—have sufficient sensitivity but poor chemical selectivity in the presence of interfering chemical vapors. Sensors currently provide only one indirect signal, for example, frequency variation due to mass loading or changes in refractive index due to molecular adsorption. Selectivity in detection is often accomplished by immobilizing chemical interfaces or biologic receptors on the sensor surface. Since chemical selectivity is accomplished by using chemically-specific interfaces immobilized on sensor surfaces, their selectivity is only as good as the selectivity of the interfaces and fails when complex mixtures are present (Walt 2005).
Small devices (for example, having roughly the size and appearance of cellular telephones) that could provide some temporal information (such as 5-minute to 1-hour averages) and be downloaded remotely, could be attractive alternatives to personal and microenvironmental monitors. The ability to include multiple agents in a given medium would allow fewer measurements to be taken, minimizing the time for setting up and collecting samples in homes and the number of samplers worn or carried by a single person. A universal platform for measuring compounds of interest, or at least screening samples for those more likely to have high concentrations, would have a major influence on the cost of assessing exposures and allow more rapid identification of “highly exposed” people to help to identify sources, factors influencing exposures, and means of reducing exposures.
The recent advances in microlithographic technologies and microfabrication and nanofabrication enable the development of smart sensors and devices that can also be mass-manufactured in a cost-effective and modular fashion (Cheng et al. 2006). These advances could be exploited and coupled with advances in electronics and computing in developing convenient, sensitive, and cost-effective physical, chemical, and biologic detection devices. Advances in nanoscience and technology offer an unprecedented opportunity for developing very small integrated sensors. Examples of nanosensor platforms include nanowires, nanoelectromechanical and microelectromechanical systems (NEMS and MEMS), optical resonators, nanoparticles, graphenes, and doped quantum dots (Khanna 2012). All those have unprecedented sensitivity (Shelley 2008). One of the advantages of nanosensors is that multiple sensor elements can be fabricated or incorporated on the same chip for monitoring multiple analytes simultaneously. Nanosensors also allow monitoring of multiple signals—for example, frequency variation due to mass loading and variation in the electric, mechanical, thermal, optical, and magnetic properties due to molecular adsorption—and so can provide increased chemical selectivity. These analytic-induced signals can be orthogonal, allowing pattern-recognition algorithms to identify a chemical selectively.
The most important and desirable characteristics of a sensor include high selectivity, high sensitivity, real-time detection, broad dynamic range, ability to regenerate in a short time, miniature size, ability to detect multiple analytes simultaneously, low power consumption, and low cost. For nanosensors that are required for field deployment, wireless transmission of signals to another location is important.
Although nanosensors have demonstrated extremely high sensitivity, their selectivity poses a challenge. Most of the chemical sensing of small molecules in the vapor phase is carried out indirectly with physical sensors modified with chemical interfaces or biologic receptors. The receptors or chemical interfaces, immobilized on physical sensors (nanosensors included), adsorb particular analyte molecules with higher affinity than other molecules that impinge on the receptor surface. Examples of the receptors or chemical interfaces include self-assembled monolayers, polymer films, and biomolecules.
Research using microsensors has demonstrated that small molecules can be detected with high sensitivity and specificity in the absence of other interfering molecules. The microsensor surface is immobilized with chemically selective interfaces for molecular recognition. A pattern-recognition algorithm using responses from an array can identify the chemical with certainty. However, the confidence level of pattern recognition decreases with binary mixtures and fails for mixtures of three or more constituents (Hsieh and Zellers 2004; Jin and Zellers 2008; Senesac and Thundat 2008). The failure is related to the weak mo-
lecular interactions between analyte and receptor. Selectivity between analyte and receptor will remain a formidable challenge provided that there is an absence of highly selective coatings that can provide a unique signal in an array format. Development of chemical interfaces or receptors that can provide unique signals is important for the advancement of personal monitors.
However, the ability of microsensors and nanosensors to produce signals that are independent of adsorption chemistry potentially leads to enhanced chemical selectivity. A chemical sensor that is based on detecting thermal changes due to infrared excitation of adsorbed molecules and uses a bimaterial cantilever (photothermal deflection spectroscopy) that can detect a monolayer of adsorbed molecules has been demonstrated (Krause et al. 2008). Infrared absorption peaks are unique for specific molecules, so multiple analytes can be detected with pattern-recognition algorithms. Nanocantilever sensors with unprecedented sensitivity have been demonstrated (Li et al. 2007). Combining the nanocantilever with photothermal deflection spectroscopy could provide high selectivity and sensitivity. However, miniature tunable light sources, such as quantum cascade lasers, in the mid-infrared region that can be incorporated into nanocantilevers are still under development (Capasso 2010).
Incorporating nanocantilevers with high-performance GC also has the potential for increasing chemical selectivity. For example, the ability of a cantilever to measure adsorption energy due to molecular binding (cantilever bending) combined with measurement of adsorbed mass (resonance frequency) and high-performance GC can enhance chemical selectivity.
The committee envisions that future nanosensors, where each sensor provides an orthogonal signal, will be able to detect multiple analytes with an array-based concept. In addition, the different modes of operation of nanosensors, for example, using variations in mass, stress, optical, electric, and thermal properties, induced by the analytes, could be incorporated without increasing the power demand for improving selectivity. The signals would be analyzed by using pattern-recognition algorithms for chemical identification. The nanosensor platform also could be equipped with wireless telemetry for data transmission.
Many challenges need to be addressed to have field-deployable nanosensors that can detect multiple chemicals with selectivity, sensitivity, and site location in a continuous fashion for weeks. Selectivity may be the most challenging task. Current nanosensors have wide variability in their response because of uncertainties in the fabrication process and variability in the immobilization of functional groups (Patel et al. 2003; Senesac and Thundat 2008). In addition, data quality issues, including maintenance, calibration, and reference checks will be important.
Because nanosensors are point sensors, they require the analyte molecules to adsorb onto the sensor surface to produce a signal. Therefore, collecting the analyte molecules through exposure of the sensor element is important. In general, adsorption by diffusion, which eliminates the need for a pump, will reduce the size and power requirement of the sensors.
In sum, advances in nanotechnology and nanosensors offer sensor and sample collector platforms that could be used for developing miniaturized, low-power personal monitors for quantitative measurement of particles and other pollutants in real time with specificity. Sample collection and separation could be accomplished with electrostatic methods, and detection could be achieved with NEMS and MEMS mass sensors. It will be possible to incorporate nanosensors that can detect the activity level of the wearer (using breathing rate and heart rate). Present technology allows inclusion of GPS for instantaneous location determination.
Sensors for Ecosystem Exposure Assessment
Scientific and technologic advances in exposure assessment in nonhuman species have been driven largely by environmental laws and policies administered by various state and federal agencies, but these advances have typically been enabled by advances in exposure and health assessments for humans.
Three recommendations from the Environmental Protection Agency (EPA) Science Advisory Board Workshop on Ecological Risk Assessment related to exposure assessment were to increase resolution and decrease uncertainties in spatial, temporal, and individual-level exposure assessments (Dale et al. 2008). The challenge for all three recommendations is that the size, scale, and duration of traditional approaches to exposure assessment needed to accomplish the improvements will be cost-prohibitive. However, recent advances in electronic miniaturization and data management will allow the development of environmental sensor networks that can provide long-term, real-time exposure monitoring data on many scales. The development of both fixed and mobile sensor arrays has become more common, and many systems are beginning to be deployed. For example, the National Ecological Observatory Network (NEON) will provide long-term ecosystem monitoring on a continental scale (Keller et al. 2008), and the Global Lakes Ecological Observatory Network (GLEON) aims to use lakes as sentinels of global climate change with existing and newly developed arrays of sensors deployed in lakes worldwide (Kratz et al. 2006; Williamson et al. 2009).
Multiscale systems could be deployed to conduct low-resolution monitoring to detect areas of interest, where higher-resolution or higher-density sensor arrays would be deployed to increase resolution and frequency of measurements (for example, Rundel et al. 2009). Driven largely by the needs of national security (for example, monitoring drinking water or air quality) and those of programs like NEON and GLEON, the development of new sensors is rapid. That is especially true for sensors based on molecular or biochemical reactions (biosensors), which show great promise for use in monitoring the exposure of ecologic resources to stressors. In addition, the European Global Monitoring for Environment and Security program will soon begin to launch missions to deploy high resolution, multi-spectral sensors that will be used to monitor environmental stressors from the local to the global scale (Aschbacher and Milagro-Pérez
2012). As improvements are made in data management and storage infrastructure on the Internet, sensor networks will play a key role in advancing the science of exposure assessment in the environment.
Although promising, issues of validation for ubiquitous and participatory sensing networks will remain a key issue. If exposure assessments are to be conducted using sensing networks and nanosensors, greater assurance about measurement accuracy and precision will be needed. Such efforts invariably will include laboratory and field testing against “gold” standard instruments.
Challenges and research needs are presented by ubiquitous networks, including embedded and participatory networks for humans and ecosystems. The field is in its infancy, and much work remains to be done, including the following:
• Develop sensor technologies that can be scaled up to large mass markets with little additional cost, and develop the software to process the information on the sensor platform (for example, cellular telephones or radio sensors) or to send it wirelessly.
• Validate ubiquitous sensors against gold standard instruments in human populations.
• Assess measurement error in cellular telephones or radio sensors in comparison with high-caliber instruments by using either laboratory or field analyses.
• Develop robust sensors and sensor platforms that function with little maintenance and can tolerate harsh environmental conditions over long periods in remote locations.
• Develop the database infrastructure to store, maintain, and protect information.
• Create the analytic methods needed to understand patterns, detect outliers, and ultimately address the potentially enormous quantities of observations that come in streams or in real time; all of which will require considerable investment in methodologic research.
• Continue to assess the optimal way of obtaining user participation.
• Address concerns about individual privacy protection so that users understand any potential risks to their privacy and identity.
Exposure science is poised to move from collection of external exposure information on a small number of stressors, locations, times, and individuals to a
more systematic assemblage of internal exposures of individuals in entire populations and multiple elements of the ecosystem to multiple stressors (Wild 2005; Dale et al. 2008). By providing detailed exposure information to complement the rapidly evolving individualized biomedical profile based on gene and gene-expression data (Cohen et al. 2011; Auffray et al. 2012; Chen et al. 2012), global-exposure surveillance has the potential to become a valuable component of routine health care (see illustration in Box 5-6). Advances in ecologic genomic techniques and bioinformatics provide a basis for conducting routine exposure assessments of broad arrays of stressors in wildlife. The use of those techniques in regulatory ecotoxicology and ecologic risk assessment has limitations (Ankley et al. 2008), but Ankley et al. (2008) state that “Ongoing research will, in the long term, serve to obviate limitations related to the global identification of gene products, proteins, and metabolites in test species relevant to ecological risk assessments.” Global approaches, the use of genomewide assessments of indicators of exposure to a wide variety of stressors, will continue to complement traditional targeted measures of internal exposure.
Measures of Internal Exposure
For both ecologic and human health risk assessment, internal measures of exposures to stressors are closer to the target site of action for biologic effects than external measures, and this potentially reduces confounding and improves the correlation of exposures with biologic effects. However, use of internal measures of exposure comes with a cost: the variability in the relationship between sources of stressors and effects is greater than that with the use of external measures of exposure. Nevertheless, the committee considers it important to advance measurement of internal exposures as an element of the vision for exposure science.
Analytic methods enable detection of both much lower concentrations of stressors internally and measurement of multiple stressors in single samples. The global measurement of thousands of small organic molecules (Nicholson and Lindon 2008) in biologic samples—metabolomics—applied initially to biomedical fields is now being applied to biomonitoring of chemicals in human and wildlife populations (Ankley et al. 2008; Stahl et al. 2010; Villenueuve and Garcia-Reyero 2011; Soltow et al. in press). Such global approaches have the distinct advantage of not being limited to a chemical or class of chemicals selected in advance and provide broader, agnostic assessment that can identify exposures, potentially improving surveillance and elucidating emerging contaminants. Approaches specific to chemical classes are becoming equally powerful; for example, the simultaneous determination of 50-77 polychlorinated biphenyls (PCBs) (Korrick et al. 2000; Bloom et al. 2009) and 28 polychlorinated dibenzodioxin, polychlorinated dibenzofuran, and dioxin-like PCB congeners (Todaka et al. 2010) is now routine, as is the measurement of the steroidome—69 steroid hormones in human blood (Hill et al. 2010). Proteomics and adductomics expand
the types of internal measures of exposure that can be analyzed to include compounds with short half-lives in the blood, including the critical class of reactive electrophiles, for example, oxidants in cigarette smoke (Tang et al. 2010; Jin et al. 2011) and acrylamide, glycidamide, and styrene oxide (Fustinoni et al. 2008; Feng and Lu 2011). Whether the biologic fluid is blood, urine, or saliva, rapidly evolving sensor platforms linked to physiologically based pharmacokinetic (PBPK) models are expected to enable field measurements in humans and in ecosystems and rapid interpretation of concentrations of chemicals in these biofluids in the context of internal exposure (Timchalk et al. 2007). However, inferring the sources and routes of these internal exposures remains a research challenge (Tan et al. 2007).
Biosignatures of Exposure
An alternative to global surveillance of internal exposure to specific stressors is the use of biosignatures that reflect the net biologic effect of internal exposure to stressors that act on a specific biologic pathway. For example, oxidative modifications of DNA or protein (Zhang et al. 2010) can be used to represent the net internal exposure to oxidants and antioxidants, the presence of liver enzymes in blood may reflect internal exposure to liver toxicants (Shi et al. 2010), induction of the cytochrome P4501A gene pathway reflects exposure to planar aromatic hydrocarbons in humans and wildlife (Nebert et al. 2004), and the presence of the egg-yolk protein precursor vitellogenin in juvenile or male fish can be used to assess exposure to estrogenic compounds in the laboratory (for example, Scholz and Mayer 2008) or in the field (for example, Kidd et al. 2007; Sanchez et al. 2011).
Applications of combined exposure—disease -omics methods in personalized medicine provide a conceptual framework for use of exposure surveillance in health management. For example, if a cost-effective global analysis were used in routine health care, children exposed to second-hand smoke could be readily identified by detection of cotinine. An important transition might occur if such a test were extended to detect a broad array of exposures to contamination in well water or to household pesticides simultaneously by using laboratory-on-a-chip or chemical profiling. Increasing use of -omics methods in personalized medicine means that integration of exposure science with personalized medicine could allow the cost of a top-down approach to exposure surveillance to be borne largely by the health-care system with systematic acquisition of information on exposures in routine health care without considerable additional cost. Such analyses would require considerable expertise of physicians to interpret and to communicate the health significance of this exposure information to patients.
Using biosignatures has several advantages. Biosignatures can overcome the analytic and informational challenges of identifying all stressors of a common biologic pathway and understanding their individual and summed potencies. The identities of the stressors do not need to be known in advance; rather, their presence is inferred from disruption of a specific biologic pathway. The close connection between the exposure measure and the adverse effect or disease process allows better exposure—disease correlations. A major challenge of the approaches is that using complicated, dynamic biologic systems to measure internal exposure will require overcoming substantial variability in response, temporal variability, the deconvolution of biologic processes into influences from external vs internal stressor-related processes, and the relative inability to target reduction of any specific compound or source on the basis of biosignatures.
Biochemical Modifiers of Internal Exposure
Adsorption, distribution, metabolism, and elimination—which contribute to the relationship between external and internal exposures—are themselves the results of biochemical and physical processes. The state of those processes, as they are related to a stressor of concern, has a substantial effect on the internal exposure that receptors experience. Knowledge of the processes is needed to characterize or predict internal exposures in humans and environmental systems. The committee envisions the extension of traditional biomonitoring to include the monitoring of the processes at the individual level and the population level through emerging technologies.
Transcriptomics, proteomics, and to a smaller extent metabolomics offer the ability to measure the status of key biologic processes that affect the pharmacokinetics of chemical stressors across time, species, and populations. And that information can be used qualitatively to identify populations expected to have greater internal exposures to a given external exposure (for example, because of differences in metabolism or higher absorption) or quantitatively by inclusion in PBPK—pharmacodynamic (PBPK-PD) models that are used for exposure assessment and prediction of doses. PBPK models have been widely used in risk assessment to predict the dose to a target tissue from external exposures (Clewell and Andersen 1987; Teeguarden et al. 2005; Clewell and Clewell 2008) and to address the effect of individual and population variability. For example, PBPK—PD models describe the induction of Phase I and Phase II metabolism in the liver and other tissues to account for the effects of increased metabolism from chronic exposures on internal exposure to one or more compounds (Sarangapani et al. 2002; Emond et al. 2006). While the Bayesian, Monte Carlo, and other computational approaches for applying PBPK models to population-level exposure assessments are well developed, the limited availability of population-level data on variability in external exposures and on individual genetic variation, hinders consistent application to populations. The commit-
tee’s vision for exposure science is anticipated to help motivate the generation of the data needed to support wider application of these computational approaches.
Ecologic Exposure Assessment
Advances in molecular biology and a desire to provide high-throughput assessment of exposures of organisms in the natural environment have led to the development of numerous tools to provide surrogate or direct measures of exposure to contaminants (Table 5-1). The expansion of biochemical and molecular measures has been particularly rapid. However, the use of molecular techniques as biomarkers to assess ecologic exposure to contaminants is limited in that most of these techniques cannot be linked quantitatively to the level of exposure and are not highly selective. Most biomarkers can provide information on whether exposure to general classes or groups of stressors has occurred and, given the current state of the science, cannot be related causally to susceptibility or to risk of developing disease in natural populations (Stahl et al. 2010). Whereas the qualitative information has been useful in directing the need for higher-resolution studies and focused analytic-chemistry measures (for example, Roberts et al. 2005; Smith et al. 2007), it does not in isolation provide exposure-assessment information that can be useful in ecologic risk assessment.
Monitoring of ecologic resources for exposure to stressors can play many roles, including determination of stressor presence, stressor type, and stressor level. With few exceptions (for example, analytic chemistry), however, monitoring techniques do not provide the quantitative information needed for use in ecologic risk assessments.
• There is a need to develop rapid-response, quantitative exposure-assessment tools that can provide information useful for exposure assessment in ecologic risk assessments.
• The use of molecular techniques as biomarkers to assess ecologic exposures to contaminants has been the subject of much attention and development in recent years. However, the uses of these techniques is limited in that most cannot be linked quantitatively to the level of exposure and to the level producing an adverse outcome. Efforts need to be made to make biomarkers more quantitative.
• Linking quantitative analytic chemistry and biochemical adverse-outcome pathways with biomonitoring tools will be essential to move ecologic exposure science into the 21st century.
|Method||Stressor Specificity: (H)igh, (M)edium,(L)ow||(Quant)itative, (Semiquant)itative, (Qual)itative||(D)irect or (I)ndirect measure of stressor||Level of Develoment, Acceptability (H)igh, (M)edium, (L)ow||Relative Speed of Analysis||Clear Link to Ecologic Risk-Assessment Goals||Example|
|Analytic chemistry||H||Quant||D||H||Slow||Yes||Genualdi et al. 2011|
|Biomimetic devices||H||Semiquant||D||M–H||Slow||Yes||Esteve-Turrillas et al. 2007|
|Whole-organism biomonitoring||H||Quant||D||H||Slow||Yes||Finkelstein et al. 2007|
|Integrated stressor assessments||M||Qual||I||M/H||Slow||Yes||Cvetkovic and Chow-Fraser 2011|
|Sensor networks||H||Quant||D||L||Fast||No||Pastorello et al. 2011|
|Biomarkers||M||Qual||I||M||Fast||No||Roberts et al. 2005|
|omics techniques||M||Qual||I||M||Fast||No||Klaper et al. 2010|
|Remote sensing||L||Qual||I||M||Medium||No||Yuan et al. 2010; Whitehead et al. 2011|
|Participatory assessment||L||Qual||I||L||Medium||No||Kolok et al. 2011|
aThese methods range from direct analytic-chemistry measurements of specific chemicals to the use of remote-sensing technologies (for example, satellite imagery). Each method has advantages and disadvantages related to stressor specificity, quantitation of exposure, connectivity of a measurement to a specific stressor, level of scientific development and acceptability, speed (and hence cost) of a measurement, and the ability to link a measurement to scientific or regulatory goals of ecologic risk assessment.
Models of the processes, dynamics, and distribution of exposures to chemical, physical, and biologic agents in association with other stressors are an essential element of exposure science. The ability of models to provide a repository for exposure knowledge, to aid in interpreting data and observations, and to provide tools for predicting trends has been and will continue to be a cornerstone of exposure science. Here we consider the role that exposure models need to play in supporting exposure science in the 21st century.
Types of Models in Exposure Science
The types of models used in exposure science vary widely. Activity-based models track the history of individuals and populations through multiple environments using activity information, whereas process-based models track the movement of chemicals and other stressors from a source to a target (receptor).
The Stochastic Human Exposure and Dose Simulation (SHEDS) model is a widely used activity-based model (EPA 2012a). It simulates individual exposure patterns by using the Consolidated Human Activity Database (CHAD), simulating a person’s contact with environmental concentrations probabilistically, estimating the person’s exposure—time profile for multiple pathways, and applying Monte Carlo sampling to simulate population exposure (Zartarian et al. 2000). SHEDS estimates exposures of specific subpopulations (for example, children) on a national scale. SHEDS is among a class of models that was developed to carry out rapid exposure characterization of numerous stressors, often in the context of policy analyses, when alternative controls were being examined. In such situations, when screening assessments are needed in counterfactual situations, exposures must be modeled.
There are different types of process-based models. For example, source-to-dose models (that is, models that link environmental fate and transport, exposures, and pharmacokinetics) relate exposures, both conceptually and mathematically, at any biologic level to exposures at any other level or to dose (McKone et al. 2007; Georgopoulos et al. 2009). The challenge is to be able to model from dose to source or source to dose, by using newly developed internal and external markers of exposures (see Figure 5-1).
In addition there are also process-based models (or mass-balance models) that have been developed for screening potential exposures, to ecosystems and humans, at local and global scales. The development of these models has been motivated in part by the need for more accurate characterizations of chemicals transported over regional, continental, and global scales—with a focus on the impacts to both humans and ecologic receptors (MacLeod et al. 2010). Such motivations have stemmed from widespread observations of organic pollutant compounds in vegetation, soil, animals, and human tissues. These process-based models use the principles of mass balance and chemical thermodynamics to
track the fate of chemical, physical, and biologic stressors, in indoor environments, and on regional and global scales. Box 5-7 examines the use of large-scale process models to assess human and ecologic exposure potential with regards to long-range transport and persistence.
Some process-based models have high spatial resolution, others focus on regional mass-balance methods. Some are deterministic and attempt to capture a small number of representative scenarios, others are stochastic and probabilistic and attempt to capture the uncertainty and variability of model inputs. There is a growing need for structure—activity models that can classify chemicals with regard to exposure and health-effects potential.
The use of computational exposure models within the regulatory decision process at the EPA and other regulatory agencies continues to grow (NRC 2007). According to NRC (2007):
“This growth is in response to greater demands for quantitative assessment of regulatory activities, including analysis of how well environmental regulatory activities fulfill their objectives and at what cost. Models are essential for estimating a variety of relevant characteristics—including pollutant emissions, ambient conditions, and dose—when direct observation would be inaccessible, infeasible, or unethical.”
Predictive exposure models have become particularly important in risk, life-cycle, and sustainability assessments, where there is a need for rapid exposure assessment (NRC 2007, 2009). One concern that arises in using exposure models is the reliability of the exposure estimates. This has resulted in a demand for a more formal treatment of uncertainty in exposure models using qualitative methods and quantitative methods such as Monte Carlo analysis (IPCS 2008; NRC 2009). Although widely endorsed, the use of Monte Carlo methods in exposure modeling remains constrained because of insufficient information on many parameters affecting exposure.
The goals of exposure models include explaining observations, guiding data collection, identifying new questions, bounding outcomes within plausible ranges, predicting exposures of individuals for epidemiologic studies, and illuminating key uncertainties and sensitivities. As discussed earlier, models may also describe internal exposures. Models will continue to provide training, educate the public, and link exposure results more closely to risk assessment and risk-management decisions.
Future of Modeling Methods and Technologies
To support the emerging technologies, the committee focused on several modeling approaches that will be needed to extract useful information from the masses of data that will be generated.
Global scale mass-balance and exposure models have supported development of the concepts of persistence and long-range transport—key hazard indicators used in chemical assessment (Fenner et al. 2005). The multimedia-mass-balance models initially focused on a “unit” or “evaluative” world approach in which mass balance is applied to an archetypal world made up of soil, water, sediments, and biota (MacLeod et al. 2010). Those types of evaluative models were instrumental in elucidating the mechanisms of bioconcentration and biomagnification (McKone and MacLeod 2003). Global-scale multimedia models illustrate the potential for some persistent pollutants to migrate to and accumulate in polar regions (Wania and Mackay 1993; Scheringer et al. 2000; Wania and Su 2004). Multimedia mass-balance models provide a basis for quantifying cumulative, multipathway exposure to pollutants that originate in contaminated air, water, and soil (McKone and MacLeod 2003). For example, they have been used to characterize the relative importance of local and distant sources of contaminants for human and ecosystem exposures (MacLeod et al. 2002), to identify environmental and chemical processes that control concentrations of pollutants (Scheringer et al. 2004), and to motivate empirical and theoretical studies to improve knowledge of the physicochemical properties and degradability of commercially important chemicals and byproducts of industry and energy production (Jaworska et al. 2003; Schenker et al. 2005).
More detailed global mass-balance models have been introduced in which detailed spatial differentiation is obtained by linking unit world models with flows of air and water estimated on the basis of long-term averages (MacLeod et al. 2010). Wania and Mackay (1999) argued that the unit world structure and generally low spatial and temporal resolution of multimedia models are appropriate for describing the behavior of persistent contaminants in the environment. However, limitations of that approach have been identified by developers of contaminant fate and transport models. Those models describe the temporal and spatial variability of the circulation of the atmosphere and oceans with high resolution (Lammel 2004). Such detailed chemistry-transport models are increasingly being developed or adapted to describe concentrations and transport pathways of trace environmental contaminants. They have been used to address aspects of chemical pollution that cannot be readily addressed by models that are based on the unit world approach, such as how persistence and long-range transport of substances depend on the specific time and location of release, and how episodic transport events can deliver pollutants to remote ecosystems (MacLeod et al. 2010).
Model Performance Evaluation
Key to the future of exposure models is how they incorporate the ever-increasing amounts of observations of natural and human processes and envi-
ronmental effects. Vast new measurement programs in fields as diverse as genomics and earth-observation systems, ranging from the nanoscale to global dimensions, present important opportunities and challenges for modeling. Although observations alone can influence policy, it is their analysis with models that will allow the full realization of their importance (NRC 2007).
The interdependence of models and measurements is complex and iterative (NRC 2007; EPA 2009). Historically, the cycle of models and observations tended to begin with observations used to build a model, then a second set of observations to calibrate the model, and a third set to “validate” or evaluate the model performance. But in place of validation or even calibration with observations, an alternative approach is to use models and observations (such as biomarker data and environmental samples) as independent tools to evaluate hypotheses about source—receptor relationships (see McKone et al. 2007).
Spatiotemporal models play a central role in exposure assessment for epidemiologic studies and risk analysis by combining information on sources of exposure with measurements to arrive at predictions of exposures. In some cases, cost and feasibility may limit direct measurements to a modest number of locations for specified periods; in other cases, remote-sensing technologies may yield an enormous quantity of data that need to be summarized.
An important element of any exposure assessment is quantifying the degree of uncertainty about the exposure estimates. It is important to distinguish two fundamentally different sources of uncertainty: statistical variability in the finite set of measurements available on which to build the model and their inherent measurement errors, and misspecification of the form of the statistical model used—both in prediction of exposure and in the sampling and measurement error distributions.
The first of those relies on statistical inference. Well-established techniques are available to estimate the standard errors of parameter estimates and model predictions. The measurement-error literature distinguishes two principal classes of measurement-error models on the basis of the relationship between a “true exposure” and a “measured exposure”. In both, the health outcome is assumed to depend only on true exposure, not its measurement; the relationship between the outcome and true exposure, commonly known as the health-effects model, and the population distribution of true exposure (Richardson and Ciampi 2003) (possibly in relation to other predictors, such as traffic density, land use, and other exposure sources) is known as the exposure model. The models differ in terms of the form of the “measurement-error model”. In the “classical error model”, the measured value is assumed to be distributed randomly around the true value with some deviations caused by instrument or model error. In the “Berkson error model”, individual exposures are assumed to be distributed around some applied exposure of a group (for example, ambient pollution). The “classical error model” is an appropriate model for situations in which personal measurements that are subject to some error are used. It tends to induce an attenuation bias toward the null in the “naïve” regression of outcome on measured exposure. The “Berkson error model” is appropriate for situations in which indi-
viduals within a group differ because of unmeasured factors (for example, time—activity or household characteristics); it might not produce any bias but will tend to inflate the variance of the estimated regression coefficient. In many instances, a combination of components with characteristics of classical and Berkson error may affect the estimated coefficients.
The second type of uncertainty, model misspecification, is more difficult (Leamer 1978) because the true form of the relationships being modeled can never be known and the most one can hope to do is to investigate the robustness of the predictions in a reasonable group of plausible alternative models. This is generally known as sensitivity analysis. But more formal alternatives, such as model averaging (Hoeting et al. 1999; Hjort and Claeskens 2003), are available that essentially provide an average of a wide variety of models, each weighted in some fashion according to goodness-of-fit, and an overall estimate of uncertainty that combines the statistical variability within each model and the differences between models to provide a more “honest” assessment of overall uncertainty than simply quoting the standard error of some “best” model.
Models that rely heavily on expert judgment, such as dose-reconstruction models used in environmental radiation-exposure studies—for example, the Hanford Environmental Dose Reconstruction Project (Shipler et al. 1996; Kopecky et al. 2004) and the Nevada Test Site down-winders studies (Simon et al. 1995)—have tended to use Monte Carlo methods for uncertainty assessment (see discussion in Box 3-2). In this case, one typically samples random values for each of the unknown parameters from their uncertainty distributions, computes exposure predictions by using these values, and repeats this process many times to generate a distribution of possible exposure estimates.
Uncertainty estimates in exposure have little utility by themselves unless they are exploited in the analysis of exposure—response relationships. Many statistical methods have been developed for dealing with this problem (Thomas et al. 1993; Carroll et al. 2006), but their use in epidemiologic and ecologic analyses has been fairly narrow. For example, in radiation epidemiology, use of the Monte Carlo uncertainty estimates in analyzing health effects has become somewhat standard, as in the analysis of the Hanford Thyroid Disease Study (Stram and Kopecky 2003). In air pollution, Bayesian methods that entail fitting the exposure and health-outcome models jointly have been discussed (Molitor et al. 2007).
In addition to models as tools to interpret, predict, and evaluate source, concentration, and receptor relationships, there are informatics models—the emerging tools for managing and exploring massive amounts of information from diverse sources and in widely different formats.
Substantial investment and progress have occurred in recent years to collect and improve access to genomic, toxicology, and health data. For example,
EPA’s ACTOR database (EPA 2012b) “is an on-line searchable warehouse of all publicly available chemical-toxicity data. It aggregates data from over 500 public sources on over 500,000 environmental chemicals; the data are searchable by chemical name, other identifiers, and chemical structure” (Judson et al. 2012). The Comparative Toxicogenomics Database (CTD 2012) entails literature curation and integration of data that describe chemical interactions with genes and proteins and chemical—disease and gene—disease relationships (Mattingly 2009; Davis et. al. 2011). Those databases and efforts to link and integrate the information that they contain have revealed the landscape of chemical-toxicity data and helped to inform research needs. The data sources have historically lacked the extensive and reliable exposure data that are required to examine the environmental contributions to diseases and to assess health risks. EPA initiated the ExpoCastTM program to fill that knowledge gap (Cohen Hubal et al. 2010). Although the program was developed to advance characterization of exposure to meet challenges posed by new toxicity-testing paradigms, in the long run it will foster novel exposure-science research to link exposures to real-world health outcomes.
ExpoCastDB, developed as a component of ExpoCastTM (Gangwal 2011), has begun the effort to consolidate observational human exposure data, improve access, and provide links to health-related data. It is designed to house measurements from human exposure studies and encourage standardized reporting of observational exposure information (EPA 2012c). This database will facilitate linkages with many other data sources, including sources of toxicity and environmental-fate data, and with manufacturers’ production and use data.
In the fields of biology and medicine, the need to manage exponentially growing information on systems and interactions has given rise to the discipline of bioinformatics. Bioinformatics brings together model algorithms, databases and information systems, Web technologies, artificial intelligence, and soft computing to generate new knowledge of biology and medicine. One example is Genome-Wide Associated Studies (GWAS), which explores how noncommunicable diseases arise from a complex combination of genetic processes and the environment (Schwartz and Collins 2007). The application of informatics to exposure science is at a very early stage and offers a substantial opportunity to gain new knowledge of stressor—receptor—effect patterns—in connection with humans and the environment.
The fields of systems biology and exposure biology have been a staging area for some of this effort. One example is that of work by Patel et al. (2010) who conducted a pilot Environment-Wide Association Study (EWAS) in which exposure-biomarker and disease-status data were systematically interpreted in a manner analogous to that in GWAS (see Box 3-3).
Fields that make use of informatics technologies often develop an ontology to guide the systematic mining of databases and scientific literature that contain millions of observations and findings. An ontology is an explicit formal specification of the terms used in a knowledge domain (for example, biology, chemistry, or toxicology) and the relations among them. Because “an ontology
defines a common vocabulary for researchers who need to share information in a domain” (Noy and McGuinness 2001), it can be adapted to machine-interpretable definitions of basic concepts to facilitate the use of computers for mining of large amounts of written material. Mattingly et al. (2012) recently designed, developed, and demonstrated an exposure ontology (ExO) to facilitate centralization and integration of exposure data. The ExO is being used to bridge the gap between exposure science and other environmental health disciplines, including toxicology, epidemiology, disease surveillance, and epigenetics (OBO Foundry 2012). The committee sees these types of efforts as important for defining and expanding exposure science in the 21st century.
The data and analytic challenges posed by the emerging exposure-assessment technologies are growing at an exponential rate in scale and complexity.
• Statistical and computational research must continue to expand to keep pace with these developments. For example, satellite imaging and personal-monitoring techniques are generating enormous quantities of data on spatiotemporal exposure and on people’s movements and activities. Meanwhile, biologic assays are capable of monitoring millions of genetic variants, metabolites, expressions, and epigenetic changes in thousands of subjects at an affordable cost.
• Without comparable investments in the development of new statistical analytic techniques to address correlated data on many more variables than subjects and without advances in computation, such as parallel-processing techniques, the analysis of the mountains of data threatens to become the new limiting factor in further progress.
• Exposure models will continue to support diverse efforts, such as risk analysis, impact assessments, life-cycle and sustainability assessments, epidemiology, and energy analysis. As is the case with all models, exposure models must balance the need for transparency with the need for fidelity and credibility. That requires concurrent development of model performance-evaluation efforts in any model-development program.
AethLabs. 2012. microAeth Model AE51. AethLabs, San Francisco, CA [online]. Available: http://www.aethlabs.com/ [accessed Jan. 11, 2012].
Allen-Piccolo, G., J.V. Rogers, R. Edwards, M.C. Clark, T.T. Allen, I. Ruiz-Mercado, K.N. Shields, E. Canuz, and K.R. Smith. 2009. An ultrasound personal locator for time-activity assessment. Int. J. Occup. Environ. Health 15(2):122-132.
Almanza, E., M. Jerrett, G. Dunton, E. Seto, and M.A. Pentz. 2012. A study of community design, greenness and physical activity in children using satellite, GPS and accelerometer data. Health Place 18(1):46-54.
Ankley, G.T., A.L. Miracle, E.J. Perkins, and G.P. Daston. 2008. Genomics in Regulatory Ecotoxicology: Applications and Challenges. Pensacola, FL: SETAC Press.
Aschbacher, J., and M.P. Milagro-Pérez. 2012. The European Earth monitoring (GMES) programme: Status and perspectives. Remote Sens. Environ. 120:3-8.
Auffray, C., T. Caulfield, M. J. Khoury, J.R. Lupski, M. Schwab, and T. Veenstra. 2012. Looking back at genomic medicine in 2011. Genome Med. 4(1):9.
AVIRIS (Airborne Visible/Infared Imaging Spectrometer). 2012. AVIRIS. Jet Propulsion Laboratory. California Institute of Technology[online]. Available: http://aviris.jpl.nasa.gov/ [accessed May 17, 2012].
Bagheri, S., and T. Yu. 2008. Hyperspectral sensing for assessing nearshore water quality conditions of Hudson/Raritan estuary. J. Environ. Inform. 11(2):123-130.
Beck, L.R., B.M. Lobitz, and B.L. Wood. 2000. Remote sensing and human health: New sensors and new opportunities. Emerg. Infect. Dis. 6(3):217-227.
Bloom, M.S., J.E. Vena, J.R. Olson, and P.J. Kostyniak. 2009. Assessment of polychlorinated biphenyl congeners, thyroid stimulating hormone, and free thyroxine among New York state anglers. Int. J. Hyg. Environ. Health 212(6):599-611.
Bradley, E.S., D.A. Roberts, P.E. Dennison, R.O. Green, M. Eastwood, S.R. Lundeen, I.B. McCubbin, and I. Leifer. 2011. Google Earth and Google Fusion tables in support of time-critical collaboration: Mapping the deepwater horizon oil spill with the AVIRIS airborne spectrometer. Earth Sci. Inform. 4(4):169-179.
Briggs, D. 2005. The role of GIS: Coping with space (and time) in air pollution exposure assessment. J. Toxicol. Environ. Health A 68(13-14):1243-1261.
Bulgarelli, B., and S. Djavidnia. 2012. On MODIS retrieval of oil spill spectral properties in the marine environment. IEEE Geosci. Remote Sens. Lett. 9(3):398-402.
Burke, J., D. Estrin, M. Hansen, A. Parker, N. Ramanathan, S. Reddy, and M.B. Srivastava. 2006. Participatory sensing. Proceedings of ACM SenSys, the 4th ACM Conference on Embedded Networked Sensor Systems, October 31, 2006, Boulder, CO [online]. Available: http://escholarship.org/uc/item/19h777qd#page-1 [accessed Jan. 17, 2012].
Calabrese, F., M. Colonna, P. Lovisolo, D. Parata, and C. Ratti. 2011. Real-time urban monitoring using cell phones: A case study in Rome. IEEE T. Intell. Transp. Syst. 12(1):141-151.
Capasso, F. 2010. High-performance midinfrared quantum cascade lasers. Opt. Eng. 49(11):111102.
Carroll, R.J., D. Ruppert, L.A. Stefanski, and C.M. Crainiceanu. 2006. Measurement Error in Nonlinear Models: A Modern Perspective, 2nd Ed. Boca Raton, FL: Chapman and Hall/CRC Press.
Chang, L.T., J. Sarnat, J.M. Wolfson, L. Rojas-Bracho, H.H. Suh, and P. Koutrakis. 1999. Development of a personal multi-pollutant exposure sampler for particulate matter and criteria gases. Pollut. Atmos. 40:31-39.
Chen, C. 2011. A Wireless Hybrid Chemical Sensor for Detection of Environmental Volatile Organic Compounds. M.S. Thesis, Arizona State University.
Chen, R., G.I. Mias, J. Li-Pook-Than, L. Jiang, H.Y. Lam, R. Chen, E. Miriami, K.J. Karczewski, M. Hariharan, F.E. Dewey, Y. Cheng, M.J. Clark, H. Im, L. Habegger, S. Balasubramanian, M. O’Huallachain, J/T. Dudley, S. Hillenmeyer, R. Haraksingh, D. Sharon, G. Euskirchen, P. Lacroute, K. Bettinger, A.P. Boyle, M. Kasowski, F. Grubert, S. Seki, M. Garcia, M. Whirl-Carrillo, M. Gallardo, M.A. Blasco, P.L. Greenberg, P. Snyder, T.E. Klein, R.B. Altman, A.J. Butte, E.A. Ashley, M. Gerstein, K.C. Nadeau, H. Tang, and M. Snyder. 2012. Personal omics
profiling reveals dynamic molecular and medical phenotypes. Cell. 148(6):1293-1307.
Cheng, M.M.C., G. Cuda, Y.L. Bunimovich, M. Gaspari, G.R. Hearth, H.D. Hill, C.A. Mirkin, A.J. Nijdam, R. Terracciano, T. Thundat, and M. Ferrari. 2006. Nanotechnologies for biomolecular detection and medical diagnostics. Curr. Opin. Chem. Biol. 10(1):11-19.
Chowdhury, Z., R.D. Edwards, M. Johnson, K.N. Shields, T. Allen, E. Canuz, and K.R. Smith. 2007. An inexpensive light-scattering particle monitor: Field evaluation. J. Environ. Monitor. 9(10):1099-1106.
Chudnovsky, A., E. Ben-Dor, A.B. Kostinski, and I. Koren. 2009. Mineral content analysis of atmospheric dust using hyperspectral information from space. Geophys. Res. Lett. 36: L15811, doi:10.1029/2009GL037922.
Chudnovsky, A., A. Kostinski, L. Herrmann, I. Koren, G. Nutesku, and E. Ben-Dor. 2011. Hyperspectral spaceborne imaging of dust-laden flows: Anatomy of Saharan storm from the Bodele Depression. Remote Sens. Environ. 115(4):1013-1024.
Clark, R.N., R.O. Green, G.A. Swayze, G. Meeker, S. Sutley, T.M. Hoefen, K.E. Livo, G. Plumlee, B. Pavri, C. Sarture, S. Wilson, P. Hageman, P. Lamothe, J.S. Vance, J. Boardman, I. Brownfield, C. Gent, L.S. Morath, J. Taggart, P.M. Theodorakos, and M. Adams. 2001. Environmental Studies of the World Trade Center Area After the September 11, 2001 Attack. US Geological Survey Open File Report 01-0429-2001. U.S. Geological Survey [online]. Available: http://pubs.usgs.gov/of/2001/ofr-01-0429/ [accessed Jan. 17, 2012].
Clewell, H.J, III, and M.E. Andersen. 1987. Dose, species and route extrapolation using physiologically based pharmacokinetic models. Pp. 159-182 in Drinking Water and Health, Vol. 8. Pharmacokinetics in Risk Assessment. Washington, DC: National Academy Press.
Clewell, R.A., and H.J. Clewell, III. 2008. Development and specification of physiologically based pharmacokinetic models for use in risk assessment. Regul. Toxicol. Pharmacol. 50(1):129-143.
Cohen, A.L., R. Soldi, H. Zhang, A.M Gustafson, R. Wilcox, B.W. Welm, J.T. Chang, E. Johnson, A. Spira, S.S. Jeffrey, and A.H. Bild. 2011. A pharmacogenomic method for individualized prediction of drug sensitivity. Mol. Syst. Biol. 7:513.
Cohen-Hubal, E.A., A.M. Richard, L. Aylward, S.W. Edwards, J. Gallagher, M. Goldsmith, S. Isukapalli, R. Tornero-Velez, E.J. Weber, and D.R. Kavlock. 2010. Advancing exposure characterization for chemical evaluation and risk assessment. J. Toxicol. Environ. Health B Crit. Rev. 13(2-4):299-313.
Crall, A.W., G.J. Newman, C.S. Jarnevich, T.J. Stohlgren, D.M. Waller, and J. Graham. 2010. Improving and integrating data on invasive species collected by citizen scientists. Biol. Invasions 12(10):3419-3428.
CTD (Comparative Toxicogenomics Database). 2012. CTD [online]. Available: http://ctdbase.org/ [accessed Jan. 11, 2012].
Cvetkovic, M., and P. Chow-Fraser. 2011. Use of ecological indicators to assess the quality of Great Lakes coastal wetlands. Ecol. Indic. 11(6):1609-1622.
Cycling Metro Vancouver. 2007. Cycling metro Vancouver [online]. Available: http://www.cyclevancouver.ubc.ca/cv.aspx [accessed Oct. 28, 2011].
Dale, V., G.R. Biddinger, M.C. Newman, J.T. Oris, G.W. Suter, T. Thompson, T.M. Armitage, J.L. Meyer, R.M. Allen-King, G.A. Burton, P.M. Chapman, L.L. Conquest, I.J. Fernandez, W.G. Landis, L.L. Master, W.J. Mitsch, T.C. Mueller, C.F. Rabeni, A.D. Rodewald, J.G. Sanders, and I.L. van Heerden. 2008. Enhancing the ecological risk assessment process. Integr. Environ. Assess. Manag. 4(3):306-313.
Davis, A.P., B.L. King, S. Mockus, C.G. Murphy, C. Saraceni-Richards, M. Rosenstein, T. Wiegers, and C.J. Mattingly. 2011. The Comparative Toxicogenomics Database: Update 2011. Nucleic Acids Res. 39(suppl.1):D1067-D1072.
de Nazelle, A, E. Seto, D. Donaire, M. Mendez, J. Matamala, M. Portella, D. Rodriguez, M. Nieuwenhuijsen, and M. Jerrett. 2011. Improving Estimates of Travel Activity and Air Pollution Exposure through Ubiquitous Sensing Technologies. Abstract S-0035 in Abstracts of the 23rd Annual Conference of the International Society of Environmental Epidemiology (ISEE), September 13-16, 2011, Barcelona, Spain [online]. Available: http://ehp03.niehs.nih.gov/article/fetchArticle.action?articleURI=info%3Adoi%2F10.1289%2Fehp.isee2011 [accessed Sept. 4, 2012].
Demokritou, P., I.G. Kavouras, S.T. Ferguson, and P. Koutrakis. 2001. Development and laboratory performance evaluation of a personal multipollutant sampler for simultaneous measurements of particulate and gaseous pollutants. Aerosol Sci. Technol. 35(3):741-752.
Doran, M., M. Babin, O. Hembise, A. Mangin, and P. Garnesson. 2011. Ocean transparency from space: Validation of algorithms estimating Secchi depth using MERIS, MODIS, and SeaWiFS data. Remote Sens. Environ. 115(12):2986-3001.
Dunton, G.F., Y. Liao, S. Intille, J. Wolch, and M. Pentz. 2011. Social and physical contextual influences on children’s leisure-time physical activity: An ecological momentary assessment study. J. Phys. Act. Health 8(suppl. 1):S103-S108.
Eakin, C.M., C.J. Nim, R.E. Brainard, C. Aubrecht, C. Elvidge, D.K. Gledhill, F. Muller-Karger, P.J. Mumby, W J. Skirving, A.E. Strong, M. Wang, S. Weeks, F. Wentz, and D. Ziskin. 2010. Monitoring coral reefs from space. Oceanography 23(4):118-133.
Edwards, R., K.R. Smith, B. Kirby, T. Allen, C.D. Litton, and S. Hering. 2006. An inexpensive dual-chamber particle monitor: Laboratory characterization. Air Waste Manage. Assoc. 56(6):789-799.
Elgethun, K., M.G. Yost, C.T. Fitzpatrick, T.L. Nyerges, and R.A. Fenske. 2007. Comparison of global positioning system (GPS) tracking and parent-report diaries to characterize children’s time-location patterns. J. Expo. Sci. Environ. Epidemiol. 17(2):196-206.
Emilli, E., C. Popp, M. Petitta, M. Riffler, S. Wunderle, and M. Zebisch. 2010. PM10 remote sensing from geostationary SEVIRI and polar-orbiting MODIS sensors over the complex terrain of the European Alpine region. Remote Sens. Environ. 114(11):2485-2499.
Emond, C., L.S. Birnbaum, and M.J. Devito. 2006. Use of a physiologically based pharmacokinetic model for rats to study the influence of body fat mass and induction of CYP1A2 on the pharmacokinetics of TCDD. Environ. Health Perspect. 114(9):1394-1400.
Engel-Cox, J.A., R.M. Hoff, R. Rogers, F. Dimmick, A.C. Rush, J.J. Szykman, J. Al-Saadi, D.A. Chu, and E.R. Zell. 2006. Integrating lidar and satellite optical depth with ambient monitoring for 3-dimensional particulate characterization. Atmos. Environ. 40(40):8056-8067.
EnMAP (Environmental Mapping and Analysis Program). 2011. EnMAP Hyperspectral Imager [online]. Available: http://www.enmap.org/ [accessed Oct. 28, 2011].
EPA (U.S.Environmental protection Agency). 2009. Guidance on the Development, Evaluation, and Application of Environmental Models. EPA/100/K-09/003. Office of Science Advisor, Council for Regulatory Environmental Modeling, U.S. Environmental Protection Agency [online]. Available: http://www.epa.gov/crem/library/cred_guidance_0309.pdf [accessed Jan. 25, 2012].
EPA (U.S. Environmental Protection Agency). 2012a. SHEDS-Multimedia: Stochastic Human Exposure and Dose Model for Multimedia, Multipathway Chemicals. Human Exposure and Atmospheric Sciences, National Exposure Research Laboratory, U.S. Environmental Protection Agency [online]. Available: http://www.epa.gov/heasd/products/sheds_multimedia/sheds_mm.html [accessed May 1, 2012].
EPA (U.S. Environmental Protection Agency). 2012b. ACTor. National Center for Computational Toxicology, U.S. Environmental Protection Agency [online]. Available: http://actor.epa.gov/actor/faces/ACToRHome.jsp [accessed Jan. 12, 2012].
EPA (U.S. Environmental Protection Agency). 2012c. ExpoCastDB [online]. Available: http://actor.epa.gov/actor/faces/ExpoCastDB/Home.jsp [accessed June 1, 2012].
Esteve-Turrillas, F.A., A. Pastor, V. Yusa, and M. de la Guardia. 2007. Using semi-permeable membrane devices as passive samplers. Trends Anal. Chem. 26(7):703-712.
Feng, C.H., and C.Y. Lu. 2011. Modification of major plasma proteins by acrylamide and glycidamide: Preliminary screening by nano liquid chromatography with tandem mass spectrometry. Anal. Chim. Acta 684(1-2):80-86.
Fenner, K., M. Scheringer, M. Macleod, M. Matthies, T. McKone, M. Stroebe, A. Beyer, M. Bonnell, A.C. Le Gall, J. Klasmeier, D. Mackay, D. Van De Meent, D. Pennington, B. Scharenberg, N. Suzuki, and F. Wania. 2005. Comparing estimates of persistence and long-range transport potential among multimedia models. Environ. Sci Technol. 39(7):1932-1942.
Ferrier, G. 1999. Application of imaging spectrometer in identifying environmental pollution caused by mining at Rodaquilar, Spain. Remote Sens. Environ. 68(2):25-137.
FGDC (Federal Geographic Data Committee). 2011. Standards [online]. Available: http://www.fgdc.gov/standards [accessed May 11, 2012].
Finkelstein, M.E., K.A. Grasman, D.A. Croll, B.R. Tershy, B.S. Keitt, W.M. Jarman, and D.R. Smith. 2007. Contaminant-associated alteration of immune function in black-footed albatross (Phoebastria nigripes), a North Pacific predator. Environ. Toxicol. Chem. 26(9):1896-1903.
Focardi, S., I. Corsi, S. Mazzuoli, L. Vignoli, and S.A. Loiselle. 2006. Integrating remote sensing approach with pollution monitoring tools for aquatic ecosystem risk assessment and management: A case study of Lake Victoria (Uganda). Environ. Monit. Assess. 122(1-3):275-287.
Fustinoni, S., L. Campo, P. Manini, M. Buratti, S. Waidyanatha, G. De Palma, A. Mutti, V. Foa, A. Colombi, and S. M. Rappaport. 2008. An integrated approach to biomonitoring exposure to styrene and styrene-(7,8)-oxide using a repeated measurements sampling design. Biomarkers 13(6):560-578.
Gallagher, L.G., T.F. Webster, A. Aschengrau, and V.M. Vieira. 2010. Using residential history and groundwater modeling to examine drinking water exposure and breast cancer. Environ. Health Perspect. 118(6):749-755.
Gangwal, S. 2011. ExpoCastDB: A Publicly Accessible Database for Observational Exposure Data. Presented at Computational Toxicology Community of Practice, September 22, 2011 [online]. Available: http://www.epa.gov/ncct/download_files/chemical_prioritization/ExpoCastDB_CommPractice_09-22-2011-Share.pdf [accessed Dec. 7, 2011].
Genualdi, S.A., K.J. Hageman, L.K. Ackerman, S. Usenko, and S.L.M. Simonich. 2011. Sources and fate of chiral organochlorine pesticides in western U.S. National Park ecosystems. Environ. Toxicol. Chem. 30(7):1533-1538.
Georgopoulos, P.G., A.F. Sasso, S.S. Isukapalli, P.J. Lioy, D.A. Vallero, M. Okino, and L. Reiter. 2009. Reconstructing population exposures to environmental chemicals from biomarkers: Challenges and opportunities. J. Expo. Sci. Environ. Epidemiol. 19(2):149-171.
Goetz, S.J., N. Gardiner, and J.H. Viers. 2008. Monitoring freshwater, estuarine and nearshore benthic ecosystems with multi-sensor remote sensing: An introduction to the special issue. Remote Sens. Environ. 112(11):3993-3995.
Gomez-Casero, M.T., I.L. Castillejo-Gonzalez, A. Garcia-Ferrer, J.M. Pena-Barragan, M. Jurado-Exposito, L. Garcia-Torres, and F. Lopez-Granados. 2010. Spectral discrimination of wild oat and canary grass in wheat fields for less herbicide application. Agron. Sustain. Dev. 30(3):689-699.
Goodchild, M.F. 2007. The Morris Hansen Lecture 2006: Statistical perspectives on spatial social science. J. Off. Stat. 23(3):269-283.
GPS (Global Positioning System). 2011. The Global Positioning System [online]. Available: http://www.gps.gov/systems/gps/ [accessed Oct. 28, 2011].
Gunier, R.B., M.H. Ward, M. Airola, E.M. Bell, J. Colt, M. Nishioka, P.A. Buffler, P. Reynolds, R.P. Rull, A Hertz, C. Metayer, and J.R. Nuckols. 2011. Determinants of agricultural pesticide concentrations in carpet dust. Environ. Health Perspect. 119(7):970-976.
Hagerstrand, T. 1970. What about people in regional science. Pap. Regul. Sci. 24:1-21 (as cited in Briggs 2005).
Hill, M., A. Parizek, R. Kancheva, M. Duskova, M. Velikova, L Kriz, M. Klimkova, A. Paskova, Z. Zizka, P. Matucha, M. Meloun, and L. Starka. 2010. Steroid metabolome in plasma from the umbilical artery, umbilical vein, maternal cubital vein and in amniotic fluid in normal and preterm labor. J. Steroid Biochem. Mol. Biol. 121(3-5):594-610.
Hjort, N.L., and G. Claeskens. 2003. Frequentist model average estimators. J. Am. Stat. Assoc. 98(464):879-899.
Hoeting, J.A., D. Madigan, A.E. Raftery, and C.T. Volinsky. 1999. Bayesian model averaging: A tutorial. Statist. Sci. 14(4):382-417.
Hoff, R.M., and S.A. Christopher. 2009. Remote sensing of particulate pollution from space: Have we reached the promised land? J. Air Waste Manage. Assoc. 59(6): 645-675.
Houston, D., P. Ong, J. Wu, and A. Winer. 2006. Proximity of licensed child care facilities to near-roadway vehicle pollution. Am. J. Public Health 96(9):1611-1617.
Hsieh, M.D., and E.T. Zellers. 2004. Limits of recognition for simple vapor mixtures determined with a microsensor array. Anal. Chem. 76(7):1885-1895.
Iglesias, R.A., F. Tsow, R. Wang, E.S. Forzani, and N. Tao. 2009. Hybrid separation and detection device for analysis of benzene, toluene, ethylbenzene, and xylenes in complex samples. Anal. Chem. 81(21):8930-8935.
Intille, S.S. 2007. Technological innovations enabling automatic, context-sensitive ecological momentary assessment. Pp. 308-337 in The Science of Real-Time Data Capture: Self-Report in Health Research, A.A. Stone, S. Shiffman, A. Atienza, and L. Nebeling, eds. Oxford: Oxford University Press [online]. Available: http://www.ccs.neu.edu/home/intille/teaching/AMB/papers/Stone_Chapter16.pdf [accessed Jan. 20, 2012].
IPCS (International Programme on Chemical Safety). 2008. Uncertainty and Data Quality in Exposure Assessment, Part 1. Guidance Document on Characterizing and Communicating Uncertainty of Exposure Assessment. IPCS project on the Harmonization of Approaches to the Assessment of Risk from Exposure to Chemicals.
Geneva: World Health Organization [online]. Available: http://www.who.int/ipcs/publications/methods/harmonization/exposure_assessment.pdf [accessed May 14, 2012].
Jaworska, J.S., R.S. Boethling, and P.H. Howard. 2003. Recent developments in broadly applicable structure-biodegradability relationships. Environ. Toxicol. Chem. 22 (8):1710-1723.
Jerrett, M., A. Arain, P. Kanaroglou, B. Beckerman, D. Potoglou, T. Sahsuvaroglu, J. Morrison, and C. Giovis. 2005. A review and evaluation of intraurban air pollution exposure models. J. Expo. Anal. Environ. Epidemiol. 15(2):185-204.
Jerrett, M., S. Gale, and C. Kontgis. 2009. An environmental health geography of risk. Pp. 418-445 in A Companion to Health and Medical Geography, T. Brown, S. McLafferty, and G. Moon, eds. Oxford, UK: Wiley-Blackwell.
Jin, C., and E.T. Zellers. 2008. Limits of recognition for binary and ternary vapor mixtures determined with multi-transducer arrays. Anal. Chem. 80(19):7283-7293.
Jin, H., B.J. Webb-Robertson, E.S. Peterson, R. Tan, D.J. Bigelow, M.B. Scholand, J.R. Hoidal, J.G. Pounds, and R.C. Zangar. 2011. Smoking, COPD and 3-nitrotyrosine levels of plasma proteins. Environ. Health Perspect. 119(9):1314-1320.
Jones, A.P., E.G. Coombes, S.J. Griffin, and E.M. van Sluijs. 2009. Environmental supportiveness for physical activity in English school children: A study using Global Positioning Systems. Int. J. Behav. Nutr. Phys. Act. 6(1):42.
Judson, R.S., M.T. Martin, P.P. Egeghy, S. Gangwal, D.M. Reif, P. Kothiya, M.A. Wolf, T. Cathey, T.R. Transue, D. Smith, J. Vail, A. Frame, S. Mosher, E.A. Cohen-Hubal, and A.M. Richard. 2012. Aggregating data for computational toxicology applications: The U.S. Environmental Protection Agency (EPA) Aggregated Computational Toxicology Resource (ACToR) system. Int. J. Mol. Sci. 13(2):1805-1831.
Keller, M., D.S. Schimel, W.W. Hargrove, and F.M. Hoffman. 2008. A continental strategy for the National Ecological Observatory Network. Front. Ecol. Environ. 6(5):282-284.
Khanna, V.K. 2012. Nanosensors: Physical, Chemical, and Biological. Boca Raton, FL: CRC Press.
Kidd, K.A., P.J. Blanchfield, K.H. Mills, V.P. Palace, R.E. Evans, J.M. Lazorchak, and R.W. Flick. 2007. Collapse of a fish population after exposure to a synthetic estrogen. Proc. Natl. Acad. Sci. U.S.A. 104(21):8897-8901.
Kim, S.K., H. Chang, and E.T. Zellers. 2011. Microfabricated gas chromatograph for the selective determination of trichloroethylene vapor at sub-parts-per-billion concentrations in complex mixtures. Anal. Chem. 83(18):7198-7206.
Kim, S.K., D.R. Burris. H. Chang, J. Bryant-Genevier, and E.T. Zellers. 2012a. Microfabricated gas chromatograph for on-site determinations of trichloroethylene in indoor air arising from vapor intrusion. 1. Field evaluation. Environ.Sci. Technol. in press. 46(11):6065-6072.
Kim, S.K., D.R. Burris, J. Bryant-Genevier, K.A. Gorder, E.M. Dettenmair, and E.T. Zellers. 2012b. Microfabricated gas chromatograph for on-site determinations of TCE in indoor air arising from vapor intrusion. 2. Spatial/temporal monitoring. Environ. Sci. Technol. 46(11):6073-6080.
Klaper, R., B.J. Carter, C.A. Richter, P.E. Drevnick, M.B. Sandheinrich, and D.E. Tillitt. 2010. Corrigendum: Use of a 15 k gene microarray to determine gene expression changes in response to acute and chronic methylmercury exposure in the fathead minnow Pimephales promelas Rafinesque (72(9):2207- 2008). J. Fish Biol. 77(1): 310.
Kolok, A.S., H.L. Schoenfuss, C.R. Propper, and T.L. Vail. 2011. Empowering citizen scientists: The strength of many in monitoring biologically active environmental contaminants. BioScience 61(8):626-630.
Kopecky, K.J., S. Davis, T.E. Hamilton, M.S. Saporito, and L.E. Onstad. 2004. Estimation of thyroid radiation doses for the Hanford thyroid disease study: Results and implications for statistical power of the epidemiological analyses. Health Phys. 87(1):15-32.
Korrick, S.A., L.M. Altshul, P.E. Tolbert, V.W. Burse, L.L. Needham, and R.R. Monson. 2000. Measurement of PCBs, DDE, and hexachlorobenzene in cord blood from infants born in towns adjacent to a PCB-contaminated waste site. J. Expo. Anal. Environ. Epidemiol. 10(6 Pt 2):743-754.
Kratz, T.K., P. Arzberger, B.J. Benson, C.Y. Chiu, K. Chiu, L. Ding, T. Fountain, D. Hamilton, P.C. Hanson, Y.H. Hu, F.P. Lin, D.F. McMullen, S. Tilak, and C. Wu. 2006. Towards a Global Lake Ecological Observatory Network. Publications of the Karelian Institute 145:51-63 [online]. Available: http://www.gleon.org/Gleon_Kratz_etal_2006.pdf [accessed Jan. 17, 2012].
Krause, A.R., C. Van Neste, L.R. Senesac, T. Thundat, and E. Finot. 2008. Trace explosive detection using photothermal deflection spectroscopy. J. App. Phys. 103(9):094906.
Lahr, J., and L. Kooistra. 2010. Environmental risk mapping of pollutants: State of the art and communication aspects. Sci. Total Environ. 408(18):3899-3907.
Lammel, G. 2004. Effects of time-averaging climate parameters on predicted multicompartmental fate of pesticides and POPs. Environ. Pollut. 128(1-2):291-302.
Lavrova, O.Y., and A.G. Kostianoy. 2011. Catastrophic oil spill in the Gulf of Mexico in April-May 2010. Izv. Atmos. Ocean. Phys. 47(9):1114-1118.
Leamer, E.E. 1978. Specification Searches: Ad Hoc Inference with Nonexperimental Data. New York: Wiley.
Lee, H.J., Y. Liu, B.A. Coull, J. Schwartz, and P. Koutrakis. 2011. A novel calibration approach of MODIS AOD data to predict PM2.5 concentrations. Atmos. Chem. Phys. 11(5):7991-8002.
Lewis, P.R., P. Manginell, D.R. Adkins, R.J. Kottenstette, D.R. Wheeler, S.S. Sokolowski, D.E. Trudell, J.E. Byrnes, M. Okandan, J.M. Bauer, R.G. Manley, and C. Frye-Mason. 2006. Recent advancements in the gas-phase MicroChemLab. IEEE Sensors Journal 6(3):784-795.
Leyk, S., C.R. Binder, and J.R. Nuckols. 2009. Spatial modeling of personalized exposure dynamics: The case of pesticide use in small-scale agricultural production landscapes of the developing world. Int. J. Health Geo. 8:17.
Li, M., H.X. Tang, and M.L. Roukes. 2007. Ultra-senstive NEMS-based cantilever array for sensing, scanned probes, and very high-frequency applications. Nature Nanotech. 2(2):114-120.
Litton, C.D., K.D. Smith, R. Edwards, and T. Allen. 2004. Combined optical and ionization measurement techniques for inexpensive characterization of micrometer and submicrometer aerosols. Aerosol Sci. Technol. 38(11):1054-1062.
Longley, P., M. Goodchild, D. Maguire, and D. Rhind. 2005. Geographic Information Systems and Science. New York: Wiley.
Lyapustin, A., J. Martonchik, Y. Wang, I. Laszlo, and S. Korkin. 2011a. Multi-angle implementation of atmospheric correction (MAIAC): Part 1. Radiative transfer basis and look-up tables. J. Geophys. Res. 116: D03210, doi:10.1029/2010JD014985.
Lyapustin, A., Y. Wang, I. Laszlo, R. Kahn, S. Korkin, L. Remer, R. Levy, and J.S. Reid. 2011b. Multi-angle implementation of atmospheric correction (MAIAC): Part 2. Aerosol algorithm. J. Geophys. Res. 116: D03211, doi:10.1029/2010JD014986.
Maclachlan, J.C., M. Jerrett, T. Abernathy, M. Sears, and M.J. Bunch. 2007. Mapping health on the internet: A new tool for environmental justice and public health research. Health Place 13(1):72-86.
MacLeod, M., D. Woodfine, J. Brimacombe, L. Toose, and D. Mackay. 2002. A dynamic mass budget for toxaphene in North America. Environ. Toxicol. Chem. 21(8):1628-1637.
MacLeod, M., M. Scheringer, T.E. McKone, and K. Hungerbühler. 2010. The state of multimedia mass-balance modeling in environmental science and decision making. Environ. Sci. Technol. 44(22):8360-8364.
Maina, J., V. Venus, M.R. McClanahan, and M. Ateweberhan. 2008. Modeling susceptibility of coral reefs to environmental stress using remote sensing data and GIS models. Ecol. Model. 212(3-4):180-199.
Malley, D.F., K.N. Hunter, and G.R. Webster. 1999. Analysis of diesel fuel contamination in soils by near-infrared reflectance spectrometry and solid phase microextraction-gas chromatography. Soil Sediment Contam. 8(4):481-489.
Mattingly, C.J. 2009. Chemical databases for environmental health and clinical research. Toxicol. Lett. 186(1):62-65.
Mattingly, C.J., T.E. McKone, M.A. Callahan, J.A. Blake, and E.A. Cohen-Hubal. 2012. Providing the missing link: The exposure science ontology ExO. Environ. Sci. Technol. 46(6):3046-3053.
Maxwell, S.K., J.R. Meliker, and P. Goovaerts. 2010. Use of land surface remotely sensed satellite and airborne data for environmental exposure assessment in cancer research. J. Expo. Sci. Environ. Epidemiol. 20(2):176-185.
McCreanor, J., P. Cullinan, M.J. Nieuwenhuijsen, J. Stewart-Evans, E. Malliarou, L. Jarup, R. Harrington, I.K. Svartengren, P. Ohman-Strickland, K.F. Chang, and J. Zhang. 2007. Respiratory effects of exposure to diesel traffic persons with asthma. N. Engl. J. Med. 357(23):2348-2358.
McKone, T.E., and M. MacLeod. 2003. Tracking multiple pathways of human exposure to persistent multimedia pollutants: Regional, continental and global-scale models. Annu. Rev. Environ. Resour. 28:463-492.
McKone, T.E., R. Castorina, M.E. Harnly, Y. Kuwabara, B. Eskenazi, and A. Bradman. 2007. Merging models and biomonitoring data to characterize sources and pathways of human exposure to organophosphorous pesticides in the Salinas Valley of California. Environ. Sci. Technol. 41(9):3233-3240.
Mishra, D., C. Cho, S. Ghosh, A. Fox, C. Downs, P. Merani, P. Kirui, N. Jackson, and S. Mishra. 2012. Post-spill state of the marsh: Remote estimation of the ecological impact of the Gulf of Mexico oil spill on Louisiana Salt Marshes. Remote Sens. Environ. 118:176-185.
Molitor, J., M. Jerrett, C.C. Chang, N.T. Molitor, J. Gauderman, K. Berhane, R. McConnell, F. Lurmann, J. Wu, A. Winer, and D. Thomas. 2007. Assessing uncertainty in spatial exposure models for air pollution health effects assessment. Environ. Health Perspect. 115(8):1147-1153.
Morland, K.B., and K.R. Evenson. 2009. Obesity prevalence and the local food environment. Health Place 15(2):491-495.
Mun, M., S. Reddy, K. Shilton, N. Yau, J. Burke, D. Estrin, M. Hansen, E. Howard, R. West, and P. Boda. 2009. PEIR, the personal environmental impact report, as a platform for participatory sensing system research. Pp. 55-68 in Proceedings of the
7th Annual International Conference on Mobile Systems, Applications and Services-MobiSys ‘09, June 22-25, 2009, Krakow, Poland. New York: ACM.
NASA (National Aeronautics and Space Administration). 2011. HyspIRI Mission Study. Jet Propulsion Laboratory. California Institute of Technology [online]. Available: http://hyspiri.jpl.nasa.gov/ [accessed Oct. 28, 2011].
Nebert, D.W., T.P. Dalton, A.B. Okey, and F.J. Gonzalez. 2004. Role of aryl hydrocarbon receptor-mediated induction of the CYP1 enzymes in environmental toxicity and cancer. J. Biol. Chem. 279(23):23847-23850.
Nicholson, J.K., and J.C. Lindon. 2008. Systems biology: Metabonomics. Nature 455 (7216):1054-1056.
Noy, N.F., and D.L. McGuinness. 2001. Ontology Development101: A Guide to Creating Your First Ontology. KSL-01-05. Knowledge System Laboratory, Stanford University, CA [online]. Available: http://www.ksl.stanford.edu/KSL_Abstracts/KSL-01-05.html [accessed May 31, 2012].
NRC (National Research Council). 2007. Models in Environmental Regulatory Decision Making. Washington, DC: National Academies Press.
NRC (National Research Council). 2009. Science and Decisions: Advancing Risk Assessment. Washington, DC: National Academies Press.
OBO Foundry (The Open Biological and Biomedical Ontologies Foundry). 2012. Exposure Ontology [online]. Available: http://obolibrary.org/cgi-bin/detail.cgi?id=exo [accessed June 4, 2012].
Odermatt, D., A. Gitelson, V.E. Brando, and M. Schaepman. 2012. Review of constituent retrieval in optically deep and complex waters from satellite imagery. Remote Sens. Environ. 118:116-126.
Offit, K. 2011. Personalized medicine: New genomics, old lessons. Hum. Genet. 130(1):3-14.
Ong, C.H., T.J. Cudahy, M.S. Caccetta, and M.S. Piggott. 2003. Deriving quantitative dust measurements related to iron ore handling from airborne hyperspectral data. Min. Technol. IMM Trans. Sect. A 112(3):158-163.
Pastorello, G.Z., G.A. Sanchez-Azofeifa, and M.A. Nascimento. 2011. Enviro-net: From networks of ground-based sensor systems to a web platform for sensor data management. Sensors 11(6):6454-6479.
Patel, C.J., J. Bhattacharya, and A.J. Butte. 2010. An Environment-Wide Association Study (EWAS) on Type 2 Diabetes Mellitus. PLoS One 5(5):e10746.
Patel, S.V., T.E. Mlsna, B. Fruhberger, E. Klaassen, S. Cemalovic, and D.R. Baselt. 2003. Chemicapactive microsensors for volatile organic compound detection. Sensor. Actuat. B Chem. 96(3):541-553.
Paulos, E., R.J. Honicky, and E. Goodman. 2007. Sensing atmosphere. The 5th ACM Conference on Embedded Network Sensor Systems-AMC SenSys, November 6-9, 2007, Sydney, Australia [online]. Available: http://www.paulos.net/papers/2007/Sensing%20Atmosphere%20(Sensys%202007%20Workshop).pdf [accessed Jan. 18, 2012].
Pelletier, B., R. Santer, and J. Vidot. 2007. Retrieving of particulate matter from optical measurements: A semiparametric approach. J. Geophys. Res. Atmos. 112:D06208, doi:10.1029/2005JD006737.
Peters, A., G. Hoek, and K. Katsouyanni. 2012. Understanding the link between environmental exposures and health: Does the exposome promise too much? J. Epidemiol. Community Health 66(2):103-105.
Richardson, D.B., and A. Ciampi. 2003. Effects of exposure measurement error when an exposure variable is constrained by a lower limit. Am. J. Epidemiol. 157(4):355-363.
Roberts, A.P., J.T. Oris, G.A. Burton, and W.H. Clements. 2005. Gene expression in caged fish as a first-tier indicator of contaminant exposure in streams. Environ. Toxicol. Chem. 24(12):3092-3098.
RTI International. 2008. New Technology Used to Increase Accuracy, Ease Measurement of Harmful Environmental Exposure. RTI International News: September 16, 2008 [online]. Available: http://www.rti.org/page.cfm?objectid=4749BFB4-CCC0-2F2C-9D8D787BCE30A49D [accessed Jan. 11, 2012].
Rundel, P.W., E.A. Graham, M.F. Allen, J.C. Fisher, and T.C. Harmon. 2009. Environmental sensor networks in ecological research. New Phytol. 182(3):589-607.
Sanchez, Y.A., K. Deener, E. Cohen Hubal, K. Knowlton, D. Reif, and D. Segal. 2010. Research needs for community-based risk assessment: Findings from multi-disciplinary workshop. J. Expo. Sci. Environ. Epidemiol. 20(2):186-195.
Sanchez, W., W. Sremski, B. Piccini, O. Palluel, E. Maillot-Marechal, S. Betoulle, A. Jaffal, S. Ait-Aissa, F. Brion, E. Thybaud, N. Hinfray, and J.M. Porcher. 2011. Adverse effects in wild fish living downstream from pharmaceutical manufacture discharges. Environ. Int. 37(8):1342-1348.
Sarangapani, R., J. Teeguarden, K.P. Plotzke, J.M. McKim, Jr, and M.E. Andersen. 2002. Dose-response modeling of cytochrome p450 induction in rats by octamethylcy-clotetrasiloxane. Toxicol. Sci. 67(2):159-172.
Schenker, U., M. MacLeod, M. Scheringer, and K. Hungerbühler. 2005. Improving data quality for environmental fate models: A least-squares adjustment procedure for harmonizing physicochemical properties of organic compounds. Environ. Sci. Technol. 39(21):8434-8441.
Scheringer, M., F. Wegmann, K. Fenner, and K. Hungerbühler. 2000. Investigation of the cold condensation of persistent organic pollutants with a global multimedia fate model. Environ. Sci. Technol. 34(9):1842-1850.
Scheringer, M., M. Stroebe, F. Wania, F. Wegmann, and K. Hungerbühler. 2004. The effect of export to the deep sea on the long-range transport potential of persistent organic pollutants. Environ. Sci. Pollut. Res. Int. 11(1):41-48.
Scholz, S., and I. Mayer. 2008. Molecular biomarkers of endocrine disruption in small model fish. Mol. Cell. Endocrinol. 293(1-2):57-70.
Schwartz, D., and F. Collins. 2007. Environmental biology and human disease. Science 316(5825):695-696.
Senesac, L., and T.G. Thundat. 2008. Nanosensors for trace explosive detection. Materials Today 11(3):28-36.
Seto, E., E. Martin, A. Yang, P. Yan, R. Gravina, I. Lin, C. Wang, M. Roy, V. Shia, and R. Bajcsy. 2010. Opportunistic Strategies for Lightweight Signal Processing for Body Sensor Networks. Proceedings of the 3rd International Conference on Pervasive Technology Related to Assistive Environments-PETRA, June 23-25, 2010, Samos, Greece [online]. Available: http://www.eecs.berkeley.edu/~yang/paper/SetoPETRAE2010.pdf [accessed Jan. 12, 2012].
Seto, E., P. Yan, P. Kuryloski, R. Bajcsy, T. Abresch, E. Henricson, and J. Han. 2011. Mobile Phones as Personal Environmental Sensing Platforms: Development of the CalFit Systems. Abstract S-0034 in Abstracts of the 23rd Annual Conference of the International Society of Environmental Epidemiology (ISEE), September 13-16, 2011, Barcelona, Spain [online]. Available: http://ehp03.niehs.nih.gov/article/
fetchArticle.action?articleURI=info%3Adoi%2F10.1289%2Fehp.isee2011 [accessed Sept. 4, 2012].
Shalat, S.L., A.A. Stambler, Z. Wang, G. Mainelis, O.H. Emoekpere, M. Hernandez, P.J. Lioy, and K. Black. 2011. Development and in-home testing of the Pretoddler In-halable Particulate Environmental Robotic (PIPER Mk IV) sampler. Environ. Sci. Technol. 45(7):2945-2950.
Shelley, S. 2008. Update. Nanosensors: Evolution, not revolution…yet. CEP 104(6):8-12.
Shi, Q., H. Hong, J. Senior, and W. Tong. 2010. Biomarkers for drug-induced liver injury. Expert Rev. Gastroenterol. Hepatol. 4(2):225-234.
Shipler, D.B., B.A. Napier, W.T. Farris, and M.D. Freshley. 1996. Hanford environmental dose reconstruction project—an overview. Health Phys. 71(4):532-544.
Short, N.M., Sr. 2011. Introduction: Technical and Historical Perspectives of Remote Sensing. Remote Sensing Tutorial [online]. Available: http://rst.gsfc.nasa.gov/Intro/Part2_1.html [accessed Jan. 18, 2012].
Shoval, N., and M. Isaacson. 2006. Application of tracking technologies to the study of pedestrian spatial behavior. Prof. Geog. 58(2):172-183.
Simon, S.L., J.E. Till, R.D. Lloyd, R.L. Kerber, D.C. Thomas, S. Preston-Martin, J.L. Lyon, and W. Stevens. 1995. The Utah leukemia case-control study: Dosimetry methodology and results. Health Phys. 68(4):460-471.
Smith, P.N., G.P Cobb., C. Godard-Codding, D. Hoff, S.T. McMurry, T.R. Rainwater, and K.D. Reynolds. 2007. Contaminant exposure in terrestrial vertebrates. Environ. Pollut. 150(1):41-64.
Soltow, Q.A., F.H. Strobel, K.G. Mansfield, L. Wachtman, Y. Park, and D.P. Jones. In press. High-performance metabolic profiling with dual chromatography-Fouriertransform mass spectrometry (DC-FTMS) for study of the exposome. Metabolomics in press.
Stahl, R.G., T.S. Bingman, A. Guiseppi-Elie, and R.A. Hoke. 2010. What biomonitoring can and cannot tell us about causality in human health and ecological risk assessments. Hum. Ecol. Risk Assess. 16(1):74-86.
Steenland, K., and D. Savitz. 1997. Topics in Environmental Epidemiology. New York: Oxford University Press.
Stram, D.O., and K.J. Kopecky. 2003. Power and uncertainty analysis of epidemiological studies of radiation-related disease risk in which dose estimates are based on a complex dosimetry system: Some observations. Radiat. Res. 160(4):408-417.
Su, J.G., M. Winters, M. Nunes, and M. Brauer. 2010. Designing a route planner to facilitate and promote cycling in Metro Vancouver, Canada. Trans. Res. Part A 44(7):495-505.
Swayze, G.A., R.N. Clark, S.J. Sutley, T.M. Hoefen, G.S. Plumlee, G.P. Meeker, I.K. Brownfield, K.E. Livo, and L.C. Morath. 2006. Spectroscopic and x-ray diffraction analyses of asbestos in the World Trade Center dust: Asbestos content of the settled dust. Pp. 40-65 in Urban Aerosols and Their Impact: Lessons Learned from the World Trade Center Tragedy, J.S. Gaffney, and N.A. Marley, eds. American Chemical Society Symposium Series 919. Oxford: Oxford University Press.
Tan, Y.M., K.H. Liao, and H.J. Clewell, III. 2007. Reverse dosimetry: Interpreting trihalomethanes biomonitoring data using physiologically based pharmacokinetic modeling. J. Expo. Sci. Environ. Epidemiol. 17(7):591-603.
Tang, Z., H. Wu, D. Du, J. Wang, H. Wang, W.J. Qian, D.J. Bigelow, J.G. Pounds, R.D. Smith, and Y. Lin. 2010. Sensitive immunoassays of nitrated fibrinogen in human biofluids. Talanta 81(4-5):1662-1669.
Teeguarden, J.G., P.J. Deisinger, T.S. Poet, J.C. English, W.D. Faber, H.A. Barton, R.A. Corley, and H.J. Clewell, III. 2005. Derivation of a human equivalent concentration for n-butanol using a physiologically based pharmacokinetic model for n-butyl acetate and metabolites n-butanol and n-butyric acid. Toxicol. Sci. 85(1):429-446.
Thomas, D.C., D. Stram, and J. Dwyer. 1993. Exposure measurement error: Influence on exposure-disease. Relationships and methods of correction. Annu. Rev. Public Health 14:69-93.
Timchalk, C., J.A. Campbell, G. Liu, Y. Lin, and A.A. Kousba. 2007. Development of a non-invasive biomonitoring approach to determine exposure to the organophosphorus insecticide chlorpyrifos in rat saliva. Toxicol. Appl. Pharmacol. 219(2-3):217-225.
Todaka, T., H. Hirakawa, J. Kajiwara, T. Hori, K. Tobiishi, D. Yasutake, D. Onozuka, S. Sasaki, C. Miyashita, E. Yoshioka, M. Yuasa, R. Kishi, T. Iida, and M. Furue. 2010. Relationship between the concentrations of polychlorinated dibenzo-p-dioxins, polychlorinated dibenzofurans, and polychlorinated biphenyls in maternal blood and those in breast milk. Chemosphere 78(2):185-192.
van der Meer, F.D., P.M. van Dijk, H. van der Werrf, and H. Yang. 2002. Remote sensing and petroleum seepage: A review and case study. Terra Nova 14(1):1-17.
van Donkelaar, A., R.V. Martin, M. Brauer, R. Kahn, R. Levy, C. Verduzco, and P.J. Villeneuve. 2010. Global estimates of ambient fine particulate matter concentrations from satellite-based aerosol optical depth: Development and application. Environ. Health Perspect. 118(6):847-855.
Vazquez-Prokopec, G.M., S.T. Stoddard, V. Paz-Soldan, A.C. Morrison, J.P. Elder, T.J. Kochel, T.W. Scott, and U. Kitron. 2009. Usefulness of commercially available GPS data-loggers for tracking human movement and exposure to dengue virus. Int. J. Health Geo. 8:68.
Villenueuve, D.L., and N. Garcia-Reyero. 2011. Vision and strategy: Predictive ecotoxicology in the 21st century. Environ. Toxicol. Chem. 30(1):1-8.
Wacholder, S., D.T. Silverman, J.K. McLaughlin, and J.S. Mandel. 1992. Selection of control in case-control studies. III. Design options. Am. J. Epidemiol. 135(9):1042-1050.
Walt, D.R. 2005. Electronic noses: Wake up and smell the coffee. Anal. Chem. 77(3):45A.
Wang, Z., S.L. Shalat, K. Black, P.J. Lioy, A.A. Stambler, O.H. Emoekpere, M. Hernandez, T. Han, M. Ramagopal, and G. Mainelis. 2012. Use of a robotic sampling platform to assess young children’s exposure to indoor bioaerosols. Indoor Air 22(2):159-169.
Wania, F., and D. Mackay. 1993. Global fractionation and cold condensation of low volatility organochlorine compounds in polar regions. Ambio 22(1):10-18.
Wania, F., and D. Mackay. 1999. The evolution of mass balance models of persistent pollutant fate in the environment. Environ. Pollut.100(1-3):223-240.
Wania, F., and Y. Su. 2004. Quantifying the global fractionation of polychlorinated biphenyls. Ambio 33(3):161-168.
Ward, M.H., J.R. Nuckols, S.J. Weigel, S.K. Maxwell, K.P. Cantor, and R.S. Miller. 2000. Identifying populations potentially exposed to agricultural pesticides using remote sensing and a geographic information system. Environ. Health Perspect. 108(1):5-12.
Ward, M.H., J. Lubin, J. Giglierano, J.S. Colt, C. Wolter, N. Bekiroglu, D. Camann, P. Hartge, and J.R. Nuckols. 2006. Proximity to crops and residential exposure to agricultural herbicides in Iowa. Environ. Health Perspect. 114(6):893-897.
Whitehead, A., B. Dubansky, C. Bodinier, T.I. Garcia, S. Miles, C. Pilley, V. Raghunathan, J.L. Roach, N. Walker, R.B. Walter, C.D. Rice, and F. Galvez. 2011. Genomic and physiological footprint of the Deepwater Horizon oil spill on resident marsh fishes. Proc. Natl. Acad. Sci. USA [online]. Available: http://www.pnas.org/content/early/2011/09/21/1109545108.full.pdf [accessed Feb. 16, 2012].
Wild, C.P. 2005. Complementing the genome with an “exposome”: The outstanding challenge of environmental exposure measurement in molecular epidemiology. Cancer Epidemiol. Biomarkers Prev. 14(8):1847-1850.
Williamson, C.E., J.E. Saros, and D.W. Schindler. 2009. Sentinels of change. Science 323(5916): 887-888.
Wilson, M.L. 2002. Emerging and vector-borne diseases: Role of high spatial resolution and hyperspectral images in analyses and forecasts. J. Geograph. Syst. 4(1):31-42.
Winkelmann, K.H. 2005. On the Applicability of Imaging Spectroscopy for the Detection and Investigation of Contaminated Sites with Particular Consideration Given to the Detection of Fuel Hydrocarbon Contamination in Soil. Ph.D. Dissertation, Brandenburgischen Technischen Universität, Cottbus.
Wootton, R., and L. Bonnardot. 2010. In what circumstances is telemedicine appropriate in the developing world? JRSM Short Rep. 1(5):37.
Wu, Y.Z., J. Chen, J.F. Ji, Q.J. Tian, and X.M. Wu. 2005. Feasibility of reflectance spectroscopy for the assessment of soil mercury contamination. Environ. Sci. Technol. 39(3):873-878.
Yuan, Y.M., W. Xiong, Y.H. Fang, T.G. Lan, and D.C. Li. 2010. Detection of oil spills on water by differential polarization FTIR spectrometry [in Chinese]. Guang Pu Xue Yu Guang Pu Fen Xi 30(8): 2129-2132.
Zartarian, V.G., H. Ozkaynak, J.M. Burke, M.J. Zufall, M.L. Rigas, and E.J. Furtaw, Jr. 2000. A modeling framework for estimating children’s residential exposure and dose to chlorpyrifos via dermal residue contact and nondietary ingestion. Environ Health Perspect. 108(6):505-514.
Zhang, X., M.E. Monroe, B. Chen, M.H. Chin, T.H. Heibeck, A.A. Schepmoes, F. Yang, B.O. Petritis, D.G. Camp II, J.G. Pounds, J.M. Jacobs, D. J. Smith, D. J. Bigelow, R.D. Smith, and W. Qian. 2010. Endogenous 3,4-dihydroxyphenylalanine and dopaquinone modifications on protein tyrosine: Links to mitochondrially derived oxidative stress via hydroxyl radical. Mol. Cell Proteomics 9(6):1199-1208.