Proceedings of a Workshop
Feasibility of Addressing Environmental Exposure Questions Using Department of Defense Biorepositories
Proceedings of a Workshop—in Brief
The past decade has seen advancements in methods for measuring environmental exposures in biological specimens, such as blood or tissue. Chemicals can now be measured more accurately and with smaller volumes of specimens. Biorepositories that store many biospecimens are maintained by the Department of Defense (DoD) for medical purposes. To help determine the feasibility of using these biorepositories to conduct research on environmental and occupational exposures experienced by servicemembers, DoD posed the following questions:
- How could biorepositories be used to reconstruct past exposure, either incident related or longitudinally based on repeated deployments and potential exposures over time?
- Is there biological monitoring technology that can be used for detecting chemicals to support longitudinal studies (such as pre-deployment and post-deployment)? If so, how could the specimens from biorepositories be used to validate exposures to such chemicals?
- How can genomic technologies be used to identify troops who had an exposure of interest?
- Are there additional biospecimens the DoD should consider collecting to answer exposure questions?
To discuss these questions and others, the National Academies of Sciences, Engineering, and Medicine’s Committee on the Feasibility of Addressing DoD’s Environmental Exposure Questions with Biorepositories convened a two-day workshop on June 14 and 15, 2018, sponsored by the DoD. This Proceedings of a Workshop—in Brief summarizes the discussions that took place at the workshop.
DOD’S ENVIRONMENTAL AND OCCUPATIONAL EXPOSURE CHALLENGES
According to Steven Jones, the Director of Force Readiness and Health Assurance Policy in the Office of the Deputy Assistant Secretary of Defense for Health Readiness Policy and Oversight, DoD servicemembers have many different exposures in many different environments. In a non-deployed environment, occupational exposures occur in shipyards, machine shops, and firing ranges. Deployed environments can range from deserts or jungles to dense urban settings, and exposures in these settings are less predictable. Servicemembers have been exposed to herbicides, radiation, dust storms, lead, and complex mixtures of chemicals and particulate matter, such as emissions from oil well fires or burn pits. Mr. Jones stressed that deployment exposures do not just apply to troops, but to staff from other Federal agencies, such as the United States Agency for International Development (USAID), the Department of State, and the Federal Bureau of Investigations. These federal employees and their contractors work alongside DoD servicemembers to accomplish a mission, and they look to DoD for information on their potential exposures.
According to Jones, the DoD is piloting an individual longitudinal exposure record (ILER). ILER is a web-based application that provides the DoD and the US Department of Veteran’s Affairs (VA) the ability to link an individual to exposures, with the goal of improving the efficiency, effectiveness, and the quality of care. Unfortunately, though, the DoD cannot monitor troops for all potential exposures. “We don’t have a device today that can monitor for 40 chemicals,” said Jones. It is important though to try to understand exposures and their potential impacts to create interventions that can protect public health.
Dr. Mark Rubertone, of the Defense Medical Surveillance System, provided an overview of the DoD serum repository. Dr. Robert Ursano, of the Uniformed Services University of the Health Sciences, discussed the repository for the Army Study to Assess Risk and Resilience in Servicemembers (STARRS) and the Study to Assess Risk and Resilience in Servicemembers Longitudinal Study (STARRS-LS). Colonel Craig Shriver, of the John P. Murtha Cancer Center, provided an overview of the Murtha Cancer Center repository.
DOD SERUM REPOSITORY, RUBERTONE
The primary purpose of the DoD serum repository is medical surveillance to support disease prevention and health program policy. This repository is the central archive of US military serum samples that are collected as part of routine, mandatory medical screenings, as well as sera collected before and after deployment, described Rubertone. The serum repository was originally established in 1989 as the Army/Navy serum repository with the purpose of storing remaining serum from routine, mandatory HIV testing its mission expanded in 1996 to include Air Force servicemembers and the collection and storage of deployment specimens.
The serum repository has over 65 million serial specimens from more than 11 million people. The specimens are stored at -30°C and barcoded with unique specimen identification numbers. The identification numbers can be linked with the demographic, deployment, occupational, and clinical data maintained within the Defense Medical Surveillance System (DMSS). The DMSS is a database maintained since the 1990s which contains roughly 2.5 billion ambulatory and inpatient records.
The serum repository is used to support research. On average 15,000-20,000 aliquots of specimens are requested per year for about 30 research studies. In 2012, Rubertone and colleagues published an analysis of the DoD serum repository’s support of research and found that about 76 published studies used DoD serum repository specimens. The published studies generally focused on infectious diseases and autoimmune disorders, however, a few studies have looked at biological regulatory molecules such as chemokines, cytokines and hormone-like molecules; genetics; or chemical exposures from the environment (Perdue et al. 2015). In one recent study, investigators measured microRNAs, small, non-coding RNAs that can function to decrease gene expression, and observed whether these were expressed differently in samples collected before and after deployment (Woeller et al. 2016). Another study detected free benzo[a]pyrene and observed associations with lipids, fatty acids, and sulfur amino acid metabolic pathways (Walker et al. 2016).
THE ARMY STARRS REPOSITORY, URSANO
The Study to Assess Risk and Resilience in Servicemembers (STARRS) and the Study to Assess Risk and Resilience in Servicemembers Longitudinal Study (STARRS-LS) repositories store specimens from these epidemiological studies. The STARRS studies are a comprehensive set of eight separate but integrated studies focused on addressing suicide risk and resilience, which are statistically weighted to be representative of the U.S. Army. These are the largest studies ever conducted on mental health and resilience, explained Ursano. A total of 45,398 soldiers have enrolled in the study and given biological samples, among them 34,453 were new soldiers and 10,654 were soldiers from 3 combat brigade teams who provided blood pre-deployment and/or post-deployment (6,070 soldiers have both pre- and post-deployment biological samples), and 291 were soldiers in a suicide attempt case-control study (1:2 ratio of cases to controls). At this time, the STARRS biorepository, housed within the Rutgers University Cell & DNA Repository (Piscataway, NJ), has 158,458 biological samples in frozen storage (-80°C) collected from 45,398 active duty Army soldiers. The STARRS biological samples include whole blood in 10 ml EDTA tubes (n=40,061), whole blood in 2.5 ml PAXGene tubes (n=16,464), plasma aliquots (n=54,929), buffy coat aliquots (n=27,978), and DNA aliquots (n=19,026).
The STARRS samples can be linked to the applicable study’s data from questionnaires, and neurological and mental health assessments and in many cases the medical records. Despite the ample data that the samples can potentially be linked to, the Army STARRS samples have not been used for research other than the STARRS study. Any future use of STARRS biological samples requires an application, approval by the NIMH/Army Scientific Oversight Committee, and approval by the Secretary of the Army, according to Dr. Ursano. The STARRS biorepository at Rutgers charges for retrieval of biological samples and for packaging/shipping of biological samples.
MURTHA CANCER CENTER REPOSITORY, SHRIVER
The John P. Murtha Cancer Center is a joint effort of the DoD, Uniformed Services University of the Health Sciences, and Walter Reed National Military Medical Center that aims to prospectively collect, process and store tissue, blood, urine, saliva and other body fluids from patients diagnosed with, or suspected to have, a malignancy. All specimens are collected from military hospitals under standardized procedures, for molecular and other types of research. The Murtha Cancer Center biorepositories are accredited by the College of American Pathologists (CAP) Biorepository accreditation program. The Murtha Cancer Center began with the prostate program, over 25 years ago, which collects more than 10,000 samples a year from over 200 consented patients. The Murtha Cancer Center biorepository has accrued over 500,000 samples on nearly 30,000 unique patients. Military hospitals treat about two-thirds of cancer patients from the DoD healthcare system, which cares for over 1,000 active duty personnel a year who are diagnosed with cancer. The Murtha Cancer Center has hundreds of associated data fields from each patient and cancer treatment outcome data are collected over time using standard protocols.
The Murtha Cancer Center repository samples have been used in studies with partners, such as the National Cancer Institute, universities, and corporations through cooperative research and development agreements. Sample distribution must be approved by a committee and is then approved by Colonel Shriver. The Murtha Cancer Center is very meticulous about distributing samples, but they are willing to exhaust a specimen if the proposed research is good. These collaborations have resulted in a myriad of peer-reviewed publications and presentations at national and international meetings. “We’ve given out over 28,000 samples,” Shriver said. “Through these collaborations with academia and internal DoD, Murtha Cancer Center samples contribute to about 100 peer-reviewed publications a year, in very high impact journals.” A few studies have used tissue specimens to examine environmental exposures, such as polychlorinated biphenyls and pesticides in breast tissue. The Murtha Cancer Center also collects information
about environmental exposures through questionnaires, and it has been working with the Environmental Protection Agency to link samples with Environmental Quality Index for all active servicemembers.
USING BIOMONITORING INFORMATION TO RECONSTRUCT PAST EXPOSURES
Retrospective Exposure Estimation for Perfluorooctanoic Acid (PFOA) for Participants in the C8 Health Project
Dr. Hyeong-Moo Shin from the University of Texas at Arlington, and Dr. Scott Bartell from the University of California at Irvine gave presentations on the use of retrospective exposure modeling for participants in the C8 Science Panel Study. The study involved people living or working in eastern Ohio and western West Virginia who were exposed to perfluorooctanoic acid (PFOA), or C8, released by the DuPont Washington Works facilities from the 1950s until the early 2000s. According to Shin, the facilities released approximately 600 tons of PFOA to the local air and waterways, which contaminated the groundwater and drinking water. The C8 Science Panel Study was a cross-sectional study funded by a legal settlement with DuPont. A single biological sample taken between 2005 and 2006 was available on each study participant, as well as residential, occupational, and medical histories. To conduct an epidemiologic analysis of the health effects of PFOA, Shin reconstructed year-to-year PFOA exposure estimates to match the medical histories for each study participant. To calculate yearly PFOA exposure estimates, Dr. Shin used predicted water and air concentrations from an environmental fate and transport model, individual residential histories and water sources, and default exposure assumptions, such as average inhalation rate from the US Environmental Protection Agency (EPA) Exposure Factors Handbook (U.S. EPA 2011). Dr. Shin then coupled individual exposure estimates with a one-compartment absorption, distribution, metabolism, and excretion model to estimate time-dependent serum PFOA concentrations.
Using these models, Shin was able to estimate serum PFOA concentrations at different time points. According to Shin, for all 43,449 participants, he found a Spearman’s correlation coefficient of 0.68, indicating the observed serum concentrations were reasonably well correlated with the predicted serum concentrations. When he restricted his analysis to participants who provided daily water intake rates and who had the same residence and workplace within one of the six qualifying municipal water districts for the five years before the serum sample was taken (n = 1,074), the Spearman’s correlation coefficient increased to 0.82 (Shin et al. 2011).
Using Biomarkers to Improve Contact-Based Exposure Assignments: From Validation to Time-Dependent Calibration
Bartell then highlighted the advantages of this retrospective modeling approach, by demonstrating how improvements to the exposure assessment methods and study design resulted in increased relative risk estimates for the relationship of PFOA exposure and preeclampsia. He made this point by presenting epidemiological analyses of C8 Science Panel Study data that used different exposure measurements. In an analysis that retrospectively associated the serum PFOA measurements from 2005 to 2006 with preeclampsia in pregnancies, the risk estimates were weak for exposures above the median (adjusted odds ratio: 1.1 per interquartile range [IQR] of log serum PFOA; 95% confidence interval [CI]: 0.9, 1.4) (Stein 2009). When using log serum PFOA exposure predicted with the models developed by Shin they also observed a modest association, but with a much more precise estimate (adjusted OR: 1.13 per IQR of log serum PFOA, 95% CI: 1.00, 1.28) (Savitz et al. 2012). Another approach that further improves the dose reconstruction described by Bartell is a Bayesian pharmacokinetic calibration. This method uses the dose reconstruction for the prior distribution and the single serum PFOA measurement as the observed data, with the pharmacokinetic model incorporated into the likelihood equation. The risk estimates with this approach were also modest, but the uncertainty was decreased (adjusted OR: 1.16 per IQR of log serum PFOA, 95% CI: 1.03, 1.30) (Avanasi et al. 2016). The consistency between the models using the different exposure measures with different threats to validity, provides more confidence in the association. In a separate prospective analysis, which associated the 2005 to 2006 PFOA measurements with preeclampsia in pregnancies occurring after the blood collection, the risk estimates were modest (adjusted odds ratio [OR]: 1.27 per unit log serum PFOA; 95% confidence interval [CI]: 1.05, 1.55) (Darrow et al. 2013).
Bartell discussed some advantages of dose reconstruction. For example, reconstructed dose estimates are less restricted to specific time-periods than biomarkers. It also may be possible to increase the sample size, by reconstructing exposures for individuals who did not provide blood samples. Bartell also elaborated that epidemiologic studies often use biomarker concentrations as the sole measure of exposure and compare the measured biomarker concentrations with the prevalence or incidence of disease. The problem with this approach is that biomarker measurements are, “just a snapshot of body burden at the time of sample collection, which is influenced by your past exposures, but more strongly influenced by the most recent exposures,” he said. He added that another advantage of dose reconstruction is that the estimates are less susceptible to confounding by unmeasured physiologic differences, because the estimates are not based on physiologic parameters. External estimates of exposure do not have the same sort of threats to the study validity, he said. There are threats to validity through non-differential measurement error with dose estimates, however; non-differential measurement error biases the observed epidemiologic effect toward the null. Therefore, “it’s likely if we do observe an association, the true association is larger,” explained Bartell. Finding reverse causation may be less likely with external exposure estimates unless the time activity information used to develop the dose estimates is also dependent on disease status. Bartell also reminded the attendees that they should view studies where exposure biomarkers measurement did not precede outcome measurement with caution because it is unclear if the exposure preceded the disease. Moreover, some validated exposure biomarkers have poor within-subject correlation over time, resulting in exposure misclassification. According to Bartell, researchers should be most cautious when looking at chemical exposures that have a short biological half-life relative to the disease induction period; because pharmacokinetic models have shown that as the biological half-life decreases, the temporal variability in the exposure biomarkers increases (Bartell, Griffith, and Faustman 2004).
Dr. Bartell acknowledged that the exposure prediction used in the C8 Science Panel Study was aided by extensive occupational and residential histories and information regarding PFOA releases, and a level of detail that might not be available for DoD. However, the Defense Occupational Environmental Health Readiness System (DOEHRS) data or other data maintained by DoD could be used to predict exposures. When asked how the exposure predictions for a burn pit exposure situation could be done by DoD, Dr. Bartell explained that crude exposure estimation methods are available. For example, if DoD knew what soldiers were potentially exposed and what the potential chemical exposures might be, then DoD could measure biomarkers in samples from the serum biorepository and confirm chemical signals that might be expected from the burn pits. Bartell suggested DoD might look to crude exposure predictions that were completed to determine exposures after the World Trade Center collapse.
NEW METHODS TO ANSWER EXPOSURE QUESTIONS
New tools and methods that can help answer questions about environmental and occupational exposures range from new biomarkers, laboratory techniques, or computational methods, to alternative sample types that may be more easily collected in austere environments than blood or urine.
Protein Adducts as Exposure Biomarkers: The State of the Science
According to Dr. William Funk of Northwestern University, one biomarker type that researchers should consider are serum albumin adducts. Serum albumin adducts are bound products of a reactive electrophile and serum albumin. Reactive electrophiles are difficult to measure directly in blood, due to their reactivity, thus adducts are often measured instead. Although reactive electrophiles can also bind with DNA, glutathione, and other proteins, such as hemoglobin, serum albumin adducts are preferred in many instances. Albumin is abundant in serum (about 60% of the protein in serum is albumin) and the protein contains one free thiol, HSA-Cys34, for binding. Albumin adducts also capture a reasonably long exposure window, approximately one month. According to Funk, after forming, “the adducts stay on the protein for the lifespan of the protein.” This longer half-life allows measurement of many exposures that would normally go undetected.
Approaches to albumin adduct measurement, sometimes referred to as adductomics, can either be targeted, where the laboratory method is designed to find an adduct of interest and provides a quantitative measurement, or untargeted, which provides a more qualitative exposure measure. Advantages of targeted adduct measurements over untargeted are increased analytical sensitivity, improved quantification, that internal standards are used, the sample throughput is higher, and a costs per sample analysis is lower. From Funk’s perspective, targeted analysis is preferred in a large epidemiologic study. An example of a target analysis approach is that of measuring benzene albumin adducts. Targeted approaches have been used for benzene exposure. After inhalation, benzene is metabolized to benzene oxide, then to reactive quinones such as benzoquinone. Benzene oxide and benzoquinone both bind with albumin. In a study of 184 subjects, both the benzene oxide and the benzoquinone adducts were observed to have a positive linear relationship with benzene exposure estimates in air (Rappaport et al. 2002).
A disadvantage of targeted approaches is that many exposures do not have metabolic pathways that are well understood. According to Funk, untargeted detection approaches may provide a method for biomarker discovery when the metabolic pathways are less well understood. For example, researchers have observed that the serum adductomic profiles of Chinese women who cooked with solid fuel differed from women who cooked with electricity or gas (Lu et al. 2017) and another study found differences in serum adductomic profiles between patients with chronic lung and heart disease and healthy controls in central London (Liu et al. 2018). Another example comes from a chronic heart and lung disease study in central London, they were able to measure ethylene oxide and acrylonitrile adducts and use these markers to distinguish smokers and non-smokers with 100% specificity (Liu et al. 2018).
These targeted and untargeted approaches can work together, described Funk. In his lab now, they are using untargeted analysis approaches to discover potential new biomarkers. Then they use a targeted analysis on panels of biomarkers and a high throughput assay on a larger number of samples. Then they look for statistical differences between exposure groups. A lot of new biomarkers are oxidative stress and response biomarkers, but the goal is to discover potentially more exposure biomarkers. These approaches can be applied to less invasive biological sampling techniques, such as dried blood spots.
Non-targeted Analysis of Archived Specimens for Discovering Past Exposures
Dr. Jon Sobus from the US Environmental Protection Agency provided an in-depth overview of the strengths and limitations of non-targeted analysis methods. According to Sobus, non-targeted analysis methods may be useful to DoD because there are exposure scenarios like burn pits where there could be exposures to chemicals such as those from incomplete combustion that have been characterized, but also chemicals that have not. In targeted analysis, different instrumentation depends on the properties of the target chemical of interest. Then standards and calibration curves are used to identify the chemicals to do precise, accurate, and sensitive measurement. According to Sobus, “non-targeted is a completely different animal. We do not start with the chemical. We start with the sample, and we do not know what we are looking for.” Another major difference between non-targeted analysis and targeted analysis is the confidence in the chemical identification. With targeted methods, analysts are certain of their identification and with non-targeted analysis methods, many times they are not. In some cases, only the mass can be identified and in others, the chemical formula can, but not the structure. In every non-targeted experiment, all levels of confidence in chemical identification can occur.
Sobus then provided a high-level overview of a typical workflow for non-targeted analysis. First, there is often a sample extraction or clean-up step. Then, the extract(s) are analyzed using a high-resolution mass spectrometer (often coupled with gas
chromatography [GC] or liquid chromatography [LC]), which identifies unknown molecular features, and assigns a precise estimate of the mass (to several decimal places) to each feature, limiting the number of candidate compounds of similar mass. Often, molecular features of greatest interest are prioritized before assigning candidate formula and structures. There are many methods for prioritization, such as picking the largest or most frequent peaks. Statistical techniques can also be used to determine molecular features with higher or lower relative abundance compared to a control. Once the features are prioritized, steps are then taken to determine the chemical formula and structure. This is done using a variety of information, including the accurate mass, retention time, and isotope distribution. Some laboratories also use tandem mass spectrometry to examine fragmentation spectra, thus improving compound identification. While not commonly practiced, new methods are emerging for generating semi-quantitative estimates of chemical concentration based on non-targeted analysis results.
To address basic science questions regarding the utility of non-targeted analysis techniques, EPA’s Non-Targeted Analysis Collaborative Trial (ENTACT) was launched in 2016. Sobus described how ENTACT contains multiple parts. In one part, EPA created 10 different synthetic mixtures using ~1,200 chemicals that had been screened for bioactivity with in vitro assays in EPA’s ToxCast1 program. Each of the 10 mixtures contained ~100 to ~400 different chemicals. EPA sent the mixtures to ~25 different laboratories who agreed to be in the trial. The laboratories then reported their results to EPA. In turn, EPA disclosed what chemicals were in the samples. The laboratories then performed an unblinded evaluation and provided the data to EPA so they could do an assessment across all the sample groups. Another arm of ENTACT aims to evaluate what happens in real samples of interest and uses reference extracts of house dust, human serum, and silicone wristbands. EPA enriched one extract from each of these media with one of the mixtures prepared for arm one and provided each of the 25 laboratories an enriched and unenriched extract for analysis (Sobus 2017). EPA has not received all of the results, but preliminary observations show that for 1200 spiked chemicals, many thousands of different features have been observed. Using improved data processing methods, EPA believes that a large portion of these features are noise or artifacts. Yet, EPA also believes that the majority of spiked compounds can be identified in the mixtures as true positives. In the results received so far, EPA has observed a tremendous amount of variability in results, with some laboratories over reporting and some underreporting, described Sobus. EPA has also observed variability in the types of chemicals observed across non-targeted analysis methods, with subsets of compounds being detected only by GC (with electron ionization) or LC (with electrospray ionization in positive or negative ion modes).
Based on ENTACT, and his other experiences with non-targeted analysis, Sobus provided the following considerations for future applications of non-targeted analysis:
- Pollutant concentrations are generally low in blood or serum, which can limit researchers’ abilities to acquire quality tandem mass spectrometry data and confidently assign chemical structures to observed features.
- The primary goal of exposure-based biomonitoring is defining sensitive and specific biomarkers of unique exposure sources, and then using biomarker data to infer previous exposures and, if possible, attendant health risks.
- When using non-targeted analysis techniques as a source of exposure biomarker data, one must consider the potential for impacts of measurement error on exposure misclassification.
Non-targeted analysis methods are certainly tools that can inform a retrospective exposure assessment but great care must be taken to avoid misinterpretation of data that are meant for discovery more than routine monitoring.
Challenges and Opportunities in Developing a Search Engine for Exposures Associated with Disease Risk
Dr. Chirag Patel of Harvard University presented on building a “search engine”, or machine learning methods, to find environmental factors associated with health and disease phenotype. Patel and his collaborators aim to understand the variation in phenotype that is due to environmental factors. Phenotype is a combination of genes that are inherited and the environment in which we live, reminded Patel. Heritability is the amount of variation in phenotype that can be explained by genetics. Things like eye color and hair curliness have high heritability, but most diseases have modest to moderate heritability. In the past, many experts thought variation in disease states could be explained by lifestyle factors, but it is doubtful that lifestyle can explain all the variation. Environmental epidemiology studies have traditionally had small sample sizes due to costs and complexity. However, there are many efforts underway to create large datasets which can be mined to answer these questions. This includes efforts such as the United Kingdom Biobank, exposome studies in Europe, NHANES, and the new National Institutes of Health All of Us cohort.
The datasets produced by these research initiatives can be used to explore the relationships between exposures and phenotypes. Patel described an approach to using these datasets to understand the linkages between the genome and the environment. According to Patel, “a data-driven way to explore the links between exposures and phenotypes is to look at all of them simultaneously,” this is a method often referred to as an exposome wide association study or EWAS. Patel then described that these large data-driven analyses can generate novel hypotheses and describe the variation at a higher scale than is possible with a more prescriptive, targeted analysis. However, to begin to establish plausibility, it is important to link chemical exposure biomarkers with the exogenous environment. To evaluate biological plausibility, it is also important to link exposure with changes in the functional genome and gene expression. But, in summary, to make inferences from associations observed in these studies, large sample sizes are required for robust measures. To further explore the observations, Patel believes it is also necessary for there to be longitudinal measures, so time-dependence and windows of vulnerability can be explored.
Patel then provided examples of how he and his collaborators have been aiming to follow this approach and developed streams of evidence using robust study designs and analytical methods. Several years ago, Patel and his collaborators conducted an exposome-wide association study using data collected from the National Health and Nutrition Examination Survey (NHANES). Using 253 quantitative biomarkers of environmental exposure they used regression to observe associations between environmental and behavioral factors with all-cause mortality (Patel et al. 2013). They found having a low household income was associated with early death, while physical activity and young age were associated with longevity.
According to Patel, he and his colleagues aimed to replicate these findings in another study. In this study, they observed 452 associations with telomere length (shorter telomeres are associated with aging). Consistent with the previous study, they found that physical fitness was strongly correlated with increased telomere length. However, because they were again testing many associations at once they also observed something completely unexpected; exposure to polychlorinated biphenyls (PCBs) was also associated with increased telomere length. To explore the biological plausibility of the relationship between PCBs and aging, Patel and his colleagues completed a meta-analysis of seven gene expression datasets and found modest evidence that PCBs may modify the expression of genes implicated in telomere length. The meta-analysis also consistently replicated the association between telomere length and physical activity (Patel et al. 2016). As a follow-on to the findings of physical activity, Patel and colleagues are now conducting an intervention study looking at gene expression arrays from participants before and after physical activity and how genes differ between people of different activity levels as measured by the number of steps taken per day. Preliminary results are supportive of their earlier findings and show a support of exercise inducing protective changes to the genome.
In Patel’s opinion, to develop robust study designs for inference, dense correlations that are observed in large studies need to be understood. Patel discussed an effort to address these issues by Raj Manrai and colleagues. They have developed a phenotype-exposure map by merging eight years of NHANES data, which includes 158 phenotypic levels and 510 quantitative exposure levels, and they have found over 67,000 associations that are above the false discovery rate and were adjusted for covariates, age, sex, race, and income. Some examples of observed positive associations between PCBs and triglycerides and nutrients and BMI. This is promising according to Patel because the map shows how different phenotypes change in response to behaviors, biomarkers of exposure, or nutrients. In conclusion, Dr. Patel again pushed for using large biorepositories to answer these questions because the large datasets can show many correlations and patterns may emerge that can be followed up with future longitudinal studies.
Panel and Audience Discussion: How Might These Techniques Be Useful to DoD?
In a panel discussion following the first three talks in this session, Dr. Benjamin Blount from the Centers for Disease Control and Prevention (CDC) commented about the importance of patterns in complex datasets. Over several decades of NHANES, CDC has amassed tens of thousands of targeted biomonitoring measurements of volatile organic compounds (VOCs). Recently, Blount and his colleagues have questioned whether different patterns of VOC concentrations could be associated with a potential source of exposure. To accomplish this, they used an artificial neural network (ANN), a computational tool that aims to mimic the brains of humans and animals. ANNs are used by internet search engines to identify images, such as distinguishing a dog from a cat. ANNs use a supervised learning phase to process the data inputs and into something humans can understand. Recently, the ANN used in Blount’s lab has distinguished people with VOC exposure due to fuel use from people with VOC exposure due to smoking (Chambers et al. 2017). This is an example of how these patterns can be important sources of information.
Deployment-associated Exposure Surveillance with High-Resolution Metabolomics
Opening the next half of the session, Dr. Dean Jones of Emory University also discussed the utility of non-targeted metabolomic exposure monitoring technology and the application of that technology to observing differences in patterns of exposures associated with deployment. The biological monitoring technology used has been liquid chromatography or gas chromatography with ultra-high-resolution mass spectrophotometry. These techniques were inspired by former NIH director Elias Zerhouni’s “Data of the Future,” where Zerhouni called for cost effective approaches for chemical analysis. In 2010, Jones and colleagues described an approach to use high-resolution mass spectrometry with computational methods for personalized medicine (Johnson et al. 2010). He and his colleagues have since improved on computational tools to extract information from the non-targeted data. One computational tool commonly used by Jones and colleagues is “Mummichog” which processes the preliminary analysis steps, such as metabolome-wide associations and partial least squares discriminant analysis, to determine the metabolites of interest through a network and pathway analysis (Li et al. 2013). Jones and colleagues have also used high-resolution metabolomics to measure multiple types of exposures and connect those exposures to biologic response. From this research they have associated chemical signatures with the dietary exposures, supplements and pharmaceuticals, the microbial communities that live in the human body, and commercial products and environmental chemicals (Park et al. 2012).
From Dean Jones’ perspective, the advantage of the high-resolution metabolomics is that it connects the internal chemical dose to human chemical exposures. The multitude of human exposures may initiate a metabolic response, which are more closely linked with biomarkers of disease risk or health outcomes (Jones et al. 2016). For about 100 US Dollars (USD), more than 10,000 chemicals can be measured in 100 microliters of blood. Further, for a bit higher cost (about 250 USD) 100,000 chemical signals can be measured in 250 microliters of blood. Jones also noted that although 10,000 chemicals are identifiable, only 1,000 are currently identified. Nonetheless, based on these advancements in metabolomic approaches Jones said that he believes the “data of the future” is available today. The high-resolution metabolomics are simple. There is only a one step protein removal and ion spectrometry. The methods are high-throughput and automated taking only about five minutes per sample. There is also extensive
coverage, with over 20,000 ions being detectable. They are also reproducible and relatively quantitative and able to be standardized for a cumulative chemical library (Uppal et al. 2016). And recently according to Jones he and his collaborators have been able use several computational tools that they’ve developed to integrate the chemical measurement data with other types of phenotypic and outcome data (Uppal et al. 2017).
Then Dean Jones described the application of these methods in the study mentioned by Rubertone that used samples from the DoD Serum Repository. In this study, they observed differences in exposure patterns among troops deployed near burn pits and those who were deployed to a control area. They observed correlations with PAHs for every metabolite in the naphthalene pathway. Additionally, they observed concentrations of insecticides, miticides, fungicides, flame-retardants, plasticizers, and other environmental chemicals, have measurable differences associated with deployment. Other evidence indicates that metabolic products of the adductome are measurable, suggesting that they may have identified biomarkers that could be useful in surveillance of deployment-associated exposures occurring weeks or months earlier. They also observed that targeted environmental chemical and metabolic pathway effects differed between pre- and post-deployment. Preliminary data showed there were possible associations of International Statistical Classification of Diseases and Related Health Problems (ICD-9) codes for respirable illness with some measured controls (Walker et al. 2016). According to Dean Jones, the main point is that there are substantially more chemicals that differ in the pre- and post-deployment among the soldiers deployed near the burn pits than those deployed to the control environment.
Integrative Mass Spectrometry to Measure the Human Exposome
Dr. Douglas Walker also of Emory University discussed integrative mass spectrometry to measure the human exposome, which is work he conducted in collaboration with Dean Jones and others. Walker began by discussing that although it is recognized that the environment plays a critical role in the development of human diseases, the ability to measure many exposures of interest is lacking compared to genetics. There is a critical need to develop analytical frameworks for universal screening to provide measures of 10,000-100,000 exposures. For him, the metabolome, or the complete set of small molecules in the human body, provides a critical link between exposure, biological response, and disease outcomes (Walker et al. 2016). Walker then explained how the chromatography, high-resolution mass spectrometry, and chemometrics work together to provide a framework for high-resolution metabolomics. The gas or high-performance liquid chromatography separates the chemicals, the high-resolution mass spectrometry measures an accurate mass, and the chemometrics allow for computation, data extraction, alignment and characterization.
Walker then provided an example study using high-resolution metabolomics to assess occupational exposures to trichloroethylene. In this study, researchers evaluated trichloroethylene (TCE) exposures among workers in Guangdong, China using commercial vapor diffusion monitors. Then they used high-resolution metabolomics from post-shift plasma collected from 95 workers unexposed to TCE and 80 workers who used TCE in a manufacturing environment. The high-resolution metabolomics detected many chemical features associated with trichloroethylene exposure and identified metabolites of trichloroethylene that are consistent with trichloroethylene’s toxicological effects. Then they regressed disease risk and exposure biomarkers, to find correlations and determine which metabolic perturbations may be associated with exposure to trichloroethylene. In this metabolome wide association study of occupational exposure to TCE, they detected dose associated with metabolic variations consistent with signals from halogenated compounds and endogenous metabolites. Walker et al. observed correlations between uric acid and trichloroethanol. They also observed that the unidentified compounds showed an association with disease risk biomarkers, which he felt demonstrated a strength of the untargeted approach (Walker et al. 2016).
In another study, Walker and colleagues completed a metabolomic assessment among individuals with yearlong exposures to near-highway ultrafine particulate matter. Then they used high-resolution metabolomics to quantify central metabolic intermediates among 28 individuals with low exposures to ultrafine particles (<19,000 particles/cm3) and 31 individuals with high exposure to ultrafine particles (>19,000 particles/cm3), then calculated correlations between the metabolites and markers of systemic response, endothelial function, and inflammation. From this analysis, Walker has observed that the exposure biomarkers representing internal dose were consistent with those observed from exposure to vehicle emissions and other respiratory pollutants. He has also observed that expression of discriminatory metabolites, such as linolenic acid, were consistent with increased oxidative stress and endothelial disfunction. These observations indicate to Walker that long-term traffic pollution exposure is associated with distinct metabolic variations (Liang et al. 2018). Moreover, Walker reported that the annotated data shows changes to chemicals consistent with exhaust constituents and metabolic changes suggesting increased risk for cardiopulmonary diseases.
Walker also discussed a new platform he is calling, “high-resolution exposomics.” Using the high-resolution exposomics, Walker has been able to improve capabilities for detection of low-level pollutant profiles in human samples using National Institute of Standards and Technology (NIST) serum. He has observed these improvements with PCBs, Polybrominated flame-retardants (PBDEs), and organochlorine (OC) pesticides. Walker believes high-resolution exposomics combined with high-resolution metabolomics provides the functional measures and sensitivity that is required for exposome wide association studies of human health and disease. According to Walker, the exposome wide association studies and metabolome wide association profiling of DoD serum repository samples could be integrated into current protocols, which could greatly improve DoD’s chemical surveillance and monitoring of early biological effects (Walker et al. 2016). Additionally, Walker reported that the application of high-resolution metabolomics to alternative matrices, such as passive silicone badges, or intestinal fluid, has shown that the laboratory method is sensitive to detect microenvironment exposures and internal dose biomarkers (Niedzwiecki et al. 2018).
Walker closed by explaining that improved measures of exposure and effect are available and ready for application to DoD repository samples. The combination of environmental chemical surveillance and biological effect monitoring could provide a unified platform. Additionally, Walker explained that the new personal exposure monitoring tools, such as silicone wristbands, and interstitial fluid patches,2 have the potential to allow DoD to collect more measures of chemical exposures during deployment.
State of the Science in Dried Blood Spots
Dr. Jeffrey Freeman of the Johns Hopkins Applied Physics Laboratory provided an overview of dried blood spots, recent advancements, and their application. Dr. Freeman describes dried blood spots as, “micro-samples in which blood, usually from a pricked finger, is spotted on specially prepared filter paper and dried in open air under ambient conditions.” Dried blood spots have many advantages, they are minimally invasive, do not required a phlebotomist to collect, there are reduced biohazard risks, and they do not require refrigeration or freezing, and they have a wide analytic range. Freeman noted that dried blood spots do have some current constraints, such as small sample quantity and issues around potential contamination, sample stability given the ambient drying and storage conditions. In spite of these limitations, interest in microsamples, and dried blood spots in particular, is increasing and the science is advancing quickly. Improvements in the quality and availability of highly sensitive analytic instrumentation, such as liquid chromatography tandem mass spectrometry (LC-MS/MS), as well as new methodological approaches for collection and storage, and improvements in measurement interpretation are factors driving the increasing interest in dried blood spots. According to Freeman, although there is a lot of interest in dried blood spots, but there are practical issues regarding their use.
Freeman discussed a scoping review of reviews he and his colleagues recently completed to characterize the current state of the science in dried blood spots and the considerations required before using them. To complete the scoping review of reviews each review paper was analyzed with a SWOT analysis (strengths, weaknesses, opportunities, and threats). If the review paper identified a paper that was not in their literature database, they added it to build a comprehensive analyte database for dried blood spot evaluation studies. Freeman found that almost 2,000 different analytes have been measured in dried blood spots using over 150 different analytic methods. Some example categories of biomarkers include: genetic, epigenetic, and transcriptomic biomarkers; protein, metabolite, and lipid biomarkers; biomarkers of bacterial, viral, parasitic, and protozoan infection; hormones, cytokines, and antibodies; and environmental chemicals such as persistent organic pollutants, polycyclic aromatic hydrocarbons, and heavy metals. Among the key findings, Freeman noted that dried blood spots are a non-standard measurement, so laboratories need to use different sample preparation and workflows for analysis than used for serum or plasma. The type of fluid collected (i.e., capillary blood versus venous blood for traditional samples) is also different, as capillary blood contains interstitial fluid and venous blood does not. Therefore, investigators will need to consider the potential for bias of their measurements due to differences from traditional samples. In the review, they found that drying the spot cards in ambient conditions for 2-4 hours was optimal. Open air-drying is not conducive to sampling in other non-clinic settings because it presents an opportunity for sample contamination and different temperatures and, levels of humidity in the environment can significantly alter analyte measurements. Therefore, analyte measurements may need to be adjusted which could introduce additional variability to the result, making some analytes challenging to quantify accurately (Freeman et al. 2017).
Freeman then discussed experiments they conducted to determine how some of these field sampling limitations could be overcome. First, Freeman and colleagues aimed to determine if they could develop an alternative to open air drying that was more conducive to field collection and which achieved standardization of drying conditions no matter the surrounding environment. They developed a method for immediate storage of the dried blood spots using an opaque airtight container that contained an experimentally optimized amount of a molecular sieve desiccant to remove the water from the air. The desiccant then rapidly dried the blood spot and created controlled humidity conditions in the container. To test the analytical method performance following this sample collection method, Freeman and colleagues applied molar ratios of a three gene tuberculosis (TB) signature and analyzed ribonucleic acid (RNA) from the blood spots consistent with the TB signature. Samples were exposed to different environments while drying; a simulated rainforest (National Aquarium), actual rainforest (Madagascar), and under drone transport (Virginia Tech). From this experiment, they found that their novel methods achieved rapid drying in all environments. The moisture free environment was achieved inside the container and was sustained under different conditions that they tested, including extreme cold and heat. They also achieved a sensitivity and specificity of 80% and 92% respectively with the 3-gene tuberculosis signature on the dried blood spots in a validation study conducted in Madagascar. Based on this study, Freeman believes dried blood spots collected with this method may be used effectively in any environment. Freeman believes that under the current constraints dried blood spots are suitable for DoD and could be used for most force-wide surveillance of servicemembers (e.g., HIV screening, drug of abuse screening, and forensic identification). Dried blood spots provide a cheaper, simpler method of specimen collection method and can be used to characterize many occupational scenarios, both known and unknown (Freeman et al. 2017).
Using Pooled Samples to Refine Questions of Interest
Dr. Lesa Aylward, of Summit Toxicology, discussed a different approach that might be useful to DoD, sample pooling. According to Aylward, sample pooling is useful “in an attempt to refine and conduct some initial testing of exposure hypotheses.” To describe the importance of testing hypotheses she explained, “There’s sort of a paradox inherent in a biorepository, we bank these biological
2A patch that is adhered to the skin that uses an array of microneedles to create micropores on the skin that then draws a sample of tissue interstitial fluid (ISF), the fluid in spaces between the tissue cells.
samples and they have an incredible potential, there is so much information in these samples, but they’re so valuable that making a decision to use samples is actually difficult.” It is difficult to decide which studies are worth using samples because analysis is destructive and once they are gone, they are gone. To help with making these decisions is where Aylward believes pooling can help.
When using pooled samples, the measures can be interpreted as the arithmetic mean of all the individuals in the pool assuming they all contributed the same volume, explained Aylward. The advantages are that a smaller volume of sample is exhausted from everyone in the pool, but a large enough sample is obtained to address hypotheses. Statistical approaches also exist that allow for inference of population variation based on pooled sample analysis (Caudill et al. 2010). NHANES has been using pooled samples for monitoring exposures to persistent organic pollutants since 2005 because to measure these exposures a large volume of sample is required. For these pooled samples, NHANES reports trends by age, time, and the general population to provide a picture of population-level exposures and changes (Sjodin et al. 2013). According to Dr. Aylward, case-control studies can be conducted using sample pools. In one study, investigators conducted a full individual analysis and then created artificial sample pools and were able to recreate the original effect estimates and confidence intervals (Lyles et al. 2016).
According to Aylward, pooling could be used by DoD repositories to test and refine initial hypotheses in a resource-efficient manner. Some scenarios where pooling might be useful to DoD include, an initial test of hypotheses regarding potential exposures associated with deployment or occupation, suspect chemical screening or non-targeted analysis, initial hypothesis tests of disease-exposure associations, and to validate or refine external exposure indices. In a study of 150 airport firefighters who trained extensively with firefighting foam, that contains perfluorinated compounds, Aylward and her colleagues analyzed individual samples for perfluorinated compounds. Then they created 15 synthetic sample pools with 10 people per pool, grouped by years of occupational exposure. The analysis showed the pools perfectly grouped the individuals’ exposure groups. To follow-on this analysis, Aylward and colleagues conducted a non-targeted analysis for suspect chemical screening and they found increased perfluorinated compounds in the firefighters compared to the controls (Rotander et al. 2015). Aylward closed by stating that the major advantage of a pooling approach is the reduced resources required to test initial hypotheses. In later phases of research, where the goal may be to determine maximum concentrations or estimate a population variance, pooling may not be a good approach.
BIOMARKER INTERPRETATION AND INTERACTION WITH DISEASE STATES
Biomonitoring Considerations Critical to Generate and Interpret Exposure Data Using Archived Samples
Dr. Antonia Calafat from the Centers for Disease Control, Dr. Judy LaKind, of LaKind associates, and Dr. Joyce Tsuji, of Exponent, discussed considerations in interpreting biomarker measurement. According to Dr. Calafat, “the concentrations measured by those biospecimens tend to be trace concentrations, much lower than concentrations of the same chemicals in the environment.” So, when archived biospecimens are used for exposure biomonitoring, researchers should ensure the samples were collected using methods that reduce the risk of contamination and that selected biomarkers can accurately assess exposure. The analytical methods for biomonitoring must be highly sensitive, selective, and specific, and follow rigorous quality control and quality assurance protocols to ensure reliable measures of target chemicals, Calafat continued. She also stressed that a biomonitoring result is a measurement of the chemical concentration in the sample plus any contamination that may have occurred during the collection. There are a few questions to be asked in order to interpret a biomonitoring result: 1) Could the sample collection procedure have introduced any contamination? 2) Is the chemical measured in the correct matrix? (Persistent compounds are generally measured in blood and non-persistent compounds in urine.) These questions help interpret the results.
Another piece of information that is important in interpreting biomonitoring results is knowledge of the chemical toxicokinetics, explained Calafat. For example, for non-persistent chemicals, free (or bioactive parent chemical) species are typically excreted much faster than the conjugated chemical species. To check for contamination, the ratio of free to conjugated chemical can help identify contamination, as the free should generally be lower in concentration.
How to Assess and Interpret the Biomonitoring Data Once You Have It
According to Dr. LaKind, it may also be important to understand exposure factors that could influence the measurements. She says, “You must be confident that the exposure preceded the outcome, so you should understand the chemical stability in the body (LaKind et al. 2014).” For example, with bisphenol-A (BPA), the main route of exposure is through diet. Some studies have observed that children have a higher intake than adults. However, they also have a shorter fasting time before the sample is collected. So according to LaKind, “the question is do adults have a lower exposure to BPA than kids, or is that an artifact of sampling methodology?”
Effect of Individual-specific Conditions on Biomonitoring Results: Metals as Case Examples
From Dr. Joyce Tsuji’s perspective, it is also important to consider individual differences that might affect exposure measures. Some examples are methylation efficiency due to nutritional status, the influence of other substances, such as smoking, or genetics may impact measures of arsenic in blood or urine. Interactions between two metals or two minerals may also affect absorption; examples include cobalt and iron, manganese and iron, zinc and cadmium, lead and calcium. Iron status may also impact measured levels of cobalt. Protein status and levels of albumin and other metal carrier proteins influence the pharmacokinetics of many metals. Similarly, disease states such as diabetes and cardiovascular disease lower protein status and functional albumin levels. Adjustment of metal concentration in urine using creatinine to account for urine sample hydration state may, therefore, inflate metal concentrations for
those with low creatinine, such as diabetics, those with lower lean body mass including women. Since urinary excretion is a major elimination route for most metals, reduced kidney function will increase blood levels of metals. Lastly, for metals with bone storage such as lead or tungsten, a fracture or osteopenia may result in increased levels in blood and urine from historical exposures.
WHAT APPROACHES MIGHT BE USEFUL TO DOD’S NEEDS?
The workshop participants concluded with a discussion, moderated by Funk, about how biorepository samples could help DoD answer questions about environmental and occupational exposures. Many audience members who participated in the discussion felt the techniques and tools discussed at the meeting were useful for DoD. One participant said, “we need to move toward a more interactive process between DoD staff who have detailed knowledge of the datasets and the people who are using cutting-edge techniques.” Another speaker said, “there’s a community here, in this room that wants to help and I think has information that can be very helpful.” Steve Jones replied to these comments, by saying that is something to consider going forward, and added that there is also a concern about keeping the repositories relevant as there are specific complexities to providing services or information to protect the health of the forces in a distributed operations context.
Avanasi, R., H. M. Shin, V. M. Vieira, and S. M. Bartell. 2016. “Variability and epistemic uncertainty in water ingestion rates and pharmacokinetic parameters, and impact on the association between perfluorooctanoate and preeclampsia in the C8 Health Project population.” Environ Res 146:299-307. doi: 10.1016/j.envres.2016.01.011.
Bartell, S. M., W. C. Griffith, and E. M. Faustman. 2004. “Temporal error in biomarker-based mean exposure estimates for individuals.” Journal of Exposure Science and Environmental Epidemiology 14, no. 2 (2004):173.
Caudill, S. P. 2010. “Characterizing populations of individuals using pooled samples.” Journal of Exposure Science and Environmental Epidemiology 20, no. 1 (2010):29.
Chambers, D. M., C. M. Reese, L. G. Thornburg, E. Sanchez, J. P. Rafson, B. C. Blount, J. R. E. Ruhl III, and V. R. De Jesús. 2017. “Distinguishing Petroleum (Crude Oil and Fuel) From Smoke Exposure within Populations Based on the Relative Blood Levels of Benzene, Toluene, Ethylbenzene, and Xylenes (BTEX), Styrene and 2, 5-Dimethylfuran by Pattern Recognition Using Artificial Neural Networks.” Environmental Science & Technology 52, no. 1 (2017):308-316.
Darrow, L. A., C. R. Stein, and K. Steenland. 2013. “Serum perfluorooctanoic acid and perfluorooctane sulfonate concentrations in relation to birth outcomes in the Mid-Ohio Valley, 2005–2010.” Environmental Health Perspectives 121, no. 10 (2013):1207.
Freeman, J. D., L. M. Rosman, J. D. Ratcliff, P. T. Strickland, D. R. Graham, and E. K. Silbergeld. 2017. “State of the Science in Dried Blood Spots.” Clinical Chemistry (2017): clinchem-2017.
Johnson, J. M., T. Yu, F. H. Strobel, and D. P. Jones. 2010. “A practical approach to detect unique metabolic patterns for personalized medicine.” Analyst 135, no. 11 (2010):2864-2870.
Jones, D. P. 2016. “Sequencing the exposome: a call to action.” Toxicology Reports 3 (2016):29-45.
LaKind, J. S., J. R. Sobus, M. Goodman, D. B. Barr, P. Furst, R. J. Albertini, T. E. Arbuckle, G. Schoeters, Y. M. Tan, J. Teeguarden, R. Tornero-Velez, and C. P. Weisel. 2014. “A proposal for assessing study quality: Biomonitoring, Environmental Epidemiology, and Short-lived Chemicals (BEES-C) instrument.” Environ Int 73:195-207. doi: 10.1016/j.envint.2014.07.011.
Li, S., Y. Park, S. Duraisingham, F. H. Strobel, N. Khan, Q. A. Soltow, D. P. Jones, and B. Pulendran. 2013. “Predicting network activity from high throughput metabolomics.” PLoS Computational Biology 9, no. 7 (2013):e1003123.
Liang, D., J. L. Moutinho, R. Golan, T. Yu, C. N. Ladva, M. Niedzwiecki, D. I. Walker, S. E. Sarnat, H. H. Chang, R. Greenwald, D. P. Jones. 2018. Use of high-resolution metabolomics for the identification of metabolic signals associated with traffic-related air pollution. Environment International. 2018 Nov 1;120:145-54.
Liu, S., H. Grigoryan, W. M. B. Edmands, S. Dagnino, R. Sinharay, P. Cullinan, P. Collins, K. F. Chung, B. Barratt, F. J. Kelly, P. Vineis, and S. M. Rappaport. 2018. “Cys34 Adductomes Differ between Patients with Chronic Lung or Heart Disease and Healthy Controls in Central London.” Environmental Science & Technology 52 (4):2307-2313. doi: 10.1021/acs.est.7b05554.
Lu, S. S., H. Grigoryan, W. M. Edmands, W. Hu, A. T. Iavarone, A. Hubbard, N. Rothman, R. Vermeulen, Q. Lan, and S. M. Rappaport. 2017. “Profiling the Serum Albumin Cys34 Adductome of Solid Fuel Users in Xuanwei and Fuyuan, China.” Environ Sci Technol 51 (1):46-57. doi: 10.1021/acs.est.6b03955.
Lyles, R. H., E. M. Mitchell, C. R. Weinberg, D. M. Umbach, and E. F. Schisterman. 2016. “An efficient design strategy for logistic regression using outcome-and covariate-dependent pooling of biospecimens prior to assay.” Biometrics 72, no. 3 (2016):965-975.
Niedzwiecki, M. M., P. Samant, D. I. Walker, V. Tran, D. P. Jones, M. R. Prausnitz, and G. W. Miller. 2018. “Human Suction Blister Fluid Composition Determined Using High-resolution Metabolomics.” Analytical Chemistry 90, no. 6 (2018):3786-3792.
Park, Y. H., K. Lee, Q. A. Soltow, F. H. Strobel, K. L. Brigham, R. E. Parker, and M. E. Wilson. 2012. “High-performance metabolic profiling of plasma from seven mammalian species for simultaneous environmental chemical surveillance and bioeffect monitoring.” Toxicology 295, no. 1-3 (2012):47-55.
Patel, C. J., A. K. Manrai, E. Corona, and I. S. Kohane. 2016. “Systematic correlation of environmental exposure and physiological and self-reported behaviour factors with leukocyte telomere length.” International Journal of Epidemiology 46, no. 1 (2016):44-56.
Patel, C. J., D. H. Rehkopf, J. T. Leppert, W. M. Bortz, M. R. Cullen, G. M. Chertow, and J. PA. Ioannidis. 2013. “Systematic evaluation of environmental and behavioural factors associated with all-cause mortality in the United States National Health and Nutrition Examination Survey.” International Journal of Epidemiology 42, no. 6 (2013):1795-1810.
Perdue, C. L., A. A. Eick Cost, M. V. Rubertone, L. E. Lindler, and S. L. Ludwig. 2015. “Description and utilization of the United States department of defense serum repository: a review of published studies, 1985-2012.” PloS One 10, no. 2 (2015):e0114857.
Rappaport, S. M., S. Waidyanatha, Q. Qu, R. Shore, X. Jin, B. Cohen, L. C. Chen, A. A. Melikian, G. Li, S. Yin, H. Yan, B. Xu, R. Mu, Y. Li, X. Zhang, and K. Li. 2002. “Albumin adducts of benzene oxide and 1,4-benzoquinone as measures of human benzene metabolism.” Cancer Res 62 (5):1330-7.
Rotander, A., LM. L. Toms, L. Aylward, M. Kay, and J. F. Mueller. 2015. “Elevated levels of PFOS and PFHxS in firefighters exposed to aqueous film forming foam (AFFF).” Environment International 82 (2015):28-34.
Savitz, D. A., C. R. Stein, S. M. Bartell, B. Elston, J. Gong, H. M. Shin, and G. A. Wellenius. 2012. “Perfluorooctanoic acid exposure and pregnancy outcome in a highly exposed community.” Epidemiology 23 (3):386-92. doi: 10.1097/EDE.0b013e31824cb93b.
Shin, H. M., V. M. Vieira, P. B. Ryan, K. Steenland, and S. M. Bartell. 2011. “Retrospective exposure estimation and predicted versus observed serum perfluorooctanoic acid concentrations for participants in the C8 Health Project.” Environ Health Perspect 119 (12):1760-5. doi: 10.1289/ehp.1103729.
Sjödin, A., R. S. Jones, S. P. Caudill, LY. Wong, W. E. Turner, and A. M. Calafat. 2013. “Polybrominated diphenyl ethers, polychlorinated biphenyls, and persistent pesticides in serum from the National Health and Nutrition Examination Survey: 2003–2008.” Environmental Science & Technology 48, no. 1 (2013):753-760.
Sobus, J. R., J. F. Wambaugh, K. K. Isaacs, A. J. Williams, A. D. McEachran, A. M. Richard, and C. M. Grulke. 2017. “Integrating tools for non-targeted analysis research and chemical safety evaluations at the US EPA.” Journal of Exposure Science & Environmental Epidemiology (2017):1.
Stein, C. R., D. A. Savitz, and M. Dougan. 2009. Serum levels of perfluorooctanoic acid and perfluorooctane sulfonate and pregnancy outcome. American Journal of Epidemiology, 170(7), pp.837-846. https://doi.org/10.1093/aje/kwp212.
U.S. EPA (United States Environmental Protection Agency). 2011. Exposure Factors Handbook 11th Edition (Final Report). Washington, DC.
Uppal, K., D. I. Walker, K. Liu, S. Li, YM. Go, and D. P. Jones. 2016. “Computational metabolomics: a framework for the million metabolome.” Chemical Research in Toxicology 29, no. 12 (2016):1956-1975.
Uppal, K., C. Ma, YM. Go, and D. P. Jones. 2017. “xMWAS: a data-driven integration and differential network analysis tool.” Bioinformatics 34, no. 4 (2017):701-702.
Walker, D. I., T. Mallon, P. K. Hopke, K. Uppal, YM. Go, P. Rohrbeck, K. D. Pennell, and D. P. Jones. 2016. “Deployment-associated exposure surveillance with high-resolution metabolomics.” Journal of occupational and environmental medicine/American College of Occupational and Environmental Medicine 58, no. 8 (2016):S12.
Woeller, C. F., T. H. Thatcher, D. Van Twisk, S. J. Pollock, A. Croasdell, N. Kim, and P. K. Hopke. 2016. “Detection of serum microRNAs from Department of Defense Serum Repository: correlation with cotinine, cytokine, and polycyclic aromatic hydrocarbon levels.” Journal of Occupational and Environmental Medicine 58, no. 8 Suppl 1 (2016):S62.
DISCLAIMER: This Proceedings of a Workshop—in Brief was prepared by Elizabeth Boyle as a factual summary of what occurred at the workshop. The planning committee’s role was limited to planning the workshop. The statements made are those of the rapporteur or individual meeting participants and do not necessarily represent the views of all meeting participants, the planning committee, or the National Academies of Sciences, Engineering, and Medicine.
Reviewers: To ensure that it meets institutional standards for quality and objectivity, this Proceedings of a Workshop—in Brief was reviewed by Lesa Aylward, Summit Toxicology; Marike Kolossa-Gehring, German Environment Agency; Megan Latshaw, Johns Hopkins University; and Hyeong Moo Shin, University of Texas at Arlington.
Planning Committee for Feasibility of Addressing Environmental Exposure Questions Using Department of Defense Biorepositories: A Workshop: Joyce Tsuji (Chair), Exponent; Lesa Aylward, Summit Toxicology; Benjamin Blount, Centers for Disease Control and Prevention; James Cerhan, Mayo Clinic; William Funk, Northwestern University.
Sponsor: This workshop was sponsored by the Department of Defense.
Suggested citation: National Academies of Sciences, Engineering, and Medicine. 2018. Feasibility of Addressing Environmental Exposure Questions Using Department of Defense Biorepositories: Proceedings of a Workshop—in Brief. Washington, DC: The National Academies Press. doi: https://doi.org/10.17226/25287.
Division on Earth and Life Studies
Copyright 2018 by the National Academy of Sciences. All rights reserved.