4

Evidence Identification

This chapter addresses the identification of evidence pertaining to questions that are candidates for systematic reviews as described in Chapter 3 (see Figure 4-1). Systematic reviews for US Environmental Protection Agency (EPA) Integrated Risk Information System (IRIS) assessments, as for any topic, should be based on comprehensive, transparent literature searches and screening to enable the formulation of reliable assessments that are based on all relevant evidence. EPA has substantially improved and documented its approach for identifying evidence in response to the report Review of the Environmental Protection Agency’s Draft IRIS Assessment of Formaldehyde (NRC 2011) and other criticisms and advice. As a way to encourage further progress, the present committee compares recent EPA materials and assessments with the guidelines developed for systematic review by the Institute of Medicine (IOM 2011) and offers specific suggestions for improving evidence identification in the IRIS process. The committee makes this comparison with the IOM guidelines because they are derived from several decades of experience, are considered a standard in the clinical domain, and are thought to be applicable to the IRIS process.

CONSIDERATION OF BIAS IN EVIDENCE IDENTIFICATION

Systematic reviews of scientific evidence are preferable to traditional literature reviews partly because of their transparency and adherence to standards. In addition, the systematic-review process gathers all the evidence without relying on the judgment of particular people to select studies. Nonetheless, systematic reviews are prone to two types of bias: bias present in the individual studies included in a review and bias resulting from how the review itself was conducted (meta-bias). Meta-bias cannot be identified by examining the methods of an individual study because it stems from how a systematic review is conducted and from factors that broadly affect a body of research.

images

FIGURE 4-1 The IRIS process; the evidence-identification step is highlighted. The committee views public input and peer review as integral parts of the IRIS process, although they are not specifically noted in the figure.



The National Academies | 500 Fifth St. N.W. | Washington, D.C. 20001
Copyright © National Academy of Sciences. All rights reserved.
Terms of Use and Privacy Statement



Below are the first 10 and last 10 pages of uncorrected machine-read text (when available) of this chapter, followed by the top 30 algorithmically extracted key phrases from the chapter as a whole.
Intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text on the opening pages of each chapter. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

Do not use for reproduction, copying, pasting, or reading; exclusively for search engines.

OCR for page 40
4 Evidence Identification This chapter addresses the identification of evidence pertaining to questions that are candi- dates for systematic reviews as described in Chapter 3 (see Figure 4-1). Systematic reviews for US Environmental Protection Agency (EPA) Integrated Risk Information System (IRIS) assess- ments, as for any topic, should be based on comprehensive, transparent literature searches and screening to enable the formulation of reliable assessments that are based on all relevant evi- dence. EPA has substantially improved and documented its approach for identifying evidence in response to the report Review of the Environmental Protection Agency’s Draft IRIS Assessment of Formaldehyde (NRC 2011) and other criticisms and advice. As a way to encourage further progress, the present committee compares recent EPA materials and assessments with the guide- lines developed for systematic review by the Institute of Medicine (IOM 2011) and offers specif- ic suggestions for improving evidence identification in the IRIS process. The committee makes this comparison with the IOM guidelines because they are derived from several decades of expe- rience, are considered a standard in the clinical domain, and are thought to be applicable to the IRIS process. CONSIDERATION OF BIAS IN EVIDENCE IDENTIFICATION Systematic reviews of scientific evidence are preferable to traditional literature reviews partly because of their transparency and adherence to standards. In addition, the systematic- review process gathers all the evidence without relying on the judgment of particular people to select studies. Nonetheless, systematic reviews are prone to two types of bias: bias present in the individual studies included in a review and bias resulting from how the review itself was con- ducted (meta-bias). Meta-bias cannot be identified by examining the methods of an individual study because it stems from how a systematic review is conducted and from factors that broadly affect a body of research. Scoping Human Human Human Develop Problem Protocols for Animal Identify Animal Evaluate Animal Integrate Formulation Systematic Evidence Studies Evidence Reviews Mechanistic Mechanistic Mechanistic Systematic Reviews Broad Literature Search Dose- Response Assessment Hazard and Derivation Identification of Toxicity Values FIGURE 4-1 The IRIS process; the evidence-identification step is highlighted. The committee views public input and peer review as integral parts of the IRIS process, although they are not specifically noted in the figure. 40

OCR for page 40
Evidence Identification 41 It might be argued that the most important form of meta-bias that threatens the validity of findings of a systematic review results from the differential reporting of study findings on the basis of their strength and direction. Since the early focus on publication bias or the failure to publish at all because of potential implications of study findings, investigators have come to rec- ognize that reporting biases encompass a wide array of behaviors, including selective outcome reporting. Reporting biases have been repeatedly documented in studies that show that research with statistically significant results is published more often (sometimes more often in English- language journals) and more quickly, and in journals that have higher citation frequencies than research whose results are not statistically significant (Dickersin and Chalmers 2011). The re- porting bias related to the publication of research with statistically significant results might also be exacerbated by increased publication pressures (Fanelli 2010). Reporting biases have been shown to be associated with all sorts of sponsors and investigators; for example, industry- supported studies in health sciences have been shown to be particularly vulnerable to distortion of findings or to not being reported at all (Lundh et al. 2012). Moreover, an investigator’s failure to submit (as opposed to selectivity on the part of the editorial process) appears to be the main reason for failure to publish (Chalmers and Dickersin 2013). There is evidence that reporting biases might also be a subject of concern in laboratory and animal research (Sena et al. 2010; Korevaar et al. 2011; ter Riet et al. 2012). The potential for reporting biases is one reason to search the gray (unpublished) literature. Specifically, the gray literature might be less likely to support specific hypotheses than literature sources that might be biased toward publication of “positive” results. Systematic review does not identify the presence of reporting biases themselves. However, a comprehensive search will include the types of studies particularly prone to reporting biases, such as industry-supported studies in the health sciences. A failure to find studies in such catego- ries that are particularly prone to reporting bias should raise concern that reporting bias is pre- sent. In addition, a systematic review provides the opportunity to compare findings among dif- ferent groups of funders and investigators and to identify any indication of meta-bias. A second type of meta-bias is information bias, which occurs when data on the groups be- ing compared (for example, animals exposed at different doses or control vs exposed animals) are collected differentially (nonrandom misclassification). Such bias can affect a whole body of literature. Incorrect information can also be collected in error (random misclassification) without a direction of the bias. Random misclassification is understandably undesirable in toxicity as- sessments. This chapter specifically addresses two steps that are critical for minimizing meta-bias: performing a comprehensive search for all the evidence, including unpublished findings, and screening and selecting reports that address the systematic-review question and meet eligibility criteria specified in the protocol. Error can also arise when data are abstracted from studies dur- ing the review process. Systematic-review methods should be structured to maximize the accura- cy of the data extracted from the identified studies in the systematic review. Therefore, the com- mittee addresses an additional step in the systematic-review process in this chapter: extracting the data from studies included in the IRIS review (see Table 4-1, section on IOM Standard 3.5). RECOMMENDATIONS ON EVIDENCE IDENTIFICATION IN THE NATIONAL RESEARCH COUNCIL FORMALDEYDE REPORT The National Research Council (NRC) formaldehyde report (NRC 2011) recommended that EPA adopt standardized, documented, quality-controlled processes and provided specific recommendations related to evidence identification (see Box 4-1). Implementation of the rec- ommendations is addressed in the following section of this chapter. Detailed findings and rec- ommendations are provided in Table 4-1, and general findings and recommendations are provid- ed at the conclusion of the chapter.

OCR for page 40
42 Review of EPA’s Integrated Risk Information System (IRIS) Process BOX 4-1 Recommendations on Evidence Identification in the 2011 National Research Council Formaldehyde Report  Establish standard protocols for evidence identification.  Develop a template for description of the search approach.  Use a database, such as the Health and Environmental Research Online (HERO) database, to capture study information and relevant quantitative data. Source: NRC 2011, p. 164. EVALUATION OF ENVIRONMENTAL PROTECTION AGENCY RESPONSE TO THE NATIONAL RESEARCH COUNCIL FORMALDEHYDE REPORT Relatively little research has been conducted specifically on the issue of evidence identifi- cation related to hazard identification and dose-response assessments for IRIS assessments. To capitalize on recent efforts in this regard, the committee used the standards established by IOM to assess the effectiveness of health interventions as the foundation of its evaluation of the IRIS process. The standards are presented in Finding What Works in Healthcare: Standards for Sys- tematic Review (IOM 2011). The general approaches and concepts underlying systematic re- views for evidence-based medicine should be relevant to the review of animal studies and epi- demiologic studies (Mignini and Khan 2006). In animal studies, the work on testing of various methods of evidence identification is in the early stages (Leenaars et al. 2012; Briel et al. 2013; Hooijmans and Ritskes-Hoitinga 2013). The IOM standards relied on three main sources: the published methods of the Cochrane Collaboration (Higgins and Green 2008), the Centre for Reviews and Dissemination of the Uni- versity of York in the United Kingdom (CRD 2009), and the Effective Health Care Program of the Agency for Healthcare Research and Quality in the United States (AHRQ 2011). Those standards reflect the input of experts who were consulted during their development. Although the IOM standards for conducting systematic reviews focus on assessing the comparative effective- ness of medical or surgical interventions and the evidence supporting the standards is based on clinical research, the committee considers the approach useful for a number of aspects of IRIS assessments because the underlying principles are inherent to the scientific process (see Hoffman and Hartung 2006; Woodruff and Sutton 2011; Silbergeld and Scherer 2013). Some analysts, however, have noted challenges associated with implementing the IOM standards (Chang et al. 2013; IOM 2013). Table 4-1 summarizes elements of the IOM standards for identifying information in sys- tematic reviews in evidence-based medicine, presents the rationale or principle behind each ele- ment, and indicates the status of the element as reflected in materials submitted to the committee that document changes in the IRIS program (EPA 2013a,b) as described in Chapter 1 (see Table 1-1). Two chemical-specific examples (EPA 2013c,d,e) are included in Table 4-1 to assess the intent and application of the new EPA strategies as reflected in the draft preamble (EPA 2013a, Appendix B) and the draft handbook (EPA 2013a, Appendix F). The committee interprets those portions of the draft preamble and handbook that address the literature search and screening as constituting draft standard approaches for evidence identification. It assumes that the preamble summarizes the methods used in IRIS assessments and that the handbook is a detailed record of methods that are intended to be applied in evidence assessments. In other words, the committee assumes that people who are responsible for performing systematic reviews to support upcoming IRIS assessments will rely on the handbook as it continues to evolve.

OCR for page 40
TABLE 4-1 Comparison of EPA Draft Materials with IOM Systematic-Review Standards for Evidence Identification Draft IRIS Draft IRIS IOM Standard Benzo[a]pyrene Ammonia Considerations (IOM 2011) Draft IRIS Preamble Draft IRIS Handbook Assessment Assessment for Further and Rationale (EPA 2013a, Appendix B) (EPA 2013a, Appendix F) (EPA 2013c) (EPA 2013d,e) Development 3.1 Conduct a comprehensive systematic search for evidence 3.1.1 Work with a librarian Not mentioned. The initial steps of the Not mentioned. Not mentioned. Begin by or other information systematic review process referencing the specialist trained in involve formulating specific key role played performing systematic strategies to identify and select by information reviews to plan the search studies related to each key specialists who strategy (p. 266). question (p. F-2). EPA refers to have expertise in tapping HERO resources (which systematic reviews Rationale: As with other include librarian expertise) and in planning the aspects of research, advises consulting a librarian search strategy and specific skills and training early (to develop search terms) their role as are required to navigate a and often. Nevertheless, the members of the wide range of outline suggests that their search IRIS team bibliographic databases process begins with literature throughout the and electronic information collection. evidence- sources. identification EPA acknowledges that the process. process developed for evidence- based medicine is generally applied to narrower, more focused questions and nonetheless provides a strong foundation for IRIS assessments; EPA notes that IRIS addresses assessment-specific questions. However, the materials do not describe information specialists trained in systematic reviews. (Continued) 43

OCR for page 40
TABLE 4-1 Continued 44 Draft IRIS Draft IRIS IOM Standard Benzo[a]pyrene Ammonia Considerations (IOM 2011) Draft IRIS Preamble Draft IRIS Handbook Assessment Assessment for Further and Rationale (EPA 2013a, Appendix B) (EPA 2013a, Appendix F) (EPA 2013c) (EPA 2013d,e) Development 3.1.2 Design the search Not mentioned. No mention of key research Not mentioned. No mention of key In the protocol, strategy to address each key questions. Step 1 of the proposed research questions. describe the role of research question (p. 266). process sets the goal of The section titles key research identifying primary studies. Step imply the general questions and their Rationale: The goal of the 1b (p. F-6) on selecting search search questions relationship to the search strategy is to terms specifies “the appropriate (for example, search strategies. Do maximize both sensitivity forms of the chemical name, CAS developmental not omit any helpful (the proportion of all number, and if relevant, major effects, reproductive information related eligible articles that are metabolite(s).” EPA also effects, to this standard; correctly identified) and describes the possible addition of immunotoxicity, rather, include this precision (the proportion of secondary search strategies that other toxicity, and information in the all articles identified by the include key words for end points, carcinogenicity), but same section as search that are eligible). the possibility of other more they are not listed appropriate. With multiple research targeted end points, and the use of anywhere explicitly. questions, a single search filters and analysis of small strategy is unlikely to cover samples of review results to all questions posed with assess relevance (pp. F-6–F-7). any precision. 3.1.3 Use an independent Not mentioned. Not mentioned. Not mentioned. Not mentioned. Add a review of the librarian or other search strategy by information specialist to an independent peer review the search information strategy (p. 267). specialist (that is, one who did not Rationale: This part of the design the protocol), evidence review requires who is trained in peer review like any other evidence part. Given the specialized identification for skills required, a person systematic reviews with similar skills would be to strengthen the expected to serve as peer search process. reviewer.

OCR for page 40
3.1.4 Search bibliographic The literature search follows Step 1A describes specific Table LS-1 and Appendix D in Systematically and databases (p. 267). standard practices and includes the databases for IRIS reviews (Table Table C-1 (Appendix Supplement regularly assess the PubMed and ToxNet databases of F-1), including PubMed, Web of C) outline the online provides search relevance and Rationale: A single the National Library of Medicine, Science, Toxline, TSCATS, databases searched. strings for some of usefulness of the database is typically not Web of Science, and other PRISM, and IHAD, several of There is not 100% but not all the identified databases sufficient to cover all databases listed in EPA’s HERO which are accessible through the agreement between databases listed in (PubMed, Web of publications (journals, system. Searches for information EPA HERO interface. the tables. Table LS-1 as Science, Toxline, books, monographs, on mechanisms of toxicity are searched. Table TSCATS, PRISM, government reports, and inherently specialized and might EPA identifies the HERO LS-1 provides IHAD, and others) others) for clinical research. include studies of other agents that interface, directly searching the keywords used for for finding primary Databases for reports act through related mechanisms (p. named databases, or supervising bibliographic studies. published in languages B2). the search process conducted by databases. other than English and for contractors. Ensure that the the gray literature could search process also be searched. conducted by contractors follows specific (detailed) guidelines for systematic literature reviews established by EPA (considering the elements outlined here) and that the contractor searches regularly undergo peer review or outside assessment. 3.1.5 Search citation See 3.1.4—The literature search EPA mentions citation indexes, The preamble The preamble Document specific indexes (p. 267). includes Web of Science. such as Web of Science, but a mentions that Web of mentions that guidance or suggestion to search them and Science is searched; searching the Web protocols for Rationale: Citation indexes how is not specified (p. F-7). not mentioned of Science is searching citation are a good way to ensure otherwise. standard practice, databases (for that eligible reports were but it is not example, to ensure not missed. mentioned in text that searches look otherwise. for citations to the identified literature). (Continued) 45

OCR for page 40
TABLE 4-1 Continued 46 Draft IRIS Draft IRIS IOM Standard Benzo[a]pyrene Ammonia Considerations (IOM 2011) Draft IRIS Preamble Draft IRIS Handbook Assessment Assessment for Further and Rationale (EPA 2013a, Appendix B) (EPA 2013a, Appendix F) (EPA 2013c) (EPA 2013d,e) Development 3.1.6 Search literature cited Not mentioned. EPA appropriately discusses this References from References from by eligible studies (p. 268). as a strategy. previous EPA other EPA assessments and assessments were Rationale: The literature others were also also examined. cited by eligible studies examined. (for example, references provided in a journal article or thesis) is a good way to ensure eligible reports were not missed. 3.1.7 Update the search at Not mentioned. EPA appropriately discusses A comprehensive Search was first Develop intervals appropriate to the this step. literature search was conducted through standardized pace of generation of new last conducted in March 2012 and processes for information for the research February 2012. updated in March updating the question being addressed Appendix C gives 2013. Search string literature searches (p. 268). dates of all searches in Appendix D-1 to enable efficient as February 14, 2012. should provide updates on a regular Rationale: Given that new exact dates included basis, for example, articles and reports are in search in addition during key stages being generated in an to the date when the of development for ongoing manner, searches search was IRIS assessments. would be updated regularly performed. to reflect new information relevant to the topic. 3.1.8 Search subject- See entry 3.1.4—The literature EPA recommends searching Table LS-1: Pubmed, Appendix D Consider and specific databases if other search includes ToxNet of the “regulatory resources and other Toxline, Toxcenter, provides search specify other databases are unlikely to National Library of Medicine and websites” for additional TSCATS, ChemID, strings for four databases beyond provide all relevant other databases listed in EPA’s resources. Chemfinder, CCRIS, subject-specific those listed in evidence (p. 268). HERO. HSDB, GENETOX, databases. Although handbook and RTECS; listed as Table LS-1 (Appendix F, Rationale: If other searched. Appendix provides additional Figure F-1) and databases are unlikely to be Table C-1: Pubmed, database names, EPA (2013b, Figure

OCR for page 40
comprehensive, search a Toxline, Toxcenter, search strings are 1-1). For example, variety of other sources to TSCATS, TSCATS2, not provided. consider additional cover the missing areas. and TSCA recent resources from the notices; does not set identified on the mention ChemID, HERO website. Chemfinder, CCRIS, HSDB, GENETOX, and RTECS. 3.1.9 Search regional Not mentioned. Currently, non-English language Not mentioned. Not mentioned. Assess (conduct bibliographic databases is considered a criterion for research to if other databases are excluding studies, and foreign determine) whether unlikely to provide all language databases are not studies in non- relevant evidence (p. 269). included in the discussion of English-language search strategies. countries are Rationale: Many countries examining topics have their own databases relevant to IRIS and either because of assessments. Revisit language or other regional findings periodically factors the reports are not to assess the effects necessarily also present in of including or US-based databases excluding non- English-language studies. 3.2 Take action to address potentially biased reporting of research results 3.2.1 Search gray literature Not mentioned. EPA recommends searching Not mentioned. Not mentioned. Consider searching databases, clinical trial “regulatory resources and other other gray literature registries, and other sources websites” for additional databases beyond of unpublished information resources. those listed in about studies (p. 269). handbook (Appendix F, Table F-1) and Rationale: Negative or null other sources of results, or undesirable unpublished results, might be published information about in difficult to access studies (also see sources. IOM Standard 3.1.8 above). (Continued) 47

OCR for page 40
TABLE 4-1 Continued 48 Draft IRIS Draft IRIS IOM Standard Benzo[a]pyrene Ammonia Considerations (IOM 2011) Draft IRIS Preamble Draft IRIS Handbook Assessment Assessment for Further and Rationale (EPA 2013a, Appendix B) (EPA 2013a, Appendix F) (EPA 2013c) (EPA 2013d,e) Development 3.2.2 Invite researchers to Not mentioned. Not mentioned. Not mentioned. Not mentioned. As needed, request clarify information about additional study eligibility, study information needed characteristics, and risk from investigators to of bias (p. 269). determine eligibility, study characteristics, Rationale: Rather than and other classify identified studies information. as missing critical information, it is preferable to ask the investigators directly for the information. 3.2.3 Invite all study EPA posts the results of the EPA endorses requesting public Section 3.1 of the Section 3.1 of the Create a structured sponsors and researchers to literature search on the IRIS Web scrutiny of the list of identified Preamble to the Preamble to the process for inviting submit unpublished data, site and requests information from studies from the initial literature benzo[a]pyrene ammonia report study sponsors and including unreported the public on additional studies and search and requests reviews of report states that states that EPA researchers to outcomes, for possible current research. EPA also the list by independent scientists unpublished health considers studies submit unpublished inclusion in the systematic considers studies received through active in research on the topic to and safety data submitted to the data. review (p. 270). the IRIS Submission Desk and ensure that all relevant studies submitted to the EPA IRIS Submission studies (typically unpublished) are identified (pp. F-3, F-7). It is are also considered Desk and through Rationale: So as to include submitted under the Toxic also noteworthy that EPA duly as long as the data other means. Many all relevant studies and data Substances Control Act or the identifies the importance of can be publicly of them are in the review, ask sponsors Federal Insecticide, Fungicide, tracking why studies later released. unpublished. and researchers for and Rodenticide Act. Material identified were missed in the information about submitted as Confidential Business initial literature search. Per Figure LS-1, the Section 3.1 of the unpublished studies or Information is considered only if it American Petroleum Preamble describes data. includes health and safety data that Institute submitted inviting the public to can be publicly released. If a study 30 references, but it comment on the that might be critical for the is not clear whether literature search and conclusions of the assessment has all study sponsors suggest additional or not been peer-reviewed, EPA will and researchers were current studies that have it peer-reviewed. invited to submit might have been unpublished data. missed in the search.

OCR for page 40
3.2.4 Hand search selected Not mentioned. Not mentioned. Not mentioned. Not mentioned. Assess (conduct journals and conference research to abstracts (p. 270). determine) whether the IOM standard Rationale: Hand suggesting hand- searching of sources most searching of likely provides relevant journals and up-to-date information and conference contributes to the abstracts is likelihood of applicable and comprehensive useful to the EPA identification of eligible task. studies. 3.2.5 Conduct a web Not mentioned. As noted for IOM Standard Not mentioned. Not mentioned. Assess (conduct search (p. 271). 3.2.1, EPA recommends research to searching regulatory and other determine) whether Rationale: Web searches, Web sites. Web searches are even when broad and likely to turn up relatively untargeted, can additional useful contribute to the information and, if likelihood that all eligible so, determine which studies have been Web sites would be identified. appropriate. 3.2.6 Search for studies Not mentioned. As noted for IOM Standard Non-English- Not mentioned. Assess (conduct reported in languages 3.1.9, studies published in language articles research to other than English if languages other than English are were excluded, per determine) whether appropriate (p. 271). currently excluded from review Figure LS-1. to search for studies and non-English-language reported in Rationale: There is databases are not included in the languages other limited evidence that discussion of search strategies. than English for negative, null, or IRIS assessments undesirable findings might and revisit question be published in languages periodically. other than English. (Continued) 49

OCR for page 40
TABLE 4-1 Continued 50 Draft IRIS Draft IRIS IOM Standard Benzo[a]pyrene Ammonia Considerations (IOM 2011) Draft IRIS Preamble Draft IRIS Handbook Assessment Assessment for Further and Rationale (EPA 2013a, Appendix B) (EPA 2013a, Appendix F) (EPA 2013c) (EPA 2013d,e) Development 3.3 Screen and select studies 3.3.1 Include or exclude Exposure route is a key design EPA specifically mentions that Protocol not Protocol not Provide inclusion studies based on the consideration for selecting pertinent “casting a wide net” is a goal of provided so unable provided so unable and exclusion protocol’s pre-specified experimental animal studies or the search process and that results to judge whether to judge whether criteria in IRIS criteria (p. 272). human clinical studies. Exposure might not address the question of criteria are criteria are assessment protocol, duration is also a key design interest. A two- or three-stage prespecified. prespecified. and use these Rationale: On the basis of consideration for selecting pertinent process is suggested (review title Sections 3.1, 3.2, Sections 3.1, 3.2, criteria in figure the study question, experimental animal studies. Short- and abstract, then full text, or and 3.3 of the and 3.3 of the describing “study inclusion and exclusion duration studies involving animals screen title and abstract in separate preamble to the preamble to the selection” flow. criteria for the review or humans might provide steps) for relevance. Table F-5 benzo[a]pyrene ammonia report would be set a priori, toxicokinetic or mechanistic specifies excluding duplicates, report provide provide information before reviewing the search information. Specialized study studies for which only abstracts information on on types of studies results (see 3.3.5) so as to designs are used for developmental are available, and examples of types of studies included, and avoid results-based and reproductive toxicity (p. B-3). criteria that might be defined for included, and Figure Figure LS-1 decisions. excluding studies. LS-1 provides provides reasons reasons for report for report exclusions. exclusions. 3.3.2 Use observational Cohort studies, case-control studies, In Step 1, literature search, it is Described in Described in Not applicable. studies in addition to and some population-based surveys recommended that articles be Section 3.2 of Section 3.2 of randomized controlled provide the strongest epidemiologic sorted into categories (for preamble to the preamble to the trials to evaluate harms of evidence; ecologic studies example, experimental studies of benzo[a]pyrene ammonia report. interventions (p. 272). (geographic correlation studies) that animals and observational studies report. relate exposures and effects by of humans). Later, in 2B, the Rationale: Predetermine geographic area; case reports of Appendix says that studies could study designs that will be high or accidental exposure provide include acute-exposure animal eligible for each study information on rare effects or experiments, 2-year bioassays, question. relevance of results from animal experimental-chamber studies of testing (p. B-3). humans, observational epidemiologic studies, in vitro studies, and many other types of designs. No restriction by study design is intended.

OCR for page 40
TABLE 4-1 Continued 52 Draft IRIS Draft IRIS IOM Standard Benzo[a]pyrene Ammonia Considerations (IOM 2011) Draft IRIS Preamble Draft IRIS Handbook Assessment Assessment for Further and Rationale (EPA 2013a, Appendix B) (EPA 2013a, Appendix F) (EPA 2013c) (EPA 2013d,e) Development full-text of articles may be more efficient, with an review of the that screeners follow identified in initial initial screen based on title, reports identified a process that screening (p. 273). followed by screening based on was conducted by a reflects the concepts abstract, followed by full text person not underlying the IOM Rationale: Data are not screening. There is not a ‘right’ or described. standards. clear, even for clinical ‘wrong’ choice; however, intervention questions, whichever you choose, be sure to regarding which method is document the process you use” best, although 2) appears to (pp. F-10, F11). be more common. 3.3.6 Taking account of the See entry 3.2.2. Not applicable because all types of Not applicable Not applicable Not applicable. risk of bias, consider using study designs are potentially because all types of because all types of observational studies to eligible (and randomized clinical study designs are study designs are address gaps in the trials are not conducted for IRIS potentially eligible potentially eligible evidence from randomized assessments). (and randomized (and randomized clinical trials on the benefits clinical trials are not clinical trials are of interventions (p. 274). conducted for IRIS not conducted for assessments). IRIS assessments). Rationale: Rather than exclude evidence where it is sparse, it might be necessary to use data from studies using design more susceptible to bias than a preferred design. 3.4 Document the search 3.4.1 Provide a line-by-line Each assessment specifies the The handbook supports careful Table LS-1 Table LS-1 Document, line-by- description of the search search strategies, keywords, and documentation of the search provides more provides more line, a description of strategy, including the date cutoff dates of its literature strategy and provides Tables F-3 database names database names the search strategy, of search for each database, searches. and F-4 as examples of the types than Appendix C than Appendix D including the dates web browser, etc. (p. 274). of information that would be but does not but does not included in the retained. No specific statement is provide search provide search search of each

OCR for page 40
Rationale: Appropriate made about documenting a line- strings for them. strings for them. database and the documentation of the by-line search strategy. Appendix C: Table Appendix D: Table date of the search search processes ensures C-1 provides search D-1 provides search for each database transparency of the strategies for more strings for four and any Web methods used in the review, than four databases subject-specific searches. and appropriate peer review searched with date databases, but exact by information specialists. of search, but exact dates for what was dates for what was included in search included in search are not provided— are not provided— for example, for for example, for PubMed, Appendix PubMed, “Date D states “Date range 1950’s to range: 1950’s to 2/14/2012.” present.” 3.4.2 Document the Not mentioned. Some support is given to Summary data are The disposition of Consider a more disposition of each report documenting the reasons for provided in Figure identified citations explicit statement in identified, including excluding each citation at the full- LS-1. is summarized in the handbook reasons for their exclusion text review stage. “In these the study-selection regarding if appropriate (p. 275). situations, the citation should be figure but is documenting the ‘tagged’ into the appropriate otherwise not disposition of each Rationale: The standard exclusion category” (p. F-16). mentioned. The report identified by supports creation of a flow disposition of the search. chart that describes the articles identified in Flowcharts can also sequence of events leading the search is be used to illustrate to identification of included documented in dispositions by studies, and it also supports HERO. category, similar to assessment of the the LitFlow diagram sensitivity and precision of in HERO. the searches a posteriori. 3.5 Manage data collection 3.5.1 At a minimum, use Not mentioned. This item is not fully described in Not mentioned. Not mentioned. Ensure the quality of two or more researchers, the process of data collection: the data collected. working independently, to “Ideally, two independent For example, at a extract quantitative or other reviewers would independently minimum, use two or critical data from each identify the relevant more researchers (Continued) 53

OCR for page 40
TABLE 4-1 Continued 54 Draft IRIS Draft IRIS IOM Standard Benzo[a]pyrene Ammonia Considerations (IOM 2011) Draft IRIS Preamble Draft IRIS Handbook Assessment Assessment for Further and Rationale (EPA 2013a, Appendix B) (EPA 2013a, Appendix F) (EPA 2013c) (EPA 2013d,e) Development study. For other types of methodological details, and then working data, one individual could compare their results and independently to extract the data while the interpretations and resolve any extract quantitative second individual differences” (p. F-21). and other critical data independently checks for from each study accuracy and completeness. document. For other Establish a fair procedure types of data, one for resolving person could extract discrepancies—do not the data while a simply give final decision- second independently making power to the senior checks for accuracy reviewer (p. 275). and completeness. Establish a fair Rationale: Because procedure for reporting is often not clear resolving or logically placed, having discrepancies—do two independent reviewers not simply give final is a quality-assurance decision-making approach. The evidence power to the senior supporting two independent reviewer (per the data extractors is limited IOM wording, p. and so some reviewers 275). prefer that one person extracts and the other verifies, a time- saving approach. Discrepancies would be decided by discussion so that each person’s viewpoint is heard.

OCR for page 40
3.5.2 Link publications Not mentioned. It is acknowledged implicitly that Not mentioned. Not mentioned. Create an explicit from the same study to there can be more than one mechanism for avoid including data from publication per study, but there are linking multiple the same study more than no specific instructions about publications from the once (p. 276). linking the publications from a same study to avoid single study together. including duplicate Rationale: There are data. numerous examples in the literature where two articles reporting the same study are thought to represent two separate studies. 3.5.3 Use standard data Not mentioned. An example worksheet is Data-extraction Data-extraction Create, pilot-test, extraction forms developed provided for observational forms are not forms are not and use standard for the specific systematic epidemiologic studies and items to described, and it is described, and it data-extraction forms review (p. 276). be extracted from the articles for not known whether is not known (see also 3.5.4 animal toxicologic studies. A forms were used; whether forms below). Rationale: Standardized structured form may be useful for evidence tables and were used; data forms are broadly recording the key features needed exposure-response evidence tables applied quality-assurance to evaluate a study. An example arrays provide a and exposure- approaches. form is shown in Figure F-3; structured format for response arrays details of such a form will need to data-reporting. provide a be modified based on the specifics structured format of the chemical, exposure for data-reporting. scenarios, and effect measures under study. 3.5.4 Pilot-test the data Not mentioned. Not mentioned. Not mentioned. Not mentioned. Create, pilot-test, extraction forms and and use standard process (p. 276). data-extraction forms (see 3.5.3 above). Rationale: Pre-testing of the data collection forms and processes are broadly applied quality-assurance approaches. 55

OCR for page 40
56 Review of EPA’s Integrated Risk Information System (IRIS) Process In general, EPA has been responsive to the recommendations from the NRC formaldehyde report. As discussed in Chapter 1, the timing of the publication of the IOM standards was such that EPA could not have been expected to have incorporated the standards into its assessments to date. Nevertheless, comparison of statements made in the draft preamble (EPA 2013a, Appendix B) and draft handbook (EPA 2013a, Appendix F) with the 2011 IOM standards demonstrates that EPA has not only responded to the recommendations made in the NRC formaldehyde report but is well on the way to meeting the general systematic-review standards for identifying and as- sessing evidence. Thus, the table is useful primarily for pointing out where further standardization might be helpful, not as a test and demonstration of whether IOM standards have been met. Sometimes the information that the committee sought is not mentioned in the sources examined but is present in other sources, for example, in explanatory materials provided on the EPA IRIS Web site and in chemical-specific links on the EPA Health and Environmental Research Online (HERO) Web site. After discussion, the committee elected to retain “not mentioned” in the table because the information sought was not mentioned in the documents reviewed even though it might have been noted elsewhere. A key goal was to see whether the information appeared where the aver- age reader might expect to find it, notably in documents describing the methods used in develop- ing IRIS assessments. For transparency, there should be no difficulty in accessing all aspects of review methods. In addition, the subset of documents reflected in the table does not represent all the materi- als available. Because EPA’s transition to a systematic process for reviewing the evidence is evolving, the committee expects that more recent documents will reflect an increasingly stand- ardized and comprehensive response. The committee had to halt its examination of recent exam- ple documents in September 2013 so that the present report could be drafted, with the under- standing that some elements that appear undeveloped in Table 4-1 have been addressed in materials released more recently. Establish Standard Strategies for Evidence Identification The IOM standards for finding and assessing individual studies include five main ele- ments: searching for evidence, addressing possible reporting biases, screening and selecting stud- ies, documenting the search, and managing data. As Table 4-1 shows, in most instances, the draft preamble (EPA 2013a, Appendix B) focuses on principles and does not address specific elements of the IOM standard. Because identifying evidence for IRIS involves all five elements reflected in the IOM standards, a concise preamble would not be expected to serve as a stand-alone roadmap for evidence-identification methods in IRIS assessments. The draft handbook (EPA 2013a, Appendix F), however, should include that level of detail and does cover the IOM standards more completely, although some gaps exist. To address the gaps, the committee recommends expanding the handbook as itemized in Table 4-1. In general, EPA might find it helpful to include a table of standards in the handbook (perhaps repeated in the preamble) and to adopt the wording in Table 4-1 for each standard (for example, from IOM) or to modify the wording to be specific to the IRIS case, as appropriate. As an overarching recommendation, the committee encourages EPA to include standard approaches for evidence identification in IRIS materials and to incorporate them consistently in the various materials. For components that are intentionally less detailed, such as the preamble, the committee encourages EPA to refer the reader elsewhere, notably to relevant parts of the handbook, for those interested in additional detail. The handbook serves as a valuable comple- ment to the preamble, but without pointers to more detailed resources the average reader might not understand the relationship between the two documents or be aware that detailed strategies or standards exist.

OCR for page 40
Evidence Identification 57 Develop a Template for Description of the Search Strategies EPA has provided the committee with a substantial set of tables, figures, and examples that demonstrate marked progress in implementing the recommendations from the NRC formalde- hyde report. In reviewing the materials provided (EPA 2013a,b), the committee did not see evi- dence that a consistent search template was being used. The preamble (EPA 2013a, Appendix B) and the handbook (EPA 2013a, Appendix F) are helpful with regard to illustrating the overall structure and flow of the evidence-identification process. For example, Figures F-1 and F-2 in EPA (2013a, Appendix F) and Figure 1-1 in EPA (2013b) illustrate the literature-search docu- mentation for ethyl tert-butyl ether. The committee recognizes that the process of developing and refining materials for the IRIS process is still going on and that representations of the search ap- proach have probably continued to evolve. However, materials provided show that the approach is not yet specified consistently and in equivalent detail among the various documents. For ex- ample, Figure 1-1 in EPA (2013b) includes Proquest—a step that involves reviewing references cited in papers identified by the search—whereas the preamble (EPA 2013a, Appendix B) and Figure F-1 in the handbook (EPA 2013a, Appendix F) do not. It is also unclear whether incon- sistencies are deliberate (and thus desirable) and related to the specific IRIS assessment being undertaken or are unintentional (and perhaps undesirable). For example, the preamble specifies searching “other databases listed in EPA’s HERO system” (p. B-2) and a number of other, most- ly unpublished sources, whereas Figure F-1 specifies “OPP databases” and also refers to search- ing other sources. The draft materials provided to the committee do not yet appear to include some quality- control and procedural guidelines identified in the IOM standards (see Table 4-1) that are rele- vant to identifying the evidence. In particular, the materials do not consider whether prespecified research questions were used to guide the evidence identification (see Chapter 3 for a discussion of the development of the research question). The committee encourages EPA to consider pre- specifying research questions when establishing the standard template for evidence identification to ensure that a search reflects the research goals appropriately. The committee commends EPA’s collaboration with the National Toxicology Program of the National Institute of Environ- mental Health Sciences in this regard and encourages incorporation of insights gained into the IRIS process. Use a Database to Capture Study Information and Relevant Quantitative Data The NRC formaldehyde report recommended that EPA use a database, such as HERO, to serve as a repository for documents supporting its toxicity assessments. The HERO database was developed to support scientific assessments for the national ambient air quality standards, nota- bly integrated science assessments for the six criteria pollutants. EPA responded to the recom- mendation with a substantial expansion of HERO to support IRIS (EPA 2013f,g). The extensive effort has involved incorporating more than 160,000 references relevant to IRIS since 2011, and updating has continued to today. For example, from August to September 2013, nearly 2,400 references were added to the IRIS set in HERO. The committee encourages EPA to adapt HERO or create a related database to contain data extracted from the individual documents. Although it is not yet evident in the draft preamble or handbook (EPA 2013a), the HERO Web site (EPA 2013f) suggests such an adaptation. In de- scribing what data HERO provides, the Web site (EPA 2013g) states “for ‘key’ studies: objec- tive, quantitative extracted study data [future enhancement].” It further states that “HERO revi- sions are planned to broaden both the features and scope of information included. Future directions include additional data sets, environmental models, and services that connect data and models.”

OCR for page 40
58 Review of EPA’s Integrated Risk Information System (IRIS) Process The committee recognizes that EPA has expanded the HERO database to capture infor- mation about documents relevant to IRIS assessments. Searching HERO, in addition to other databases, will be increasingly useful to identify relevant studies for IRIS assessments. As noted, the committee encourages EPA to expand HERO or build a complementary database into which data extracted from the documents in the HERO database can be entered. By creating a data re- pository for study information and relevant quantitative data, EPA will be able to accumulate and evaluate evidence among IRIS assessments. Such a repository of identified data (see Goldman and Silbergeld 2013) would further enhance the process and its consistent application for IRIS, as well as enhancing data-sharing (for example, see related discussion in IOM 2013). COMMENTS ON BEST PRACTICES FOR EVIDENCE IDENTIFICATION IOM (2011) standards as highlighted in Table 4-1 capture recent best practices. Searching for and identifying the evidence is arguably the most important step in a systematic review. Ac- cordingly, a standardized search strategy and reporting format are essential for evidence identifi- cation. As discussed in Chapter 3, the protocol frames an answerable question or questions that will be addressed by the assessment, states the eligibility criteria for inclusion in the assessment, and describes in detail how the relevant evidence will be identified. Searches should always be well documented with an expected format, as described in the articles on search filters for Em- base (deVries et al. 2011) and for PubMed (Hooijmans et al. 2010). As noted above, the commit- tee could not always find some of the critical information in the draft materials that it reviewed, although it has found that more recent IRIS assessments and preliminary materials for upcoming assessments reflect increasing standardization (for example, see, EPA 2013h,i), which is com- mendable. Standardizing the search strategy and reporting format would aid the reader of IRIS assessments and would facilitate an evaluation of how well the standards and concepts set forth in the preamble and handbook are being applied. In addition, standardization would help to min- imize unnecessary duplication, overlaps, and inconsistencies among various IRIS assessments. An example of format and documentation issues related to searching for animal studies can be found in Leenaars et al. (2012). The IOM standards also emphasize the role of various specialists in the review process, in- cluding information specialists (also referred to as informationists) and topic-specific experts. Those screening the studies and abstracting the data also need explicit training, and typically topic-specific experts are involved at this step. The roles of all team members should be identi- fied in the protocol. The evidence supporting the IOM standards is likely to be useful in the IRIS domain, but it would be appropriate for EPA to perform research that examines evidence specifically applicable to epidemiology and toxicity evaluations underlying IRIS assessments. For example, a targeted research effort could address the question of whether it is useful and necessary to search the gray literature—research literature that has not been formally published in journal articles, such as conference abstracts, book chapters, and theses—and the non-English-language literature in sys- tematic reviews for IRIS assessments. Given how quickly methods for systematic reviews are evolving, including databases and indexing terms, methodologic research related to systematic reviews for IRIS assessments should be kept current to ensure that standards are up to date and relevant. FINDINGS AND RECOMMENDATIONS The findings and recommendations that follow are broad recommendations on evidence identification; specific suggestions or considerations for each step in the process are provided in Table 4-1.

OCR for page 40
Evidence Identification 59 Finding: EPA has been responsive to recommendations in the NRC formaldehyde report regard- ing evidence identification and is well on the way to adopting a more rigorous approach to evi- dence identification that would meet standards for systematic reviews. This finding is based on a comparison of the draft EPA materials provided to the committee with IOM standards. Recommendation: The trajectory of change needs to be maintained. Finding: Current descriptions of search strategies appear inconsistently comprehensive, particu- larly regarding (a) the roles of trained information specialists; (b) the requirements for contrac- tors; (c) the descriptions of search strategies for each database and source searched; (d) critical details concerning the search, such as the specific dates of each search and the specific publica- tion dates included; and (e) the periodic need to consider modifying the databases and languages to be searched in updated and new reviews. The committee acknowledges that recent assess- ments other than the ones that it reviewed might already address some of the indicated concerns. Recommendation: The current process can be enhanced with more explicit documentation of methods. Protocols for IRIS assessments should include a section on evidence identification that is written in collaboration with information specialists trained in systematic reviews and that includes a search strategy for each systematic-review question being addressed in the assessment. Specifically, the protocols should provide a line-by-line description of the search strategy, the date of the search, and publication dates searched and, as noted in Chapter 3, explicitly state the inclusion and exclusion criteria for studies. Recommendation: Evidence identification should involve a predetermined search of key sources, follow a search strategy based on empirical research, and be reported in a standardized way that allows replication by others. The search strategies and sources should be modified as needed on the basis of new evidence on best practices. Contractors who perform the evidence identification for the systematic review should adhere to the same standards and provide evi- dence of experience and expertise in the field. Finding: One problem for systematic reviews in toxicology is identifying and retrieving toxico- logic information outside the peer-reviewed public literature. Recommendation: EPA should consider developing specific resources, such as registries, that could be used to identify and retrieve information about toxicology studies reported outside the literature accessible by electronic searching. In the medical field, clinical-trial registries and US legislation that has required studies to register in ClinicalTrials.gov have been an important step in ensuring that the total number of studies that are undertaken is known. Finding: Replicability and quality control are critical in scientific undertakings, including data management. Although that general principle is evident in IRIS assessments that were reviewed, tasks appear to be assigned to a single information specialist or review author. There was no evi- dence of the information specialist’s or reviewer’s training or of review of work by others who have similar expertise. As discussed in Chapter 2, an evaluation of validity and reliability through inter-rater comparisons is important and helps to determine whether multiple reviewers are needed. This aspect is missing from the IOM standards. Recommendation: EPA is encouraged to use at least two reviewers who work independently to screen and select studies, pending an evaluation of validity and reliability that might indicate that multiple reviewers are not warranted. It is important that the reviewers use standardized proce- dures and forms.

OCR for page 40
60 Review of EPA’s Integrated Risk Information System (IRIS) Process Finding: Another important aspect of quality control in systematic reviews is ensuring that in- formation is not double-counted. Explicit recognition of and mechanisms for dealing with multi- ple publications that include overlapping data from the same study are important components of data management that are not yet evident in the draft handbook. Recommendation: EPA should engage information specialists trained in systematic reviews in the process of evidence identification, for example, by having an information specialist peer re- view the proposed evidence-identification strategy in the protocol for the systematic review. Finding: The committee did not find enough empirical evidence pertaining to the systematic- review process in toxicological studies to permit it to comment specifically on reporting biases and other methodologic issues, except by analogy to other, related fields of scientific inquiry. It is not clear, for example, whether a reporting bias is associated with the language of publication for toxicological studies and the other types of research publications that support IRIS assess- ments or whether any such bias (if it exists) might be restricted to specific countries or periods. Recommendation: EPA should encourage and support research on reporting biases and other methodologic topics relevant to the systematic-review process in toxicology. Finding: The draft preamble and handbook provide a good start for developing a systematic, quality-controlled process for identifying evidence for IRIS assessments. Recommendation: EPA should continue to document and standardize its evidence-identification process by adopting (or adapting, where appropriate) the relevant IOM standards described in Table 4-1. It is anticipated that its efforts will further strengthen the overall consistency, reliabil- ity, and transparency of the evidence-identification process. REFERENCES AHRQ (Agency for Healthcare Research and Quality). 2011. The Effective Health Care Program Stake- holder Guide. Publication No. 11-EHC069-EF [online]. Available: http://www.ahrq.gov/research/ findings/evidence-based-reports/stakeholderguide/stakeholdr.pdf [accessed December 4, 2013]. Briel, M., K.F. Muller, J.J. Meerpohl, E. von Elm, B. Lang, E. Motshall, V. Gloy, F. Lamontagne, G. Schwarzer, and D. Bassler. 2013. Publication bias in animal research: A systematic review protocol. Syst. Rev. 2:23. Chalmers, I., and K. Dickersin. 2013. Biased under-reporting of research reflects biased under-submission more than biased editorial rejection. F1000 Research 2(1). Chang, S.M., E.B. Bass, N. Berkman, T.S. Carey, R.L. Kane, J. Lau, and S. Ratichek. 2013. Challenges in implementing The Institute of Medicine systematic review standards. Syst. Rev. 2(1):69. CRD (Centre for Reviews and Dissemination). 2009. Systematic Reviews: CRD’s Guidance for Undertak- ing Reviews in Health Care. York, UK: York Publishing Services, Ltd [online]. Available: http:// www.york.ac.uk/inst/crd/pdf/Systematic_Reviews.pdf [accessed December 4, 2013]. de Vries, R.B., C.R. Hooijmans, A. Tillema, M. Leenaars, and M. Ritskes-Hoitinga. 2011. A search filter for increasing the retrieval of animal studies in Embase. Lab Anim. 45(4):268-270. Dickersin, K., and I. Chalmers. 2011. Recognizing, investigating and dealing with incomplete and biased reporting of clinical research: From Francis Bacon to the World Health Organisation. J. R. Soc. Med. 104(12):532-538. EPA (U.S. Environmental Protection Agency). 2013a. Part 1. Status of Implementation of Recommenda- tions. Materials Submitted to the National Research Council, by Integrated Risk Information System Program, U.S. Environmental Protection Agency, January 30, 2013[online]. Available: http://www. epa.gov/IRIS/iris-nrc.htm [accessed October 22, 2013]. EPA (U.S. Environmental Protection Agency). 2013b. Part 2. Chemical-Specific Examples. Materials Sub- mitted to the National Research Council, by Integrated Risk Information System Program, U.S. Envi-

OCR for page 40
Evidence Identification 61 ronmental Protection Agency, January 30, 2013 [online]. Available: http://www.epa.gov/iris/pdfs/ IRIS%20Program%20Materials%20to%20NRC_Part%202.pdf [accessed October 22, 2013]. EPA (U.S. Environmental Protection Agency). 2013c. Toxicological Review of Benzo[a]pyrene (CAS No. 50- 32-8) in Support of Summary Information on the Integrated Risk Information System (IRIS), Public Comment Draft. EPA/635/R13/138a. National Center for Environmental Assessment, Office of Re- search and Development, U.S. Environmental Protection Agency, Washington, DC. August 2013 [online]. Available: http://cfpub.epa.gov/ncea/iris_drafts/recordisplay.cfm?deid=66193 [accessed Nov. 13, 2013]. EPA (U.S. Environmental Protection Agency). 2013d. Toxicological Review of Ammonia (CAS No. 7664-41- 7), In Support of Summary Information on the Integrated Risk Information System (IRIS). Revised Ex- ternal Review Draft. EPA/635/R-13/139a. National Center for Environmental Assessment, Office of Research and Development, U.S. Environmental Protection Agency, Washington, DC. August 2013 [online]. Available: http://cfpub.epa.gov/ncea/cfm/recordisplay.cfm?deid=254524 [accessed October 14, 2013]. EPA (U.S. Environmental Protection Agency). 2013e. Toxicological Review of Ammonia (CAS No. 7664- 41-7), In Support of Summary Information on the Integrated Risk Information System (IRIS), Sup- plemental Information. Revised External Review Draft. EPA/635/R-13/139b. National Center for Environmental Assessment, Office of Research and Development, Washington, DC [online]. Availa- ble: http://cfpub.epa.gov/ncea/cfm/recordisplay.cfm?deid=254524 [accessed October 14, 2013]. EPA (U.S. Environmental Protection Agency). 2013f. Health and Environmental Research Online (HERO): The Assessment Process [online]. Available: http://hero.epa.gov/index.cfm?action=content.assessment [accessed October 14, 2013]. EPA (U.S. Environmental Protection Agency). 2013g. Health and Environmental Research Online (HERO): Basic Information [online]. Available: http://hero.epa.gov/index.cfm?action=content.basic [accessed October 14, 2013]. EPA (U.S. Environmental Protection Agency). 2013h. Preliminary Materials for the Integrated Risk Infor- mation System (IRIS) Toxicological Review of tert-Butyl Alcohol (tert-Butanol) [CASRN 75-65-0]. EPA/635/R-13/107. National Center for Environmental Assessment, Office of Research and Devel- opment, Washington, DC. July 2013 [online]. Available: http://www.epa.gov/iris/publicmeeting/iris_ bimonthly-oct2013/t-butanol-litsearch_evidence-tables.pdf [accessed October 14, 2013]. EPA (U.S. Environmental Protection Agency). 2013i. Systematic Review of the tert-Butanol Literature (gener- ated by HERO) [online]. Available: http://www.epa.gov/iris/publicmeeting/iris_bimonthly-oct2013/ mtg_docs.htm [accessed October 14, 2013]. Fanelli, D. 2010. Do pressures to publish increase scientists’ bias? An empirical support from U.S. States data. PLoS One 5(4):e10271. Goldman, L.R., and E.K. Silbergeld. 2013. Assuring access to data for chemical evaluations. Environ. Health Perspect. 121(2):149-152. Higgins, J.P.T., and S. Green, eds. 2008. Cochrane Handbook for Systematic Reviews of Interventions. Chichester, UK: John Wiley & Sons. Hoffman, S., and T. Hartung. 2006. Toward an evidence-based toxicology. Hum. Exp. Toxicol. 25(9):497- 513. Hooijmans, C.R., and M. Ritskes-Hoitinga. 2013. Progress in using systematic reviews of animal studies to improve translational research. PLOS Med. 10(7):e1001482. Hooijmans, C.R., A. Tillema, M. Leenaars, and M. Ritskes-Hoitinga. 2010. Enhancing search efficiency by means of a search filter for finding all studies on animal experimentation in PubMed. Lab Anim. 44(3):170-175. IOM (Institute of Medicine). 2011. Finding What Works in Health Care: Standards for Systematic Reviews. Washington, DC: National Academies Press. IOM (Institute of Medicine). 2013. Sharing Clinical Research Data: Workshop Summary. Washington, DC: National Academies Press. Korevaar, D.A., L. Hooft, and G. ter Riet. 2011. Systematic reviews and meta-analyses of preclinical stud- ies: Publication bias in laboratory animal experiments. Lab. Anim. 45(4):225-230. Leenaars, M., C.R. Hooijmans, N. van Veggel, G. ter Riet, M. Leeflang, L. Hooft, G.J. van der Wilt, A. Tillema, and M. Ritskes-Hoitinga. 2012. A step-by-step guide to systematically identify all relevant animal studies. Lab Anim. 46(1):24-31. Lundh, A., S. Sismondo, J. Lexchin, O.A. Busuioc, and L. Bero. 2012. Industry sponsorship and research out- come. Cochrane Database Syst. Rev. (12):Art. MR000033. doi: 10.1002/14651858.MR000033.pub2.

OCR for page 40
62 Review of EPA’s Integrated Risk Information System (IRIS) Process Mignini, L.E., and K.S. Khan. 2006. Methodological quality of systematic reviews of animal studies: A survey of reviews of basic research. BMC Med. Res. Methodol. 6:10. doi:10.1186/1471-2288-6-10. NRC (National Research Council). 2011. Review of the Environmental Protection Agency’s Draft IRIS Assessment of Formaldehyde. Washington, DC: National Academies Press. Sena, E.S., H.B. van der Worp, P.M. Bath, D.W. Howells, and M.R. Macleod. 2010. Publication bias in reports of animal stroke studies leads to major overstatement of efficacy. PLoS Biol. 8(3):e1000344. Silbergeld, E., and R.W. Scherer. 2013. Evidence-based toxicology: Strait is the gate, but the road is worth taking. ALTEX 30(1):67-73. ter Riet, G., D.A. Korevaar, M. Leenaars, P.J. Sterk, C.J.F. Van Noorden, L.M. Bouter, R. Lutter, R.P.O. Elferink, and L. Hooft. 2012. Publication bias in laboratory animal research: A survey on magnitude, drivers, consequences and potential solutions. PLOS ONE 7(9):e43404. Woodruff, T.J., and P. Sutton. 2011. An evidence-based medicine methodology to bridge the gap between clinical and environmental health sciences. Health Aff. (Millwood) 30(5):931-937.