National Academies Press: OpenBook
« Previous: 2 The Program Evaluation Context
Suggested Citation:"3 Evaluation Framework." Institute of Medicine and National Research Council. 2009. Evaluating Occupational Health and Safety Research Programs: Framework and Next Steps. Washington, DC: The National Academies Press. doi: 10.17226/12639.
×
Page 37
Suggested Citation:"3 Evaluation Framework." Institute of Medicine and National Research Council. 2009. Evaluating Occupational Health and Safety Research Programs: Framework and Next Steps. Washington, DC: The National Academies Press. doi: 10.17226/12639.
×
Page 38
Suggested Citation:"3 Evaluation Framework." Institute of Medicine and National Research Council. 2009. Evaluating Occupational Health and Safety Research Programs: Framework and Next Steps. Washington, DC: The National Academies Press. doi: 10.17226/12639.
×
Page 39
Suggested Citation:"3 Evaluation Framework." Institute of Medicine and National Research Council. 2009. Evaluating Occupational Health and Safety Research Programs: Framework and Next Steps. Washington, DC: The National Academies Press. doi: 10.17226/12639.
×
Page 40
Suggested Citation:"3 Evaluation Framework." Institute of Medicine and National Research Council. 2009. Evaluating Occupational Health and Safety Research Programs: Framework and Next Steps. Washington, DC: The National Academies Press. doi: 10.17226/12639.
×
Page 41
Suggested Citation:"3 Evaluation Framework." Institute of Medicine and National Research Council. 2009. Evaluating Occupational Health and Safety Research Programs: Framework and Next Steps. Washington, DC: The National Academies Press. doi: 10.17226/12639.
×
Page 42
Suggested Citation:"3 Evaluation Framework." Institute of Medicine and National Research Council. 2009. Evaluating Occupational Health and Safety Research Programs: Framework and Next Steps. Washington, DC: The National Academies Press. doi: 10.17226/12639.
×
Page 43
Suggested Citation:"3 Evaluation Framework." Institute of Medicine and National Research Council. 2009. Evaluating Occupational Health and Safety Research Programs: Framework and Next Steps. Washington, DC: The National Academies Press. doi: 10.17226/12639.
×
Page 44
Suggested Citation:"3 Evaluation Framework." Institute of Medicine and National Research Council. 2009. Evaluating Occupational Health and Safety Research Programs: Framework and Next Steps. Washington, DC: The National Academies Press. doi: 10.17226/12639.
×
Page 45
Suggested Citation:"3 Evaluation Framework." Institute of Medicine and National Research Council. 2009. Evaluating Occupational Health and Safety Research Programs: Framework and Next Steps. Washington, DC: The National Academies Press. doi: 10.17226/12639.
×
Page 46
Suggested Citation:"3 Evaluation Framework." Institute of Medicine and National Research Council. 2009. Evaluating Occupational Health and Safety Research Programs: Framework and Next Steps. Washington, DC: The National Academies Press. doi: 10.17226/12639.
×
Page 47
Suggested Citation:"3 Evaluation Framework." Institute of Medicine and National Research Council. 2009. Evaluating Occupational Health and Safety Research Programs: Framework and Next Steps. Washington, DC: The National Academies Press. doi: 10.17226/12639.
×
Page 48
Suggested Citation:"3 Evaluation Framework." Institute of Medicine and National Research Council. 2009. Evaluating Occupational Health and Safety Research Programs: Framework and Next Steps. Washington, DC: The National Academies Press. doi: 10.17226/12639.
×
Page 49
Suggested Citation:"3 Evaluation Framework." Institute of Medicine and National Research Council. 2009. Evaluating Occupational Health and Safety Research Programs: Framework and Next Steps. Washington, DC: The National Academies Press. doi: 10.17226/12639.
×
Page 50
Suggested Citation:"3 Evaluation Framework." Institute of Medicine and National Research Council. 2009. Evaluating Occupational Health and Safety Research Programs: Framework and Next Steps. Washington, DC: The National Academies Press. doi: 10.17226/12639.
×
Page 51
Suggested Citation:"3 Evaluation Framework." Institute of Medicine and National Research Council. 2009. Evaluating Occupational Health and Safety Research Programs: Framework and Next Steps. Washington, DC: The National Academies Press. doi: 10.17226/12639.
×
Page 52
Suggested Citation:"3 Evaluation Framework." Institute of Medicine and National Research Council. 2009. Evaluating Occupational Health and Safety Research Programs: Framework and Next Steps. Washington, DC: The National Academies Press. doi: 10.17226/12639.
×
Page 53
Suggested Citation:"3 Evaluation Framework." Institute of Medicine and National Research Council. 2009. Evaluating Occupational Health and Safety Research Programs: Framework and Next Steps. Washington, DC: The National Academies Press. doi: 10.17226/12639.
×
Page 54
Suggested Citation:"3 Evaluation Framework." Institute of Medicine and National Research Council. 2009. Evaluating Occupational Health and Safety Research Programs: Framework and Next Steps. Washington, DC: The National Academies Press. doi: 10.17226/12639.
×
Page 55
Suggested Citation:"3 Evaluation Framework." Institute of Medicine and National Research Council. 2009. Evaluating Occupational Health and Safety Research Programs: Framework and Next Steps. Washington, DC: The National Academies Press. doi: 10.17226/12639.
×
Page 56
Suggested Citation:"3 Evaluation Framework." Institute of Medicine and National Research Council. 2009. Evaluating Occupational Health and Safety Research Programs: Framework and Next Steps. Washington, DC: The National Academies Press. doi: 10.17226/12639.
×
Page 57
Suggested Citation:"3 Evaluation Framework." Institute of Medicine and National Research Council. 2009. Evaluating Occupational Health and Safety Research Programs: Framework and Next Steps. Washington, DC: The National Academies Press. doi: 10.17226/12639.
×
Page 58
Suggested Citation:"3 Evaluation Framework." Institute of Medicine and National Research Council. 2009. Evaluating Occupational Health and Safety Research Programs: Framework and Next Steps. Washington, DC: The National Academies Press. doi: 10.17226/12639.
×
Page 59
Suggested Citation:"3 Evaluation Framework." Institute of Medicine and National Research Council. 2009. Evaluating Occupational Health and Safety Research Programs: Framework and Next Steps. Washington, DC: The National Academies Press. doi: 10.17226/12639.
×
Page 60
Suggested Citation:"3 Evaluation Framework." Institute of Medicine and National Research Council. 2009. Evaluating Occupational Health and Safety Research Programs: Framework and Next Steps. Washington, DC: The National Academies Press. doi: 10.17226/12639.
×
Page 61
Suggested Citation:"3 Evaluation Framework." Institute of Medicine and National Research Council. 2009. Evaluating Occupational Health and Safety Research Programs: Framework and Next Steps. Washington, DC: The National Academies Press. doi: 10.17226/12639.
×
Page 62
Suggested Citation:"3 Evaluation Framework." Institute of Medicine and National Research Council. 2009. Evaluating Occupational Health and Safety Research Programs: Framework and Next Steps. Washington, DC: The National Academies Press. doi: 10.17226/12639.
×
Page 63
Suggested Citation:"3 Evaluation Framework." Institute of Medicine and National Research Council. 2009. Evaluating Occupational Health and Safety Research Programs: Framework and Next Steps. Washington, DC: The National Academies Press. doi: 10.17226/12639.
×
Page 64
Suggested Citation:"3 Evaluation Framework." Institute of Medicine and National Research Council. 2009. Evaluating Occupational Health and Safety Research Programs: Framework and Next Steps. Washington, DC: The National Academies Press. doi: 10.17226/12639.
×
Page 65
Suggested Citation:"3 Evaluation Framework." Institute of Medicine and National Research Council. 2009. Evaluating Occupational Health and Safety Research Programs: Framework and Next Steps. Washington, DC: The National Academies Press. doi: 10.17226/12639.
×
Page 66
Suggested Citation:"3 Evaluation Framework." Institute of Medicine and National Research Council. 2009. Evaluating Occupational Health and Safety Research Programs: Framework and Next Steps. Washington, DC: The National Academies Press. doi: 10.17226/12639.
×
Page 67
Suggested Citation:"3 Evaluation Framework." Institute of Medicine and National Research Council. 2009. Evaluating Occupational Health and Safety Research Programs: Framework and Next Steps. Washington, DC: The National Academies Press. doi: 10.17226/12639.
×
Page 68
Suggested Citation:"3 Evaluation Framework." Institute of Medicine and National Research Council. 2009. Evaluating Occupational Health and Safety Research Programs: Framework and Next Steps. Washington, DC: The National Academies Press. doi: 10.17226/12639.
×
Page 69
Suggested Citation:"3 Evaluation Framework." Institute of Medicine and National Research Council. 2009. Evaluating Occupational Health and Safety Research Programs: Framework and Next Steps. Washington, DC: The National Academies Press. doi: 10.17226/12639.
×
Page 70
Suggested Citation:"3 Evaluation Framework." Institute of Medicine and National Research Council. 2009. Evaluating Occupational Health and Safety Research Programs: Framework and Next Steps. Washington, DC: The National Academies Press. doi: 10.17226/12639.
×
Page 71
Suggested Citation:"3 Evaluation Framework." Institute of Medicine and National Research Council. 2009. Evaluating Occupational Health and Safety Research Programs: Framework and Next Steps. Washington, DC: The National Academies Press. doi: 10.17226/12639.
×
Page 72
Suggested Citation:"3 Evaluation Framework." Institute of Medicine and National Research Council. 2009. Evaluating Occupational Health and Safety Research Programs: Framework and Next Steps. Washington, DC: The National Academies Press. doi: 10.17226/12639.
×
Page 73
Suggested Citation:"3 Evaluation Framework." Institute of Medicine and National Research Council. 2009. Evaluating Occupational Health and Safety Research Programs: Framework and Next Steps. Washington, DC: The National Academies Press. doi: 10.17226/12639.
×
Page 74

Below is the uncorrected machine-read text of this chapter, intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text of each book. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

3 Evaluation Framework A s discussed throughout this report, an evaluation framework was developed to guide the NIOSH program evaluations and provide a set of criteria to be used in scoring the relevance and impact of each program’s efforts in reducing work-related hazardous exposures, illnesses, and injuries. The National Academies Committee for the Review of NIOSH Research Programs (referred to as the framework committee) developed an initial evaluation framework that was honed and refined to improve its utility, clarity, and emphasis as the eight evalua- tions proceeded. The framework presented in this chapter differs slightly from the versions that were used by the evaluation committees. Insights gained in the evalu- ation of the eight NIOSH programs (see Chapter 4) are included in this version of the framework, which is provided for consideration in future program evaluations. In addition to evaluating NIOSH programs, this analytic framework and approach may be applicable to the evaluation of other federal agency research programs or research programs in other organizations. In conducting its evaluation, each committee was asked to determine whether the NIOSH program was undertaking high-priority, relevant research and transfer activities (relevance) and whether these efforts are improving health and safety in the workplace (impact). The evaluation committee was also tasked with (1) rating  The evaluation framework document used by the individual evaluation committee is provided as an appendix in each of the evaluation committee reports (IOM and NRC, 2006, 2008, 2009; NRC and IOM, 2007, 2008a,b, 2009a,b). 37

38 E v a l u a t i n g O c c u p a t i o n a l H e a l t h a n d S a f e t y Research Programs both the relevance and the impact of the NIOSH program using 1–5 integer scales, and (2) providing input about emerging areas of research and recommendations for program improvement. OVERVIEW OF THE EVALUATION FRAMEWORK After examining different approaches to program evaluation (see Chapter 2), the framework committee decided to define the scope and stages of the evaluation process based on the logic model (Williams et al., 2009). The resulting evaluation framework described in this chapter breaks the logic models developed by NIOSH (Figure 3-1) into discrete program components to be assessed by each evaluation committee. Criteria for evaluation of each component of the framework are de- tailed below. In the evaluation framework (overview provided in Figure 3-2), the assessment of strategic goals and objectives, inputs, activities, and outputs (B to E) largely define the relevance of the program, while assessment of intermediate and end outcomes (F and G) largely define the program impact. The following major components of each NIOSH program were assessed by the evaluation committees: • Major occupational safety and health challenges in the program area. • Goals and objectives as defined by NIOSH. • Inputs (e.g., budget; staff; facilities; and input from the program’s re- search management, the NIOSH Board of Scientific Counselors, and stakeholders). • Activities (efforts by NIOSH staff, contractors, and grantees; e.g., surveil- lance of injury, illness, and hazards; exposure assessment research; health- effects research; injury-risk factor research; intervention research; health services research; and technology transfer activities). • Outputs (NIOSH products; e.g., publications, reports, conferences, data- bases, tools, methods, guidelines, recommendations, education and train- ing, and patents). • Intermediate outcomes (actions by external stakeholders in response to NIOSH products; e.g., policy change, training and education, self-reported use or repackaging of NIOSH data by stakeholders, adoption of NIOSH- developed technologies, implemented guidelines, and licenses). • End outcomes (e.g., reduction in work-related injuries, illnesses, or hazard- ous exposures in the workplace). The framework committee understood that the efforts of any research pro- gram or the evaluation of that program will not be as linear as presented in either

OSHA, MSHA, other federal agencies; -Recommendations, Pilot and/or NIOSH reports, publications, Transfer: market-ready programs; workshops, databases, -Translation of technologies, Employees, Congress; state conferences; research into training and employers, and local practice, products, education industry, agencies; -Training and education and technologies; programs, educators, standards materials and -Information guidance, regulators bodies; labor, demonstration programs, dissemination; regulations, who reduce or trade, and trained professionals; -Capacity building standards, prevent professional through technical trade and hazardous associations; -Tools and methods, best assistance major media exposures or technology practices, developmental (HHEs), training, releases, conditions developers and technologies, licenses, and education websites manufacturers; patents other researchers; SH practitioners Conduct surveillance and evaluate intervention effectiveness FIGURE 3-1  The National Institute for Occupational Safety and Health logic model. Figure 3-1 39 R01502 copied from job #R01276 text is bitmapped and not editable

40 E v a l u a t i n g O c c u p a t i o n a l H e a l t h a n d S a f e t y Research Programs External Factors B C A Analysis of Review and D Goals and Assessment Review and Identify Major Objectives Driving of Inputs Assessment Occupational Health Current Program of Activities and Safety Challenges in Program Area Planning: e.g., surveillance Assessment of NIOSH and intervention data; For example: surveillance, Independent assessment process to select stakeholder inputs by committee to health effects research, program goals, compare with NIOSH evaluation of goals intervention research, Production: e.g., intra- technology transfer program area goals selected by NIOSH, and extramural funding, comparison with activities, health services staffing, physical facilities, and other research assessment of management structure challenges (A) F G Review and E Review and Assessment of Review and Assessment of Intermediate Assessment End Outcomes Outcomes of Outputs Reduced injuries, For example: public policy For example: publications, illnesses, hazardous impact, training/education, reports, databases, tools, exposures in the self-reported use and/or methods, guidelines, workplace repackaging by stakeholders, recommendations, implemented guidelines licenses and patents External Factors FIGURE 3-2  Overview of the evaluation process. Figure 3-2 portrait R01502 Figure 3-2 or Box 3-1; rather, they are iterative processes. Overlap necessarily occurs adapted from Figure A-2 in job #R01233 fully editable between the assessment of relevance and impact, particularly in the assessment of information transfer. Furthermore, components of any program may not fit perfectly into any one category. For example, training and development programs are appropriately defined as outputs by NIOSH in the logic model (Figure 3-1), but the framework committee found more value in focusing on the responses to these outputs as intermediate outcomes in the evaluation. Some NIOSH programs are organized using a matrix management approach as they span several NIOSH divisions or laboratories. Because resources within NIOSH are allocated in large part at the division level, rather than the program level, a matrix organization may have little control over the input portion of the logic model and therefore fewer resources within its direct control on which to base decisions. Following the suggested evaluation process ensured a level of consistency and comparability among all the evaluation committees. For future program evalua- tions, training on logic models and criteria for differentiating the various com-

E v a l u a t i o n F r a m e w o r k 41 BOX 3-1 Steps in the Evaluation Process   1. Gather appropriate information.   2. Assess external factors.   3. Identify time frame to be evaluated.   4. Identify major occupational health and safety challenges in program area.   5. Analyze program goals and objectives.   6. Identify major program components.   7. Evaluate program inputs, activities, outputs, and outcomes.   8. Determine scores for relevance and impact and provide the rationale.   9. Assess the program’s process for targeting priority research needs and provide the committee’s assessment of emerging issues. 10. Prepare report by using the template provided as a guide. ponents of the logic model would be beneficial at the inception of the evaluation process both for NIOSH staff as they assemble the evidence packages and for evalu- ation committee members as they begin their assessment of the program. Drawing on the program logic model, the evaluation framework, and the evalu- ation committee members’ expertise, the evaluation committees began by examin- ing important inputs and external factors affecting the NIOSH research program’s agenda. Examples of external factors included research activities of industry and other federal agencies as well as the political and regulatory environment. The evaluation then focused on the program’s research activities, outputs, associated transfer activities, and resulting intermediate and end outcomes. Box 3-1 provides a summary of the evaluation process as suggested by the framework committee. Detailed guidance on each step is provided in later sections of this chapter. The following key factors were considered in assessing the relevance of NIOSH research programs: • The severity and/or frequency of health and safety hazards addressed and the number of people at risk (magnitude) for these hazards. • The extent to which NIOSH research programs identified and addressed gender-related issues and issues of vulnerable populations and the extent   Vulnerable populations are defined as groups of workers who have biological, social, or economic characteristics that place them at increased risk for work-related conditions or on whom inadequate data have been collected. These populations include low-wage workers, disadvantaged minorities, disabled persons, and non-English-speakers for whom language or other barriers present health or safety risks.

42 E v a l u a t i n g O c c u p a t i o n a l H e a l t h a n d S a f e t y Research Programs to which NIOSH research programs addressed the health and safety needs of small businesses. • The stage of research on the problems being addressed. As the health ef- fects are understood, research efforts should shift from etiologic research to intervention research and then to intervention-effectiveness research. Gaps in the spectrum of prevention need to be addressed; for example, research on exposure assessment may be necessary before the next intervention steps can be taken. • The structure, in addition to the content, of the research program. A rel- evant research program is more than a set of unrelated research projects; it is an integrated program involving interrelated surveillance, research, and transfer activities. • Appropriate NIOSH consideration of external stakeholder input. The evaluation committees had the option to consider these and other relevant factors as they progressed through each stage of the evaluation. Data documenting end outcomes are often quite limited or are not available to quantify reductions in illness, injury, and hazardous exposures. Data documenting intermediate outcomes, although likely also limited, could serve as an appropriate proxy for end outcome data if the relationship between occupational exposures and health outcomes is well understood. For example, changes in regulations or procedures likely to result in reduced exposures are important measures of inter- mediate outcomes. Useful program evaluation requires specific questions and criteria for assessing each component of the program; a disciplined focus on a small number of questions or hypotheses typically related to program goals, performance criteria, and performance standards; a rigorous method of answering the questions or testing the hypotheses; and a credible procedure for developing qualitative and quantitative assessments. Because of the uniqueness of each NIOSH program, each evaluation committee determined the most reasonable way to apply the evaluation criteria. EVALUATION COMMITTEES The individual evaluation committees were formed in accordance with the rules of the National Academies that focus on ensuring a balanced committee. Each evaluation committee included: persons with expertise appropriate for the specific NIOSH research program under review (including researchers and representatives of stakeholder groups; e.g., worker organizations and industry), experts in technol- ogy and knowledge transfer, and experts in program evaluation.

E v a l u a t i o n F r a m e w o r k 43 The evaluation committee gathered appropriate information from the NIOSH research program under review, from external stakeholders affected directly by the NIOSH program research, and from relevant independent parties. The original contracts between NIOSH and the National Academies specified that each evalua- tion committee would consist of about 10 members, meet three times, and prepare a report due to NIOSH within 9 months of the first meeting of the evaluation committee. As noted in Chapter 4, future evaluations may consider extending the time frame to 12–14 months, depending on the size and complexity of the program being evaluated. STEPS IN THE EVALUATION PROCESS The evaluation process consists of 10 steps, described in the following sections and summarized in Box 3-1. A description is provided of how NIOSH programs were evaluated by the National Academies. This model can be applied to other program evaluations as well. 1.  Gather Appropriate Information Each NIOSH program under review provided information to the relevant evaluation committee, including that outlined in Box 3-2. Some of the evaluation committees requested additional information from the program. Organizing the information listed in Box 3-2 by goal (or by subprogram if organizing by goal was not feasible) was helpful to the committees. In addition to the information provided by the NIOSH program, the evaluation committees independently collected additional information as deemed necessary for the evaluation, such as the perspectives of external stakeholders, including the Occupational Safety and Health Administration (OSHA), the Mine Safety and Health Administration (MSHA), workforces and their unions, and industry. In conducting the review, the evaluation committees examined how the inputs, activi- ties, outputs, and intermediate outcomes contribute to the impact and relevance of the program as a whole. Many NIOSH programs have been evaluated by internal and other external bodies as part of an overall assessment of NIOSH, such as the Performance As- sessment Rating Tool (PART) review, or through evaluations of specific research  PART focuses on assessing program-level performance and is one of the measures of success of the budget and performance integration initiative of the President’s management agenda (http://www. whitehouse.gov/omb/expectmore/summary/10002160.2004. html [accessed January 30, 2009]).

44 E v a l u a t i n g O c c u p a t i o n a l H e a l t h a n d S a f e t y Research Programs BOX 3-2 Evaluation Committee Information Needs • Program background and overview: o Program history o Program management structure o Major program challenges o Program goals and objectives, past (for period under review) and current o Process for developing and updating program strategic plans o Enabling or authorizing legislation o Major subprograms (if appropriate) o  Results of previous program reviews (e.g., annual review by NIOSH leadership team or external scientific program reviews) o External factors affecting the program • Interactions with external stakeholders and with other NIOSH programs: o  role of program research staff in NIOSH policy setting, OSHA and MSHA stan- The dard setting, voluntary standard setting, and other government policy functions o Interactions and working relationships with other NIOSH programs o  Identification of other institutions and research programs with similar portfolios and an explanation of the relationship between NIOSH activities and those of other institutions o  Key partnerships with other government agencies, employers, labor, academic institutions, nonprofit organizations, and international organizations • Program inputs: o Production inputs (program resources): ß Funding by year for period under review ß Funding by objective or subprogram ß  Program staffing, full-time equivalents, and laboratory facilities, by subprogram (if indicated) ß Percentage of program budget that is discretionary (beyond salaries) ß Percentage of program budget that is earmarked ß Significant contributions to the program from other sources (in kind or funds) o Planning inputs: ß  Surveillance data, inputs from the Health Hazard Evaluation and the Fatality Assessment and Control Evaluation programs, and intramural and extramural research findings that influenced program goals and objectives ß  Planning inputs from stakeholders such as advisory groups; National Occupa- tional Research Agenda (NORA) teams; and professional, industry, and labor groups (specify if any input comes from groups representing small business or vulnerable populations) ß Related OSHA and MSHA strategic plans or other input ß Process for soliciting and approving intramural research ideas ß  Process for soliciting and approving program-supported extramural research activities

E v a l u a t i o n F r a m e w o r k 45 • Program activities (more details provided in Box 3-3): o Intramural: ß Surveillance activities ß Research activities ß  Transfer activities to encourage implementation of research results for improved occupational safety and health (e.g., information dissemination, technical as- sistance, and technology and knowledge transfer) ß  collaborations in intramural activities (e.g., with other government agencies, Key academia, industry, and unions) o Extramural funded by NIOSH: ß Requests for applications developed by program ß  Funded projects: grants, cooperative agreements, and contracts, such as the following:  Surveillance activities  Research activities  Transfer activities  Capacity-building activities • Outputs (products of the research program—more details provided in Box 3-4): o Intramural: ß Peer-reviewed publications, agency reports, alerts, and recommendations ß  Databases, websites, tools, and methods (including education and training materials) ß Technologies developed and patents ß Sponsored conferences and workshops o Extramural: ß Program announcements ß Requests for applications • Intermediate outcomes: o  Standards or guidelines issued by other agencies or organizations based in whole or in part on NIOSH research o  Adoption and use of control or personal protective technologies developed by NIOSH o  Evidence of industry, employer, or worker behavioral changes in response to re- search outputs o  of NIOSH products by workers, industry, occupational health and safety profes- Use sionals, healthcare providers, and others (including internationally) o NIOSH website hits and document requests o Unique staff or laboratory capabilities that serve as a national resource o Other intermediate outcomes, including those from extramural activities • End outcomes: o  Data on program impact on rates and numbers of injuries, illnesses, and hazardous exposures in the workplace (including trend data, if available) o Documentation of workplace risk reduction (quantitative, qualitative, or both) • Description of current processes for setting research priorities and identifying emerging issues in the workplace

46 E v a l u a t i n g O c c u p a t i o n a l H e a l t h a n d S a f e t y Research Programs program elements. The evaluation committees were asked to review all prior evalu- ations of the program as an aid to understanding the evolution of the program and its elements. The National Academies committee evaluations, however, were independent of prior reviews and evaluations. 2.  Assess External Factors As depicted in the logic model (Figure 3-1), reductions in work-related injury and illness or in hazardous exposures (end outcomes) are dependent on stakeholder activities (external factors). Actions beyond NIOSH’s control by those in labor, industry, regulatory entities, and elsewhere are necessary for NIOSH program ac- tivities to produce changes in end outcomes. Implementation of research findings may depend on existing or future policy considerations, economic conditions, and the public agenda. External factors were considered as forces beyond the control of the NIOSH program that may affect the evolution of the program. External factors influence progress through all phases of the logic model, from inputs to end outcomes (see Figure 3-1). Identification of external factors by an evaluation committee is essen- tial because it provides the context for evaluation of the program. External factors may be best assessed on the basis of the expert judgment of evaluation commit- tee members who have knowledge of the field of research. NIOSH program staff provided their ideas on external factors early in the evaluation process. Informa- tion regarding external factors was also sought from other NIOSH program and management staff, OSHA and MSHA staff, and from other external stakeholders. Additionally, each evaluation committee chose other approaches to assess external factors. Factors external to a program might help or hinder the achievement of certain outcomes or might present formidable obstacles. The evaluation commit- tees addressed both possibilities. Some external factors may constrain research activities related to specific target populations, methodological issues, or resource availability. Evaluation committees examined whether • Projects addressing a critical health need are technologically feasible. A workforce of appropriate size and with appropriate duration and distribu- tion of exposure for measuring a health effect may not exist; for example, no population of workers has been exposed for 30 years to formaldehyde at the current OSHA permissible exposure limit (PEL), so the related cancer mortality cannot yet be directly assessed. • Research is inhibited because NIOSH investigators are unable to access an adequate study population. Under current policy, NIOSH must either

E v a l u a t i o n F r a m e w o r k 47 obtain an invitation by management to study a workplace or seek a judicial order to provide authority to enter a worksite (cooperation under court order may well be insufficient for effective research). • Research is inhibited because the work environment, materials, and his- torical records cannot be accessed even with management and workforce cooperation. • Adequate or established methods exist for assessing the environment. • The NIOSH contribution to a particular field of research is reduced or difficult to estimate because other institutions are working in the same field. • NIOSH resources are inadequate to tackle key questions. Evaluation of the impact of NIOSH research outputs on worker health and safety also required consideration of external factors that might impede or aid implementation, measurement, and so on. Evaluation committees considered whether the following conditions exist and if so how this influences the research that NIOSH undertakes: • Regulatory changes and implementation are unachievable because of ob- stacles to regulation or because of differing priorities of the regulatory agencies. For example, there may be no implementation of recommenda- tions for improved respiratory protection programs for healthcare workers because of the lack or weakness of enforcement policies. • A feasible control for a known risk factor or exposure has not been imple- mented because the cost of implementation is too high or because current economic incentives do not favor such actions. • End outcomes are unobservable because baseline and continuing surveil- lance data are not available. For example, the current incidence of occu- pational noise-induced hearing loss is not known, although surveillance for a substantial threshold shift is feasible. (NIOSH conducts surveillance of some types of work-related illnesses, injuries, and hazards, but compre- hensive surveillance is not possible with existing resources.) • Reductions in adverse effects of chronic exposure cannot be measured. For example, 90 percent of identified work-related mortality is from diseases such as cancer that arise only after decades of latency after first exposure to a carcinogen. Effects of reducing exposure to a carcinogen therefore cannot be observed in the time frame of most interventions. • A promulgated regulation requires a technology that has been developed but is not widely used.

48 E v a l u a t i n g O c c u p a t i o n a l H e a l t h a n d S a f e t y Research Programs 3.  Identify Time Frame to Be Evaluated The NIOSH research program and other sources provided each evaluation committee with the history of the research program being evaluated and informa- tion on its major subprograms, goals, objectives, resources, and other pertinent information. Having that information allowed the committee to choose the time period most appropriate for the evaluation, with a focus on evaluating the program during the most recent appropriate period. For purposes of the eight reviews al- ready completed, the evaluation committees considered three general timeframes: 1970–1995 (pre-NORA period), the period from the founding of NIOSH to the initiation of the National Occupational Research Agenda (NORA); 1996–2005 (NORA 1 period); and after 2005 (NORA 2 period). The period chosen for review took into consideration suggestions from the NIOSH research program under review. It was recognized that many of the intermediate and end outcomes docu- mented in the selected time frame are consequences of research outputs completed before that time period. 4.  Identify Major Occupational Health and Safety Challenges in Program Area Early in the assessment process, each evaluation committee identified, indepen- dently of NIOSH, the major occupational health and safety challenges for the re- search area being examined (Box A in Figure 3-2). In arriving at a list of challenges, the evaluation committees relied on surveillance findings, NIOSH investigations of sentinel events (through health-hazard or fatality-assessment programs), ex- ternal advisory inputs, and their own expert judgment. The evaluation committee then compared its own assessment of the challenges with the program’s goals and objectives as outlined in the next step. The congruence between the two was useful during the assessment of relevance. In identifying and discussing the challenges, the evaluation committee included examples of best practices or described the components of the committee’s vision of an ideal program. 5.  Analyze Program Goals and Objectives The research program’s goals and objectives were evaluated with a focus on how each program goal is related to agency-wide strategic goals and to the program   An occupational sentinel event is a disease, disability, or untimely death, which is occupationally related and whose occurrence may provide the impetus for further study or for the need to intervene (Rutstein et al., 1983).

E v a l u a t i o n F r a m e w o r k 49 challenges (Box B in Figure 3-2). NIOSH research programs should be designed to be responsive to present or future workplace safety and health issues, and the evaluation committee was asked to provide an assessment of whether the program’s goals and objectives are consistent with those issues. The evaluation committees recognized that NIOSH research priorities are sometimes circumstantial (e.g., congressionally mandated) rather than based on NIOSH’s assessment of the state of knowledge. Questions Considered in the Evaluation of Program Goal and Objectives 1. Are the goals and objectives of the program well defined and clearly described? 2. How were the goals and objectives derived (or updated) through strategic planning processes? 3. How well aligned were program goals and objectives with NORA 1 priori- ties during the past decade? 4. How are current program goals and objectives related to current NIOSH goals? 5. Are the program’s goals and objectives relevant to the major challenges for the research program and likely to address emerging issues in that specific research area (as determined by the evaluation committee)? • Did past program goals and objectives (as reflected in prior research and dissemination and transfer activities) focus on the most relevant problems and anticipate emerging issues? • Do the current program goals and objectives target the most relevant problems? Assessment of Program Goals and Objectives The evaluation committee was asked to provide a qualitative assessment dis- cussing the relevance of the program’s goals and objectives in relation to its major challenges. 6.  Identify Major Program Components Each evaluation committee determined how to disaggregate a program to achieve a manageable and meaningful evaluation of its components and of the overall program. Usually the disaggregation followed the strategic goals that the program identified. Although the research programs are built around intramural efforts, all relevant extramural efforts must be considered.

50 E v a l u a t i n g O c c u p a t i o n a l H e a l t h a n d S a f e t y Research Programs 7.  Evaluate Program Inputs, Activities, Outputs, and Outcomes 7a. Assess Inputs (Box C in Figure 3-2) Planning inputs include input from stakeholders, surveillance and intervention data, and risk assessments. Production inputs include intramural and extramural funding, staffing, management structure, and physical facilities. The evaluation committee examined existing intramural and extramural re- sources and, in some cases prior surveys or case studies developed specifically to assess progress in reducing workplace illnesses and injuries and to provide infor- mation relevant to the targeting of research to future needs. The NIOSH research program provided the evaluation committee with relevant planning and produc- tion inputs (see below and Box 3-2 for examples). Planning inputs.  Planning inputs can be qualitative or quantitative. Sources of qualitative inputs for NIOSH included the following: • Other NIOSH programs; • Federal advisory committees, such as the Board of Scientific Counselors, the Mine Safety and Health Research Advisory Committee, and the Na- tional Advisory Committee on Occupational Safety and Health; • NORA research partners and stakeholders, NORA strategic research plans, and the NORA Liaison Committee and federal liaison committee recommendations; • Industry, labor, academe, professional associations, industry associations, and the Council of State and Territorial Epidemiologists; and • OSHA and MSHA strategic plans and other federal research agendas. Attention was given to how comprehensive the inputs have been and to what extent gaps in input have been identified and considered by the program being evaluated. Sources of quantitative inputs for NIOSH included the following: • Intramural surveillance information, such as descriptive data on expo- sures and outcomes (appropriate data may be available from a number of NIOSH divisions and laboratories); • Reports from the NIOSH Health Hazard Evaluation (HHE) program; • Reports from the Fatality Assessment and Control Evaluation (FACE) program; • Extramural health-outcome and exposure-assessment data from OSHA, MSHA (both safety and health inspection data), the Bureau of Labor Sta-

E v a l u a t i o n F r a m e w o r k 51 tistics, the U.S. Department of Defense (DoD), and the U.S. Department of Agriculture (USDA) (fatality, injury, and illness surveillance data); state government partners, including NIOSH-funded state surveillance pro- grams, such as the Sentinel Event Notification System for Occupational Risks, Adult Blood Lead Epidemiology and Surveillance, and state-based FACE programs; and nongovernmental organizations, such as the National Safety Council, the Association of Occupational and Environmental Clin- ics (AOEC), the American Society of Safety Engineers, and the American College of Occupational and Environmental Medicine; and • Appropriate data from investigator-initiated extramural research funded by NIOSH. Production inputs.  For the research program under review, NIOSH program staff identified portions of the NIOSH intramural budget, staff, facilities, and manage- ment that played major roles in the research program. Production inputs were described primarily in terms of support for intramural research projects, relevant extramural projects (particularly cooperative agreements and contracts), HHEs that supported program goals, and related staffing levels. Consideration was also given to leveraged funds provided by partners such as the National Institutes of Health (NIH) and the Environmental Protection Agency (EPA) for joint requests for applications or program announcements and to OSHA, MSHA, and DoD contracts as well as collaborations with other NIOSH programs. Using this evaluation model, assessment of inputs included the evaluation committee’s consideration of the degree to which allocation of funding and per- sonnel was commensurate with the resources needed to conduct the research and the extent to which funding for the relevant intramural research activity has been limited by lack of discretionary spending beyond salaries, such as travel, supplies, and external laboratory services. Questions considered in the evaluation of inputs 1. Are planning and production inputs consistent with program goals? 2. How well are major planning and production inputs used to support the major activities? 3. Is input obtained from external stakeholders, including from those repre- senting vulnerable working populations and small businesses? 4. Are production inputs (intramural and extramural funding, staffing, man- agement, and physical infrastructure resources) consistent with program goals and objectives?

52 E v a l u a t i n g O c c u p a t i o n a l H e a l t h a n d S a f e t y Research Programs Assessment of inputs.  The evaluation committee was asked to provide a qualitative assessment that discussed the quality, adequacy, and use of inputs. 7b. Assess Activities (Box D in Figure 3-2) Activities are defined as the efforts of program staff, grantees, and contrac- tors. Activities of the NIOSH program under review were divided into research and transfer activities. Box 3-3 suggests the types and organization of information that can be useful in evaluating program activities. Some types of research activ- ity may not be applicable to a given NIOSH program. Research activities include surveillance, health-effects research, exposure assessment research, safety-design and safety-systems research, intervention research, and diffusion and dissemination research. Transfer activities include marketing analysis, information dissemination, training, technical assistance, and technology transfer. Depending on the scope of the program under review, activities may also be grouped by research program objectives or by subprograms. Conventional occupational safety and health research efforts appropriately focus on injury, illness, or death; on biomarkers of exposure; and on health effects of new technology, personal protective equipment, and regulations. Consideration was also given to the types of surveillance data needed. Assessment of the program’s activities relevant to socioeconomic and policy research and diffusion research were also considered because these research endeavors can provide information needed to effect important outcomes farther out on the causal chain to influence health and safety in the workplace. Examples of other types of research that could have been useful to the evaluation committees in examining activities relevant to the program’s mission included the following: • Surveillance research to assess the degree of significant or systematic un- derreporting of relevant injuries, illnesses, and biomarkers; • Socioeconomic research on cost shifting between workers’ compensation and private insurance; • Research on methods to build the health and safety capacity of primary care clinicians in community health centers and other healthcare settings to improve the recognition and treatment of work-related conditions; • Transfer research on how to change the health and safety knowledge and behavior of adolescents to improve the likelihood of reduced injuries as they enter the workforce; and • Community-based participatory research to explore how recent immi- grants and those employed for a longer time in the United States under­ stand ­acceptable health and safety risks, with the purpose of better targeting the workforce training needs of immigrant workers.

E v a l u a t i o n F r a m e w o r k 53 BOX 3-3 Examples of NIOSH Program Research and Transfer Activities Surveillance (including surveillance of injuries, illnesses, and hazards) Health-effects research (illnesses, injuries, and biomarkers): Epidemiology Toxicology Physical and safety risk factors (laboratory based) Development of clinical screening methods and tools Exposure assessment research: Chemical hazards Physical hazards Biologic hazards Ergonomic hazards Safety (traumatic injury) hazards Safety design and safety systems research Intervention research: Control technologies Engineering controls and alternatives Administrative controls Personal protective equipment Work organization Community participation Policy (e.g., alternative approaches to targeting inspections) Design for safety Emergency preparedness and disaster response Diffusion and dissemination research: Training effectiveness Information dissemination effectiveness Diffusion of technology Health services and other research: Access to occupational health care Infrastructure—delivery of occupational health services Socioeconomic consequences of work-related injuries and illnesses Workers’ compensation Technology transfer and other transfer activities: Information dissemination Training programs Marketing analysis Technical assistance

54 E v a l u a t i n g O c c u p a t i o n a l H e a l t h a n d S a f e t y Research Programs Transfer activities were assessed to determine whether the program appro- priately targets its outputs in a manner that will have the greatest impact. Ideally, information dissemination should be proactive, and strategic dissemination should be informed by research on the diffusion of new technologies, processes, and practices. Highly relevant information and technology transfer activities include plans for transfer to all appropriate worker populations, including those considered vulnerable. Training should be incorporated into the strategic goals of all research fields where appropriate. The evaluation committee reviewed project-level research and transfer activi- ties (including surveillance activities) that have been completed, are in progress, or are planned by the program under review. Programs were asked to provide a list of activities and specify whether the activities were intramural or extramural. The evaluation committee assessed each research activity outlined in Box 3-3 that is or should be an important element of the specific program being evaluated. In the case of a sector-based research program (e.g., mining or construction) for which health-effects research is not being evaluated, each committee determined what research activities were consistent with the program’s goals and objectives, and then assessed the value of the activities. Questions considered in assessing research activities   1. What are the major subprograms or groupings of activities within the program?   2. Are activities consistent with program goals and objectives?   3. Are research activities relevant to the major challenges of the research program? • Do they address the most serious outcomes? • Do they address the most common outcomes? • Do they address the needs of both genders, vulnerable working popu- lations, and small businesses?   4. Are NIOSH research activities pioneering in opening new and useful fields of research to be further explored by NIOSH and others?   5. Are research activities appropriately responsive to the input of stake­holders?   6. To what extent do research activities involve external partnerships?   7. Are partners involved early in the research process to allow them to participate in determining research objectives and research design?   8. Were original resource allocations appropriate for the research activities, and do they remain appropriate?   9. To what extent do peer reviews (internal, external, and midcourse) affect the activities?

E v a l u a t i o n F r a m e w o r k 55 10. Is there adequate monitoring of quality assurance procedures to ensure credible research data, analyses, and conclusions? Questions considered in assessing transfer activities 1. Is a coherent program of transfer activities planned? 2. Have staff thought through issues of compatibility, cost, and simplicity in designing information and transfer products? 3. Are the program’s publications and information dissemination, train- ing, education, and technical assistance efforts successful in reaching the workplace or relevant stakeholders in other settings? How widespread is the response? 4. To what degree have stakeholders responded to program information and training products? 5. Is there evidence that the formats for information products were selected in response to stakeholder preferences? 6. To what extent do program personnel rely on assessment of stakeholder needs and reactions to prototype information and training projects (for- mative evaluation techniques)? 7. To what extent does the program build research and education capacity internally and among stakeholders? Assessment of activities.  Each evaluation committee was asked to provide a quali- tative assessment of the relevance of these efforts. This assessment included con- sideration of the external factors that constrained choices of research projects and the relevance and effectiveness of transfer activities. The evaluation committee considered the appropriateness of resource allocations. A highly relevant program would address high-priority needs, produce high-quality results, be appropriately collaborative, be of value to stakeholders, and be substantially engaged in transfer activities. A program might be less relevant to the extent that those key elements were not up to the mark or were missing. The committee’s discussions covered those aspects in sufficient detail to arrive at a qualitative assessment of the activities. Assessment of the transfer activities included considerations of program planning, coherence, and impact. The evaluation committee also considered the incorpora- tion of international research results into knowledge-transfer activities conducted by the NIOSH program for U.S. industry sectors. 7c. Assess Outputs (Box E in Figure 3-2) For the NIOSH evaluations, an output is a direct product of a NIOSH research program. Outputs may be designed for researchers, practitioners, intermediaries,

56 E v a l u a t i n g O c c u p a t i o n a l H e a l t h a n d S a f e t y Research Programs and end-users, such as employers and employees. Outputs can include publications in peer-reviewed journals, recommendations, reports, website content, workshops and presentations, databases, educational materials, scales and methods, new tech- nologies, patents, and technical assistance. Outputs of the research program’s ex- tramurally funded activities were also considered. Box 3-4 lists examples of major BOX 3-4 Examples of Research Program Outputs Peer-reviewed publications by NIOSH staff: • Number of original research articles by NIOSH staff and citations •  Number of review articles by NIOSH staff (including best-practices articles) and citations •  Publications in the field of interest with other support by investigators also funded by NIOSH (e.g., ergonomic studies with other support by an investigator funded by NIOSH to do ergonomics work, in which case NIOSH should get some credit for seeding interest or drawing people into the field) Peer-reviewed publications by external researchers funded by NIOSH: •  Number of NIOSH-funded original research articles by external researchers and citations •  Number of NIOSH-funded review articles by external researchers (including best- practices articles) and citations • Collaboration with other government or academic researchers NIOSH reports: • Number of written reports and citations Sponsored conferences and workshops: • Number of sponsored conferences • Number of sponsored workshops • Description of conferences and workshops (title, date, sponsors, target audience, number of participants, and resulting products) Databases: • Number of major databases created by NIOSH staff • Number of major databases created by external researchers funded by NIOSH grants • Description of databases: o Title, objective (in one to four sentences), and start and stop dates o Partial versus complete sponsorship (if partial, who were cosponsors?) o Study or surveillance system design, study population, and sample size

E v a l u a t i o n F r a m e w o r k 57 outputs considered by the evaluation committees. Each NIOSH research program was asked to make every effort to include all pertinent data of the types listed in Box 3-4 in the materials submitted to the committee. Outputs may be tailored to the intended audience to communicate informa- tion most effectively and increase the likelihood of comprehension, knowledge, Primary “products” of the database (e.g., number of peer-reviewed articles and o  reports) Recommendations: • Number of major recommendations • Description of recommendations: o Complete citation (article, report, or conference where recommendation was made) o Summary in one to four sentences o Percentages of target audiences and decision makers that have adopted the recom- mendation (up to 10 years after release) o Examples of implementation in the field Tools, methods, and technologies: •  Number of major tools, methods, and technologies (includes training and education materials) • Descriptions: o Title and objective (in one to four sentences) o Complete citation (if applicable) o Percentage of target audience that has used the tools, methods, or technologies (up to 10 years after release) o Up to three examples of implementation in the field Patents: • Total number of patents • For each: o Title and objective (in one to four sentences) o Complete citation o Percentage of target audience that has used product (up to 10 years after release) o  to three examples of implementation in the field Up Miscellaneous: • Any other important program outputs

58 E v a l u a t i n g O c c u p a t i o n a l H e a l t h a n d S a f e t y Research Programs attitude formation, and behavioral intent. The extent of use of formative evaluation data and the extent of user feedback in the design of the output can be considered indicators of appropriate quality assessment. Activities such as collaborations can also legitimately be conceptualized as outputs, because the collaboration itself is a result of NIOSH efforts. Coopera- tion, coordination, more intensive collaboration, and eventual formal partnering can be considered important outputs leading to desirable intermediate outcomes. Technology transfer and knowledge transfer are facilitated significantly through such relationships. The extent of collaboration with other organizations in the determination of research agendas, the conduct of research, the dissemination of research results, and interorganizational involvement in the production of outputs may be measures of output quality and quantity. The evaluation committees con- sidered coauthorship while determining the importance of research by the NIOSH program to the broader research community. The NIOSH program was asked to provide information on all relevant outputs of the program under review that were produced during the chosen period. Questions considered in the evaluation of outputs   1. What are the major outputs of the research program?   2. Are output levels consistent with resources allocated (were resources allocated and used efficiently to produce outputs)?   3. Does the research program produce outputs that address high-priority areas?   4. To what extent does the program generate important new knowledge or technologies?   5. Do any widely cited, peer-reviewed publications report “breakthrough” results?   6. What, if any, internal or external capacity-building outputs are documented?   7. Are outputs relevant to both genders and vulnerable populations, and do they address the needs of small businesses?   8. Are products user-friendly with respect to readability, simplicity, and design?   9. To what extent does the program help to build the internal or extramural institutional knowledge base? 10. Does the research produce effective cross-agency, cross-institute, or internal–external collaborations? 11. To what extent does the program build research and education capacity (internal or external)?

E v a l u a t i o n F r a m e w o r k 59 Assessment of outputs.  The evaluation committees were asked to provide a quali- tative assessment, including discussion of relevance and utility. The outputs of a highly ranked program address needs in high-priority areas, contain new knowl- edge or technology that is effectively communicated, contribute to capacity build- ing inside and outside the program, and are relevant to pertinent populations. The committees were asked to provide a discussion that covered those aspects in sufficient detail to support the qualitative assessment of the outputs. 7d. Assess Outcomes (Boxes F and G in Figure 3-2) Intermediate outcomes.  Intermediate outcomes are external stakeholder actions to which the program contributed. They reflect the impact of program activities and may lead to the desired end outcome of improved worker safety and health. Intermediate outcomes in the NIOSH evaluations included the production of guidelines or regulations based wholly or partly on NIOSH research by those outside of NIOSH (products adopted as public policy or as policy or guidelines by private organizations or industry); contributions to training and education programs sponsored by other organizations; use of publications or other materi- als by workers, industry, and occupational safety and health professionals in the field; secondary dissemination of program activities and outputs through trade and mass media coverage; and citations of NIOSH research by industrial and academic scientists. Intermediate outcomes allow inference that a program’s outputs are associated with observed changes in the workplace. Thus, an intermediate outcome reflects an assessment of worth by NIOSH program stakeholders (e.g., managers in industrial firms) about NIOSH research or its products (e.g., NIOSH training workshops). Intermediate outcomes that are difficult to monitor, but may be valid indicators of relevance or utility, include self-report measures by users of NIOSH outputs. Self-reported indicators include the extent to which key intermediaries find value in NIOSH products or databases for the repackaging of health and safety informa- tion, the extent to which NIOSH recommendations are in place and attended to in workplaces, and employee or employer knowledge of and adherence to NIOSH- recommended practices. Questions considered in the evaluation of intermediate outcomes 1. Do program outputs result in or contribute to stakeholder training or edu- cation activities used in the workplace or in school or apprentice programs? If so, how?

60 E v a l u a t i n g O c c u p a t i o n a l H e a l t h a n d S a f e t y Research Programs 2. Do program activities and outputs result in regulations, public policy, or voluntary standards or guidelines that are transferred to or created by the workplace? 3. Have the program’s activities and outputs resulted in changes in employer or worker practices associated with the reduction of risk—for example, in the adoption of new feasible control or personal protective technologies or administrative control concepts? 4. Does the program contribute to changes in healthcare practices in- tended to improve recognition and management of occupational health conditions? 5. Do program activities and outputs result in research partnerships with stakeholders that lead to changes in the workplace? 6. To what extent do stakeholders find value in the NIOSH program’s prod- ucts, as shown by document requests, website hits, conference attendance, and similar evidence of stakeholder interest? 7. Does the program or a subprogram provide unique staff or laboratory capability that is a necessary national resource? If so, is it adequate, or does it need to be enhanced or reduced? 8. Have program activities and outputs resulted in interventions that pro- tect both genders and vulnerable workers or address the needs of small businesses? 9. To what extent did the program contribute to increased capacity at work- sites to identify or respond to safety and health threats? Assessment of intermediate outcomes.  The evaluation committees were asked to provide a qualitative assessment of product development, usefulness, and impact with consideration given to the relative value of intermediate outcomes (the frame- work committee recommended applying the well-accepted hierarchy-of-controls model). Discussions could include comments on how widely products have been used or programs implemented. The qualitative discussion should be specific about the various products developed by the program and the extent of their use by certain entities (e.g., industry, labor, government) for specific purposes. Discus- sions included whether the products have resulted in changes in the workplace or in the reduction of risk; the recognition accorded to the program or the facilities by its peers (e.g., recognition as a “center of excellence” by national and interna- tional communities) was also considered in the assessment. To be highly ranked, a program should have high performance in most of the relevant questions in this section. One aspect of the evaluation was considering whether the same changes in stakeholder activities and behaviors probably would have occurred without efforts by the NIOSH program.

E v a l u a t i o n F r a m e w o r k 61 End outcomes. Each evaluation committee was asked to assess to the greatest extent possible the program’s contribution to end outcomes—improvements in workplace safety and health (impact). For purposes of this evaluation, end out- comes are safety- and health-related changes that are a result of program activities, specifically, decreases in injuries, illnesses, and hazardous exposures or risks. Data on reductions in work-related injuries, illnesses, and hazardous exposures were available for only a few of the programs, and in some cases they were quantifi- able. When there was no direct evidence of improvements in health and safety, intermediate outcomes were used as proxies for end outcomes in assessing impact, and the evaluation committee qualified their findings. The evaluation committees described the realized or potential benefits of the NIOSH program. Assessing the causal relationship between NIOSH research and specific occu- pational safety and health outcomes was seen as a major challenge because NIOSH does not have direct responsibility or authority for implementing its research findings in the workplace. Furthermore, the benefits of NIOSH research program outputs can be realized, potential, or limited to the knowledge gained. Studies that conclude with negative results may nevertheless have incorporated excellent science, close off unproductive areas of research, and contribute to the knowledge base. The generation of important knowledge is a recognized form of outcome in the absence of measurable impacts. The impact of research, particularly applied research as conducted by NIOSH, depends on the existence of a “receptor” for research results, such as a regulatory agency, a professional organization, an employer, an employee organization, or in some cases, employees themselves. The evaluation committee was asked to consider issues related to the various stages that lead to outcomes, including the following: • Did research by the program identify a gap in protection or a means of reducing risk? • Did the program convey this information to potential users in a usable form? • Were research results (e.g., recommendations, technologies) applied? • Did the applied results lead to desired outcomes? Quantitative data were preferable to qualitative, but qualitative analysis was necessary in some cases. Sources of quantitative data relevant to NIOSH included the following: • Bureau of Labor Statistics (BLS) data on fatal occupational injuries (the Census of Fatal Occupational Injuries) and nonfatal occupational injuries and illnesses (the annual Survey of Occupational Injuries and Illnesses);

62 E v a l u a t i n g O c c u p a t i o n a l H e a l t h a n d S a f e t y Research Programs • NIOSH intramural surveillance systems, such as the National Electronic Injury Surveillance System, the coal worker X-ray surveillance program, and agricultural worker surveys conducted by NIOSH in collaboration with the USDA; • State-based surveillance systems, such as the NIOSH-funded Adult Blood Lead Epidemiology and Surveillance Program and the Sentinel Event No- tification System for Occupational Risks programs for asthma, pesticides, silicosis, noise-induced hearing loss, dermatitis, and burns; state-level vital statistics systems and other health data systems such as cancer registries and hospital discharge and emergency department datasets; • Selected state worker compensation programs; and • Exposure data collected in the OSHA Integrated Management Information System. The framework committee was unaware of mechanisms for the surveillance of many occupation-related chronic illnesses, such as cancers that arise from long exposure to chemicals and other stressors. The incidence and prevalence of many such outcomes may best be evaluated by investigator-initiated research. Research that leads to new effective surveillance concepts or programs warrants special recognition. The evaluation committees were asked to consider the strengths and weak- nesses of outcome data sources. Quantitative accident, injury, illness, and employ- ment data and databases are subject to error and bias and should be used by the evaluation committees only for drawing inferences after critical evaluation and ex- amination of available corroborating data. For example, occupational illnesses are widely recognized as being poorly documented in the BLS Survey of Occupational Injuries and Illnesses, which captures only incident cases among active workers. Health practitioners often have difficulty in diagnosing the component of illnesses that may be related to work; furthermore, few practitioners are adequately trained to make such an assessment. Many of these illnesses have long latencies and do not appear until years after individuals have left the employment in question. Ad- ditionally, surveillance programs may systematically undercount some categories of workers, such as contingent workers. In addition to measures of illness and injury, measures of exposure to chemical and physical agents and to safety and ergonomic hazards can be useful. Measures of exposure or probability of exposure can serve as an appropriate proxy for disease or injury when a well-described association exists between occupational exposure and health. In such instances, a decrease in exposure can be accepted as evidence that the end outcome of reduced illness or injury is being achieved. Such assumptions

E v a l u a t i o n F r a m e w o r k 63 are particularly necessary when the latent period between exposure and disease outcome makes effective evaluation of the relevant end outcome infeasible, as in the case of asbestos exposure and lung cancer. The reduction in the number of worksites that exceed an OSHA PEL or an American Conference of Governmental Industrial Hygienists threshold limit value is a quantitative measure of improvement of occupational health awareness and reduction of risk. In addition to exposure level, the number of people exposed and the distribution of exposure levels can provide quantitative data. Other evidence includes air monitoring data, reduction in requirements for use of personal protec- tive equipment, and reduction in ergonomic risks. Challenges posed by inadequate or inaccurate measurement systems should not drive programs out of difficult fields of study, and the evaluation commit- tees should be aware of such possibilities. In particular, contingent and informal working arrangements that place workers at greatest risk are also those on which surveillance information is almost totally lacking, so novel methods for measuring impact may be required. The commitments of industry, labor, and government to health and safety are critical external factors. Several measures of this commitment can be useful for the evaluation committee: monetary commitments, attitude, staffing, and surveys of relative importance. To the extent that resources allocated to safety and health are limiting factors, the evaluation committee also explicitly assessed the performance of the NIOSH program in the context of constraints. Questions considered in the evaluation of end outcomes 1. What are the amounts and qualities of relevant end outcome data, includ- ing data documenting injuries, illness, exposure, and productivity affected by health? 2. What are the temporal trends in the data? 3. Is there objective evidence of improvement in occupational safety or health? 4. To what degree is the NIOSH program or subprogram responsible for improvement in occupational safety or health? 5. How do findings compare with data from comparable groups in the United States or the corresponding populations in other countries? 6. What is the evidence that external factors have affected outcomes or out- come measures? 7. Has the program been responsible for changes in outcomes outside the United States?

64 E v a l u a t i n g O c c u p a t i o n a l H e a l t h a n d S a f e t y Research Programs Assessment of end outcomes.  The evaluation committee was asked to provide a qualitative assessment of program impact, discussing the evidence of reductions in injuries and illnesses or their appropriate proxies. Other outcomes.  Regarding the NIOSH study, there may be as-yet unappreciated health and safety impacts or other beneficial social, economic, and environmental outcomes as a result of NIOSH activities. NIOSH study results may be influential outside the United States, and there may be evidence of implementation of NIOSH recommendations and training programs abroad. Questions considered in the evaluation of other outcomes 1. Is the program likely to produce a favorable change that has not yet oc- curred or not been appreciated? 2. Has the program been responsible for social, economic, security, or envi- ronmental outcomes? 3. Have program activities and outputs impacted occupational health and safety in other countries? Assessment of other outcomes.  The evaluation committees were asked to con- sider other outcomes, including beneficial changes that are expected to occur; social, economic, security, or environmental outcomes; and the impact that the program has had on international occupational safety and health. 8.  Determine Scores for Relevance and Impact and Provide the Rationale The evaluation committees assigned an integer score for the relevance of the research program to the improvement of occupational safety and health and another integer score for the impact of the program on such improvements. Using their expert judgment, the committees rated the relevance and impact of the overall research program by first summarizing their assessments of the major goals or subprograms and then appropriately weighting the goal areas (or subprograms) to determine the overall program ratings. Relevance and impact scores were based on 5-point categorical scales estab- lished by the framework committee (described below), in which 1 is the lowest and 5 the highest rating. The framework committee made an effort to establish mutu-  Inlight of substantial differences among the types of research programs, the framework commit- tee chose not to construct a single algorithm to use in weighting the goals or subprograms.

E v a l u a t i o n F r a m e w o r k 65 ally exclusive rating categories in the scales. The evaluation committee determined how individual goal areas (or subprograms) influenced final scores. Final program ratings consisted of integer scores for relevance and impact and prose justification of the scores. Box 3-5 provides an overview of the issues to be considered in determining ratings of relevance and impact. Evaluation committees were asked to consider items 1 through 4 in Box 3-5 for the overall program and to assess the relevance of the program by reviewing the committee’s responses to the questions evaluat- ing the program’s challenges, goals and objectives, inputs, activities, and outputs (Section 7). The evaluation committee evaluated separately the extent to which the program’s research efforts are in high-priority areas and the extent to which the program is involved in transfer activities. Transfer activities occur in two contexts: (1) efforts by the NIOSH program to translate intellectual products into practice and (2) stakeholder efforts to integrate NIOSH results into the workplace. To assess impact, each evaluation committee first needed to consider the avail- able evidence of changes in work-related risks and the adverse effects and external factors related to the changes. The evaluation committee reviewed the responses to the questions on the reviews of outputs, intermediate outcomes, and end out- comes and systematically assessed the impact of the research program. Items 2 to 7 in Box 3-5 address these areas. The evaluation committee needed to judge, for BOX 3-5 Overview of the Issues Assess the following for each program: 1.  Relevance of current and recently completed research and transfer activities to objec- tive improvements in workplace safety and health. 2.  Contributions of the NIOSH program’s research and transfer activities to changes in work-related practices and reduction in workplace exposures, illnesses, or injuries. 3.  Contributions of the NIOSH program’s research and transfer activities to improve- ments in work-related practices. 4.  Contributions of the NIOSH program’s research to productivity, security, or environ- mental quality (beneficial side effects). 5.  Evidence of policy, technological, behavioral, and other changes that would reduce risk in the workplace (intermediate outcomes). 6.  Evidence of reduction in workplace exposures, illnesses, or injuries (end outcomes). 7.  Evidence of external factors that prevented translation of NIOSH research results into intermediate or end outcomes.

66 E v a l u a t i n g O c c u p a t i o n a l H e a l t h a n d S a f e t y Research Programs example, whether outcomes occurred earlier than they would have or are better than they would have been in the absence of the research program, or whether outcomes would have occurred were it not for external factors beyond the control of the NIOSH program. Scoring of Relevance As discussed in previous sections, numerous factors may be considered in as- sessing relevance. The scoring criteria focus on the evaluation committee’s assess- ment of whether the program appropriately set priorities among research needs as well as how engaged the program was in appropriate transfer activities to move research findings into the workplace. Since the evaluation of NIOSH programs in- cluded assessment of research activities and knowledge transfer activities, both are considered in the final relevance score. With respect to research, the key indicator is the extent to which the program’s research is in priority subject areas (high prior- ity, priority, lesser priority, or not focused on priorities); with respect to transfer, the key indicator is the level of engagement in appropriate transfer activities (in this case, significantly engaged, engaged, or not engaged). This approach resulted in a complex scoring system that tries to address the best and worst cases and any variations in between. Box 3-6 lists the criteria for scoring the overall relevance BOX 3-6 Scoring Criteria for Relevance 5  =   esearch is in high-priority subject areas and the NIOSH program is significantly R engaged in appropriate transfer activities for completed research projects/reported research results. 4  =   esearch is in high-priority subject areas and the NIOSH program is engaged in ap- R propriate transfer activities for completed research projects/reported research results; or research is in priority subject areas and the NIOSH program is significantly engaged in appropriate transfer activities for completed research projects/reported research results. 3  =   esearch is in high-priority subject areas, but the NIOSH program is not engaged in R appropriate transfer activities; or research is in priority subject areas but the NIOSH program is not significantly engaged in appropriate transfer activities; or research focuses on lesser priorities but the NIOSH program is significantly engaged in ap- propriate transfer activities. 2  =   esearch program is focused on lesser priorities and the NIOSH program is not sig- R nificantly engaged in appropriate transfer activities. 1  =   esearch program is not focused on priorities. R

E v a l u a t i o n F r a m e w o r k 67 TABLE 3-1  Guidance for Weighting Research Priority and Transfer Activities Assessment of Engagement in Appropriate Research Priority Transfer Activities Applicable Score High priority Significantly engaged 5 High priority Engaged 4 Priority Significantly engaged 4 High priority Not engaged 3 Priority Engaged or not engaged 3 Lesser priority Significantly engaged 3 Lesser priority Engaged or not engaged 2 Not focused on priorities Any level of engagement 1 of the NIOSH research program. Table 3-1 provides guidance regarding how the committee may weight research priorities and transfer levels when determining relevance scores. The evaluation committee considered both completed research and research that is in progress in its assessment of relevance. The committee kept in mind how well the program has considered the frequency and severity of the problems being addressed; whether appropriate attention has been directed to issues regarding both sexes, vulnerable populations, or hard-to-reach workplaces; and whether the different needs of large and small businesses have been considered. Each committee determined how to consider external factors in assigning program scores. Scoring of Impact Box 3-7 provides the criteria established for the rating of impact. The evalu- ation committee primarily considered completed research outputs. In assigning a score for impact, it is important to recognize that a “major contribution” (required for a score of 5) does not imply that the NIOSH program was solely responsible for observed improvements in worker safety and health. Many factors may be required to effect improvements. The committee could say that the NIOSH program made “major contributions” if the improvements would not have occurred when they did without the program’s efforts. The framework committee had some concern that the imposed scoring criteria for impact might be considered a promotion of the conventional occupational- health research paradigm that focuses on health-effects and technology research

68 E v a l u a t i n g O c c u p a t i o n a l H e a l t h a n d S a f e t y Research Programs BOX 3-7 Scoring Criteria for Impact 5  =   esearch program has made major contribution(s) to worker health and safety on the R basis of end outcomes or well-accepted intermediate outcomes. 4  =   esearch program has made some contributions to end outcomes or well-accepted R intermediate outcomes. 3  =   esearch program activities are ongoing and outputs are produced that are likely to R result in improvements in worker health and safety. Well-accepted outcomes have not been recorded. 2  =   esearch program activities are ongoing and outputs are produced that may result in R new knowledge or technology, but only limited application is expected. Well-accepted outcomes have not been recorded. 1  =   esearch activities and outputs do not result in or are not likely to have any R application. NA = Impact cannot be assessed; program is not mature enough. without much emphasis on the socioeconomic, policy, surveillance, and diffusion research (as opposed to diffusion activities) needed to effect change. The evalua- tion committees were asked to remember that not all intermediate outcomes occur in the workplace. Important outcomes that NIOSH can affect also occur much farther out on the causal chain. NIOSH, for example, has an important role in generating knowledge that may contribute to changing norms in the insurance industry, in healthcare practice, in public health practice, and in the community at large. The evaluation committees considered whether some of those issues need to be addressed and considered as external factors that facilitate or limit applica- tion of more traditional research findings. Given the rapidly changing nature of work and the workforce and the intractable problems in manufacturing, mining, and other fields, the evaluation committees were encouraged to think beyond the conventional paradigm. 9.  Assess the Program’s Process for Targeting Priority Research Needs and Provide the Committee’s Assessment of Emerging Issues Among the most challenging aspects of research in illness and injury preven- tion are the identification of new or emerging needs and trends and the formula-

E v a l u a t i o n F r a m e w o r k 69 tion of a research response that uses scarce resources to best effect in anticipation of them. The second charge to the evaluation committee was assessment of the re- search program’s effectiveness in targeting new research and identifying emerging issues in occupational safety and health most relevant to future improvements in workplace protection. The evaluation committee was asked to provide a quali- tative narrative assessment of the program’s process for determining priorities for research and emerging workplace issues. The committee also independently identified emerging workplace issues that the NIOSH program should be pre- pared to address. The evaluation committees reviewed the procedures that the NIOSH program has in place to identify needed research relevant to the NIOSH mission and re- viewed the success that the NIOSH program has had in identifying and addressing research related to emerging issues. For example, the program should be involved in examining leading indicators from other federal agencies (e.g., EPA, Department of Labor, National Institute of Standards and Technology, NIH, DoD, and Depart- ment of Commerce) that track or provide data on new technologies, new products, new processes, and disease or injury trends. The NIOSH HHE program offers a potential source for the identification of emerging research needs. The evaluation committee needed to determine whether the program under review appropriately considered pertinent HHE investigation findings. Additional emerging issues may have been revealed through consideration of NIOSH and NIOSH-funded FACE reports, AOEC reports, U.S. Chemical Safety Board investigations, and the Sentinel Event Notification System for Occupational Risks and other state-based surveillance programs. Appropriate federal advisory committees and other stakeholder groups were also consulted to provide qualita- tive information. The evaluation committee systematically assessed how the research program targets new research by evaluating each goal area or subprogram for the items listed in Box 3-8. Questions Considered in Identifying Emerging Issues 1. What information does the NIOSH program review to identify emerging research needs? • What is the process for review? • How often does the process take place? • How are NIOSH staff scientists and leadership engaged? • What is the process for moving from ideas to formal planning and resource allocation?

70 E v a l u a t i n g O c c u p a t i o n a l H e a l t h a n d S a f e t y Research Programs BOX 3-8 Targeting of New Research and Identification of Emerging Issues Assess the following: 1. Past and present effectiveness in targeting most relevant research needs. 2. Effectiveness in targeting research in fields most relevant to future improvements in occupational safety and health. 2. How are external stakeholders involved? • What advisory or stakeholder groups are asked to identify emerging research targets? • How often are such groups consulted, and how are suggestions fol- lowed up? 3. What new research targets have been identified for future development in the program under evaluation? • How were they identified? • Were lessons learned that could help to identify other emerging issues? • Does the evaluation committee agree with the issues identified and selected as important and with the NIOSH program’s response, or were important issues overlooked? • Is there evidence of unwise expenditure of resources on unimportant issues? The evaluation committee members used their expert judgment both to evalu- ate the emerging research targets identified by the NIOSH program and to provide recommendations on improvements to the program or additional research that NIOSH had not yet identified. Recommendations included a brief statement of their rationale. 10.  Prepare Report by Using the Template Provided as a Guide Consistency and comparability among evaluation committee report formats was desirable, but the framework committee recognized that each NIOSH research program is different and that each evaluation committee was independent. The outline provided in Box 3-9 flows from the framework committee’s review of the generalized NIOSH logic model (Figure 3-1) and the overviews of the evaluation process (Box 3-1, Figure 3-2). The evaluation committees were free to use or adapt

E v a l u a t i o n F r a m e w o r k 71 BOX 3-9 Suggested Outline for Evaluation Committee Reports I. Introduction This section should be a brief descriptive summary of the history of the program being evaluated with respect to pre-NORA, NORA 1, and current and future plans of the research program presented by the NIOSH program. It should present the context for the research on safety and health; goals, objectives, and resources; groupings of goal areas or subprograms; and any other important pertinent information (a list of the NIOSH materials reviewed should be provided in Appendix C). II. Evaluation of the Program (Charge 1) A. Evaluation summary: should include a brief summary of the evaluation with respect to impact and relevance, scores for impact and relevance, and summary statements B. Strategic goals and objectives: should describe assessment of the extent to which program strategic plans reflect program relevance C. Assessment of inputs: should describe adequacy of inputs to achieve goals D. Assessment of activities: should describe assessment of the relevance of the activities E. Assessment of research program outputs: should describe assessment of rel- evance and potential usefulness of the research program’s outputs F. Assessment of intermediate outcomes and causal impact: should describe as- sessment of the intermediate outcomes and the program’s contribution to them; should include the likely impacts and recent outcomes in the assessment G. Assessment of end outcomes: should describe the end outcomes related to health and safety and provide an assessment of the type and degree of attribution to the NIOSH program H. Assessment of other outcomes: should discuss health and safety impacts that are expected to occur; beneficial social, economic, and environmental outcomes; and international dimensions and outcomes I. Summary of ratings and rationale III. NIOSH Targeting of New Research and Identification of Emerging Issues (Charge 2) The evaluation committee should assess the progress that the NIOSH program has made in targeting new research in occupational safety and health. The evaluation committee should assess whether the NIOSH program has identified important emerging issues that appear especially important in terms of relevance to the mission of NIOSH. The evaluation commit- tee should respond to NIOSH’s perspective and add its own recommendations. IV. Recommendations for Program Improvement On the basis of the review and evaluation of the program, the evaluation committee may provide recommendations for improving the relevance of the NIOSH research program to (continued)

72 E v a l u a t i n g O c c u p a t i o n a l H e a l t h a n d S a f e t y Research Programs BOX 3-9 Continued safety and health conditions in the workplace and the impact of the research program on safety and health in the workplace. Appendix A Framework Document Appendix B Methods and Information Gathering Appendix C List of NIOSH and Related Materials Collected in the Process of the Evaluation this outline as necessary when organizing their final reports. The framework com- mittee encouraged each evaluation committee to look at prior evaluation commit- tee reports for organizational ideas. REFERENCES IOM and NRC (Institute of Medicine and National Research Council). 2006. Hearing loss research at NIOSH. Committee to Review the NIOSH Hearing Loss Research Program. Rpt. No. 1, Reviews of Research Programs of the National Institute for Occupational Safety and Health. Washington, DC: The National Academies Press. IOM and NRC. 2008. The personal protective technology program at NIOSH. Committee to Review the NIOSH Personal Protective Technology Program. Rpt. No. 5, Reviews of Research Programs of the National Institute for Occupational Safety and Health. Washington, DC: The National Academies Press. IOM and NRC. 2009. Traumatic injury research at NIOSH. Committee to Review the NIOSH Trau- matic Injury Research Program. Rpt. No. 6, Reviews of Research Programs of the National Institute for Occupational Safety and Health. Washington, DC: The National Academies Press. NRC and IOM. 2007. Mining safety and health research at NIOSH. Committee to Review the NIOSH Mining Safety and Health Research Program. Rpt. No. 2, Reviews of Research Programs of the National Institute for Occupational Safety and Health. Washington, DC: The National Acad- emies Press. NRC and IOM. 2008a. Agriculture, forestry, and fishing research at NIOSH. Committee to Review the NIOSH Agriculture, Forestry, and Fishing Research Program. Rpt. No. 3, Reviews of Research Programs of the National Institute for Occupational Safety and Health. Washington, DC: The National Academies Press. NRC and IOM. 2008b. Respiratory diseases research at NIOSH. Committee to Review the NIOSH Re- spiratory Diseases Research Program. Rpt. No. 4, Reviews of Research Programs of the National Institute for Occupational Safety and Health. Washington, DC: The National Academies Press.

E v a l u a t i o n F r a m e w o r k 73 NRC and IOM. 2009a. The health hazard evaluation program at NIOSH. Committee to Review the NIOSH Health Hazard Evaluation Program. Rpt. No. 7, Reviews of Research Programs of the National Institute for Occupational Safety and Health. Washington, DC: The National Acad- emies Press. NRC and IOM. 2009b. Construction research at NIOSH. Committee to Review the NIOSH Construc- tion Research Program. Rpt. No. 8, Reviews of Research Programs of the National Institute for Occupational Safety and Health. Washington, DC: The National Academies Press. Rutstein, D. D., R. J. Mullan, T. M. Frazier, W. E. Halperin, J. M. Melius, and J. P. Sestito. 1983. Sentinel health events (occupational): A basis for physician recognition and public health surveillance. American Journal of Public Health 73(9):1054–1062. Williams, V. L., E. Eiseman, E. Landree, D. M. Adamson. 2009. Demonstrating and communicating research impact: Preparing NIOSH programs for external review. Santa Monica, CA: RAND.

Next: 4 Improving the Evaluation Process »
Evaluating Occupational Health and Safety Research Programs: Framework and Next Steps Get This Book
×
 Evaluating Occupational Health and Safety Research Programs: Framework and Next Steps
Buy Paperback | $45.00 Buy Ebook | $35.99
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF

Each year, approximately 5,000 fatal work-related injuries and 4 million non-fatal injuries and illnesses occur in the United States. This number represents both unnecessary human suffering and high economic costs. In order to assist in better evaluating workplace safety and create safer work environments, the Institute of Medicine conducted a series of evaluations of the National Institute for Occupational Safety and Health (NIOSH) research programs, assessing the relevance and impact of NIOSH's work on improving worker safety and health.

READ FREE ONLINE

  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  6. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  7. ×

    View our suggested citation for this chapter.

    « Back Next »
  8. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!