4

Measurement, Evaluation, and Reporting for Improved Readiness and Resilience

In God we trust, all others bring data.
 —The Elements of Statistical Learning (Hastie et al., 2009)

Because of its critical mission, the Department of Homeland Security (DHS) requires a committed workforce that is ready and resilient (healthy, safe, engaged, and productive). However, DHS has a paucity of data and information available to inform the Department on issues related to health, safety, productivity, and quality of life, all of which lead to workforce readiness and resilience (WRR). To be successful in achieving its mission, DHS needs to establish a robust and valid measurement and evaluation infrastructure for diagnosing individual and organizational problems, tracking and monitoring program impacts, and evaluating overall success. DHS needs to abide by the maxim that data drive change and that what is measured is managed.

A state-of-the-art measurement and evaluation system is integral to the development and monitoring of any workplace program (IOM, 2005; Lester, 2013; NIOSH, 2008). That system will coordinate data collection, aggregation, analysis, and reporting. The data emerging from the system will help DHS to understand its needs, choose proper interventions, and examine whether the interventions produce the desired outcomes, including whether and where improvements can be made. DHS needs to support the development and maintenance of high-quality data aggregation and reporting systems and the content expertise necessary to extract intelligence from stand-alone and integrated databases. Effective measure-



The National Academies | 500 Fifth St. N.W. | Washington, D.C. 20001
Copyright © National Academy of Sciences. All rights reserved.
Terms of Use and Privacy Statement



Below are the first 10 and last 10 pages of uncorrected machine-read text (when available) of this chapter, followed by the top 30 algorithmically extracted key phrases from the chapter as a whole.
Intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text on the opening pages of each chapter. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

Do not use for reproduction, copying, pasting, or reading; exclusively for search engines.

OCR for page 129
4 Measurement, Evaluation, and Reporting for Improved Readiness and Resilience In God we trust, all others bring data. —The Elements of Statistical Learning (Hastie et al., 2009) Because of its critical mission, the Department of Homeland Security (DHS) requires a committed workforce that is ready and resilient (healthy, safe, engaged, and productive). However, DHS has a paucity of data and information available to inform the Department on issues related to health, safety, productivity, and quality of life, all of which lead to workforce readiness and resilience (WRR). To be successful in achieving its mission, DHS needs to establish a robust and valid measurement and evaluation infrastructure for diagnosing individual and organizational problems, tracking and monitoring program impacts, and evaluating overall success. DHS needs to abide by the maxim that data drive change and that what is measured is managed. A state-of-the-art measurement and evaluation system is integral to the development and monitoring of any workplace program (IOM, 2005; Lester, 2013; NIOSH, 2008). That system will coordinate data collection, aggregation, analysis, and reporting. The data emerging from the system will help DHS to understand its needs, choose proper interventions, and examine whether the interventions produce the desired outcomes, includ- ing whether and where improvements can be made. DHS needs to sup- port the development and maintenance of high-quality data aggregation and reporting systems and the content expertise necessary to extract in- telligence from stand-alone and integrated databases. Effective measure- 129

OCR for page 129
130 A READY AND RESILIENT WORKFORCE FOR DHS ment of programs will produce important information on whether the programs are yielding improvements in health, morale, safety, and resili- ence. This feedback mechanism will, in turn, allow DHS to calculate the potential cost savings for the department in the form of improved worker productivity, reduced turnover, and possibly reduced health care use (Baicker et al., 2010; CDC, 2011). This chapter reviews gaps in measurement, evaluation, and reporting related to workforce readiness and resilience in DHS and provides rec- ommendations for improvement. This is a critical centerpiece of any program and will be crucial in helping DHS to determine whether it is meeting the goals laid out in its WRR strategic plan. ORGANIZING FRAMEWORK The measurement and evaluation strategy that DHS puts into place needs to be organized into three broad categories: structure, process, and outcomes (Goetzel and Ozminkowski, 2001). Working backward, DHS needs first to consider the intended outcomes of its human capital pro- grams and how the outcomes can be measured. An effective measure- ment and evaluation strategy needs to consider the intervention programs’ framework or structure, and how they are delivered to the tar- get population. If, for example, increased workforce readiness and resili- ence are the goals, DHS needs to ask whether an intervention is evidence-based, adequately resourced, and supported by energetic leadership—that is, structured appropriately to deliver the “dose” neces- sary to achieve the outcome highlighted (Goetzel and Ozminkowski, 2001). Further, DHS needs to measure whether the program is well communicated, achieves high engagement rates, and is viewed favorably by its target audience—the various process measures necessary for even- tual program success. Specifically, program structure needs to be assessed in terms of the basic architecture or blueprint of each DHS program—its “inputs.” The evaluation of structure needs to focus on whether the program’s critical components are in place according to plan, that is, an “audit” of program design compliance is needed (Goetzel and Ozminkowski, 2001). Questions asked in a structure assessment would include (1) What is the interven- tion?, (2) What is the expected outcome of the intervention?, (3) How is the program delivered to participants?, (4) What is the intensity of the pro- gram?, (5) Are core evidence-based program elements included in the de-

OCR for page 129
MEASUREMENT, EVALUATION, AND REPORTING 131 sign?, and (6) Are key measures in place to evaluate the intervention? (Goetzel and Ozminkowski, 2001). Those questions and others relevant to the specific program in question need to be asked and answered routinely. The answers can be delivered in the form of qualitative assessments com- plemented by administrative reports that are produced regularly to keep program managers current on the operation of the enterprise. Process evaluation involves asking how well the program is being implemented—whether the execution of the program is progressing ac- cording to plan and whether operations and delivery systems are handled smoothly. Questions addressed by process evaluations include (1) How many people participate?, (2) How many are fully engaged in behavior change efforts?, (3) How satisfied are they with how the program is run?, (4) How do they feel about the people in charge?, (5) Which programs are best attended?, and (6) How many individuals successfully complete the programs? (Goetzel and Ozminkowski, 2001). Outcome evaluation is used to round out the measurement and evalu- ation paradigm. Outcomes evaluation is focused on the extent to which program objectives are achieved within a given period (Goetzel and Ozminkowski, 2001). Essentially, the purpose of an outcomes study is to determine whether the program’s goals and objectives have been satis- fied according to plan. Program outcomes generally are in three broad categories: (1) improvements in the health and well-being of workers; (2) cost savings; and (3) performance enhancements. Therefore, key out- comes for DHS would include improved quality of life, cost- effectiveness and cost-benefit of interventions (return on investment or ROI), a more ready and resilient workforce, lower turnover, and higher morale and engagement. Figure 4-1 presents a simple model of a workplace program meas- urement schema. The model shows various components of structure, process, and outcome measures that are important to capture in evaluat- ing a comprehensive workplace intervention. MEASUREMENT, EVALUATION, AND REPORTING IN THE DEPARTMENT OF HOMELAND SECURITY DHS offers many program activities that are loosely tied to one another but no standardized mechanism for measuring and evaluating

OCR for page 129
132 A READY AND RESILIENT WORKFORCE FOR DHS FIGURE 4-1 Model of a workplace program measurement schema. SOURCE: Goetzel, 2013. whether those programs work either individually or in combination with one another. There has been a tendency for DHS and its components to respond impulsively to perceived workforce health or resilience issues. New programs are often created without regard for measurement on the front or back end. Thus, there is no diagnostic analysis to justify specific interventions and no consistent method for evaluating program effects. For example, DHSTogether was created in 2009 in response to low employee morale, as reported by the Federal Employee Viewpoint Sur- vey (FEVS) (Green and Perkins, 2012; IOM, 2012). DHS and its compo- nents have taken steps to understand the root causes of low morale, but the Government Accountability Office (GAO) (2012a, p. 19) found that “DHS and the selected components conducted limited analysis in several areas that is not consistent with OPM [Office of Personnel and Manage- ment] and Partnership guidance that lays out useful factors for evaluating root causes of morale problems through FEVS analysis.”1 The compo- nent agencies also developed action plans to address morale, which are reviewed by the Office of the Chief Human Capital Officer (OCHCO). 1 The committee and the Government Accountability Office are referring to root cause analysis at the organizational level because root cause analysis at the individual level would be time and resource intensive to identify and address.

OCR for page 129
MEASUREMENT, EVALUATION, AND REPORTING 133 DHS was not able to share the action plans, so the committee cannot comment on whether the plans are actionable and measurable. However, on reviewing the action plans, GAO (2012a) found that despite having broad performance metrics in place to track and as- sess DHS employee morale on an agency-wide level, DHS does not have specific metrics within the action plans that are consist- ently clear and measurable. As a result, DHS’s ability to assess its efforts to address employee morale problems and determine if changes should be made to ensure progress toward achieving its goals is limited. Another GAO (2012b) report reviewed the DHS Workforce Strategy (DHS, 2011) and found that “although DHS began taking positive steps for managing strategic workforce planning in 2011, DHS officials have not yet taken steps to implement an effective oversight approach for monitoring and evaluating components’ progress in implementing strategic workforce planning.” The analysis identified performance measures that reported on only 2 of the 15 elements in DHS’s strategic workforce planning model (GAO, 2012b). OCHCO, which oversees the data col- lection relevant to the workforce strategy, informed the committee that the office has faced challenges in collecting the data outlined in the per- formance measures and therefore has incomplete data at this time. The committee frequently heard in discussions with and presenta- tions from DHS staff that new programs are often implemented without built-in metrics for accountability and evaluation (Green, 2013; Green and Perkins, 2012). No agency or organization, including DHS, can cre- ate programs without regard for baseline measurement or built-in metrics for program evaluation. In the current fiscal climate, programs need to be created to address critical organizational needs and evaluated to ensure that they are meeting the needs. The measurement strategy that DHS uses needs to provide visibility at the highest level of the organization through data-driven reports that draw a straight line to program resource alloca- tions. In short, the metrics that DHS chooses must be linked to its prima- ry mission—protecting the homeland. During the committee’s information-gathering process, OCHCO was asked to present available data on the DHS workforce. After a panel

OCR for page 129
134 A READY AND RESILIENT WORKFORCE FOR DHS discussion with representatives of several offices in OCHCO,2 it became clear that DHS does not have a good understanding of what data are available, how the existing data are currently being used around the de- partment, how the data could potentially be used, and how many of the data that are collected are not readily available to them because of rules and silos. DHS uses few tools and data points to assess the readiness and resilience of its workforce. The department relies heavily on the FEVS to identify workforce health and resilience issues that need to be addressed. The FEVS can be a powerful management tool to identify strengths and weaknesses, but it is a blunt instrument—it informs leadership about what employees do and do not like but not about why or what the root causes of the problems might be. The FEVS also does not provide information on other important measures crucial to the success of DHS operations, such as employee health and well-being (physical, mental, social, financial, intellectual, and spiritual), health care use, absenteeism, presenteeism, disability, safety, or retention. DHS does collect data on some of those areas, in- cluding absenteeism, disability, occupational injury, and retention (see Box 4-1 for a list of DHS data sources available to OCHCO). However, the data come from a variety of sources (headquarters offices, direc- torates, and operational components) and are not actively integrated or analyzed for the purpose of measuring workforce readiness or resilience. That makes a current-state analysis difficult. In addition, DHS informed the committee that it does not have, or has not pursued the ability to have, access to data related to  Health risk assessments (HRAs)3  Medical costs—health care claims information from OPM is not available to federal agencies  Disease incidence and prevalence  Health care utilization  Biometric or laboratory values 2 The panel in the February 2013 committee meeting included the human resources specialist for workforce engagement, the department safety and health manager, the program manager of human capital policy and programs, the program manager and policy advisor for the workers compensation program, the personnel research psychologist for human capital policy and programs, and the human capital business systems reporting team lead for Human Resources Information Technology (HRIT) operations, all of whom are housed in OCHCO. 3 The US Coast Guard collects such information on active duty personnel.

OCR for page 129
MEASUREMENT, EVALUATION, AND REPORTING 135  Performance and presenteeism  Psychosocial risk factors, such as stress and depression  Long- and short-term disability  Worker compensation and safety (minimal data available to DHS, but the data are collected by the federal government) (Green, 2013) BOX 4-1 Summary of DHS-Wide Data Sources Available to the Office of the Chief Human Capital Officer (OCHCO)a Workforce demographics on age, gender, geographic location, time with agen- cy, time in service, veteran status, job type, pay level (from the National Fi- nance Center [NFC]) Turnover from NFC, including intent to leave (from FEVS); reasons for leaving (exit survey—response rates vary and participation is voluntary; Transportation Security Administration and US Secret Service administer their own) Sick and annual leave, although detailed information (e.g., types of sick leave) needs to be requested from component agencies and may be subject to priva- cy considerations Federal Employee Viewpoint Survey results (from the OPM) Equal Employment Opportunity complaints—icomplaints, 462 report[s] to the Equal Employment Opportunity Commission Workers’ compensation from the Department of Labor (DOL) Office of Work- ers’ Compensation, cross-referenced with DHS component systems Accidents and injuries, including line-of-duty deaths (from DOL, component systems) Health and safety program quality and implementation (onsite assessments) OCHCO relies on data calls to DHS component agencies for  employee assistance program reports  suicide numbers  employee relations cases  cross-referencing for other measures _______________ a As presented to the committee. SOURCE: Green, 2013.

OCR for page 129
136 A READY AND RESILIENT WORKFORCE FOR DHS The Office of Health Affairs (OHA), which has responsibility for overseeing the DHSTogether program, does not conduct systematic measurement activities related to readiness and resilience. A few of the pilot programs have some basic evaluation built in (such as the new Strong Bonds program), but it focuses on employee satisfaction. The US Coast Guard (USCG) has been funded by DHSTogether to run a pilot program that will use the Navy Stress Assessment tool,4 which aims to identify rapidly the stress levels of command personnel before stress af- fects job performance (Green and Perkins, 2012). It is a preventive health workforce tool, and is administered to help in early detection of employ- ees’ stress and stress-related problems. Pilot program managers plan to work with the tool’s Navy creators to modify the tool to meet USCG’s needs (Green and Perkins, 2012). If the program is successful, OHA plans to test the tool in law enforcement component agencies (Green and Perkins, 2012). Although that interagency agreement has been in the works for almost a year, little progress in modifying the tool has been made. Component agencies independently conduct some data-gathering activities—such as focus groups, town halls, exit surveys—and compile suicide rates, but the efforts vary widely among components, and each component has a different level of interest and capabilities for data analysis. Throughout the agency, there are probably some best practices for data gathering and analysis. For example, during its information gather- ing the committee heard from David Tumblin, the Director of Human Capital Strategy and Technology in the Immigration and Customs En- forcement (ICE) Human Capital Office, about an evaluation of the 2012 FEVS data to identify the top five drivers of overall satisfaction and dis- satisfaction in its component (see Box 4-2). The work that his group is undertaking is impressive, and is a good example of processes and prac- tices that other component agencies could learn from. How the data available to DHS inform workforce readiness and resil- ience is unclear. DHS and its component agencies have created many programs focused on improving employee resilience and morale with nothing to connect them, no sense of what is working, and limited ra- tionale for existence of such programs. Altering root causes to create a ready and resilient workforce is challenging, because the factors are often 4 The Navy Stress Assessment tool is a joint effort of the Navy’s Operational Stress Control team: Navy Personnel Research, Studies and Technology, and the Defense Equal Opportunity Management Institute.

OCR for page 129
MEASUREMENT, EVALUATION, AND REPORTING 137 BOX 4-2 Use of Federal Employee Viewpoint Survey Data to Identify Top Drivers of Workforce Satisfaction in Immigration and Customs Enforcement The Office of Human Capital, Strategy and Technology (HCT) in Im- migration and Customs Enforcement (ICE) now reviews and deciphers ICE’s responses to FEVS. Six in 10 ICE employees completed the survey, a total of 11,400 individuals, which enabled ICE to analyze the data in a statistically valid manner. Because the Office of Personnel Management did not release the raw FEVS data, ICE used statistical methods to derive the top five drivers of overall satisfaction and dissatisfaction for the three organizations that comprise 90 percent of ICE’s workforce. From these top drivers the analysis included information gathered from agencywide Town Hall meetings to inform the questions around the drivers. Subsequently, HCT held approximately six focus groups in each of the three large organizations within ICE, including departments that had high satisfaction ratings and low satisfaction ratings. Following these focus groups, HCT administered pulse surveys to measure whether or not employee feedback and data collection are mak- ing an impact on overall employee satisfaction. SOURCE: Tumblin, 2013. woven into the fabric of the institution and causal pathways can be com- plex. Regardless, this type of assessment is essential for ensuring that the programs implemented can make a difference. If evaluation is not built into initial program development, there can be no continuous improve- ment. DHS has not identified commonalities among the various staff levels, job types, or locations in DHS that can be measured across compo- nents. Without a common set of measures, the agency cannot compare these groups. In addition, because of a lack of a vision statement and well- defined objectives, it is not clear what the goal of the DHS resilience program should be—without that, there is no measurement and thus no clear path. MEASUREMENT AND EVALUATION—ESSENTIAL ELEMENTS OF SUCCESSFUL WORKPLACE PROGRAMS Measurement and evaluation needs to be “built in” to any organiza- tional efforts that aim to improve the health, well-being, resilience, mo- rale, and safety of the workforce. The National Institute for Occupational

OCR for page 129
138 A READY AND RESILIENT WORKFORCE FOR DHS Safety and Health (NIOSH) Essential Elements of Effective Workplace Programs (2008) framework underscores the need for evaluation of comprehensive worksite programs, practices, and policies and to “hard code” measurement into environmental and individual programs aimed at preventing disease and promoting health and safety. The NIOSH frame- work identifies four actions related to measurement and evaluation: (1) find and use the right tools; (2) adjust the program as needed; (3) meas- ure and analyze; and (4) learn from experience (see Box 4-3 for a de- scription; NIOSH, 2008). Those actions imply the need to collect baseline data before a program is conceived, to align measurement sys- tems with program objectives, to leverage integrated data to gain a full picture of program effects, to report findings in ways that are easily BOX 4-3 NIOSH Total Worker Health, Measurement, and Evaluation Elements Find and use the right tools. Measure risk from the work environment and baseline health in order to track progress. For example, a Health Risk Ap- praisal instrument that assesses both individual and work-environment health risk factors can help establish baseline workforce health information, direct environmental and individual interventions, and measure progress over time. Optimal assessment of a program’s effectiveness is achieved through the use of relevant, validated measurement instruments. Adjust the program as needed. Successful programs reflect an understand- ing that the interrelationships between work and health are complex. New workplace programs and policies modify complex systems. Uncertainty is inevitable; consequences of change may be unforeseen. Interventions in one part of a complex system are likely to have predictable and unpredictable effects elsewhere. Programs must be evaluated to detect unanticipated ef- fects and adjusted based on analysis of experience. Measure and analyze. Develop objectives and a selective menu of relevant measurements, recognizing that the total value of a program, particularly one designed to abate chronic diseases, may not be determinable in the short run. Integrate data systems across programs and among vendors. Integrated systems simplify the evaluation system and enable both tracking of results and continual program improvement. Learn from experience. Adjust or modify programs based on established milestones and on results you have measured and analyzed. SOURCE: NIOSH, 2008.

OCR for page 129
MEASUREMENT, EVALUATION, AND REPORTING 139 understood and actionable, and to make changes in response to the data analyses. The key factors are to determine the data needed for measurement and evaluation, where the data reside, whether new data need to be collected to address information gaps, how the data should be managed, and the frequency of reporting to decision makers. A structure self-assessment tool adapted from the NIOSH Essential Elements guidelines can be found in Appendix E. The tool can be used by WRR and component agencies to identify needed improvements to include in strategic plans. Ideally, this type of evaluation would be conducted through group discussions, which the committee believes would encourage more realistic evaluations than having a single individual completing the survey. Such groups could consist of representatives with responsibility for relevant subjectss (such as health, wellness, and human capital) who are assembled by the administrator of WRR and component agency heads. Basic Questions of Evaluation A distinction can be made between measurement and evaluation. Measurement implies tracking of data that are essential for running a program effectively, usually derived from administrative systems and presented in the form of “dashboard” reports. Evaluation involves step- ping periodically and asking some fundamental questions such as, Is the program working and is it worth the time, effort, and expense? Is DHS getting sufficient value from the program? If yes, maintain or even en- hance the program. If no, fine tune or eliminate it. In developing an evaluation framework, DHS needs to develop a clear and cogent research agenda that addresses specific questions one at a time. DHS needs to answer the broad question, Is the program successful in meeting its intended goals? That broad question can be broken down into its components that can be tested empirically, that is, verifiable from observation or experiment. For example, a more specific question can be phrased as, Was the readiness and resilience program successful in lowering absenteeism rates of employees at intervention worksites? To help DHS approach the task of measurement in simple language, the following nine questions are offered as ones that could be asked and answered before an evaluation project is undertaken (Goetzel and Ozminkowski, 2001):

OCR for page 129
148 A READY AND RESILIENT WORKFORCE FOR DHS Organizational measures can be constructed by using climate scales that are widely available and tested, including8 the Health Enhancement Research Organization (HERO) Employee Health Management Best Practice Scorecard (HERO Scorecard), the CDC Worksite Health ScoreCard, the National Business Group on Health WISCORE (Wellness Impact Scorecard), OPM WellCheck, and the Samueli Institute Optimal Healthy Environments in the Workplace Assessment (CDC, 2012; Goetzel and Terry, 2013; HERO, 2012; NBGH, 2013). The Campbell Occupa- tional Survey (COS) could also be useful. The COS addresses the essential factors that define employees’ perceptions of their work and workplace; it measures factors that are valid throughout the hierarchy of positions and across occupational groups. Another useful example is a tool developed for Johnson & Johnson, the Health and Performance Index (HaPI), which taps into individual and organizational health issues. The HaPI combines measures of health, productivity, well-being, leadership effectiveness, and culture of health and program sustainability (it also includes two questions from the Connor-Davidson Resilience Scale). It can provide a scorecard and be used to benchmark against other companies (Isaac, 2013). (See Box 4-6 for excerpts from HaPI.) DHS could use one of those tools and add or subtract specific measures, depending on the goals adopted for WRR or individual needs of components. For example, DHS could add questions from individual or organizational resilience assessment tools (for example, Connor and Davidson, 2003, and Wagnild, 2009).9 The National Prevention, Health Promotion, and Public Health Council (2013) annual report states that OHA, in conjunction with OCHCO, will pilot an HRA tool for OHA employees (about 90 full-time employees); OHA could consider using one of the above tools in the pilot. Organizational resilience can be thought of as resulting from three overarching factors: growth, competence, and efficacy (Vogus and Sutcliffe, 2007). The committee reviewed the questions in the FEVS and grouped the ones most relevant to WRR into those three categories (see Box 4-7). Because the FEVS is the main source of data available to all component agencies, in the first few years of the WRR strategic plan 8 None of these is a “resilience” scales or scorecard, but they do address health, wellness, and organizational factors, which, as discussed in Chapter 2, are protective factors that fall under the umbrella of readiness and resilience. 9 These resilience scales are for use at the individual level. There are no widely accepted resilience-specific assessment tools for use at the organizational level.

OCR for page 129
MEASUREMENT, EVALUATION, AND REPORTING 149 BOX 4-6 Excerpts from the Johnson & Johnson Health and Performance Index (Excerpts from Section to Be Filled Out by Employees) About Your Health  Would you say that in general your health is. . .?  In a typical week, what is the average amount of time you spend doing moderate-intensity physical activity?  How many hours of sleep do you usually get in a 24-hour period? About Your Job  My organization cares about my health and well-being.  My organization’s leaders view the level of employee health and well- being as one important indicator of the worksite’s success.  My organization offers incentives for employees to be healthy. About Your Work  During the past 4 weeks, how much did your health problems affect your productivity while you were working?  At my work, I feel bursting with energy (select on a scale of 0–6). Excerpts from section to be filled out by the organization (completed by one person in charge of health and human performance at the organization— individuals are encouraged to gather information from others in the organiza- tion knowledgeable about each section that follows): Program Measurement and Evaluation  Has your health and human performance program been evaluated?  What issues were evaluated?  About how many employees were included in the evaluation/s? Program Components  Indicate whether or not in the past 12 months your organization had in place each of the following programs, policies, and environmental sup- ports for a healthy lifestyle. o Have written goals and objectives for the health and human per- formance program. o Provide subsidies for gym or health club memberships. o Provide coverage for smoking/cessation counseling, medications, or nicotine replacement therapy.

OCR for page 129
150 A READY AND RESILIENT WORKFORCE FOR DHS Leading by Example (Indicate the extent to which you Disagree or Agree with the following statements. Please note: consider “leadership” as those in position to influence health and human performance related activities and policies at your organization.)  Our organization’s health and human performance programs are aligned with our organizational goals  Our leadership shares information with employees about the effect of employee health on overall organizational success. SOURCE: Johnson & Johnson, 2013. BOX 4-7 Federal Employee Viewpoint Survey (FEVS) Questions on Readiness and Resilience Numbers below are question numbers in the 2012 FEVS Growth items 1. I am given a real opportunity to improve my skills in my organization 8. I am constantly looking for ways to do my job better 27. The skill level in my work unit has improved in the past year 43. My supervisor/team leader provides me with opportunities to demonstrate my leadership skills 46. My supervisor/team leader provides me with constructive sugges- tions to improve my job performance 47. Supervisors/team leaders in my work unit support employee development 48. My supervisor/team leader listens to what I have to say Competence items 3. I feel encouraged to come up with new and better ways of doing things 21. My work unit is able to recruit people with the right skills 29. The workforce has the job-relevant knowledge and skills necessary to accomplish organizational goals 36. My organization has prepared employees for potential security threats 39. My agency is successful at accomplishing its mission Efficacy items 2. I have enough information to do my job well

OCR for page 129
MEASUREMENT, EVALUATION, AND REPORTING 151 7. When needed I am willing to put in the extra effort to get a job done 9. I have sufficient resources (for example, people, materials, budget) to get my job done 14. Physical conditions (for example, noise level, temperature, lighting, cleanliness in the workplace) allow employees to perform their jobs well 20. The people I work with cooperate to get the job done 26. Employees in my work unit share job knowledge with each other 30. Employees have a feeling of personal empowerment with respect to work processes 58. Managers promote communication among different work units (for example, about projects, goals, needed resources) 59. Managers support collaboration across work units to accomplish work objectives 61. I have a high level of respect for my organization’s senior leaders 62. Senior leaders demonstrate support for Work/Life programs SOURCE: Questions excerpted from OPM, 2013a. data from the FEVS can be analyzed at the lowest level available throughout the department (generally by physical location within a com- ponent). This more manageable core set of questions (the 2012 FEVS included 98 questions) can be used by all DHS component agencies to provide a benchmark for improvement. Some of the other FEVS ques- tions may also contribute in important ways (for example, questions on trust and respect), DHS will have to align the questions that it decides to use with identified goals and objectives. The committee suggests a mixed-methods approach to data collec- tion that is needed for multifaceted change. Qualitative research can play a role in this approach with the goal of providing timely and useful in- formation that can inform adjustments as needed. Underlying (and hid- den) concepts and worker ideas and reactions can be captured from meeting notes and group discussions. These data can be collected at the component agency level and added to the quantitative data collected to obtain a more complete picture of what is working best and under what conditions and what might need to be changed to arrive at the desired outcomes. Workers can contribute to critical thinking about the overall goals and processes of the change strategies, and this will increase their investment in the change. That would provide information on the overall organizational climate and how opportunities align with the intrinsic ca- reer motivations and capacities of the workforce.

OCR for page 129
152 A READY AND RESILIENT WORKFORCE FOR DHS Continuous Improvement Measurement and evaluation are critical for ensuring continuous im- provement, which is one of the NIOSH essential elements (“learn from experience”). Continuous improvement is the adjustment or modification of programs on the basis of established milestones and results that have been measured and analyzed. It includes a focused effort to make im- provement through additional tests of change. One approach is a rapid cycle improvement strategy, such as Plan-Do-Study-Act (PDSA) (Langley et al., 1996, 2009). In that strategy, objectives and predictions about out- comes are identified, and a plan to carry out the remainder of the cycle (who, what, where, and when) is created. As the plan is carried out, data are collected and used to identify potential problems. Time is then set aside to analyze the data and look at outcomes to track progress and identify needed improvements. Finally, the analyses are acted on with needed adjustments of the initial plan that are based on the outcomes. The cycle then begins anew to monitor and improve continuously. Incor- porating a method for ongoing surveillance is critical for gaining a com- prehensive understanding of underlying drivers of workforce readiness and resilience. Special Considerations Related to Law Enforcement A substantial number of DHS components are involved in law en- forcement (about 50 percent). Law enforcement personnel are frequently exposed to traumatic events that may lead to pathogenic outcomes such as acute or posttraumatic stress disorder, that erode individual resilience and readiness. In some departments, traumatic events (often termed criti- cal incidents) are recorded in a database. That allows departments to rank such events in terms of severity and to be better informed about what sorts of action to take if an employee is so exposed. Organizational trust is essential in law enforcement resilience, so it is important for personnel involved in traumatic situations to be informed of the reasons for record- ing a particular event. An additional method of preventing pathogenic outcomes is to train supervisors in behaviors and symptoms of undue stress or post-traumatic stress disorder. Often, behavioral changes in- clude changes in personality, decreased performance, work absence, in- creased alcohol use, and poor hygiene. Supervisors trained in this area

OCR for page 129
MEASUREMENT, EVALUATION, AND REPORTING 153 can assess the degree of difficulty that an officer is having and take ap- propriate action to remedy the situation. ADDRESSING THE GAPS Measuring health-improvement processes, understanding their rela- tionship to the well-being of an organization, and designing effective interventions all rely on collecting understandable, valid, timely, accu- rate, and integrated data. Collecting appropriate data to understand criti- cal issues can also help to bring top leadership support, something that the current resilience program needs. DHS is not using data to diagnose the needs of its workforce and is not measuring progress in addressing those needs. DHS does not the collect the necessary data, so it is impossible to tell where resources should be invested (specific interventions). A key role that DHS leadership can play is to put forward a measurement and evalua- tion framework to inform the long-term strategy. The committee finds that DHS lacks a strategy, a framework, and a common set of metrics that promote, sustain, and monitor employee readiness and resilience, and ultimately program effectiveness. DHS and its components thus lack a comprehensive, consistent, coherent, mean- ingful top-to-bottom view of the readiness and resilience of the DHS workforce. Recommendation 6: Develop and implement a measurement and evaluation strategy for continuous improvement of workforce readiness and resilience in the Department of Homeland Security. The Department of Homeland Security should design and im- plement an ongoing measurement and evaluation process to in- form and improve employee and organizational readiness and resilience initiatives. This will support planning, assessment, execu- tion, evaluation, and continuous quality improvement of the strategic plan. Before the introduction of any new measures or the collection of any new data, DHS should access and analyze existing workforce data.

OCR for page 129
154 A READY AND RESILIENT WORKFORCE FOR DHS Characteristics of the evaluation strategy should include a. A focus on structure, process, and outcome measures. b. Implementation of a standardized core set of measures to be used DHS-wide. c. Establishment of a baseline database for diagnostic and prescrip- tive purposes. d. Establishment of clear program goals, with associated timelines, that can be tracked and monitored using DHS’s measurement and evaluation system. e. Conducting ongoing assessment of program implementation, with regular quarterly reports on progress. f. Use of evidence to inform resource allocation and reallocation. g. Regular communication and dissemination of findings among components. h. Submission of an annual measurement and evaluation report to the Secretary (see Recommendation 2). The committee recognizes that this will require senior executive inter- vention to make data that have already been collected available for this purpose, that new data collection will be needed, and that all data ele- ments should be integrated. The annual report to the Secretary should include data on the structural, process, and outcome changes outlined in this chapter, for example:  Structural changes—a composite score of organizational culture scores that uses a subset of FEVS items or an external tool, like the CDC Worksite Health Scorecard (CDC, 2012).  Process changes—a score that indicates employee engagement, morale, leadership support, and satisfaction with the workplace (FEVS subset of items or a separate survey).  Outcome changes—with a focus on self-reported health im- provements (stress, depression, obesity, smoking, blood pres- sure, cholesterol, glucose, body mass index, alcohol use, and physical activity), productivity changes (absenteeism, disability, safety, and presenteeism), and medical costs (expenditure per employee per year). This chapter has provided a measurement framework. DHS needs to work with components to identify the core measures to be used through-

OCR for page 129
MEASUREMENT, EVALUATION, AND REPORTING 155 out DHS on the basis of goals adopted for WRR and in alignment with the mission. Component agencies will need to identify other measures or data points on the basis of their unique needs and goals to complement the core measures identified for use DHS-wide. The committee recognizes that members of the DHS workforce may view the collection and analysis of some types of data outlined in this chapter in an unfavorable light, and this could potentially further dimin- ish morale. As discussed in Chapter 3, it is imperative that DHS maintain an open conversation with the workforce, especially regarding these ef- forts. It will be important for DHS to demonstrate genuine concern for the workforce, communicate how building WRR directly benefits em- ployees, and openly discuss and address any concerns that may arise. The committee believes that such open communication will garner work- force support for these efforts and help to embed a culture of resilience within DHS. As with anything that needs improvement—measurement and evalu- ation are not exceptions—leadership needs to be the role model and set the pace for the endeavor to be successful. As noted, programs anchored in measuring problems and achievements know where they are going and are more likely to reach identified goals. Data must be collected and up- dated in a timely manner, vetted for quality, and made accessible to us- ers. A strategic measurement and evaluation framework needs to be developed that addresses structure, process, and outcome measures, re- view of data sources, setting of goals, establishment a timetable, alloca- tion of budget dollars, and requirement of regular reports with recom- mendations for program adjustments (or elimination if indicated). DHS is not unique among federal agencies in facing challenges in climate, cul- ture, and performance, and it is certainly not alone in lacking the kind of systematic and data-driven improvement strategies that are discussed in this chapter. Although what the committee recommends will initially be a large undertaking, once in place it will yield important returns to the workforce and can stand as a model system for other federal agencies. REFERENCES Acquisition Solutions. 2009. Integrated IT governance: Providing agency transparency and visibility. https://www.asigovernment.com/documents/ Insights_Integrated%20IT%20Governance.pdf (accessed August 21, 2013). Baicker, K., D. Cutler, and Z. Song. 2010. Workplace wellness programs can generate savings. Health Affairs (Millwood) 29(2):304–311.

OCR for page 129
156 A READY AND RESILIENT WORKFORCE FOR DHS Care Continuum Alliance. 2010. Outcomes guidelines report, vol. 5. Washington, DC: Care Continuum Alliance. CDC (Centers for Disease Control and Prevention). 2011. Investing in prevention improves productivity and reduces employer costs. http:// www.cdc.gov/policy/resources/Investingin_ReducesEmployerCosts.pdf (accessed August 21, 2013). CDC. 2012. The CDC worksite health scorecard: An assessment tool for employers to prevent heart disease, stroke, and related health conditions. Atlanta, GA: Department of Health and Human Services. Connor, K. M., and J. R. T. Davidson. 2003. Development of a new resilience scale: The Connor-Davidson Resilience Scale (CD-RISC). Depression and Anxiety 18(2):76–82. Cornum, R., T. D. Vail, and P. B. Lester. 2012. Special feature—resilience: The result of a totally fit force. Joint Force Quarterly (66):28. DHS (Department of Homeland Security). 2011. Department of Homeland Security workforce strategy: Fiscal years 2011–2016. Washington, DC: DHS. Frankel, N., and A. Gage. 2007. M&E fundamentals: A self-guided minicourse. Washington, DC: USAID/MEASURE Evaluation. GAO (Government Accountability Office). 2012a. Department of Homeland Security: Taking further action to better determine causes of morale problems would assist in targeting action plans. GAO-12-940. Washington, DC: GAO. GAO. 2012b. DHS strategic workforce planning: Oversight of departmentwide efforts should be strengthened. Washington, DC: GAO. Goetzel, R. 2013. Intensive training program: How to evaluate health promotion programs. Presentation at the 23rd Annual Art & Science of Health Promo- tion Conference, March 18–22, 2013. Goetzel, R. Z., and R. J. Ozminkowski. 2001. Program evaluation. In Health promotion in the workplace, edited by M. P. O’Donnell. Albany, NY: Delmar. Pp. 116–165. Goetzel, R. Z., and P. E. Terry. 2013. Current state and future directions for organizational health scorecards. American Journal of Health Promotion 27(5):11. Green, A. 2013. Data resources for resilience. Presentation to the IOM Committee on Department of Homeland Security Workforce Resilience, February 4–5, Washington, DC. Green, A., and L. Perkins. 2012. DHS workforce resilience: Past, current and future. Presentation to the IOM Committee on Department of Homeland Secuirty Workforce Resilience, Washington, DC. Hastie, T., R. Tibshirani, and J. Friedman. 2009. The elements of statistical learning. 2nd ed. New York: Springer.

OCR for page 129
MEASUREMENT, EVALUATION, AND REPORTING 157 HERO (Health Enhancement Research Organization). 2012. HERO employee health management best practices scorecord in collaboration with Mercer, annual report 2012. New York: HERO and Mercer LLC. International Federation of Red Cross and Red Crescent Societies. 2011. Project/programme monitoring and evaluation (M&E) guide. Geneva, Switzerland: International Federation of Red Cross and Red Crescent Societies. IOM (Institute of Medicine). 2005. Integrating employee health: A model program for NASA. Washington, DC: The National Academies Press. IOM. 2012. Building a resilient workforce: Opportunities for the Department of Homeland Security: Workshop summary. Washington, DC: The National Academies Press. Isaac, F. 2013. Work, health, and productivity: The Johnson & Johnson Story. Presentation to the IOM Committee on Department of Homeland Security Occupational Health and Operational Medicine Infrastructure. Washington, DC. Johnson & Johnson, 2013. Health and Performance Index (HAPI). New Brunswick, NJ: Johnson & Johnson. Langley, G. J., K. M. Nolan, T. W. Nolan, C. L. Norman, and L. P. Provost. 1996. The improvement guide: A practical approach to enhancing organizational performance. San Francisco, CA: Jossey-Bass. Langley, G. J., R. Moen, K. M. Nolan, T. W. Nolan, C. L. Norman, and L. P. Provost. 2009. The improvement guide: A practical approach to enhancing organizational performance. 2nd ed. Classic icon. San Francisco, CA: Jossey-Bass. Lester, P. B. 2013. Data integration. Presentation to the IOM Committee on Department of Homeland Security Workforce Resilience, February 4–5, Washington, DC. National Prevention, Health Promotion, and Public Health Council. 2013. 2013 annual status report. http://www.surgeongeneral.gov/initiatives/ prevention/2013-npc-status-report.pdf (accessed August 21, 2013). NBGH (National Business Group on Health). 2013. WISCORE, the Wellness Impact Scorecard. https://www.businessgrouphealth.org/scorecard_v4/ index.cfm?event=using (accessed July 9, 2013). NIOSH (National Institute for Occupational Safety and Health). 2008. Essential elements of effective workplace programs and policies for improving worker health and wellbeing. Atlanta, GA: CDC. NRC (National Research Council). 2010. Steps toward large-scale data integration in the sciences: Summary of a workshop. Washington, DC: The Natiomal Academies Press. OPM (Office of Personnel Management). 2010. The health data warehouse. Letter No. 2010-08. Washington, DC: OPM. OPM. 2012. Agency rankings. http://www.fedview.opm.gov/2012/Reports/ Ranking.asp?HCAFF=KM&VW=FULL (accessed June 7, 2013).

OCR for page 129
158 A READY AND RESILIENT WORKFORCE FOR DHS OPM. 2013a. Federal viewpoint survey. http://www.fedview.opm.gov (accessed July 9, 2013). OPM. 2013b. Work/life health and wellness: Well check. http://www.opm.gov/ policy-data-oversight/worklife/health-wellness/#url=Well-Check (accessed July 26, 2013). Partnership for Public Service. 2005. Case study, NASA: Overcoming mission challenges. Washington, DC: Partnership for Public Service. Pronk, N. 2005. The four faces of measurement. ACSM’S Health & Fitness 9(5):3. Soler, R. E., K. D. Leeks, S. Razi, D. P. Hopkins, M. Griffith, A. Aten, A., S. K. Chattopadhyay, S. C. Smith, N. Habarta, R. Z. Goetzel, N. P. Pronk, D. E. Richling, D. R. Bauer, L. R. Buchanan, C. S. Florence, L. Koonin, D. MacLean, A. Rosenthal, D. Matson Koffman, J. V. Grizzell, A. M. Walker, and Task Force on Community Preventive Services. 2010. A systematic review of selected interventions for worksite health promotion: The assessment of health risks with feedback. American Journal of Preventative Medicine 38(2 Suppl):S237–S262. Tumblin, D. 2013. DHS best practices. Presentation to the IOM Committee on Department of Homeland Security Workforce Resilience, Washington, DC. Vogus, T. J., and K. M. Sutcliffe. 2007. Organizational resilience: Towards a theory and research agenda. Montreal, QC. Wagnild, G. 2009. A review of the resilience scale. Journal of Nursing Measurement 17(2):105–113. WHO (World Health Organization). 2009. A guide to monitoring and evaluation for collaborative TB/HIV activities. Geneva, Switzerland: WHO. W.K. Kellogg Foundation. 2004. W.K. Kellogg Foundation evaluation handbook. Battle Creek, MI: W.K. Kellogg Foundation.