National Academies Press: OpenBook
« Previous: Summary
Suggested Citation:"1 Introduction." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×

1

Introduction

Science and technology are engines of U.S. economic growth and international competitiveness in the global 21st century economy. Leading economists (e.g., Solow, 1957; Mankiw, 2003; Romer, 1990), policy makers, and the public all agree that technological innovation fueled by scientific research is the primary mechanism for sustained economic growth (Xie and Killewald, 2012). As the nation continues to recover from the 2008 economic recession, the science, technology, engineering, and mathematics (STEM) fields are critical drivers for the health of the economy. Hence, a robust, skilled STEM workforce is important for the nation (National Academy of Sciences, National Academy of Engineering, and Institute of Medicine, 2007, 2010). Because undergraduate STEM education plays a central role in developing the STEM workforce, and also contributes to a strong general education for all students, improving the quality of undergraduate STEM education is a national imperative.

Some recent trends raise concerns about the health of the nation’s STEM workforce (see Xie and Killewald, 2012). First, scientists’ earnings (adjusted for inflation) have stagnated since the 1960s and have declined relative to those of other high-status, high-education professions, such as law and business, which could discourage individuals from entering or staying in science careers. Second, it has become more difficult for recent science doctorates to obtain any academic position, and the available academic positions are weighted toward more postdoctoral appointments and fewer faculty positions, which could discourage young people from pursuing academic research. Third, U.S. science faces increasing foreign competition as the share of global research conducted in other countries is increasing.

Suggested Citation:"1 Introduction." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×

These trends could lead to gradual erosion of U.S. dominance in science and a slowdown in the economic growth fueled by technological innovation.

To strengthen the nation’s research and technology enterprise in the face of these trends, the President’s Council of Advisors on Science and Technology (PCAST) (2012) recommended producing 1 million additional college graduates with degrees in STEM over the following decade. Recognizing that many students with an interest in and aptitude for STEM, especially females and underrepresented minorities, are not completing degrees in these fields (see National Research Council, 2011; National Academies of Sciences, Engineering, and Medicine, 2016a), the PCAST report called for widespread implementation of strategies to engage, motivate, and retain diverse students in STEM. Such strategies are beginning to emerge from a growing body of relevant research, but they have not yet been widely implemented (see National Research Council, 2012; National Academies of Sciences, Engineering, and Medicine, 2016a).

Many initiatives to improve the quality of undergraduate STEM education are now under way. Some focus on the national level, others involve multi-institution collaborations, and others take place on individual campuses. For example, the interagency Committee on STEM Education of the National Science and Technology Council (2013) developed a STEM education 5-year strategic plan that identified improving the experience of undergraduate students as a priority goal for federal investment. Within this broad goal, the strategic plan identified four priority areas: (1) promoting evidence-based instructional practices; (2) improving STEM experiences in community colleges; (3) expanding undergraduate research experiences; and (4) advancing success in the key gateway of introductory mathematics. Other initiatives include the undergraduate STEM initiative of the Association of American Universities;1 a workshop and sourcebook on undergraduate STEM reform of the Coalition for Reform in Undergraduate STEM Education;2 and the Partnership for Undergraduate Life Sciences Education, or PULSE.3

At present, policy makers and the public do not know whether these various federal, state, and local initiatives are accomplishing their stated goals and achieving nationwide improvement in undergraduate STEM education. This is partly due to a lack of high-quality national data on undergraduate STEM teaching and learning. A recent study of barriers and opportunities for 2-year and 4-year STEM degrees (National Academies of Sciences, Engineering, and Medicine, 2016a) highlighted the mismatch between currently available datasets and the realities of student trajecto-

___________________

1 See https://stemedhub.org/groups/aau/about [July 2017].

2 See https://www.aacu.org/pkal/sourcebook [July 2017].

3 See http://www.pulsecommunity.org [July 2017].

Suggested Citation:"1 Introduction." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×

ries. Although students today often transfer across institutions, stop in and out of STEM programs, and attend classes part time, existing surveys rarely capture these trends. That study committee concluded that existing data collection systems (national, state, and institutional) were often not structured to gather the information needed to understand the quality of undergraduate education.

Anticipating these challenges, the President’s Council of Advisors on Science and Technology (2012) recommended that the National Academies of Sciences, Engineering, and Medicine develop metrics to evaluate undergraduate STEM education.

In response, the National Science Foundation (NSF) charged the National Academies to conduct a consensus study to identify objectives for improving undergraduate STEM education and to outline a framework and set of indicators to document the status and quality of undergraduate STEM education at the national level over multiple years: see Box 1-1 for the full study charge.

INTERPRETING THE STUDY CHARGE

As it began its work, the committee identified a number of important issues in the study charge and within the broader social, political, and historical context that led to this study. In constructing a shared understanding of these issues, the committee was able to calibrate its interpretations of key terms and phrases in the charge as well as other terms and phrases that emerged over the course of its deliberations. Throughout this chapter, the committee identifies the issues emerging in its interpretation of the study charge. For each one, it shares definitions of terms and, as appropriate, discusses the context in order to help ground the committee’s proposed indicator system. Underlying this work is the committee’s vision for undergraduate STEM education.

Vision

In developing a conceptual framework and indicators to monitor improvement in undergraduate STEM education, the committee envisioned what such improvement would look like. In this vision, students—from all walks of life and with all types of experiences and backgrounds—would be well prepared to help address global, societal, economic, and technological challenges. Students would have the STEM background to become successful in the careers of today as well as those of tomorrow as U.S. society continues to become increasingly diverse, global, and interconnected. Among these well-prepared graduates, some would become professional scientists and engineers, conducting research and developing new technologies to support sustained economic growth.

Suggested Citation:"1 Introduction." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×

At the same time, the committee envisions that all students, not merely those who pursue STEM degrees and careers, would have both access and exposure to high-quality STEM education to support the development of STEM literacy, a relatively new concept. The committee’s adoption of this concept was informed by a recent report (National Academies of

Suggested Citation:"1 Introduction." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×

Sciences, Engineering, and Medicine, 2016b) that identified several aspects of science literacy: (1) the understanding of scientific practices, such as the formulation and testing of hypotheses; (2) content knowledge, including concepts and vocabulary; and (3) understanding of science as a social process, for example, the role of peer review. In keeping with these expanding definitions of science literacy, the committee recognized that “engaging in science—whether using knowledge or creating it—necessitates some level of familiarity with the enterprise and practice of science” (National Academies of Sciences, Engineering, and Medicine, 2016b, p. 11). Students’ ability to use STEM knowledge (for example, as citizens) or create STEM knowledge, then, will require an analogous familiarity with the enterprises and practices of science and technology and engineering and mathematics. The application of the term “literacy” to these disciplines “signifies something like ‘knowledge, skills, and fluency’ within [these] particular domain[s]” (National Academies of Sciences, Engineering, and Medicine, 2016b, p. 17).

In the committee’s vision, STEM literacy, along with lifelong interest in STEM, is important for all graduates, regardless of their field of study, both in the workplace and outside of work. In the workplace, STEM knowledge and skills, such as those encompassed within the concept of STEM literacy, are useful across a range of occupations beyond those of scientists and engineers (Carnevale, Smith, and Melton, 2011; National Science Foundation, 2014). Outside of work, people can draw on their science literacy for making decisions that could involve science (e.g., decisions about personal health or voting on environmental issues), although such knowledge is only one of many factors contributing to their decision making (National Academies of Sciences, Engineering, and Medicine, 2016b). Given that access to high-quality science learning experiences can facilitate and support the development of science knowledge bases, the committee expects that analogous exposure to high-quality STEM learning experiences in undergraduate education will facilitate and support the development of STEM knowledge bases that graduates may choose to draw on when making important life decisions. However, the committee did not reach consensus on exactly what level of exposure (e.g., completion of a certain number or type of courses or learning experiences outside the classroom) would support STEM literacy. For some students, STEM literacy might require exposure to adult basic education, adult literacy, or vocational programs offered in the community. However, measures of such programs would fall outside of the committee’s charge to develop indicators of the status and quality of undergraduate STEM education.

The committee envisions that exposure to STEM concepts and processes can help individuals to make sense of the world around them, enabling the skills and dispositions needed to participate actively in a democracy. Because of its commitment to these values, the committee sees the goals

Suggested Citation:"1 Introduction." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×

for undergraduate education presented later in this chapter as working in service of ensuring STEM literacy for all undergraduates, not just those majoring in STEM fields.

Given the increasingly diverse and global nature of U.S. society, the committee envisions that STEM education will embrace approaches that increase representation of diverse populations in STEM careers. All undergraduate institutions will provide equitable STEM educational practices, both inside and outside the classroom (curricular and co-curricular), ensuring that all students have the opportunities and support they need to reach their potential. In addition, instructors, staff, and administrators will have the knowledge, skills, and understanding of evidence-based teaching and learning methods to deliver a 21st century, inclusive STEM curriculum and co-curriculum, and students will have clear pathways into and through STEM programs of learning.

A Focus on the National Level

Given its charge, the committee’s work focused on national-level indicators. As noted above, many initiatives are currently under way to monitor and improve undergraduate STEM education. However, these initiatives tend to gather detailed, local data that are appropriate for local feedback and improvement; they are not appropriate for representing STEM education phenomena broadly and nationally across 2-year and 4-year institutions.

For example, the PULSE vision and change rubrics are designed to evaluate life science departments’ progress toward specified reform principles (American Association for the Advancement of Science, 2011). Department-level leaders (current or former department chairs or deans) voluntarily use the rubrics for self-study and improvement in terms of 66 different criteria across five areas: curriculum alignment, assessment, faculty practice/faculty support, infrastructure, and climate for change. Analyzing rubric data from a sample of life sciences departments,4Brancaccio-Taras and colleagues (2016) concluded that the rubrics constitute a valid and reliable instrument for evaluating departmental change across different institution types. The authors also concluded that their analysis of rubric data from the 26 institutions that responded to all five groups of rubrics provided “baseline knowledge and insights about the state of the adoption of the

___________________

4 The authors invited all members of the PULSE community (which includes 2-year and 4-year colleges, regional comprehensive universities, and research universities) to submit their rubric data. The respondents provided varying amounts of data: 26 institutions provided rubric data across all five areas, 57 provided data on curriculum alignment, 35 on assessment, 49 on faculty practice/faculty support, 28 on infrastructure, and 32 on climate for change.

Suggested Citation:"1 Introduction." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×

recommendations of the Vision and Change report” (Brancaccio-Taras et al., 2016, p. 11). (The authors are currently working to gather data from a larger number of departments.) Although the committee views the currently available baseline data as a valuable snapshot of quality improvement in some life sciences departments, these data are not nationally representative, do not include other disciplines, and do not constitute national indicators of improvement in undergraduate STEM.

The national-level data needed for the committee’s proposed indicators differ from the PULSE rubrics or other fine-grained data designed to guide local improvement efforts. In K–12 education, for example, teacher observation instruments can be designed to gather detailed data that provide very specific feedback to teachers for improving their practices. However, because schools and districts seek instruments that can provide a picture of teachers’ work across multiple content areas and grade levels, they often design more “course-grained” observation instruments to provide more global data (Hill and Grossman, 2013). One result is that three-fourths of the 15,000 teachers responding to a recent survey indicated that their most recent evaluations failed to identify areas for improvement (Weisberg et al., 2009, cited in Hill and Grossman, 2013). The granularity of data matters, and the global data needed for national indicators may not be useful for informing improvement by individual instructors, STEM departments, or STEM programs (Wilson and Anagnostopolous, in press).

Equity, Diversity, and Inclusion

Given the national need for a robust supply of STEM professionals for technological innovation and sustained economic growth, another important dimension of this study is the underrepresentation of certain groups in STEM, involving issues of equity, diversity, and inclusion. Equity refers to the fair distribution of opportunities to participate and succeed in education for all students. Diversity focuses on the differences among individuals, including demographic differences such as gender, race, ethnicity, and country of origin. Inclusion refers to the processes through which all students are made to feel welcome and are treated as motivated learners.

The committee views equity as a central element of quality in undergraduate STEM education and considers measurement of equity as essential to measuring and improving quality. From an accountability perspective, equity in undergraduate STEM education is the achievement of proportional representation of all demographic groups in terms of access, retention, degree completion, and participation in enriching STEM educational programs, experiences, and activities that prepare students to enter the STEM workforce. The committee uses the word demographics broadly to capture the full spectrum of diversity in the population. Such diversity

Suggested Citation:"1 Introduction." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×

includes, but is not limited to, diversity in socioeconomic level, gender identity, race and ethnicity, religion, first-generation in college, marital and parental status, veteran status, disability status, and age. The National Science Foundation and the National Center for Science and Engineering Statistics (2017) has identified certain demographic groups as “underrepresented” because their representation in STEM education and employment is not proportional to their representation in the national population. These underrepresented groups include persons with disabilities, women in some STEM disciplines, and three racial and ethnic minority groups: Hispanics, Blacks, and American Indians.

Goals and Objectives

The committee defines a goal as an intended outcome representing improvement in undergraduate STEM education. It is stated in general terms and covers a long time frame. Because achieving long-term goals involves the work of multiple stakeholders at different levels in the undergraduate education system (e.g., classroom, institution, state higher education system), the committee’s goals are stated as action verbs but do not identify a specific actor as the subject. The committee defines objectives as more specific and measurable steps toward achieving goals.

Measures and Indicators

The committee defines a measure as a value that is quantified against a standard at a specified time. An indicator is a specific type of measure that provides evidence that a certain condition exists or that specifies how well certain results or objectives have or have not been achieved (Brizius and Campbell, 1991). It goes beyond raw statistics or data to provide easily understandable information that can be used to guide educational policy and practice (National Research Council, 2014).

After considering various definitions of an educational indicator (e.g., Oakes, 1986), the committee adopted Planty and Carlson’s (2010) conceptualization of an educational indicator and an indicator system. In this definition, an educational indicator has three key characteristics. First, it attempts to represent the status of a specific condition or phenomenon. For example, it may measure student achievement, dropout rates, crime in schools, or another aspect of education. Second, it typically is quantitative (Planty and Carlson, 2010, p. 4):

Indicators are created from data (i.e., observations collected in numerical form) and presented in the form of numbers and statistics. Statistics are numerical facts, but by themselves are not indicators. . . . Indicators

Suggested Citation:"1 Introduction." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×

combine statistics with purpose, meaning, and context to provide useful information about a condition or phenomenon of interest.

Third, an indicator has a temporal component, meaning that it might not only indicate the status of a condition or phenomenon at a given time, but also can represent change in the condition over time.

An educational indicator may be either a single statistic or a composite statistic. Single-statistic indicators measure a specific condition of an education system (e.g., an institution’s student enrollment and number of Pell grant recipients). Composite indicators combine single statistics to depict a relationship between two or more aspects of the education system; examples include student-faculty ratio and student readiness (Planty and Carlson, 2010).

Because a lone indicator rarely provides useful information about complex conditions or phenomena, indicator systems are designed to generate more comprehensive and, therefore, more useful information about conditions. More than just a collection of indicator statistics, an educational indicator system measures the system’s inputs, processes, and outputs. In education, such inputs may include fiscal and material resources, instructor quality, and student background. Processes may include instructional quality and institutional context and structure, while outputs may include student achievement, participation, and attitudes and aspirations (Shavelson, McDonnell, and Oakes, 1989; Odden, 1990, pp. 24–25). A high-quality indicator system not only measures these individual components, but also suggests how they work together to produce an overall effect.

In the following chapters of this report, the committee proposes an educational indicator system reflecting this definition. The system is based on a conceptual framework that views undergraduate STEM education as a system and considers how the components (inputs, processes, the environment) work together to produce an effect on the desired student outcomes. The committee emphasizes that its proposed indicators should be viewed in concert with one another to provide insight into the overall quality of undergraduate STEM education (see National Research Council, 2014). Though individual indicators can provide discrete markers of progress toward improvement in undergraduate STEM, the committee also recognizes that telling an “end-to-end story” requires the creation of a coherent framework demonstrating how relationships across indicators facilitate achieving the identified objectives. This report provides such a framework, identifying overarching and mutually reinforcing goals that are aligned with more specific objectives for improving undergraduate STEM education, along with a set of indicators to monitor progress toward the objectives.

Suggested Citation:"1 Introduction." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×

Undergraduate STEM Education

For the purposes of this report, the committee defines undergraduate STEM education as undergraduate education in the natural and social sciences, technology, engineering, and mathematics. This definition follows the National Science Foundation (2016) definition of the STEM fields, which includes the social sciences. In keeping with the committee’s charge to develop indicators for 2-year and 4-year STEM programs, the definition encompasses programs of study leading to bachelor’s degrees, associate’s degrees, and certificates at all types of public, private, for-profit, and nonprofit institutions of higher education. This definition also includes workforce development programs that prepare students for “middle-skill” jobs, defined as those jobs that require less educational preparation than a bachelor’s degree, but some education or training beyond a high school degree (Holzer and Lerman, 2007; Rothwell, 2013). Finally, the committee’s definition includes introductory STEM courses that students may take as part of general education requirements, regardless of their major field of study.

Evidence-Based STEM Educational Practices and Programs

In the committee’s view, improving the quality of undergraduate STEM education will require wider use of “evidence-based STEM educational practices and programs.” A growing body of research (see National Research Council, 2012, and National Academies of Sciences, Engineering, and Medicine, 2016a) has begun to identify effective teaching practices and co-curricular programs that support students’ mastery of STEM concepts and skills and their retention in STEM programs. Based on this research, the committee defines evidence-based STEM educational practices and programs as those meeting at least one of the following criteria:

  • the preponderance of published literature suggests that it will be effective across settings or in the specific local setting, or
  • the practice is built explicitly from accepted theories of teaching and learning and is faithful to best practices of implementation, or
  • locally collected, valid, and reliable evidence, based on a sound methodological research approach, suggests that the practice is effective.

MEASURING COLLEGE QUALITY IN AN ERA OF ACCOUNTABILITY

The committee defines improvement as progress toward the committee’s vision, goals, and objectives for undergraduate STEM education (see

Suggested Citation:"1 Introduction." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×

Chapter 2). To define quality, the committee adapted a definition of higher education quality from Matsudaira (2015), which, in turn, reflects earlier work on quality improvement in health care delivery (Institute of Medicine, 2001). Specifically, the committee defines quality in higher education as the degree to which exposure to STEM educational offerings increases the likelihood of desired educational outcomes. This definition focuses on the causal effect that exposure to some STEM educational experience (e.g., the physics program at college A; an innovative remedial mathematics course at college B) has on advancing valued outcomes (Matsudaira, 2015; see also Matchett, Dahlberg, and Rudin, 2016). In this case, the valued outcomes of undergraduate STEM include mastery of STEM concepts and skills and attainment of STEM credentials.

In developing this definition of quality, the committee considered the larger social and political context. Responding to soaring college costs, high attrition rates, and rising student debt, parents, employers, policy makers and taxpayers are asking questions about the quality of higher education: Are students learning? Are graduates earning? What is the value of higher education? Each of these questions is complex, and the committee acknowledges that they cannot be answered with single indicators. Nevertheless, the goals, objectives, and indicators presented later in this report contribute to answering these questions.

Employment Outcomes

In response to growing calls for accountability, one of the most widely used methods for measuring the quality or “value” of a college or university is to assemble and analyze data on graduates’ earnings. However, research has demonstrated that both graduation rates and postgraduation earnings vary widely, depending on the type and selectivity of the institution and the characteristics of incoming students (National Academies of Sciences, Engineering, and Medicine, 2016a; Matsudaira, 2015). Although economists are beginning to develop methods to adjust graduates’ earnings to account for the characteristics of incoming students, these methods are not yet fully developed, and further research would be needed to develop uniform quality measures (Matsudaira, 2015).

In addition, graduates’ earning are influenced by labor market demand, and the wage premium for STEM graduates varies by time, place, and field in ways that are characteristic of a market economy. Furthermore, many STEM majors enter occupations that are not traditionally considered part of the STEM workforce (National Science Foundation, 2016), but their STEM knowledge may indeed contribute to their earnings (Carnevale, Smith, and Melton, 2011): It is not practical to precisely identify these workers and measure this contribution. Moreover, individuals who have

Suggested Citation:"1 Introduction." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×

received STEM education may possess unobserved attributes that make them incomparable to their peers without STEM education. For all of these reasons, some experts and leaders in higher education agree that postgraduation earnings alone are not a suitable measure of institutional quality (e.g., Matchett, Dahlberg, and Rudin, 2016). Nevertheless, many states have already implemented performance-based funding systems that reward institutions on the basis of the average earnings of graduates.

Another, related method for measuring an institution’s “value” focuses on the extent to which graduates find jobs related to their chosen fields of study. This method, however, does not fully address the measurement challenges described above. U.S. colleges and universities intentionally offer a variety of majors to meet students’ varying interests and demands. Some STEM majors (such as engineering and nursing) are more closely tied to future occupations than others (such as social science and biology). Thus, across STEM fields, there is a large variation in the flow of students from STEM majors to STEM occupations (Xie and Killewald, 2012), and STEM education can be a good preparation for non-STEM careers: see Box 1-2. Labor market demand, external to higher education, influences whether STEM graduates find any type of job, a job in their specific STEM discipline, or a job in another STEM discipline. A graduate’s employment prospects are also influenced by the geographical location and movement of companies and academic research organizations, changes in domination of certain industries and technologies, and shifting demand for different skills as new technologies emerge and others become obsolete. We know that rates of employment and unemployment vary by discipline. In 2013, most STEM professionals (93.3%) were employed in their field of study, with only 6.7 percent reporting that they worked in a job outside the field of their highest degree because a job in their field was not available: See Table 1-1 (National Science Foundation, 2016). Within this average, however, the rate of working outside one’s field of study varied by discipline, with higher rates in the social, life, and physical sciences and lower rates in engineering and computer and mathematical sciences.

In the committee’s view, care should be taken in interpreting these different job placement rates. They may reflect differences in employers’ skill demands, the intended purpose of the students’ majors, or students’ selections of fields. Thus, one should not attribute the observed differences in job placement rates simply to differences in the overall national quality of the undergraduate education programs in each discipline. Even when STEM majors enter occupations outside the STEM workforce, the knowledge and skills they developed in their undergraduate STEM programs contribute to the national economy (see “Vision,” above). In light of these complexities, the committee does not propose any overarching goals, more specific objectives, or indicators related to placement in STEM jobs.

Suggested Citation:"1 Introduction." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×

The STEM Workforce

The committee defines the STEM workforce to include science and engineering occupations (currently about 5.7 million people) and science and engineering-related occupations (currently about 7.4 million people; National Science Foundation, 2016). These two groups of occupations have been carefully defined and studied by the National Science Foundation (2014, 2015, and 2016). In this definition (National Science Foundation, 2016), the science and engineering occupations include computer and mathematical scientists; biological, agricultural, and environmental life scientists; physical scientists (e.g., physicists, chemists, geoscientists); social scientists (e.g., psychologists, economists, sociologists); engineers; and postsecondary teachers in science and engineering fields. The science- and engineering-related occupations include health care workers (e.g., physicians, audiologists, nurses); science and engineering managers (e.g., engineering managers, natural and social science managers); science and engineering precollege teachers (e.g., K–12 science teachers); technologists and technicians in science and engineering; and other science and engineering-related occupations (e.g., actuaries, architects).

The committee notes that, although the concept of a “STEM workforce” is widely used and has been referenced in law, there is no consensus on how it is defined. Various reports use different definitions, leading to divergent and sometimes conflicting conclusions about the size and other characteristics of the STEM workforce: see Box 1-2. Furthermore, the STEM workforce is heterogeneous; it is composed of many different “sub-workforces” that can be characterized by field of degree, occupational field, the education level required, or some combination of these elements.

Several approaches have been proposed in recent years to define and measure the STEM workforce or the science and engineering workforce. One approach simply counts any job held by an individual with at least a bachelor’s degree in science or engineering as part of the science and engineering workforce; with this approach, the workforce totals 19.5 million people (National Science Foundation, 2014). In another approach, based on surveys of college graduates about their job requirements, 16.5 million people indicated that their position required a bachelor’s-level degree of science and engineering knowledge (National Science Foundation, 2014). In yet another approach, Rothwell (2013) analyzed data on skill and knowledge requirements from the Occupational Information Network (O*NET) national database and found that 26 million jobs required significant STEM expertise. There is little agreement across these various approaches. Given this lack of consensus, the committee followed the National Science Foundation’s approach, the first one above, defining the STEM workforce to include science and engineering occupations

Suggested Citation:"1 Introduction." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×

(currently about 5.7 million people) and science- and engineering-related occupations (currently about 7.4 million people; National Science Foundation, 2016).

Suggested Citation:"1 Introduction." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×

Learning Outcomes

In Chapter 2, the committee identifies increasing students’ mastery of STEM concepts and skills as one of three overarching goals for improving the quality of undergraduate STEM education. However, the committee

Suggested Citation:"1 Introduction." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×

does not propose any indicators that would directly measure student learning because of the complexities discussed here.

There is no simple way to address questions about whether students are acquiring the STEM concepts, skills, and abilities that will serve them for their lives after college, for several reasons. First, expectations for the holder of an associate’s degree or certificate are different than those for the holder of a baccalaureate degree. Second, faculty, employers, professional societies, accreditation agencies, testing companies, and curriculum committees all have different answers about the ideal and acceptable levels of proficiency, and, more fundamentally, about what concepts and skills should be measured for proficiency. These groups have launched a variety of efforts to define proficiency, some of which focus on core knowledge and skills for all 2-year and 4-year graduates, across all fields of study (e.g., Association of American Colleges & Universities, 2007; Lumina Foundation, 2015), while others focus on specific disciplines (e.g., Arum, Roksa, and Cook, 2016). Leaders in life sciences education, for example, have identified core concepts, competencies, and disciplinary practices for “biological literacy” in undergraduate biology (Brewer and Smith, 2011).

Third, the STEM disciplines are characterized by rapid discoveries and the ongoing development of new knowledge and skills. Within and across these disciplines, new subdisciplines and interdisciplinary fields are continually being created, bringing differing views about the core knowledge and skills that define successful learning. Fourth, in U.S. higher education, there have never been national tests, graduation standards, or uniform STEM curricula. These would be incompatible with the tradition of state and system-level autonomy in public higher education and with the diversity of public, private nonprofit, and private for-profit institutions that provide undergraduate STEM education. And fifth, some learning outcomes are ways of seeing problems, analyzing them, and solving them, with appropriate tools and with collaborations among diverse groups (see, e.g., Association of American Colleges & Universities, 2007). These outcomes are not knowledge about specific content areas, nor can they be easily translated to a national-level measure.

Because of these complications and practical difficulties, the committee does not propose any indicators that would directly measure student learning. However, the committee does target increased acquisition of STEM concepts and skills as an overarching goal for improving undergraduate STEM education (see Chapters 2 and 3).

In the future, with the growth of online instruction and assessment, more detailed, automated, proficiency exams and fine-grained records of accomplishment may be available: see Box 1-3. At that time, it will be important to revisit the conceptual framework and indicators proposed in this

Suggested Citation:"1 Introduction." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×
Suggested Citation:"1 Introduction." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×

report. The committee envisions that its recommended indicator system will undergo continuous improvement and updating (see Chapter 7).

Goals of the Indicator System

In light of the pressures for accountability and the complexity of measuring quality described above, the committee stresses that the primary goal of the indicator system is to allow federal agencies to monitor the status and quality of undergraduate STEM education over time (based on data aggregated from individual institutions). For such monitoring, the proposed indicator system can use data from nationally representative samples of

Suggested Citation:"1 Introduction." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×

institutions and students (see Chapters 6 and 7); in contrast, a system designed for accountability or ranking would require data from the entire universe of 2-year and 4-year institutions.

The committee envisions that institutional data collected by federal agencies to inform the national indicators could also be accessible to, and used by, individual institutions, state higher education systems, or consortia of institutions to monitor their own programs over time for the purpose of improvement. Although such accessibility might also allow institutions, consortia, states, or individuals to compare and rate institutions with the goal of holding them accountable, that is not the intended purpose of the indicator system. Rather, the committee expects that the indicator system will be used by the National Science Foundation and other federal agencies to monitor nationwide progress toward improving the quality of STEM undergraduate education. The committee also anticipates that the interagency committee on STEM education will use the indicator system as it works to advance the objectives for undergraduate STEM education identified in the federal STEM education 5-year strategic plan of the National Science and Technology Council.

STUDY APPROACH AND ORGANIZATION OF THE REPORT

To address its charge, the committee met seven times over the course of the study. The meetings were organized to allow the committee to consider the testimony of expert presenters, as well as privately deliberate the weight of existing evidence. In Phase I, the committee gathered and reviewed a wide catalog of literature on discipline-based education research and change strategies for improving the quality of undergraduate STEM. It also obtained information on existing systems for monitoring the quality of undergraduate education, both generally and in STEM specifically by inviting outside experts to speak to the committee and convening a workshop in February 2016 (see Appendix C). Drawing on these sources of evidence and its own expert judgment, the committee developed a draft report with a conceptual framework of goals and objectives for undergraduate STEM, releasing it for public comment in August 2016. In addition to soliciting comments and feedback online, the committee convened a public workshop in October 2016 to obtain further input (see Appendix A for a summary of comments received and committee responses). In Phase II of the study, the committee met four times in closed session to deliberate on the public comments, revise the framework of goals and objectives, and develop indicators of progress toward the objectives.

Based on the vision presented above, and its deliberations about the preliminary conceptual framework created in Phase I, the committee identified three overarching goals for STEM education: (1) increase students’

Suggested Citation:"1 Introduction." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×

mastery of STEM concepts and skills; (2) strive for equity, diversity, and inclusion; and (3) ensure adequate numbers of STEM professionals. These three goals are discussed in greater detail in Chapter 2. In Phase II, the committee also reviewed additional literature as it deliberated on the proposed indicators and developed conclusions and recommendations for research and data collection to develop the indicator system. Throughout the study process, committee members drafted sections of text, which were shared, reviewed, edited, and revised across members of the entire committee.

The report is organized around the major tasks outlined in the committee’s charge. Chapter 2 presents the conceptual framework for the indicator system; Chapters 3, 4, and 5 discuss the committee’s three goals for improvement in undergraduate STEM, along with objectives and indicators to measure progress toward those goals. Chapter 6 reviews existing monitoring systems and data sources related to undergraduate STEM education, and Chapter 7 discusses alternative approaches to implementing the indicator system.

REFERENCES

American Association for the Advancement of Science. (2011). Vision and Change in Undergraduate Biology Education: A Call to Action. Washington, DC: Author. Available: http://visionandchange.org/finalreport [July 2017].

Arum, R., Roksa, J., and Cook, A. (2016). Improving Quality in American Higher Education: Learning Outcomes and Assessments for the 21st Century. Hoboken, NJ: John Wiley & Sons.

Association of American Colleges & Universities. (2007). College Learning for the New Global Century. Washington, DC: Author. Available: https://www.aacu.org/sites/default/files/files/LEAP/GlobalCentury_final.pdf [July 2017].

Biel, R., and Brame, C. (2016). Traditional versus online biology courses: Connecting course design and student learning in an online setting. Journal of Microbiology & Biology Education, 17, 417–422.

Bonvillian, W.B., and Singer, S.R. (2013). The online challenge to higher education. Issues in Science and Technology, 29(4). Available: http://issues.org/29-4/the-online-challenge-to-higher-education [October 2017].

Bowen, W.G., Chingos, M.M., Lack, K.A. and Nygren, T.I. (2012). Interactive Learning Online at Public Universities: Evidence from Randomized Trials. Available: http://www.sr.ithaka.org/wp-content/uploads/2015/08/sr-ithaka-interactive-learning-online-at-public-universities.pdf [October 2017].

Brancaccio-Taras, L., Pape-Lindstrom, P., Peteroy-Kelly, M., Aguirre, K., Awong-Taylor, J., Balser, R., Cahill, M.J., Frey, R.G., Jack, R., Kelrick, M., Marley, K., Miller, K.G., Osgood, M., Romano, S., Uzman, J.A., and Zhao, J. (2016). The PULSE vision and change rubrics, version 1.0: A valid and equitable tool to measure transformation of life sciences departments at all institution types. CBE-Life Sciences Education, 15(4), art. 60. Available: http://www.lifescied.org/content/15/4/ar60.full [March 2017].

Suggested Citation:"1 Introduction." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×

Brewer, C.A., and Smith, D. (2011). Vision and Change in Undergraduate Biology Education: A Call to Action. Final Report of a National Conference organized by the American Association for the Advancement of Science. Washington, DC: American Association for the Advancement of Science. Available: http://visionandchange.org/files/2011/03/Revised-Vision-and-Change-Final-Report.pdf [May 2017].

Brizius, J.A., and Campbell, M.D. (1991). Getting Results: A Guide for Government Accountability. Washington, DC: Council of Governors’ Policy Advisors.

Butz, W.P., Bloom, G.A., Gross, M.E., Kelly, T.K., Kofner, A., and Rippen, H.E. (2003). Is There a Shortage of Scientists and Engineers? How Would We Know? RAND Science and Technology Issue Paper. Available: https://www.rand.org/content/dam/rand/pubs/issue_papers/2005/IP241.pdf [August 2017].

Carnevale, A.P., Smith, N., and Melton, M. (2011). STEM: Science, Technology, Engineering, and Mathematics. Washington, DC: Georgetown University Center on Education and the Workforce. Available: https://cew.georgetown.edu/wp-content/uploads/2014/11/stem-complete.pdf [July 2017].

Dadgar, M., and Weiss, M.J. (2012). Labor Market Returns to Sub-Baccalaureate Credentials: How Much Does a Community College Degree or Certificate Pay? CCRC Working Paper No. 45. New York: Columbia University, Teachers College, Community College Research Center.

Ginder, S., and Stearns, C. (2014). Web Tables: Enrollment in Distance Education Courses by State: Fall 2012. NCES 2014-023. Washington, DC: U.S. Department of Education, National Center for Education Statistics. Available: https://nces.ed.gov/pubsearch/pubsinfo.asp?pubid=2014023 [August 2017].

Hill, H., and Grossman, P. (2013). Learning from teacher evaluation: Challenges and opportunities. Harvard Educational Review, 82(1), 123–141.

Ho, A.D., Reich, J., Nesterko, S., Seaton, D.T., Mullaney, T., Waldo, J., and Chuang, I. (2014). HarvardX and MITx: The First Year of Open Online Courses. HarvardX and MITx Working Paper No. 1. Available: https://harvardx.harvard.edu/multiple-course-report [August 2017].

Holzer, H.J., and Lerman, R.I. (2007). America’s Forgotten Middle Skill Jobs: Education and Training Requirements in the Next Decade and Beyond. Available: http://www.urban.org/sites/default/files/publication/31566/411633-America-s-Forgotten-Middle-Skill-Jobs.PDF [March 2017].

Institute of Medicine. (2001). Crossing the Quality Chasm: A New Health System for the 21st Century. Washington, DC: National Academy Press.

Kolowich, S., and Newman, J. (2013). The professors behind the MOOC hype. The Chronicle of Higher Education, March 18. Available: http://www.chronicle.com/article/The-Professors-Behind-the-MOOC/137905 [August 2017].

Lowell, B.L., and Salzman, H. (2007). Into the Eye of the Storm: Assessing the Evidence on Science and Engineering Education, Quality, and Workforce Demand. Washington, DC: The Urban Institute. Available: http://www.urban.org/sites/default/files/publication/46796/411562-Into-the-Eye-of-the-Storm.PDF [August 2017].

Lumina Foundation. (2015). The Degree Qualifications Profile: A Learning-Centered Framework for What College Graduates Should Know and Be Able to Do to Earn the Associate’s, Bachelor’s or Master’s Degree. Indianapolis, IN: Lumina Foundation. Available: https://www.luminafoundation.org/files/resources/dqp.pdf [November 2015].

Maeroff, G.I. (2003). A Classroom of One: How Online Learning Is Changing Our Schools and Colleges. New York: Palgrave Macmillan.

Suggested Citation:"1 Introduction." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×

Matchett, K., Dahlberg, M., and Rudin, T. (2016). Quality in the Undergraduate Experience: What Is It? How Is It Measured? Who Decides? Summary of a Workshop. Washington, DC: The National Academies Press. Available: http://www.nap.edu/catalog/23514/quality-in-the-undergraduate-experience-what-is-it-how-is [July 2016].

Matsudaira, J. (2015). Defining and Measuring Quality in Higher Education. Paper commissioned for the Board on Higher Education and the Workforce Meeting on Quality in Higher Education, December 14-15. Available: http://sites.nationalacademies.org/cs/groups/pgasite/documents/webpage/pga_170937.pdf [July 2017].

Mankiw, N.G. (2003). Principles of Microeconomics (third ed.). Boston, MA: South-Western College.

National Academies of Sciences, Engineering, and Medicine. (2016a). Barriers and Opportunities for 2-Year and 4-Year STEM Degrees: Systemic Change to Support Students’ Diverse Pathways. Washington, DC: The National Academies Press. Available: http://www.nap.edu/catalog/21739/barriers-and-opportunities-for-2-year-and-4-year-stem-degrees [March 2016].

National Academies of Sciences, Engineering, and Medicine (2016b). Science Literacy: Concepts, Contexts, and Consequences. Washington, DC: The National Academies Press. Available: https://www.nap.edu/catalog/23595/science-literacy-concepts-contexts-and-consequences [August 2017].

National Academy of Sciences, National Academy of Engineering, and Institute of Medicine. (2007). Rising Above the Gathering Storm: Energizing and Employing America for a Brighter Future. Washington, DC: The National Academies Press. Available: http://www.nap.edu/catalog/11463/rising-above-the-gathering-storm-energizing-and-employing-america-for [March 2016].

National Academy of Sciences, National Academy of Engineering, and Institute of Medicine. (2010). Rising Above the Gathering Storm, Revisited: Rapidly Approaching Category 5. Washington, DC: The National Academies Press. Available: https://www.nap.edu/catalog/11463/rising-above-the-gathering-storm-energizing-and-employing-america-for [August 2017].

National Research Council. (2011). Expanding Underrepresented Minority Participation: America’s Science and Technology Talent at the Crossroads. Washington, DC: The National Academies Press. Available: https://www.nap.edu/catalog/12984/expanding-underrepresented-minority-participation-americas-science-and-technology-talent-at [August 2017].

National Research Council. (2012). Discipline-Based Education Research: Understanding and Improving Learning in Undergraduate Science and Engineering. Washington, DC: The National Academies Press. Available: http://www.nap.edu/catalog/13362/discipline-based-education-research-understanding-and-improving-learning-in-undergraduate [March 2016].

National Research Council. (2014). Capturing Change in Science, Technology, and Innovation: Improving Indicators to Inform Policy. Washington, DC: The National Academies Press. Available: https://www.nap.edu/catalog/18606/capturing-change-in-science-technology-and-innovation-improving-indicators-to [August 2017].

National Science and Technology Council. (2013). Federal STEM Education 5-Year Strategic Plan. Available: https://www.whitehouse.gov/sites/default/files/microsites/ostp/stem_stratplan_2013.pdf [March 2016].

National Science Foundation. (2014). Science and Engineering Indicators 2014. Arlington, VA: Author. Available: https://www.nsf.gov/statistics/seind14 [February 2018].

National Science Foundation. (2015). Revisiting the STEM Workforce: A Companion to Science and Engineering Indicators 2014. Arlington, VA: Author. Available: http://www.nsf.gov/pubs/2015/nsb201510/nsb201510.pdf [March 2016].

Suggested Citation:"1 Introduction." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×

National Science Foundation. (2016). Science and Engineering Indicators 2016. Arlington, VA: Author. Available: https://www.nsf.gov/statistics/2016/nsb20161/# [July 2017].

National Science Foundation, National Center for Science and Engineering Statistics. (2017). Women, Minorities, and Persons with Disabilities in Science and Engineering: 2017. Special Report NSF 17-310. Arlington, VA: Author. Available: https://www.nsf.gov/statistics/2017/nsf17310 [August 2017].

Oakes, J. (1986). Educational Indicators: A Guide for Policymakers. Santa Monica, CA: Center for Policy Research in Education.

Odden, A. (1990). Educational indicators in the United States: The need for analysis. Educational Researcher, 19(5), 24–29.

Parsad, B., and Lewis, L. (2008). Distance Education at Degree-Granting Postsecondary Institutions: 2006–07 (NCES 2009–044). National Center for Education Statistics, Institute of Education Sciences, U.S. Department of Education. Washington, DC. Available: https://nces.ed.gov/pubs2009/2009044.pdf [August 2017].

Planty, M., and Carlson, D. (2010). Understanding Education Indicators: A Practical Primer for Research and Policy. New York: Teachers College Press.

President’s Council of Advisors on Science and Technology. (2012). Engage to Excel: Producing One Million Additional College Graduates with Degrees in Science, Technology, Engineering, and Mathematics. Available: https://www.whitehouse.gov/sites/default/files/microsites/ostp/pcast-engage-to-excel-final_feb.pdf [March 2016].

Radford, A.W. (2011). Stats in Brief: Learning at a Distance: Undergraduate Enrollment in Distance Education Courses and Degree Programs. (NCES 2012-154). Washington, DC: National Center for Education Statistics, U.S. Department of Education. Available: https://nces.ed.gov/pubs2012/2012154.pdf [August 2017].

Romer, P.M. (1990). Endogenous technological change. Journal of Political Economy, 98, S5, S71.

Rothwell, J. (2013). The Hidden STEM Economy. Metropolitan Policy Program at Brookings Institution. Available: http://www.brookings.edu/~/media/research/files/reports/2013/06/10-stem-economy-rothwell/thehiddenstemeconomy610.pdf [July 2017].

Shavelson, R.J., McDonnell, L.M., and Oakes, J. (1989). Indicators for Monitoring Mathematics and Science Education. Santa Monica, CA: RAND. Available: http://www.rand.org/pubs/reports/R3742.html [July 2017].

Snyder, T.D., de Brey, C., and Dillow, S.A. (2016). Digest of Education Statistics 2014. (NCES 2016-006). Washington, DC: National Center for Education Statistics, Institute of Education Sciences, U.S. Department of Education. Available: https://nces.ed.gov/pubs2016/2016006.pdf [August 2017].

Solow, R.M. (1957). Technical change and the aggregate production function. The Review of Economics and Statistics, 39, 312–320.

Stevens, A.H., Kurlaender, M., and Grosz, M. (2015). Career Technical Education and Labor Market Outcomes: Evidence from California Community Colleges. (NBER Working Paper No. 21137). Cambridge, MA: National Bureau of Economic Research. Available: http://www.nber.org/papers/w21137 [August 2017].

Weisberg, D., Sexton, S., Mulhern, J., and Keeling, D. (2009). The Widget Effect: Our National Failure to Acknowledge and Act on Differences in Teacher Effectiveness. Brooklyn, NY: The New Teacher Project.

Wilson, S.M., and Anagnostopolous, D. (in press). The seen and the foreseen: Will unintended consequences thwart efforts to (re)build trust in teacher preparation? Journal of Teacher Education.

Xie, Y., and Killewald, A.A. (2012). Is American Science in Decline? Cambridge, MA: Harvard University Press.

Suggested Citation:"1 Introduction." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×

This page intentionally left blank.

Suggested Citation:"1 Introduction." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×
Page 13
Suggested Citation:"1 Introduction." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×
Page 14
Suggested Citation:"1 Introduction." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×
Page 15
Suggested Citation:"1 Introduction." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×
Page 16
Suggested Citation:"1 Introduction." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×
Page 17
Suggested Citation:"1 Introduction." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×
Page 18
Suggested Citation:"1 Introduction." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×
Page 19
Suggested Citation:"1 Introduction." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×
Page 20
Suggested Citation:"1 Introduction." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×
Page 21
Suggested Citation:"1 Introduction." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×
Page 22
Suggested Citation:"1 Introduction." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×
Page 23
Suggested Citation:"1 Introduction." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×
Page 24
Suggested Citation:"1 Introduction." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×
Page 25
Suggested Citation:"1 Introduction." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×
Page 26
Suggested Citation:"1 Introduction." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×
Page 27
Suggested Citation:"1 Introduction." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×
Page 28
Suggested Citation:"1 Introduction." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×
Page 29
Suggested Citation:"1 Introduction." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×
Page 30
Suggested Citation:"1 Introduction." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×
Page 31
Suggested Citation:"1 Introduction." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×
Page 32
Suggested Citation:"1 Introduction." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×
Page 33
Suggested Citation:"1 Introduction." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×
Page 34
Suggested Citation:"1 Introduction." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×
Page 35
Suggested Citation:"1 Introduction." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×
Page 36
Next: 2 Conceptual Framework for the Indicator System »
Indicators for Monitoring Undergraduate STEM Education Get This Book
×
 Indicators for Monitoring Undergraduate STEM Education
Buy Paperback | $55.00 Buy Ebook | $44.99
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF

Science, technology, engineering and mathematics (STEM) professionals generate a stream of scientific discoveries and technological innovations that fuel job creation and national economic growth. Ensuring a robust supply of these professionals is critical for sustaining growth and creating jobs growth at a time of intense global competition. Undergraduate STEM education prepares the STEM professionals of today and those of tomorrow, while also helping all students develop knowledge and skills they can draw on in a variety of occupations and as individual citizens. However, many capable students intending to major in STEM later switch to another field or drop out of higher education altogether, partly because of documented weaknesses in STEM teaching, learning and student supports. Improving undergraduate STEM education to address these weaknesses is a national imperative.

Many initiatives are now underway to improve the quality of undergraduate STEM teaching and learning. Some focus on the national level, others involve multi-institution collaborations, and others take place on individual campuses. At present, however, policymakers and the public do not know whether these various initiatives are accomplishing their goals and leading to nationwide improvement in undergraduate STEM education.

Indicators for Monitoring Undergraduate STEM Education outlines a framework and a set of indicators that document the status and quality of undergraduate STEM education at the national level over multiple years. It also indicates areas where additional research is needed in order to develop appropriate measures. This publication will be valuable to government agencies that make investments in higher education, institutions of higher education, private funders of higher education programs, and industry stakeholders. It will also be of interest to researchers who study higher education.

READ FREE ONLINE

  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    Switch between the Original Pages, where you can read the report as it appeared in print, and Text Pages for the web version, where you can highlight and search the text.

    « Back Next »
  6. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  7. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  8. ×

    View our suggested citation for this chapter.

    « Back Next »
  9. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!