National Academies Press: OpenBook

Indicators for Monitoring Undergraduate STEM Education (2018)

Chapter: 7 Implementing the Indicator System

« Previous: 6 Existing Data Sources and Monitoring Systems
Suggested Citation:"7 Implementing the Indicator System." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×

7

Implementing the Indicator System

As detailed in Chapter 6, nationally representative data are not currently available from public or proprietary sources for most of the committee’s proposed indicators. This limits policy makers’ ability to track progress toward the committee’s goals of (1) increasing students’ mastery of STEM concepts and skills; (2) striving for equity, diversity, and inclusion; and (3) ensuring adequate numbers of STEM professionals. That chapter outlines steps toward developing each of the 21 indicators by revising various public and proprietary data sources to provide the data needed for each one.

This chapter aims to reduce the complexity of implementing the indicator system by presenting three options for obtaining the data required for all of the indicators: (a) creating a national student unit record data system; (b) expanding National Center for Education Statistics (NCES) data collections; and (c) combining existing data from nonfederal sources. It also discusses new data collection and analysis systems that could potentially be used in the future to support the proposed indicator system. The chapter ends with the committee’s conclusion about moving forward, including a caution about the intended use of the proposed indicator system.

OPTION 1: CREATE A NATIONAL STUDENT UNIT RECORD DATA SYSTEM

A national student unit record data system incorporating administrative data on individual students would provide reliable and usable data for many of the proposed indicators focused on students’ progress through

Suggested Citation:"7 Implementing the Indicator System." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×

STEM programs. Relative to the current Integrated Postsecondary Education Data System (IPEDS), such a system would incorporate more accurate and complete data, As noted in Chapter 6, IPEDS data do not include students who transfer, those who attend multiple institutions, and those who enroll part time; a student unit record data system would include all of these groups.1 In addition, because IPEDS consists of aggregated, institution-level data on cohorts of students, it cannot be used to track the academic progress of individual students, or different groups of students, over time. In contrast, a student unit record data system is well suited for such longitudinal monitoring. Data from such a system could be aggregated to monitor the progress and outcomes of intended STEM majors among students of different genders, racial and ethnic groups, socioeconomic backgrounds, disability status, and Pell grant eligibility (socioeconomic) status. A student unit record data system would allow longitudinal analyses of trends over time for 8 of the 21 proposed indicators:

  • Indicator 2.1.2: Entrance to and persistence in STEM academic programs
  • Indicator 2.2.1: Diversity of STEM degree and certificate earners in comparison with diversity of degree and certificate earners in all fields
  • Indicator 2.2.2: Diversity of students who transfer from 2-year to 4-year STEM programs in comparison with diversity of students in 2-year STEM programs
  • Indicator 2.2.3 Time to degree for students in STEM academic programs
  • Indicator 3.1.1 Completion of foundational courses, including developmental education courses, to ensure STEM program readiness
  • Indicator 3.2.1: Retention in STEM programs, course to course and year to year
  • Indicator 3.2.2: Transfers from 2-year to 4-year STEM programs in comparison with transfers to all 4-year programs
  • Indicator 3.3.1: Number of students who attain STEM credentials over time, disaggregated by institute type, transfer status, and demographic characteristics

In this option, the federal government would require institutions to collect and provide to the national system standardized unit record data on student educational experiences and activities. Currently, the creation of such a system is prohibited by the 2008 amendments to the Higher Educa-

___________________

1 The National Center for Education Statistics has added new survey components to begin capturing information on part-time and transfer students: see Chapter 6.

Suggested Citation:"7 Implementing the Indicator System." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×

tion Act (see Chapter 6). At this time, however, there are bipartisan bills in Congress (H.R. 2434 and S. 1121, the College Transparency Act) that would amend the Higher Education Act to repeal the current ban on a national student unit record data system and direct the NCES to create such a system. If the bills became law, NCES, when creating the new system, could take advantage of the lessons learned from the many state higher education systems and multi-institution education reform consortia that have successfully collected and used unit record student data to monitor undergraduate education.

Creating a national database of student unit records appears to be both technically and financially feasible and could reduce institutions’ current burden of reporting IPEDS data, as shown in two feasibility studies.

In 2005, NCES commissioned the first study, to examine the feasibility of creating a student unit record data system that could replace the aggregated institution-level data included in IPEDS. The study (Cunningham and Milam, 2005) presented three findings. First, the authors found that NCES had at the time most of the computing hardware and software necessary to implement such a system, including equipment for web-based data collection and servers for storing large amounts of student data. However, to ensure the security and confidentiality of the data, NCES would have to create a new, permanent database storage system, protected by physical and software firewalls; the authors did not estimate how much these modifications would have cost.

Second, the authors found that implementing the new system at that time would present colleges and universities with technical challenges, requiring expenditures for new technology, training in the use of the new reporting system, and personnel. Cunningham and Milam (2005) gathered estimates of implementation costs from hundreds of people from a variety of individual institutions, state higher education agencies, and higher education associations. The cost estimates varied widely, depending on whether an institution was currently participating in a state-level student unit record data system (see Chapter 6) and its information technology and institutional research capabilities. Another key factor was whether an institution was already uploading student data to the National Student Loan Data System (NSLDS; see Chapter 6); at the time, nearly all institutions were doing so. Given these complex factors influencing costs, Cunningham and Milam (2005) did not estimate an average per-institution cost, but noted the possibility of providing federal support to defray these costs.

Third, the authors found that institutional costs would eventually decline, partly because some IPEDS reporting would be eliminated.

Cunningham and Milam (2005) concluded that it was technically feasible for most institutions to report student data to a national student unit record data system, given time for transition. They did not address, how-

Suggested Citation:"7 Implementing the Indicator System." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×

ever, whether this new reporting would be financially feasible for participating colleges and universities.

More recently, Miller (2016) analyzed various approaches to developing a national student unit record data system to be overseen by the Department of Education. Like Cunningham and Milam (2005), the author noted that nearly all institutions were already reporting data to NSLDS, which included much of the data needed to track the progress of students receiving financial aid over time (e.g., enrollments, transfers, field of study, completions). Because, on average, 70 percent of all students receive financial aid (Executive Office of the President of the United States, 2017), NSLDS already includes much of the data needed for a national system. Miller thus proposed that a national data system could best be created by building on the existing capability of NSLDS.

Expanding NSLDS to include all data from all students appears technically feasible, based on the system’s recent history of adding 17 percent more student records between February 2010 and July 2013. Miller (2016) estimated that the programming changes to accommodate this growth would cost around $1 million. Miller cautioned, however, that NSLDS already has significant technical limitations and a history of poor processing speeds. Adding millions of additional records on students who do not receive financial aid could slow the system’s ability to perform its core function of ensuring students receive financial aid and repay their loans. To address this problem, Miller (2016) proposed a complete modernization of NSLDS, which would require additional funding; he did not provide a cost estimate.

In this proposal, NCES would handle access to the student unit record data system by policy makers, researchers, and the public (Miller, 2016). At least once a year, a data extract would be transmitted to NCES, which would be responsible for generating public reports on higher education and populating IPEDS with data no longer being reported to it. NCES would also establish and implement protocols for allowing access to the database while maintaining the privacy and confidentiality of individual student records.

Miller (2016) argues that moving to the system he proposes would not burden most institutions with massive, costly changes in their reporting, for two reasons. First, NSLDS requires institutions to report data on only those students who receive federal financial aid. Yet many institutions submit data on all of their students to the National Student Clearinghouse (NSC), which in turn reports on financial aid recipients to NSLDS on behalf of these institutions. For this large group of public and private institutions (more than 3,600 according to NSC; see Chapter 6), moving to a student unit record system would simply mean passing along data they are already assembling. Second, the student unit record data system would replace seven

Suggested Citation:"7 Implementing the Indicator System." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×

TABLE 7-1 Data for Indicators in Option 1

Objective Indicator Proposed Data Source
1.1 Use of evidence-based STEM educational practices both in and outside of classrooms 1.1.1 Use of evidence-based STEM educational practices in course development and delivery Renewed and expanded NSOPF
1.1.2 Use of evidence-based STEM educational practices outside the classroom Renewed and expanded NSOPF
1.2 Existence and use of supports that help STEM instructors use evidence-based learning experiences 1.2.1 Extent of instructors’ involvement in professional development Renewed and expanded NSOPF
1.2.2 Availability of support or incentives for evidence-based course development or course redesign Renewed and expanded NSOPF

components of IPEDS entirely and most of an eighth component. Using NCES estimates of the average number of hours required to complete each component in 2015–2016, Miller (2016) estimated that moving to a student unit record system would reduce reporting time by about 60 percent, saving around 88.6 hours per institution per year.

The current lack of a student unit record data system makes it difficult to develop a national picture of and to monitor trends over time among the nation’s postsecondary students and institutions; this difficulty applies to undergraduate STEM education.

However, even if the prohibition against such a system is removed, a national database of student unit records would not provide information on instructors,2 who play a key role in engaging students in evidence-based educational experiences inside and outside the classroom, nor on programs and perceptions related to equity, diversity, and inclusion. Therefore, student and instructor surveys about educational experiences and activities would still be necessary to support the proposed indicator system. Table 7-1 shows how each indicator would be supported under this option.

___________________

2 As noted above, the committee uses the term “instructor” to refer to all individuals who teach undergraduates, including tenured and tenure-track faculty, part-time and adjunct instructors, and graduate student instructors.

Suggested Citation:"7 Implementing the Indicator System." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×
Objective Indicator Proposed Data Source
1.3 An institutional climate that values undergraduate STEM instruction 1.3.1 Use of valid measures of teaching effectiveness Renewed and expanded NSOPF
1.3.2 Consideration of evidence-based teaching in personnel decisions by departments and institutions Renewed and expanded NSOPF
1.4 Continuous improvement in STEM teaching and learning No indicators: see “Challenges of Measuring Continuous Improvement” in Chapter 3.
2.1 Equity of access to high-quality undergraduate STEM educational programs and experiences 2.1.1 Institutional structures, policies, and practices that strengthen STEM readiness for entering and enrolled college students Extended and expanded BPS
2.1.2 Entrance to and persistence in STEM educational programs Unit record data system
2.1.3 Equitable student participation in evidence-based STEM educational practices Extended and expanded BPS
2.2 Representational equity among STEM credential earners 2.2.1 Diversity of STEM degree and certificate earners in comparison with diversity degree and certificate earners in all fields Unit record data system
2.2.2 Diversity of students who transfer from 2-year to 4-year STEM programs in comparison with diversity in 2-year STEM programs Unit record data system
2.2.3 Time to degree for students in STEM academic programs Unit record data system
2.3 Representational diversity among STEM instructors 2.3.1 Diversity of STEM instructors in comparison with diversity of STEM graduate degree holders Revised IPEDS Human Resources Survey
Suggested Citation:"7 Implementing the Indicator System." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×
Objective Indicator Proposed Data Source
2.3.2 Diversity of STEM graduate student instructors in comparison with diversity of STEM graduate students Revised IPEDS Human Resources Survey
2.4 Inclusive environments in institutions and STEM departments 2.4.1 Students pursuing STEM credentials feel included and supported in their academic programs and departments Extended and expanded BPS
2.4.2 Instructors teaching courses in STEM disciplines feel supported and included in their departments Renewed and expanded NSOPF
2.4.3 Institutional practices that are culturally responsive, inclusive, and consistent across the institution Renewed and expanded NSOPF
3.1 Foundational preparation for STEM for all students 3.1.1 Completion of foundational courses, including developmental education courses, to ensure STEM program readiness Unit record data system
3.2 Successful navigation into and through STEM programs of study 3.2.1 Retention in STEM programs, course to course and year to year Unit record data system
3.2.2 Transfers from 2-year to 4-year STEM programs in comparison with transfers to all 4-year programs Unit record data system
3.3 STEM credential attainment 3.3.1 Number of students who attain STEM credentials over time, disaggregated by institution type, transfer status, and demographic characteristics Unit record data system

NOTES: BPS, Beginning Postsecondary Students Longitudinal Study; IPEDS, Integrated Postsecondary Education Data System; NSOPF, National Study of Postsecondary Faculty.

Suggested Citation:"7 Implementing the Indicator System." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×

OPTION 2: EXPAND NCES DATA COLLECTIONS

A currently legal and more feasible approach would rely on IPEDS and other NCES surveys to support the proposed indicators. As discussed in Chapter 6, nearly all public and private, for-profit and nonprofit institutions provide annual data to IPEDS, reporting on the vast majority of students who are currently enrolled in STEM courses and majors. Using this option to inform the proposed indicators would tap the capabilities of an established and reliable data collection mechanism, although it would provide less robust data than under Option 1. The usefulness of IPEDS data for the indicators is limited by the statistical and analytic challenges that result from using aggregated data as the unit of analysis.

Currently, IPEDS data partly support only 2 of the 21 proposed indicators:

  • Indicator 2.2.1: Diversity of STEM degree and certificate earners in comparison with diversity of degree and certificate earners in all fields
  • Indicator 3.3.1: Number of students who attain STEM credentials over time, disaggregated by institution type, transfer status, and demographic characteristics

The IPEDS data do not fully support these two indicators because they do not permit the disaggregation of STEM credentials by disability status and Pell status (a proxy for socioeconomic status).3 Fortunately, states and voluntary multi-institution data initiatives have developed and implemented an expanded range of measures that could fill some of the gaps in IPEDS (see below).

Under this option, NCES (with support from the National Science Foundation [NSF]) would expand IPEDS institutional surveys to include institution-level measures of student progress toward degrees and certificates. In addition, NCES would extend existing student surveys and revive a major faculty survey. Table 7-2 shows how each indicator would be supported under this option.

In comparison with the first option, this option would place a greater burden on institutions of higher education. In Option 1, institutional research staff would only have to upload student unit record data to the national student unit record system. In this option, institutional research staff would be required to collect additional data (beyond their current collections) and use it to calculate additional measures for reporting to IPEDS.

___________________

3 Beginning in 2017–2018, NCES will gather information on students’ Pell grant status as part of the new outcome measures survey within IPEDS

Suggested Citation:"7 Implementing the Indicator System." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×

TABLE 7-2 Data for Indicators in Option 2

Objective Indicator Proposed Data Source
1.1 Use of evidence-based STEM educational practices both in and outside of classrooms 1.1.1 Use of evidence-based STEM educational practices in course development and delivery Renewed and expanded NSOPF
1.2.1 Use of evidence-based STEM educational practices outside the classroom Renewed and expanded NSOPF
1.2 Existence and use of supports that help STEM instructors use evidence-based learning experiences 1.2.1 Extent of instructors’ involvement in professional development Renewed and expanded NSOPF
1.2.2 Availability of support or incentives for evidence-based course development or course redesign Renewed and expanded NSOPF
1.3 An institutional climate that values undergraduate STEM instruction 1.3.1 Use of valid measures of teaching effectiveness Renewed and expanded NSOPF
1.3.2 Consideration of evidence-based teaching in personnel decisions by departments and institutions Renewed and expanded NSOPF
1.4 Continuous improvement in STEM teaching and learning No indicators: see “Challenges of Measuring Continuous Improvement” in Chapter 3.
2.1 Equity of access to high-quality undergraduate STEM educational programs and experiences 2.1.1 Institutional structures, policies, and practices that strengthen STEM readiness for entering and enrolled college students Extended and expanded BPS
2.1.2 Entrance to and persistence in STEM educational programs Extended and expanded BPS
2.1.3 Equitable student participation in evidence-based STEM educational practices Extended and expanded BPS
Suggested Citation:"7 Implementing the Indicator System." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×
Objective Indicator Proposed Data Source
2.2 Representational equity among STEM credential earners 2.2.1 Diversity of STEM degree and certificate earners in comparison with diversity of degree and certificate earners in all fields Revised and expanded IPEDS
2.2.2 Diversity of students who transfer from 2-year to 4-year STEM programs in comparison with diversity in 2-year STEM programs Revised and expanded IPEDS
2.2.3 Time to degree for students in STEM academic programs Revised and expanded IPEDS
2.3 Representational diversity among STEM instructors 2.3.1 Diversity of STEM instructors in comparison with diversity of STEM graduate degree holders Revised IPEDS Human Resources Survey
2.3.2 Diversity of STEM graduate student instructors in comparison with diversity of STEM graduate students Revised IPEDS Human Resources Survey
2.4 Inclusive environments in institutions and STEM departments 2.4.1 Students pursuing STEM credentials feel included and supported in their academic programs and departments Extended and expanded BPS
2.4.2 Instructors teaching courses in STEM disciplines feel supported and included in their departments Renewed and expanded NSOPF
2.4.3 Institutional practices that are culturally responsive, inclusive, and consistent across the institution Renewed and expanded NSOPF
Suggested Citation:"7 Implementing the Indicator System." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×
Objective Indicator Proposed Data Source
3.1 Foundational preparation for STEM for all students 3.1.1 Completion of foundational courses, including developmental education courses, to ensure STEM program readiness Revised and expanded IPEDS
3.2 Successful navigation into and through STEM programs of study 3.2.1 Retention in STEM programs, course to course and year to year Revised and expanded IPEDS
3.2.2 Transfers from 2 year-to 4-year STEM programs in comparison with transfers to all 4-year programs Revised and expanded IPEDS
3.3 STEM credential attainment 3.3.1 Number of students who attain STEM credentials over time (disaggregated by institution type, transfer status, and students’ demographic characteristics) Revised and expanded IPEDS

NOTES: BPS, Beginning Postsecondary Students Longitudinal Study; IPEDS, Integrated Postsecondary Education Data System; NSOPF, National Study of Postsecondary Faculty.

Overall, this option would require NCES to change its data collection in three ways: expanding IPEDS, expanding the Beginning Postsecondary Students Longitudinal Study (BPS), and renewing and expanding the National Study of Postsecondary Faculty.

Expanding IPEDS

Under this option, IPEDS surveys would be expanded to require institutions to report on several measures that have been developed and tested in voluntary data collections by states and higher education reform consortia: see Box 7-1. These new measures, like the committee’s proposed indicators, are designed to represent important dimensions of undergraduate education in a readily understandable form.

Specifically, NCES would expand the IPEDS surveys to include the following measures, as defined by Janice and Voight (2016, p. iv):

Suggested Citation:"7 Implementing the Indicator System." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×
  • Program of study selection: The percentage of students in a cohort who demonstrate a program of study selection by taking nine credits (or three courses) in a meta-major in the first year. Meta-majors include: education; arts and humanities; social and behavioral sciences and human services; science, technology, engineering, and math (STEM); business and communication; health; and trades.
  • Enrollment: a 12-month headcount that includes all undergraduate students who enroll at any point during the calendar year, disaggregated by program of study selection.
  • Gateway course completion: The percentage of students completing college-level, introductory mathematics and English courses tracked separately in their first year, disaggregated by program of study selection.
Suggested Citation:"7 Implementing the Indicator System." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×
  • Retention rate: The percentage of students in a cohort who are either enrolled at their initial institution or transfer to a longer duration program at the initial or a subsequent institution, calculated annually for up to 200 percent of program length, disaggregated by program of study selection.
  • Transfer rate: The percentage of students in a cohort who transfer into longer programs at the initial or subsequent institution(s), for up to 200 percent of program length, disaggregated by program of study selection.

Two additional changes to IPEDS data collection would be necessary to obtain the data needed for the proposed indicators. First, all of the measures of student progress and credentials outlined above would have to be disaggregated by Pell grant status and disability status. As noted above, institutions currently provide IPEDS data on completions disaggregated by gender and race and ethnicity, so this change will require institutions to make additional calculations using their administrative records. Second, data collected from institutions through the IPEDS Human Resources Survey would have to be expanded to include the department in which faculty and graduate students teach. The Human Resources Survey, administered each spring, asks institutions to report data on: employee status (full or part time, faculty status); full-time instructional staff (academic rank, gender, and contract length or teaching period); and total salary outlay and number of months covered, by academic rank and gender. However, the survey does not ask for disciplinary department, which is needed to identify STEM faculty and staff.

The proposed additional measures would capture and follow a wider range of students intending to major in STEM than are currently included in IPEDS, including first-time full-time, transfer full-time, first-time part-time, and transfer part-time students. These additional measures would not only support the proposed indicator system, but would also improve the capacity of IPEDS, allowing policy makers and higher education leaders to track students’ progress and completion generally, across all fields or majors.

Different combinations of the proposed additional student measures have been used in voluntary data-sharing programs among institutions participating in higher education reform consortia, such as Achieving the Dream, Completion by Design, and Complete College America, and by some state higher education data systems (see Chapter 6). Of particular importance to the proposed indicator system is the “program of study selection” measure used by Achieving the Dream. As noted above, this measure assigns students into one of seven meta-majors, one of which is STEM, based on their course-taking patterns in their first year of higher education.

Suggested Citation:"7 Implementing the Indicator System." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×

This measure permits STEM students to be identified early in their college careers, even before they officially declare their intended majors.

If it is not possible to add the program-of-study-selection measure to IPEDS, NCES could instead require institutions to use the data on program of study selection they already report to NSLDS to disaggregate enrollment, gateway course completion, retention rate, and transfer rate data by Classification of Instructional Program codes. Since 2014, postsecondary institutions have been required to report to NSLDS unit student record data on students’ program of study selection for those programs with enrolled students who receive Title IV aid.4 These reports do not capture students’ intentions as early as the measure of program of study selection outlined above, and they miss the 30 percent of students who do not receive or apply for some type of federal aid (Miller, 2016). Nonetheless, this requirement would allow institutions to report more detailed information to IPEDS on the programs in which their students are enrolled.

Expanding the Beginning Postsecondary Students Longitudinal Study

The second expansion necessary to support the proposed indicator system under this option would be to extend and expand the NCES’s Beginning Postsecondary Students Longitudinal Survey (BPS) to include additional and larger cohorts. To date, BPS has followed cohorts of students who entered postsecondary education in 1990, 1996, 2004, and 2012. Students in the BPS complete three surveys: one at the end of their first academic year, and two others 3 years and 6 years after they entered postsecondary education. The survey structure includes a section on characteristics, which includes items on students’ educational experiences relevant to the proposed indicators, and this section would easily allow addition of a STEM-specific branch. Specifically, to support the proposed indicators in this option, the NCES would need to: continue to add new BPS cohorts every 6–8 years; expand the sample in each cohort to a size sufficient to allow statistical analyses of gender, race and ethnicity, disability status, and Pell grant (socioeconomic) status; and add STEM-specific questions to the survey.

___________________

4 Title IV includes students with loans under the Federal Family Education Loan Program or William D. Ford Federal Direct Loan (Direct Loan) Program, as well as students who have Federal Perkins Loans, students who have received Federal Pell Grants, Teacher Education Assistance for College and Higher Education Grants, Academic Competitiveness Grants or Science and Math Access to Retain Talent Grants, and students on whose behalf parents borrowed Parent PLUS loans.

Suggested Citation:"7 Implementing the Indicator System." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×

Renewing and Expanding the National Study of Postsecondary Faculty

The third expansion to current NCES data collection under this option would be to renew and expand the National Study of Postsecondary Faculty (NSOPF). The NSOPF was administered in 1992, 1998, and 2003 to 3,000 department chairs and 11,000 instructors at a wide range of postsecondary institutions. This survey, which asked faculty about their employment and activities, would have to be reinstated and modified in three ways to support the proposed indicator system. First, it would be administered to a sufficiently large sample of faculty to allow the meaningful disaggregation of the sample into important dimensions of diversity, including race and ethnicity, gender, part- or full-time status, and tenure- or non-tenure track status. Second, it would be expanded to include graduate teaching assistants, who provide a substantial amount of instruction for STEM students, especially during the first 2 years of higher education. Third, it would be modified to include questions on instructors’ use of evidence-based STEM educational practices, including course redesign, evidence-based teaching strategies, and involvement in professional development. Although the survey was administered to faculty from all disciplines, it was designed to measure teaching activities that are discipline specific. Hence, adding questions specifically about evidence-based STEM teaching strategies is possible in the NSOPF framework.

Revising and expanding the NSOPF and IPEDS as proposed under this option would likely increase the data collection burden on institutions. For example, a group of 18 public and private 2-year and 4-year institutions joined in the Voluntary Institutional Metrics Project to develop meaningful measures of the quality and outcomes of higher education (HCM Strategists, 2013). The participating institutions shared their student unit record data, including all full-time, part-time, and transfer students, and analyzed the data to report on measures similar to those that would be added to IPEDS in this option.

The participating institutions reported that the burden of analyzing student records to report on the measures was substantial. Following 4-year students for 200 percent of expected time to completion (8 years) from their initial enrollment sometimes required the analysis of millions of records over an extended period of time during which there were sometimes changes in records systems and processes. In addition, institutions lacked data on students who transferred to other institutions, especially if the other institution was located in another state. However, if this option was adopted and institutions had difficulty obtaining data on students who transfer to another institution, they might be able to obtain the data from NSC (see Chapter 6). Nevertheless, the additional burden on institutions to report data on these measures is undeniable. This challenge points to a

Suggested Citation:"7 Implementing the Indicator System." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×

benefit of the first option: if a national student unit record data system is created, the burden of calculating the proposed indicators related to STEM students’ progress and completion would fall on NCES rather than on institutions.

OPTION 3: COMBINE EXISTING DATA FROM NONFEDERAL SOURCES

The third option, which could be undertaken by federal agencies or other organizations (e.g., a higher education association), would take advantage of existing data sources and require little or no federal investment. This option has the potential to provide limited support for the committee’s proposed indicator system by combining data from states and voluntary multi-institution education reform consortia to create a national picture of STEM undergraduate education. As noted above, many 2-year and 4-year institutions voluntarily share unit record data on cohorts of students through state data warehouses and/or with one or more education reform consortia. They report on student progress using measures similar to those identified at the beginning of this chapter (refer to Box 7-1). Institutions participating in these consortia and state data sharing also report IPEDS data to NCES and data on students receiving financial aid to NSLDS; these data are relevant for the proposed indicators.

In this option, the federal government or a private foundation would seek to tap these existing data resources by commissioning research on the feasibility of creating a nationally representative sample of postsecondary institutions that are already reporting the measures outlined in Option 2. Specifically, the Department of Education, NSF, or a private foundation would commission research to first identify institutions that are already submitting student unit record data to a consortia or their state data systems. One starting point would be the map listing voluntary data collection efforts in each state and territory.5

The research would then examine whether a representative national sample of student unit record data could be derived from these institutions. The committee’s preliminary review of this information suggests that such data sharing is most frequent among public 4-year institutions, moderate among 2-year institutions, less frequent among private nonprofit institutions, and rare among private for-profit institutions. As noted in Chapter 6, state unit record data systems have limited or no coverage of private nonprofit institutions. However, statistical weighting methods could be used to correct for this gap and represent all institution types. With stratification

___________________

5 See the Institution for Higher Education Policy website: http://www.ihep.org/postsecdata/mapping-data-landscape [August 2017].

Suggested Citation:"7 Implementing the Indicator System." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×

and weighting, a small sample could provide nationally representative student unit record data. Because the sample would be comprised of institutions already engaged in voluntary data collection and sharing, these institutions might welcome the opportunity to share and analyze their data to support the proposed indicators. To encourage participation and reduce the burden of additional data collection, federal agencies or foundations might offer to reimburse institutions for the costs of additional staff time required to share and analyze existing student unit record data and gather additional data.

Assuming that a nationally representative sample could be assembled, one potential limitation is that Completion by Design (CBD) is the only reform consortium using the critical measure of program of study selection (Engle, 2016), which allows STEM students to be identified before they officially declare their majors. Because CBD includes only about 200 2-year institutions in Ohio, North Carolina, and Florida, few institutions in the proposed national sample are currently calculating this measure. To address this limitation and identify students in STEM programs, the Department of Education, NSF, or another entity (e.g., a higher education association) might require or request institutions in the national sample to either analyze their administrative records to calculate the program of study selection measure to identify STEM students or rely on the Classification of Instructional Program codes and level of credential data they already report to NSLDS. As noted above, federal or foundation funding for institutions might encourage compliance with these requirements or requests.

In this option, the survey data needed would be collected through a combination of existing proprietary surveys of students and faculty. As discussed in Chapter 6, several long-standing national surveys owned and administered by universities or nonprofit organizations could be used to collect data for the indicators. These include the National Survey of Student Engagement (NSSE), the Community College Survey of Student Engagement (CCSSE), and the Freshman Survey, College Senior Survey, and Faculty Survey of the Higher Education Research Institute.

In this option, the Department of Education, NSF, or other entity would contract with these survey providers to extend their surveys and administer them to nationally representative samples of institutions and students. First, the federal government or other entity would commission the survey organizations to develop short survey modules for both students and faculty, designed to elicit detailed information on evidence-based STEM educational practices and other elements of the committee’s proposed indicators. Fortunately, most of these surveys are designed to include shorter, customized modules of questions, increasing the feasibility of this approach. Second, the survey organizations would be commissioned to administer the extended surveys to a nationally representative sample of 2-year and

Suggested Citation:"7 Implementing the Indicator System." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×

4-year institutions and of STEM students. Third, given the decline in survey response rates, the survey organizations would be provided with support for incentives or other mechanisms to boost response rates. Table 7-3 summarizes how the indicators could be supported in this option.

TABLE 7-3 Data for Indicators in Option 3

Objective Indicators Data Source
1.1 Use of evidence-based STEM educational practices both in and outside of classrooms. 1.1.1 Use of evidence-based educational practices in course development and delivery Revised and expanded proprietary surveys to include a nationally representative sample of all types of 2-year and 4-year institutions
1.2.1 Use of evidence-based STEM educational practices outside the classroom Same as above
1.2 Existence and use of supports that help STEM use evidence-based STEM learning experiences 1.2.1 Extent of instructors’ involvement in professional development Revised and expanded proprietary surveys to include a nationally representative sample of all types of 2-year and 4-year institutions
1.2.2 Availability of support or incentives for evidence-based course development or course redesign Same as above
1.3 An institutional climate that values undergraduate STEM instruction 1.3.1 Use of valid measures of teaching effectiveness Revised and expanded proprietary surveys to include a nationally representative sample of all types of 2-year and 4-year institutions
1.3.2 Consideration of evidence-based teaching in personnel decisions by departments and institutions Same as above
1.4 Continuous improvement in STEM teaching and learning No indicators: see “Challenges of Measuring Continuous Improvement” in Chapter 3.
Suggested Citation:"7 Implementing the Indicator System." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×
Objective Indicators Data Source
2.3 Representational diversity among STEM instructors 2.3.1 Diversity of STEM instructors in comparison with diversity of STEM graduate degree holders Nationally representative sample of institutions drawn from appropriate voluntary reform initiatives
2.3.2 Diversity of STEM graduate student instructors in comparison with diversity of STEM graduate students Same as above
2.4 Inclusive environments in institutions and STEM departments 2.4.1 Students pursuing STEM credentials feel included and supported in their academic programs and departments Revised and expanded proprietary surveys to include a nationally representative sample of all types of 2-year and 4-year institutions
2.4.2 Faculty teaching courses in STEM disciplines feel supported and included in their departments Same as above
2.4.3 Institutional practices that are culturally responsive, inclusive, and consistent across the institution Same as above
3.1 Foundational preparation for STEM for all students 3.1.1 Completion of foundational courses, including developmental education courses, to ensure STEM program readiness Nationally representative sample of institutions drawn from appropriate voluntary reform initiatives
3.2 Successful navigation into and through STEM programs of study 3.2.1 Retention in STEM programs, course to course and year to year Nationally representative sample of institutions drawn from appropriate voluntary reform initiatives
3.2.2 Transfers from 2-year to 4-year STEM programs in comparison with transfers to all 4-year programs Same as above
3.3 STEM credential attainment 3.3.1 Number of students who attain STEM credentials over time (disaggregated by institution type, transfer status, and demographic characteristics) Nationally representative sample of institutions drawn from appropriate voluntary reform initiatives
Suggested Citation:"7 Implementing the Indicator System." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×

CONCLUSIONS

CONCLUSION 6 Three options would provide the data needed for the proposed indicator system:

  1. Create a national student unit record data system, supplemented with expanded surveys of students and instructors (Option 1).
  2. Expand current federal institutional surveys, supplemented with expanded surveys of students and instructors (Option 2).
  3. Develop a nationally representative sample of student unit record data, supplemented with student and instructor data from proprietary survey organizations (Option 3).

Option 1 would provide the most accurate, complete, and useful data to implement the proposed indicators of students’ progress through STEM education. As noted above, legislation has been introduced in Congress to repeal the current ban on a student unit record data system and direct NCES to create it. Although creating a national student unit record data system would require investment of federal resources, the system would provide valuable information to policy makers about the status and quality of undergraduate education generally, not only in STEM fields. Institutions would be required to share their student unit record data with the federal government, but they would not be required to gather any additional data or make any additional calculations beyond what they already provide to IPEDS. This option for implementing the indicator system would also require regular surveys of students and faculty for data not covered by a student unit record data system.

Option 2 would take advantage of the well-developed system of institutional surveys that NCES uses to obtain IPEDS data annually from the vast majority of 2-year and 4-year institutions. Under this option, NCES would add to these surveys some of the new measures of student progress developed by higher education reform consortia, which include part-time and transfer students. Some of the measures are closely related to the proposed indicators. Like the first option, this option would also require investment of federal resources, but it would draw on the strengths of the well-established system of institutional reporting for IPEDS. In comparison with Option 1, this option would increase institutions’ burden for IPEDS reporting, requiring them to calculate additional measures based on their internal student unit record data. The additional measures would provide much of the student data needed for the indicator system, but the system would also require data from regular surveys of students and faculty.

Option 3 could be carried out by the federal government or another entity (e.g., a higher education association). It would take advantage of

Suggested Citation:"7 Implementing the Indicator System." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×

the rapid growth of higher education data collection and analysis by state higher education systems and education reform consortia across the country and require little or no federal investment. As noted above, some of these new measures of student progress are similar to the committee’s indicators. As in option 1 and 2, additional data from surveys would be needed to support the indicators.

Research, Evaluation, and Updating of the Proposed Indicator System

Many of the indicators proposed by the committee represent new conceptions of key elements of undergraduate STEM education to be monitored over time. Some indicators require research as the first step toward developing clear definitions and identifying the best measurement methods, prior to beginning data collection and implementing the indicator system (refer to Table 6-2). Once the system has been implemented and the indicators are in use, the committee suggests that the federal government conduct or commission an evaluation study to ensure that the indicators measure what they are intended to measure.

In addition, ongoing research may identify new factors related to the quality of undergraduate STEM education beyond those included in the proposed objectives and indicators. For example, there is promising evidence that three psychological competencies are related to students’ persistence and academic success (National Academies of Sciences, Engineering, and Medicine, 2017): (1) a sense of belonging on campus; (2) utility value (recognizing the value and relevance of academic subjects to one’s own life); and (3) a growth mindset (the belief that one’s intelligence is not fixed but can grow). Given the current lack of common definitions and high-quality assessments of these and other competencies (e.g., interest), the committee did not propose any objectives or indicators related to them. In the future, however, as further research evidence emerges, it may be appropriate to add objectives and indicators of these psychological competencies.

Furthermore, given that the structure of undergraduate education continues to evolve in response to changing student demographics, funding sources, educational technologies, and the growth of new providers, it may be a challenge to ensure that the proposed STEM indicators remain relevant and informative over time.

A number of trends in this evolution are clear. First, entering students will continue to come from increasingly diverse socioeconomic and ethnic backgrounds, with different work experiences, and at different stages of life. The committee’s proposed options concerning a national unit record student data system, revising IPEDS surveys, or using data from states or voluntary data initiatives are designed to measure the progress of these more diverse student groups.

Suggested Citation:"7 Implementing the Indicator System." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×

Second, an increasing number of students will earn STEM credentials by nontraditional educational pathways. Already, various credentials are provided by massive open online courses, companies (STEM certificates and badges), and competency-based educational programs. To date, most higher education institutions to do not award credit toward STEM degrees for these courses and credentials, but this may change in the future. As new approaches emerge, it will be important to update the indicator system to capture students’ changing trajectories toward changing STEM credentials.

Finally, with the spread of learning management systems across postsecondary education, institutions will have access to a different kind of data on student learning and faculty activities. These data include student scores on online assessments, student work products, faculty assignments, syllabi, and a host of behavioral information about how students and faculty are working. Combining such information with more traditional information, such as student grades and course-taking patterns, using increasingly sophisticated data analytic techniques will allow new approaches to monitoring student progress and achievement. The Signals Project at Purdue University (Sclater and Peasgood, 2016), the use of analytics to study virtual learning environments at the University of Maryland, Baltimore County (University of Maryland, Baltimore County, 2017), and the development and distribution of predictive models for students’ success at Marist College (Marist College, 2017) are part of an emerging international movement to use data analytics to develop more accurate predictive models of student grades, retention, persistence, and graduation. This work will expand and grow more sophisticated over the next decade, and as it does, features may emerge that will allow for more easily obtained and more accurate measures for monitoring some of the proposed indicators.

These and other developments in undergraduate education imply that in the coming years, it will be important to review, and revise as necessary, the committee’s proposed STEM indicators and the data and methods for measuring them.

A Note of Caution

The proposed indicator system would create a picture of the current status of undergraduate STEM education and allow policy makers to monitor change over time, including movement toward the three goals that underlie the indicator system. Although individual institutions or consortia of institutions may wish to adopt some or all of these indicators to monitor their own STEM educational programs, the indicator system is not intended to support ranking systems or inter-institutional comparisons. Many of the indicators are influenced by the socioeconomic status, parental education, and high school preparation of potential STEM students, long before these

Suggested Citation:"7 Implementing the Indicator System." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×

students begin postsecondary education. Thus, an individual institution’s performance on these indicators could be more strongly influenced by the background characteristics of its entering students than by factors that are under the institution’s control. Moreover, the indicator system is designed to capture the increasing number of students who pursue STEM credentials while attending multiple postsecondary institutions. It would, therefore, be impossible to fairly apportion the credit that a particular institution should receive for such students on many of these indicators. For these reasons, it would be inappropriate to use these indicators to rank or compare the performance of different postsecondary institutions.

REFERENCES

Cunningham, A.F., and Milam, J. (2005). Feasibility of a Student Unit Record System Within the Integrated Postsecondary Education Data System. (NCES 2005–160). U.S. Department of Education, National Center for Education Statistics. Washington, DC: U.S. Government Printing Office.

Engle, J. (2016). Answering the Call: Institutions and States Lead the Way Toward Better Measures of Postsecondary Performance. Seattle, WA: Bill & Melinda Gates Foundation. Available: http://postsecondary.gatesfoundation.org/wp-content/uploads/2016/02/AnsweringtheCall.pdf [June 2017].

Executive Office of the President of the United States. (2017). Using Federal Data to Measure and Improve the Performance of Institutions of Higher Education. Washington, DC: Author. Available: https://collegescorecard.ed.gov/assets/UsingFederalDataToMeasureAndImprovePerformance.pdf [September 2017].

HCM Strategists. (2013). The Voluntary Institutional Metrics Project: A Better Higher Education Data and Information Framework for Informing Policy. Washington, DC: Author. Available: https://www.luminafoundation.org/resources/a-better-higher-education-data-and-information-framework-for-informing-policy [July 2017].

Janice, A., and Voight, M. (2016). Toward Convergence: A Technical Guide for the Postsecondary Metrics Framework. Washington, DC: The Institute for Higher Education Policy. Available: http://www.ihep.org/research/publications/toward-convergence-technical-guide-postsecondary-metrics-framework [July 2017].

Marist College. (2017). Learning Analytics Project Wins Innovation Award. Available: http://www.marist.edu/publicaffairs/eduventuresaward2015.html [July 2017].

Miller, B. (2016). Building a Student-level Data System. Washington, DC: Institute for Higher Education Policy. Available: http://www.ihep.org/sites/default/files/uploads/postsecdata/docs/resources/building_a_student-level_data_system.pdf [June 2017].

National Academies of Sciences, Engineering, and Medicine. (2017). Supporting Students’ College Success: The Role of Assessment of Intrapersonal and Interpersonal Competencies. Washington, DC: The National Academies Press. Available: https://www.nap.edu/catalog/24697/supporting-students-college-success-the-role-of-assessment-of-intrapersonal [October 2017].

Sclater, N., and Peasgood, A. (2016). Learning Analytics in Higher Education: A Review of UK and International Practice. Available: https://www.jisc.ac.uk/reports/learning-analytics-in-higher-education [July 2017].

University of Maryland, Baltimore County. (2017). Division of Information Technology Analytics. Available: http://doit.umbc.edu/analytics [July 2017].

Suggested Citation:"7 Implementing the Indicator System." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×

This page intentionally left blank.

Suggested Citation:"7 Implementing the Indicator System." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×
Page 177
Suggested Citation:"7 Implementing the Indicator System." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×
Page 178
Suggested Citation:"7 Implementing the Indicator System." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×
Page 179
Suggested Citation:"7 Implementing the Indicator System." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×
Page 180
Suggested Citation:"7 Implementing the Indicator System." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×
Page 181
Suggested Citation:"7 Implementing the Indicator System." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×
Page 182
Suggested Citation:"7 Implementing the Indicator System." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×
Page 183
Suggested Citation:"7 Implementing the Indicator System." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×
Page 184
Suggested Citation:"7 Implementing the Indicator System." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×
Page 185
Suggested Citation:"7 Implementing the Indicator System." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×
Page 186
Suggested Citation:"7 Implementing the Indicator System." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×
Page 187
Suggested Citation:"7 Implementing the Indicator System." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×
Page 188
Suggested Citation:"7 Implementing the Indicator System." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×
Page 189
Suggested Citation:"7 Implementing the Indicator System." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×
Page 190
Suggested Citation:"7 Implementing the Indicator System." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×
Page 191
Suggested Citation:"7 Implementing the Indicator System." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×
Page 192
Suggested Citation:"7 Implementing the Indicator System." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×
Page 193
Suggested Citation:"7 Implementing the Indicator System." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×
Page 194
Suggested Citation:"7 Implementing the Indicator System." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×
Page 195
Suggested Citation:"7 Implementing the Indicator System." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×
Page 196
Suggested Citation:"7 Implementing the Indicator System." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×
Page 197
Suggested Citation:"7 Implementing the Indicator System." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×
Page 198
Suggested Citation:"7 Implementing the Indicator System." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×
Page 199
Suggested Citation:"7 Implementing the Indicator System." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×
Page 200
Next: Appendix A: Public Comments on Draft Report and Committee Response »
Indicators for Monitoring Undergraduate STEM Education Get This Book
×
 Indicators for Monitoring Undergraduate STEM Education
Buy Paperback | $55.00 Buy Ebook | $44.99
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF

Science, technology, engineering and mathematics (STEM) professionals generate a stream of scientific discoveries and technological innovations that fuel job creation and national economic growth. Ensuring a robust supply of these professionals is critical for sustaining growth and creating jobs growth at a time of intense global competition. Undergraduate STEM education prepares the STEM professionals of today and those of tomorrow, while also helping all students develop knowledge and skills they can draw on in a variety of occupations and as individual citizens. However, many capable students intending to major in STEM later switch to another field or drop out of higher education altogether, partly because of documented weaknesses in STEM teaching, learning and student supports. Improving undergraduate STEM education to address these weaknesses is a national imperative.

Many initiatives are now underway to improve the quality of undergraduate STEM teaching and learning. Some focus on the national level, others involve multi-institution collaborations, and others take place on individual campuses. At present, however, policymakers and the public do not know whether these various initiatives are accomplishing their goals and leading to nationwide improvement in undergraduate STEM education.

Indicators for Monitoring Undergraduate STEM Education outlines a framework and a set of indicators that document the status and quality of undergraduate STEM education at the national level over multiple years. It also indicates areas where additional research is needed in order to develop appropriate measures. This publication will be valuable to government agencies that make investments in higher education, institutions of higher education, private funders of higher education programs, and industry stakeholders. It will also be of interest to researchers who study higher education.

READ FREE ONLINE

  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    Switch between the Original Pages, where you can read the report as it appeared in print, and Text Pages for the web version, where you can highlight and search the text.

    « Back Next »
  6. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  7. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  8. ×

    View our suggested citation for this chapter.

    « Back Next »
  9. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!