National Academies Press: OpenBook

Indicators for Monitoring Undergraduate STEM Education (2017)

Chapter: 7 Implementing the Indicator System

« Previous: 6 Existing Data Sources and Monitoring Systems
Suggested Citation:"7 Implementing the Indicator System." National Academies of Sciences, Engineering, and Medicine. 2017. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×
Page 156
Suggested Citation:"7 Implementing the Indicator System." National Academies of Sciences, Engineering, and Medicine. 2017. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×
Page 157
Suggested Citation:"7 Implementing the Indicator System." National Academies of Sciences, Engineering, and Medicine. 2017. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×
Page 158
Suggested Citation:"7 Implementing the Indicator System." National Academies of Sciences, Engineering, and Medicine. 2017. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×
Page 159
Suggested Citation:"7 Implementing the Indicator System." National Academies of Sciences, Engineering, and Medicine. 2017. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×
Page 160
Suggested Citation:"7 Implementing the Indicator System." National Academies of Sciences, Engineering, and Medicine. 2017. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×
Page 161
Suggested Citation:"7 Implementing the Indicator System." National Academies of Sciences, Engineering, and Medicine. 2017. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×
Page 162
Suggested Citation:"7 Implementing the Indicator System." National Academies of Sciences, Engineering, and Medicine. 2017. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×
Page 163
Suggested Citation:"7 Implementing the Indicator System." National Academies of Sciences, Engineering, and Medicine. 2017. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×
Page 164
Suggested Citation:"7 Implementing the Indicator System." National Academies of Sciences, Engineering, and Medicine. 2017. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×
Page 165
Suggested Citation:"7 Implementing the Indicator System." National Academies of Sciences, Engineering, and Medicine. 2017. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×
Page 166
Suggested Citation:"7 Implementing the Indicator System." National Academies of Sciences, Engineering, and Medicine. 2017. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×
Page 167
Suggested Citation:"7 Implementing the Indicator System." National Academies of Sciences, Engineering, and Medicine. 2017. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×
Page 168
Suggested Citation:"7 Implementing the Indicator System." National Academies of Sciences, Engineering, and Medicine. 2017. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×
Page 169
Suggested Citation:"7 Implementing the Indicator System." National Academies of Sciences, Engineering, and Medicine. 2017. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×
Page 170
Suggested Citation:"7 Implementing the Indicator System." National Academies of Sciences, Engineering, and Medicine. 2017. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×
Page 171
Suggested Citation:"7 Implementing the Indicator System." National Academies of Sciences, Engineering, and Medicine. 2017. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×
Page 172
Suggested Citation:"7 Implementing the Indicator System." National Academies of Sciences, Engineering, and Medicine. 2017. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×
Page 173
Suggested Citation:"7 Implementing the Indicator System." National Academies of Sciences, Engineering, and Medicine. 2017. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×
Page 174
Suggested Citation:"7 Implementing the Indicator System." National Academies of Sciences, Engineering, and Medicine. 2017. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×
Page 175
Suggested Citation:"7 Implementing the Indicator System." National Academies of Sciences, Engineering, and Medicine. 2017. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×
Page 176

Below is the uncorrected machine-read text of this chapter, intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text of each book. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

PREPUBLICATION COPY, UNCORRECTED PROOFS 7 Implementing the Indicator System As detailed in Chapter 6, nationally representative data are not currently available from public or proprietary sources for most of the committee’s proposed indicators. This limits policy makers’ ability to track progress toward the goals of committee’s goals of (1) increasing students’ mastery of STEM concepts and skills; (2) striving for equity, diversity, and inclusion; and (3) ensuring adequate numbers of STEM professionals. That chapter outlines steps toward developing each of the 21 indicators by revising various public and proprietary data sources to provide the data needed for each one. This chapter aims to reduce the complexity of implementing the indicator system by presenting three options for obtaining the data required for all of the indicators: a) creating a national student unit record data system; b) expanding National Center for Education Statistics (NCES) data collections; and c) combining existing data from nonfederal sources. It also discusses new data collection and analysis systems that could potentially be used in the future to support the proposed indicator system. The chapter ends with the committee’s conclusion about moving forward, including a caution about the intended use of the proposed indicator system. OPTION 1: CREATE A NATIONAL STUDENT UNIT RECORD DATA SYSTEM A national student unit record data system incorporating administrative data on individual students would provide reliable and usable data for many of the proposed indicators focused on students’ progress through STEM programs. Relative to the current Integrated Postsecondary Education Data System (IPEDS), such a system would incorporate more accurate and complete data, As noted in Chapter 6, IPEDS data do not include students who transfer, those who attend multiple institutions, and those who enroll part time: a student unit record data system would include all of these groups.1 In addition, because IPEDS consists of aggregated, institution-level data on cohorts of students, it cannot be used to track the academic progress of individual students, or different groups of students, over time. In contrast, a student unit record data system is well suited for the type of such longitudinal monitoring. Data from such a system could be aggregated to monitor the progress and outcomes of intended STEM majors among students of different genders, racial and ethnic groups, socioeconomic backgrounds, disability status, and Pell grant eligibility (socioeconomic) status. A student unit record data system would allow longitudinal analyses of trends over time for 8 of the 21 proposed indicators:  Indicator 2.1.2: Entrance to and persistence in STEM academic programs  Indicator 2.2.1: Diversity of STEM degree and certificate earners in comparison with diversity of degree and certificate earners in all fields  Indicator 2.2.2: Diversity of students who transfer from 2- to 4-year STEM programs in comparison with diversity of students in 2-year STEM programs  Indicator 2.2.3 Time to degree for students in STEM academic programs 1 The National Center for Education Statistics has added new survey components to begin capturing information on part-time and transfer students: see Chapter 6. 7-1

PREPUBLICATION COPY, UNCORRECTED PROOFS  Indicator 3.1.1 Completion of foundational courses, including developmental education courses, to ensure STEM program readiness  Indicator 3.2.1: Retention in STEM programs, course to course and year to year  Indicator 3.2.2: Transfers from 2- to 4-year STEM programs in comparison with transfers to all 4-year programs  Indicator 3.3.1: Number of students who attain STEM credentials over time, disaggregated by institute type, transfer status, and demographic characteristics In this option, the federal government would require institutions to collect and provide to the national system standardized unit record data on student educational experiences and activities. Currently, the creation of such a system is prohibited by the 2008 amendments to the Higher Education Act (see Chapter 6). At this time, however, there are bipartisan bills in Congress (H.R. 2434 and S. 1121, the College Transparency Act) that would amend the Higher Education Act to repeal the current ban on a national student unit record data system and direct the National Center for Education Statistics (NCES) to create such a system. If the bills became law, NCES, when creating the new system, could take advantage of the lessons learned from the many state higher education systems and multi-institution education reform consortia that have successfully collected and used unit record student data to monitor undergraduate education. Creating a national database of student unit records appears to be both technically and financially feasible and could reduce institutions’ current burden of reporting IPEDS data, as shown in two feasibility studies. In 2005, NCES commissioned the first study, to examine the feasibility of creating a student unit record data system that could replace the aggregated institution-level data included in IPEDS. The study (Cunningham and Milam, 2005) presented three findings. First, the authors found that NCES had at the time most of the computing hardware and software necessary to implement such a system, including equipment for web-based data collection and servers for storing large amounts of student data. However, to ensure the security and confidentiality of the data, NCES would have to create a new, permanent, database storage system, protected by physical and software firewalls; the authors did not estimate how much these modifications would have cost. Second, the authors found that implementing the new system at that time would present colleges and universities with technical challenges, requiring expenditures for new technology, training in the use of the new reporting system, and personnel. Cunningham and Milam (2005) gathered estimates of implementation costs from hundreds of people from a variety of individual institutions, state higher education agencies, and higher education institutions. The cost estimates varied widely, depending on whether an institution was currently participating in a state-level student unit record data system (see Chapter 6) and its information technology and institutional research capabilities. Another key factor was whether an institution was already uploading student data to the National Student Loan Data System (NSLDS; see Chapter 6); at the time, nearly all institutions were doing so. Given these complex cost factors influencing costs, Cunningham and Milam (2005) did not estimate an average per-institution cost, but noted the possibility of providing federal support to defray these costs. Third, the authors found that institutional costs would eventually decline, partly because some IPEDS reporting would be eliminated. Cunningham and Milam (2005) concluded that it was technically feasible for most institutions to report student data to a national student unit record data system, given time for 7-2

PREPUBLICATION COPY, UNCORRECTED PROOFS transition. They did not address, however, whether this new reporting would be financially feasible for participating colleges and universities. More recently, Miller (2016) analyzed various approaches to developing a national student unit record data system to be overseen by the Department of Education. Like Cunningham and Milam (2005), the author noted that nearly all institutions were already reporting data to NSLDS, which included much of the data needed to track the progress of students receiving financial aid over time (e.g., enrollments, transfers, field of study, completions). Because, on average, 70 percent of all students receive financial aid (Executive Office of the President of the United States, 2017), NSLDS already includes much of the data needed for a national system. Miller thus proposed that a national data system could best be created by building on the existing capability of NSLDS. Expanding NSLDS to include all data from all students appears technically feasible, based on the system’s recent history of adding 17 percent more student records between February 2010 and July 2013. Miller (2016) estimated that the programming changes to accommodate this growth would cost around $1 million. Miller cautioned, however, that NSLDS already has significant technical limitations and a history of poor processing speeds. Adding millions of additional records on students who do not receive financial aid could slow the system’s ability to perform its core function of ensuring students receive financial aid and repay their loans. To address this problem, Miller (2016) proposed a complete modernization of NSLDS, which would require additional funding; he did not provide a cost estimate. In this proposal, NCES would handle access to the student unit record data system by policy makers, researchers, and the public (Miller, 2016). At least once a year, a data extract would be transmitted to NCES, which would be responsible for generating public reports on higher education and populating IPEDS with data no longer being reported to it. NCES would also establish and implement protocols for allowing access to the database while maintaining the privacy and confidentiality of individual student records. Miller (2016) argues that moving to the system he proposes would not burden most institutions with massive, costly changes in their reporting, for two reasons. First, NSLDS requires institutions to report data on only those students who receive federal financial aid. Yet many institutions submit data on all of their students to the National Student Clearinghouse (NSC), which in turn reports on financial aid recipients to NSLDS on behalf of these institutions. For this large group of public and private institutions (over 3,600 according to NSC; see Chapter 6), moving to a student unit record system would simply mean passing along data they are already assembling. Second, the student unit record data system would replace seven components of IPEDS entirely and most of an eighth component. Using NCES estimates of the average number of hours required to complete each component in 2015-2016, Miller (2016) estimated that moving to a student unit record system would reduce reporting time by about 60 percent, saving around 88.6 hours per institution per year. The current lack of a student unit record data system makes it difficult to develop a national picture of and to monitor trends over time among the nation’s postsecondary students and institutions; this difficulty applies to undergraduate STEM education. However, even if the prohibition against such a system is removed, a national database of student unit records would not provide information on instructors,2 who play a key role in 2 As noted above, the committee uses the term “instructor” to refer to all individuals who teach undergraduates, including tenured and tenure-track faculty, part-time and adjunct instructors, and graduate student instructors. 7-3

PREPUBLICATION COPY, UNCORRECTED PROOFS engaging students in evidence-based educational experiences inside and outside the classroom, nor on programs and perceptions related to equity, diversity, and inclusion. Therefore, student and instructor surveys about educational experiences and activities would still be necessary to support the proposed indicator system. Table 7-1 shows how each indicator would be supported under this option. OPTION 2: EXPAND NCES DATA COLLECTIONS A currently legal and more feasible approach would rely on IPEDS and other NCES surveys to support the proposed indicators. As discussed in Chapter 6, nearly all public and private, for-profit and nonprofit institutions provide annual data to IPEDS, reporting on the vast majority of students who are currently enrolled in STEM courses and majors. Using this option to inform the proposed indicators would tap the capabilities of an established and reliable data collection mechanism, although it would provide less robust data than under Option 1. The usefulness of IPEDS data for the indicators is limited by the statistical and analytic challenges that result from using aggregated data as the unit of analysis. Currently, IPEDS data partly support only 2 of the 21 proposed indicators:  Indicator 2.2.1: Diversity of STEM degree and certificate earners in comparison with diversity of degree and certificate earners in all fields  Indicator 3.3.1: Number of students who attain STEM credentials over time, disaggregated by institution type, transfer status, and demographic characteristics The IPEDS data do not fully support these two indicators because they do not permit the disaggregation of STEM credentials by disability status and Pell status (a proxy for socioeconomic status).3 Fortunately, states and voluntary multi-institution data initiatives have developed and implemented an expanded range of measures that could fill some of the gaps in IPEDS (see below). Under this option, NCES (with support from the National Science Foundation [NSF]) would expand IPEDS institutional surveys to include institution-level measures of student progress towards degrees and certificates. In addition, NCES would extend existing student surveys and revive a major faculty survey. Table 7-2 shows how each indicator would be supported under this option. In comparison with the first option, this option would place a greater burden on institutions of higher education. In Option 1, institutional research staff would only have to upload student unit record data to the national student unit record system. In this option, institutional research staff would be required to collect additional data (beyond their current collections) and use it to calculate additional measures for reporting to IPEDS. Overall, this option would require NCES to change its data collection in three ways: expanding IPEDS, expanding the Beginning Postsecondary Students Longitudinal Study (BPS), and renewing and expanding the National Study of Postsecondary Faculty. Expanding IPEDS 3 Beginning in 2017-2018, NCES will gather information on students’ Pell Grant status as part of the new outcome measures survey within IPEDS 7-4

PREPUBLICATION COPY, UNCORRECTED PROOFS Under this option, IPEDS surveys would be expanded to require institutions to report on several measures that have been developed and tested in voluntary data collections by states and higher education reform consortia: see Box 7-1. These new measures, like the committee’s proposed indicators, are designed to represent important dimensions of undergraduate education in a readily understandable form. Specifically, NCES would expand the IPEDS surveys to include the following measures, as defined by Janice and Voight (2016, p. iv):  Program of study selection: The percentage of students in a cohort who demonstrate a program of study selection by taking nine credits (or three courses) in a meta-major in the first year. Meta-majors include: education; arts and humanities; social and behavioral sciences and human services; science, technology, engineering, and math (STEM); business and communication; health; and trades.  Enrollment: a 12-month headcount that includes all undergraduate students who enroll at any point during the calendar year, disaggregated by program of study selection.  Gateway course completion: The percentage of students completing college-level, introductory mathematics and English courses tracked separately in their first year, disaggregated by program of study selection.  Retention rate: The percentage of students in a cohort who are either enrolled at their initial institution or transfer to a longer duration program at the initial or a subsequent institution, calculated annually for up to 200 percent of program length, disaggregated by program of study selection.  Transfer rate: The percentage of students in a cohort who transfer into longer programs at the initial or subsequent institution(s), for up to 200 percent of program length, disaggregated by program of study selection. Two additional changes to IPEDS data collection would be necessary to obtain the data needed for the proposed indicators. First, all of the measures of student progress and credentials outlined above would have to be disaggregated by Pell grant status and disability status. As noted above, institutions currently provide IPEDS data on completions disaggregated by gender and race and ethnicity, so this change will require institutions to make additional calculations using their administrative records. Second, data collected from institutions through the IPEDS Human Resources Survey would have to be expanded to include the department in which faculty and graduate students teach. The Human Resources Survey, administered each spring, asks institutions to report data on: employee status (full or part time, faculty status); full time instructional staff (academic rank, gender, and contract length or teaching period); and total salary outlay and number of months covered, by academic rank and gender. However, the survey does not ask for disciplinary department, which is needed to identify STEM faculty and staff. The proposed additional measures would capture and follow a wider range of students intending to major in STEM than are currently included in IPEDS, including first-time full-time, transfer full-time, first-time part-time, and transfer part-time students. These additional measures would not only support the proposed indicator system, but would also improve the capacity of IPEDS, allowing policy makers and higher education leaders to track students’ progress and completion generally, across all fields or majors. 7-5

PREPUBLICATION COPY, UNCORRECTED PROOFS Different combinations of the proposed additional student measures have been used in voluntary data-sharing programs among institutions participating in higher education reform consortia, such as Achieving the Dream, Completion by Design, and Complete College America, and by some state higher education data systems (see Chapter 6). Of particular importance to the proposed indicator system is the “program of study selection” measure used by Achieving the Dream. As noted above, this measure assigns students into one of seven meta-majors, one of which is STEM, based on their course-taking patterns in their first year of higher education. This measure permits STEM students to be identified early in their college careers, even before they officially declare their intended majors. If it is not possible to add the program-of-study-selection measure to IPEDS, NCES could instead require institutions to use the data on program of study selection they already report to NSLDS to disaggregate enrollment, gateway course completion, retention rate, and transfer rate data by Classification of Instructional Program codes. Since 2014, postsecondary institutions have been required to report to NSLDS unit student record data on students’ program of study selection for those programs with enrolled students who receive Title IV aid.4 These reports do not capture students’ intentions as early as the measure of program of study selection outlined above, and they miss the 30 percent of students who do not receive or apply for some type of federal aid (Miller, 2016). Nonetheless, this requirement would allow institutions to report more detailed information to IPEDS on the programs in which their students are enrolled. Expanding the Beginning Postsecondary Students Longitudinal Study The second expansion necessary to support the proposed indicator system under this option would be to extend and expand the NCES’s Beginning Postsecondary Students Longitudinal Survey (BPS) to include additional and larger cohorts. To date, BPS has followed cohorts of students who entered postsecondary education in 1990, 1996, 2004, and 2012. Students in the BPS complete three surveys: one at the end of their first academic year, and two others 3 years and 6 years after they entered postsecondary education. The survey structure includes a section on characteristics, which includes items on students’ educational experiences relevant to the proposed indicators, and this section would easily allow addition of a STEM- specific branch. Specifically, to support the proposed indicators in this option, the NCES would need to: continue to add a new BPS cohorts every 6-8 years; expand the sample in each cohort to a size sufficient to allow statistical analyses of gender, race and ethnicity, disability status, and Pell grand (socioeconomic) status; and add STEM-specific questions to the survey. Renewing and Expanding the National Study of Postsecondary Faculty The third expansion to current NCES data collection under this option would be to renew and expand the National Study of Postsecondary Faculty (NSOPF). The NSOPF was administered in 1992, 1998, and 2003 to 3,000 department chairs and 11,000 instructors at a wide range of postsecondary institutions. This survey, which asked faculty about their 4 Title IV includes students with loans under the Federal Family Education Loan Program or William D. Ford Federal Direct Loan (Direct Loan) Program, as well as students who have Federal Perkins Loans, students who have received Federal Pell Grants, Teacher Education Assistance for College and Higher Education Grants, Academic Competitiveness Grants or Science and Math Access to Retain Talent Grants, and students on whose behalf parents borrowed Parent PLUS loans. 7-6

PREPUBLICATION COPY, UNCORRECTED PROOFS employment and activities, would have to be reinstated and modified in three ways to support the proposed indicator system. First, it would be administered to a sufficiently large sample of faculty to allow the meaningful disaggregation of the sample into important dimensions of diversity, including race and ethnicity, gender, part- or full-time status, and tenure- or non-tenure track status. Second, it would be expanded to include graduate teaching assistants, who provide a substantial amount of instruction for STEM students, especially during the first 2 years of higher education. Third, it would be modified to include questions on instructors’ use of evidence-based STEM educational practices, including course redesign, evidence-based teaching strategies, and involvement in professional development. Although the survey was administered to faculty from all disciplines, it was designed to measure teaching activities that are discipline specific. Hence, adding questions specifically about evidence-based STEM teaching strategies is possible in the NSOPF framework. Revising and expanding the NSOPF and IPEDS as proposed under this option would likely increase the data collection burden on institutions. For example, a group of 18 public and private 2- and 4-year institutions joined in the Voluntary Institutional Metrics Project to develop meaningful measures of the quality and outcomes of higher education (HCM Strategists, 2013). The participating institutions shared their student unit record data, including all full-time, part- time, and transfer students, and analyzed the data to report on measures similar to those that would be added to IPEDS in this option. The participating institutions reported that the burden of analyzing student records to report on the measures was substantial. Following 4-year students for 200 percent of expected time to completion (8 years) from their initial enrollment sometimes required the analysis of millions of records over an extended period of time during which there were sometimes changes in records systems and processes. In addition, institutions lacked data on students who transferred to other institutions, especially if the other institution was located in another state. However, if this option was adopted and institutions had difficulty obtaining data on students who transfer to another institution, they might be able to obtain the data from NSC (see Chapter 6). Nevertheless, the additional burden on institutions to report data on these measures is undeniable. This challenge points to a benefit of the first option: if a national student unit record data system is created, the burden of calculating the proposed indicators related to STEM students’ progress and completion would fall on NCES rather than on institutions. OPTION 3: COMBINE EXISTING DATA FROM NONFEDERAL SOURCES The third option, which could be undertaken by federal agencies or other organizations (e.g., a higher education association), would take advantage of existing data sources and require little or no federal investment. This option has the potential to provide limited support for the committee’s proposed indicator system by combining data from states and voluntary multi- institution education reform consortia to create a national picture of STEM undergraduate education. As noted above, many 2- and 4-year institutions voluntarily share unit record data on cohorts of students through state data warehouses and/or with one or more education reform consortia. They report on student progress using measures similar to those identified at the beginning of this chapter (see Box 7-1, above). Institutions participating in these consortia and state data sharing also report IPEDS data to NCES and data on students receiving financial aid to NSLDS; these data are relevant for the proposed indicators. 7-7

PREPUBLICATION COPY, UNCORRECTED PROOFS In this option, the federal government or a private foundation would seek to tap these existing data resources by commissioning research on the feasibility of creating a nationally representative sample of postsecondary institutions that are already reporting the measures outlined in Option 2. Specifically, the Department of Education, NSF, or a private foundation would commission research to first identify institutions that are already submitting student unit record data to a consortia or their state data systems. One starting point would be the map listing voluntary data collection efforts in each state and territory.5 The research would then examine whether a representative national sample of student unit record data could be derived from these institutions. The committee’s preliminary review of this information suggests that such data sharing is most frequent among public 4-year institutions, moderate among 2-year institutions, less frequent among private nonprofit institutions, and rare among private for-profit institutions. As noted in Chapter 6, state unit record data systems have limited or no coverage of private for-profit institutions. However, statistical weighting methods could be used to correct for this gap and represent all institution types. With stratification and weighting, a small sample could provide nationally representative student unit record data. Because the sample would be comprised of institutions already engaged in voluntary data collection and sharing, these institutions might welcome the opportunity to share and analyze their data to support the proposed indicators. To encourage participation and reduce the burden of additional data collection, federal agencies or foundations might offer to reimburse institutions for the costs of additional staff time required to share and analyze existing student unit record data and gather additional data. Assuming that a nationally representative sample could be assembled, one potential limitation is that Completion by Design (CBD) is the only reform consortium using the critical measure of program of study selection (Engle, 2016), which allows STEM students to be identified before they officially declare their majors. Because CBD includes only about 200 2- year institutions in Ohio, North Carolina, and Florida, few institutions in the proposed national sample are currently calculating this measure. To address this limitation and identify students in STEM programs, the Department of Education, NSF, or another entity (e.g., a higher education association) might require or request institutions in the national sample to either analyze their administrative records to calculate the program of study selection measure to identify STEM students or rely on the Classification of Instructional Program codes and level of credential data they already report to NSLDS. As noted above, federal or foundation funding for institutions might encourage compliance with these requirements or requests. In this option, the survey data needed would be collected through a combination of existing proprietary surveys of students and faculty. As discussed in Chapter 6, several long- standing, national surveys owned and administered by universities or nonprofit organizations could be used to collect data for the indicators. These include the National Survey of Student Engagement (NSSE), the Community College Survey of Student Engagement (CCSSE), and the Freshman Survey, College Senior Survey, and Faculty Survey of the Higher Education Research Institute. In this option, the Department of Education, NSF or other entity would contract with these survey providers to extend their surveys and administer them to nationally representative samples of institutions and students. First, the federal government or other entity would 5 See the Institution for Higher Education Policy website: http://www.ihep.org/postsecdata/mapping-data- landscape [August 2017]. 7-8

PREPUBLICATION COPY, UNCORRECTED PROOFS commission the survey organizations to develop short survey modules for both students and faculty, designed to elicit detailed information on evidence-based STEM educational practices and other elements of the committee’s proposed indicators. Fortunately, most of these surveys are designed to include shorter, customized modules of questions, increasing the feasibility of this approach. Second, the survey organizations would be commissioned to administer the extended surveys to a nationally representative sample of 2- and 4-year institutions and of STEM students. Third, given the decline in survey response rates, the survey organizations would be provided with support for incentives or other mechanisms to boost response rates. Table 7-3 summarizes how the indicators could be supported in this option. CONCLUSIONS CONCLUSION 6 Three options would provide the data needed for the proposed indicator system: 1. Create a national student unit record data system, supplemented with expanded surveys of students and instructors (Option 1). 2. Expand current federal institutional surveys, supplemented with expanded surveys of students and instructors (Option 2). 3. Develop a nationally representative sample of student unit record data, supplemented with student and instructor data from proprietary survey organizations (Option 3). Option 1 would provide the most accurate, complete, and useful data to implement the proposed indicators of students’ progress through STEM education. As noted above, legislation has been introduced in Congress to repeal the current ban on a student unit record data system and direct NCES to create it. Although creating a national student unit record data system would require investment of federal resources, the system would provide valuable information to policy makers about the status and quality of undergraduate education generally, not only in STEM fields. Institutions would be required to share their student unit record data with the federal government, but they would not be required to gather any additional data or make any additional calculations beyond what they already provide to IPEDS. This option for implementing the indicator system would also require regular surveys of students and faculty for data not covered by a student unit record data system. Option 2 would take advantage of the well-developed system of institutional surveys that NCES uses to obtain IPEDS data annually from the vast majority of 2- and 4-year institutions. Under this option, NCES would add to these surveys some of the new measures of student progress developed by higher education reform consortia, which include part-time and transfer students. Some of the measures are closely related to the proposed indicators. Like the first option, this option would also require investment of federal resources, but it would draw on the strengths of the well-established system of institutional reporting for IPEDS. In comparison with Option 1, this option would increase institutions’ burden for IPEDS reporting, requiring them to calculate additional measures based on their internal student unit record data. The additional measures would provide much of the student data needed for the indicator system, but the system would also require data from regular surveys of students and faculty. 7-9

PREPUBLICATION COPY, UNCORRECTED PROOFS Option 3 could be carried out by the federal government or another entity (e.g., a higher education association). It would take advantage of the rapid growth of higher education data collection and analysis by state higher education systems and education reform consortia across the country and require little or no federal investment. As noted above, some of these new measures of student progress are similar to the committee’s indicators. As in option 1 and 2, additional data from surveys would be needed to support the indicators. Research, Evaluation, and Updating of the Proposed Indicator System Many of the indicators proposed by the committee represent new conceptions of key elements of undergraduate STEM education to be monitored over time. Some indicators require research as the first step toward developing clear definitions and identifying the best measurement methods, prior to beginning data collection and implementing the indicator system (see Table 6-2, in Chapter 6). Once the system has been implemented and the indicators are in use, the committee suggests that the federal government conduct or commission an evaluation study to ensure that the indicators measure what they are intended to measure. In addition, ongoing research may identify new factors related to the quality of undergraduate STEM education beyond those included in the proposed objectives and indicators. For example, there is promising evidence that three psychological competencies are related to students’ persistence and academic success (National Academies of Sciences, Engineering, and Medicine, 2017): (1) a sense of belonging on campus; (2) utility value (recognizing the value and relevance of academic subjects to one’s own life); and (3) a growth mindset (the belief that one’s intelligence is not fixed but can grow). Given the current lack of common definitions and high-quality assessments of these and other competencies (e.g., interest), the committee did not propose any objectives or indicators related to them. In the future, however, as further research evidence emerges, it may be appropriate to add objectives and indicators of these psychological competencies. Furthermore, given that the structure of undergraduate education continues to evolve in response to changing student demographics, funding sources, educational technologies, and the growth of new providers, it may be a challenge to ensure that the proposed STEM indicators remain relevant and informative over time. A number of trends in this evolution are clear. First, entering students will continue to come from increasingly diverse socioeconomic and ethnic backgrounds, with different work experiences, and at different stages of life. The committee’s proposed options concerning a national unit record student data system, revising IPEDS surveys, or using data from states or voluntary data initiatives are designed to measure the progress of these more diverse student groups. Second, an increasing number of students will earn STEM credentials by nontraditional educational pathways. Already, various credentials are provided by massive open online courses, companies (STEM certificates and badges) and competency-based educational programs. To date, most higher education institutions to do not award credit toward STEM degrees for these courses and credentials, but this may change in the future. As new approaches emerge, it will be important to update the indicator system to capture students’ changing trajectories toward changing STEM credentials. Finally, with the spread of learning management systems across postsecondary education, institutions will have access to a different kind of data on student learning and faculty activities. 7-10

PREPUBLICATION COPY, UNCORRECTED PROOFS These data include student scores on on-line assessments, student work products, faculty assignments, syllabi, and a host of behavioral information about how students and faculty are working. Combining such information with more traditional information, such as student grades and course-taking patterns, using increasingly sophisticated data analytic techniques will allow new approaches to monitoring student progress and achievement. The Signals Project at Purdue University (Sclater and Peasgood, 2016), the use of analytics to study virtual learning environments at the University of Maryland, Baltimore County (Sclater and Peasgood, 2016; University of Maryland, Baltimore County, 2017) and the development and distribution of predictive models for students’ success at Marist College (Marist College, 2017) are part of an emerging international movement to use data analytics to develop more accurate predictive models of student grades, retention, persistence, and graduation. This work will expand and grow more sophisticated over the next decade, and as it does, features may emerge that will allow for more easily obtained and more accurate measures for monitoring some of the proposed indicators. These and other developments in undergraduate education imply that in the coming years, it will be important to review, and revise as necessary, the committee’s proposed STEM indicators and the data and methods for measuring them. A Note of Caution The proposed indicator system would create a picture of the current status of undergraduate STEM education and allow policy makers to monitor change over time, including movement toward the three goals that underlie the indicator system. Although individual institutions or consortia of institutions may wish to adopt some or all of these indicators to monitor their own STEM educational programs, the indicator system is not intended to support ranking systems or inter-institutional comparisons. Many of the indicators are influenced by the socioeconomic status, parental education, and high school preparation of potential STEM students, long before these students begin postsecondary education. Thus, an individual institution’s performance on these indicators could be more strongly influenced by the background characteristics of its entering students than by factors that are under the institution’s control. Moreover, the indicator system is designed to capture the increasing number of students who pursue STEM credentials while attending multiple postsecondary institutions. It would, therefore, be impossible to fairly apportion the credit that a particular institution should receive for such students on many of these indicators. For these reasons, it would be inappropriate to use these indicators to rank or compare the performance of different postsecondary institutions. 7-11

PREPUBLICATION COPY, UNCORRECTED PROOFS REFERENCES Cunningham, A.F., and Milam, J. (2005). Feasibility of a Student Unit Record System Within the Integrated Postsecondary Education Data System (NCES 2005–160). U.S. Department of Education, National Center for Education Statistics. Washington, DC: U.S. Government Printing Office. Engle, J. (2016). Answering the Call: Institutions and States Lead the Way Toward Better Measures of Postsecondary Performance. Seattle, WA: Bill & Melinda Gates Foundation. Available: http://postsecondary.gatesfoundation.org/wp- content/uploads/2016/02/AnsweringtheCall.pdf [June 2017]. Executive Office of the President of the United States. (2017). Using Federal Data to Measure and Improve the Performance of Institutions of Higher Education. Washington, D.C.: Author. Available: https://collegescorecard.ed.gov/assets/UsingFederalDataToMeasureAndImprovePerforma nce.pdf [September 2017]. HCM Strategists. (2013). The Voluntary Institutional Metrics Project: A Better Higher Education Data and Information Framework for Informing Policy. Washington, DC: HCM Strategists. Available: https://www.luminafoundation.org/resources/a-better- higher-education-data-and-information-framework-for-informing-policy [July 2017]. Janice, A., and Voight, M. (2016). Toward Convergence: A Technical Guide for the Postsecondary Metrics Framework. Washington, DC: The Institute for Higher Education Policy. Available: http://www.ihep.org/research/publications/toward-convergence- technical-guide-postsecondary-metrics-framework [July 2017]. Marist College. (2017). Learning Analytics Project Wins Innovation Award. Available: http://www.marist.edu/publicaffairs/eduventuresaward2015.html [July 2017]. Miller, B. (2016). Building a Student-level Data System. Washington, DC: Institute for Higher Education Policy. Available: http://www.ihep.org/sites/default/files/uploads/postsecdata/docs/resources/building_a_stu dent-level_data_system.pdf [June 2017]. National Academies of Sciences, Engineering, and Medicine. (2017). Supporting Students’ College Success: The Role of Assessment of Intrapersonal and Interpersonal Competencies. Washington, D.C.: National Academies Press. Available: https://www.nap.edu/catalog/24697/supporting-students-college-success-the-role-of- assessment-of-intrapersonal [October 2017]. Sclater, N., and Peasgood, A. (2016). Learning Analytics in Higher Education: A Review of UK and International Practice. Available: https://www.jisc.ac.uk/reports/learning-analytics- in-higher-education [July 2017]. University of Maryland, Baltimore County. (2017). Division of Information Technology Analytics. Available: http://doit.umbc.edu/analytics [July 2017]. 7-12

PREPUBLICATION COPY, UNCORRECTED PROOFS BOX 7-1 New Measures in Higher Education Seeking data to guide improvement in higher education, states and multi-institution reform consortia launched new surveys and developed new measures of student progress to fill gaps in existing state and federal data sets (Engle, 2016). But these new data collection efforts were rarely aligned with the older, existing federal and state data systems, yielding a patchwork of individual, unconnected data systems. In 2015, with support from the Bill and Melinda Gates Foundation, the Institute for Higher Education Policy (IHEP) convened a working group of data experts to discuss ways to improve the quality of higher education data systems in order to inform state and federal policy conversations. IHEP commissioned these experts to write a series of papers examining technical, resource, and policy considerations related to current data collection efforts and data systems, and offering recommendations for improvement.* Building on this work, Janice and Voight (2016) recommended the use of approximately 40 specific performance measures organized around the three high-level dimensions of performance, efficiency, and equity. The authors proposed that these measures could frame a comprehensive data system to address important questions about characteristics of college students, student outcomes, and college costs. The proposed measures include 20 measures of students’ progression and completion that are relevant to the committee’s proposed indicators, and they have been used by institutions, states and education reform consortia to collect and interpret data. *The papers are available at: http://www.ihep.org/postsecdata/mapping-data-landscape/national- postsecondary-data-infrastructure [August 2017] 7-13

PREPUBLICATION COPY, UNCORRECTED PROOFS TABLE 7-1 Data for Indicators in Option 1 Objective Indicator Proposed Data Source 1.1 Use of evidence-based 1.1.1 Use of evidence-based STEM educational practices in Renewed and STEM educational practices course development and delivery expanded NSOPF both in and outside of classrooms 1.1.2 Use of evidence-based STEM educational practices Renewed and outside the classroom expanded NSOPF 1.2 Existence and use of 1.2.1 Extent of instructors’ involvement in professional Renewed and supports that help STEM development expanded NSOPF instructors use evidence-based learning experiences 1.2.2 Availability of support or incentives for evidence-based course development or course redesign Renewed and expanded NSOPF 1.3 An institutional climate that 1.3.1 Use of valid measures of teaching effectiveness Renewed and values undergraduate STEM expanded NSOPF instruction 1.3.2 Consideration of evidence-based teaching in personnel decisions by departments and institutions Renewed and expanded NSOPF 1.4 Continuous improvement in No indicators: see “Challenges of Measuring Continuous STEM teaching and learning Improvement” in Chapter 3. 2.1 Equity of access to high- 2.1.1 Institutional structures, policies, and practices that Extended and quality undergraduate STEM strengthen STEM readiness for entering and enrolled college expanded BPS educational programs and students experiences 2.1.2 Entrance to and persistence in STEM educational Unit record data programs system 7-14

PREPUBLICATION COPY, UNCORRECTED PROOFS 2.1.3 Equitable student participation in evidence-based STEM Extended and educational practices expanded BPS 2.2 Representational equity 2.2.1 Diversity of STEM degree and certificate earners in Unit record data among STEM credential earners comparison with diversity degree and certificate earners in all system fields 2.2.2 Diversity of students who transfer from 2- to 4-year Unit record data STEM programs in comparison with diversity in 2-year STEM system programs 2.2.3 Time to degree for students in STEM academic programs Unit record data system 2.3 Representational diversity 2.3.1 Diversity of STEM instructors in comparison with Revised IPEDS among STEM instructors diversity of STEM graduate degree holders Human Resources Survey 2.3.2 Diversity of STEM graduate student instructors in Revised IPEDS comparison with diversity of STEM graduate students Human Resources Survey 2.4 Inclusive environments in 2.4.1 Students pursuing STEM credentials feel included and Extended and institutions and STEM supported in their academic programs and departments expanded BPS departments 2.4.2 Instructors teaching courses in STEM disciplines feel Renewed and supported and included in their departments expanded NSOPF 2.4.3 Institutional practices that are culturally responsive, inclusive, and consistent across the institution Renewed and expanded NSOPF 3.1 Foundational preparation for 3.1.1 Completion of foundational courses, including Unit record data STEM for all students developmental education courses, to ensure STEM program system readiness 3.2 Successful navigation into 3.2.1 Retention in STEM programs, course to course and year Unit record data 7-15

PREPUBLICATION COPY, UNCORRECTED PROOFS and through STEM programs of to year system study 3.2.2 Transfers from 2- to 4-year STEM programs in Unit record data comparison with transfers to all 4-year programs system 3.3 STEM credential attainment 3.3.1 Number of students who attain STEM credentials over Unit record data time, disaggregated by institution type, transfer status, and system demographic characteristics NOTES: BPS, Beginning Postsecondary Students Longitudinal Study; IPEDS, Integrated Postsecondary Education Data System; NSOPF, National Study of Postsecondary Faculty. 7-16

PREPUBLICATION COPY, UNCORRECTED PROOFS TABLE 7-2 Data for Indicators in Option 2 Objective Indicator Proposed Data Source 1.1 Use of evidence-based 1.1.1 Use of evidence-based STEM educational practices in Renewed and STEM educational practices course development and delivery expanded NSOPF both in and outside of classrooms 1.2.1 Use of evidence-based STEM educational practices Renewed and outside the classroom expanded NSOPF 1.2 Existence and use of 1.2.1 Extent of instructors’ involvement in professional Renewed and supports that help STEM development expanded NSOPF instructors use evidence-based learning experiences 1.2.2 Availability of support or incentives for evidence-based course development or course redesign Renewed and expanded NSOPF 1.3 An institutional climate that 1.3.1 Use of valid measures of teaching effectiveness Renewed and values undergraduate STEM expanded NSOPF instruction 1.3.2 Consideration of evidence-based teaching in personnel decisions by departments and institutions Renewed and expanded NSOPF 1.4 Continuous improvement in No indicators: see “Challenges of Measuring Continuous STEM teaching and learning Improvement” in Chapter 3. 2.1 Equity of access to high- 2.1.1 Institutional structures, policies, and practices that Extended and quality undergraduate STEM strengthen STEM readiness for entering and enrolled college expanded BPS educational programs and students experiences 2.1.2 Entrance to and persistence in STEM educational Extended and programs expanded BPS 7-17

PREPUBLICATION COPY, UNCORRECTED PROOFS 2.1.3 Equitable student participation in evidence-based STEM Extended and educational practices expanded BPS 2.2 Representational equity 2.2.1 Diversity of STEM degree and certificate earners in Revised and expanded among STEM credential earners comparison with diversity of degree and certificate earners in IPEDS all fields Revised and expanded 2.2.2 Diversity of students who transfer from 2- to 4-year IPEDS STEM programs in comparison with diversity in 2-year STEM programs 2.2.3 Time to degree for students in STEM academic programs Revised and expanded IPEDS 2.3 Representational diversity 2.3.1 Diversity of STEM instructors in comparison with Revised IPEDS among STEM instructors diversity of STEM graduate degree holders Human Resources Survey 2.3.2 Diversity of STEM graduate student instructors in Revised IPEDS comparison with diversity of STEM graduate students Human Resources Survey 2.4 Inclusive environments in 2.4.1 Students pursuing STEM credentials feel included and Extended and institutions and STEM supported in their academic programs and departments expanded BPS departments 2.4.2 Instructors teaching courses in STEM disciplines feel Renewed and supported and included in their departments expanded NSOPF 2.4.3 Institutional practices that are culturally responsive, Renewed and inclusive, and consistent across the institution expanded NSOPF 3.1 Foundational preparation for 3.1.1 Completion of foundational courses, including Revised and expanded STEM for all students developmental education courses, to ensure STEM program IPEDS readiness 3.2 Successful navigation into 3.2.1 Retention in STEM programs, course to course and year Revised and expanded 7-18

PREPUBLICATION COPY, UNCORRECTED PROOFS and through STEM programs of to year IPEDS study 3.2.2 Transfers from 2- to 4-year STEM programs in Revised and expanded comparison with transfers to all 4-year programs IPEDS 3.3 STEM credential attainment 3.3.1 Number of students who attain STEM credentials over Revised and expanded time (disaggregated by institution type, transfer status, and IPEDS students’ demographic characteristics) NOTES: BPS, Beginning Postsecondary Students Longitudinal Study; IPEDS, Integrated Postsecondary Education Data System; NSOPF, National Study of Postsecondary Faculty. 7-19

PREPUBLICATION COPY, UNCORRECTED PROOFS TABLE 7-3 Data for Indicators in Option 3 Objective Indicators Data Source 1.1 Use of evidence-based 1.1.1 Use of evidence-based educational Revised and expanded proprietary STEM educational practices practices in course development and delivery surveys to include a nationally both in and outside of representative sample of all types of classrooms. 2- and 4-year institutions 1.2.1 Use of evidence-based STEM educational Same as above practices outside the classroom 1.2 Existence and use of 1.2.1 Extent of instructors’ involvement in Revised and expanded proprietary supports that help STEM professional development surveys to include a nationally use evidence-based STEM representative sample of all types of learning experiences 1.2.2 Availability of support or incentives for 2- and 4-year institutions evidence-based course development or course redesign 1.3 An institutional climate 1.3.1 Use of valid measures of teaching Revised and expanded proprietary that values undergraduate effectiveness surveys o include a nationally STEM instruction representative sample of all types of 1.3.2 Consideration of evidence-based teaching 2- and 4-year institutions in personnel decisions by departments and institutions 1.4 Continuous No indicators: see “Challenges of Measuring improvement in STEM Continuous Improvement” in Chapter 3. teaching and learning 2.3 Representational 2.3.1 Diversity of STEM instructors in Nationally representative sample of diversity among STEM comparison with diversity of STEM graduate institutions drawn from appropriate instructors degree holders voluntary reform initiatives 2.3.2 Diversity of STEM graduate student instructors in comparison with diversity of 7-20

PREPUBLICATION COPY, UNCORRECTED PROOFS STEM graduate students 2.4 Inclusive environments 2.4.1 Students pursuing STEM credentials feel Revised and expanded proprietary in institutions and STEM included and supported in their academic surveys to include a nationally departments programs and departments representative sample of all types of 2- and 4-year institutions 2.4.2 Faculty teaching courses in STEM disciplines feel supported and included in their departments 2.4.3 Institutional practices that are culturally responsive, inclusive, and consistent across the Institution 3.1 Foundational 3.1.1 Completion of foundational courses, Nationally representative sample of preparation for STEM for including developmental education courses, to institutions drawn from appropriate all students ensure STEM program readiness voluntary reform initiatives 3.2 Successful navigation 3.2.1 Retention in STEM programs, course to Nationally representative sample of into and through STEM course and year to year institutions drawn from appropriate programs of study voluntary reform initiatives 3.2.2 Transfers from 2- to 4-year STEM programs in comparison with transfers to all 4- year programs 3.3 STEM credential 3.3.1 Number of students who attain STEM Nationally representative sample of attainment credentials over time (disaggregated by institutions drawn from appropriate institution type, transfer status, and voluntary reform initiatives demographic characteristics) 7-21

Next: Appendix A Public Comments on Draft Report and Committee Response »
Indicators for Monitoring Undergraduate STEM Education Get This Book
×
Buy Paperback | $55.00
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF

Science, technology, engineering and mathematics (STEM) professionals generate a stream of scientific discoveries and technological innovations that fuel job creation and national economic growth. Ensuring a robust supply of these professionals is critical for sustaining growth and creating jobs growth at a time of intense global competition. Undergraduate STEM education prepares the STEM professionals of today and those of tomorrow, while also helping all students develop knowledge and skills they can draw on in a variety of occupations and as individual citizens. However, many capable students intending to major in STEM later switch to another field or drop out of higher education altogether, partly because of documented weaknesses in STEM teaching, learning and student supports. Improving undergraduate STEM education to address these weaknesses is a national imperative.

Many initiatives are now underway to improve the quality of undergraduate STEM teaching and learning. Some focus on the national level, others involve multi-institution collaborations, and others take place on individual campuses. At present, however, policymakers and the public do not know whether these various initiatives are accomplishing their goals and leading to nationwide improvement in undergraduate STEM education.

Indicators for Monitoring Undergraduate STEM Education outlines a framework and a set of indicators that document the status and quality of undergraduate STEM education at the national level over multiple years. It also indicates areas where additional research is needed in order to develop appropriate measures. This publication will be valuable to government agencies that make investments in higher education, institutions of higher education, private funders of higher education programs, and industry stakeholders. It will also be of interest to researchers who study higher education.

  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  6. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  7. ×

    View our suggested citation for this chapter.

    « Back Next »
  8. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!