National Academies Press: OpenBook

Indicators for Monitoring Undergraduate STEM Education (2018)

Chapter: 6 Existing Data Sources and Monitoring Systems

« Previous: 5 Goal 3: Ensure Adequate Numbers of STEM Professionals
Suggested Citation:"6 Existing Data Sources and Monitoring Systems." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×

6

Existing Data Sources and Monitoring Systems

This chapter addresses the committee’s charge to review existing systems for monitoring undergraduate STEM education. The first section provides an overview of currently available data on higher education in STEM fields. The next two sections review public and proprietary data sources, respectively. The fourth section discusses existing monitoring systems that contain elements related to the committee’s proposed indicators. The final section focuses directly on the committee’s indicators, summarizing for each indicator current data sources, potential new data sources, and the research and data development that would be required to tap those potential sources for the purpose of ongoing monitoring of undergraduate STEM education.

OVERVIEW

Although many different postsecondary education data sources are available, they are limited in their ability to track students’ progress into and through STEM programs and monitor the status of the committee’s goals for undergraduate STEM:

  • Goal 1: Increase students’ mastery of STEM concepts and skills by engaging them in evidence-based STEM educational practices and programs.
  • Goal 2: Strive for equity, diversity, and inclusion of STEM students and instructors by providing equitable opportunities for access and success.
Suggested Citation:"6 Existing Data Sources and Monitoring Systems." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×
  • Goal 3: Ensure adequate numbers of STEM professionals by increasing completion of STEM credentials as needed in the different STEM disciplines.

The various public and proprietary data sources currently available are summarized in Table 6-1. These data sources rely primarily on three types of data (1) student and faculty unit record administrative data, (2) aggregated institution-level data, and (3) surveys of individual students and instructors (see Box 6-1).

TABLE 6-1 Major Sources of Data on Undergraduate STEM Education

Source Frequency Coverage and Representativeness Feasibility of Disaggregation
Federala
IPEDS Annual Nationally representative; mandatory so virtually 100% coverage Strong for race and ethnicity, gender, institution type, and discipline; does not allow disaggregation for disability and Pell grant (socioeconomic) status
Beginning Postsecondary Education Longitudinal Study (BPS) Every 6 to 8 years Nationally representative; 82% response rate in most recent (BPS 04/09) Limited for disaggregating by both demographic characteristics and field of study
National Survey of Postsecondary Faculty Discontinued in 2004 Nationally representative of full-time, but not part-time, faculty Strong for individual and institutional characteristics
Proprietaryb
National Student Clearinghouse Annual 98% of institutions represented, but institutions do not always provide students’ demographic characteristics, disciplines, and degree programs Limited for student characteristics
HERIc Freshman Survey Annual Nationally representative of first-time, full-time, 4-year students Good for 4-year student characteristics, but limited for 2-year student characteristics
Suggested Citation:"6 Existing Data Sources and Monitoring Systems." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×
Source Frequency Coverage and Representativeness Feasibility of Disaggregation
HERIc Your First College Year/Senior Survey Annual Limited coverage of 2-year institutions Strong for student characteristics; weak for institutional characteristics
HERIc Faculty Survey Every 3 years Strong coverage among 4-year nonprofit institutions; nationally representative of full-time faculty at 4-year institutions Strong for faculty at 4-year institutions
HERIc Diverse Learning Environments Survey Occasional Very limited coverage among 2-year and 4-year institutions; student response rates within institutions average 25% Strong for student characteristics; weak for institutional characteristics
National Survey of Student Engagement Annual Broad coverage among 4-year institutions; student response rates within institutions average 30–35% Strong for student and institutional characteristics of 4-year institutions
Community College Survey of Student Engagement Annual Moderate coverage of 2-year institutions; poor student response rates Limited for student characteristics due to small sample sizes
Faculty Survey of Student Engagement Annual Limited coverage of 2-year institutions; faculty responses average 20–25% Strong for individual characteristics of faculty at 4-year institutions

a These data are publicly available.

b These data may or may not be publicly available. Access may require intellectual property negotiations and fees.

c HERI, Higher Education Research Institute.

Suggested Citation:"6 Existing Data Sources and Monitoring Systems." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×

The major federal system, the Integrated Postsecondary Educational Data System (IPEDS), focuses primarily on credential attainment by full-time students at the institution at which they began their studies. This focus does not always match students’ trajectories through undergraduate STEM education. For example, many undergraduate STEM students enroll part time: a recent analysis of data on first-time students entering higher education in 2003–2004 with plans to major in STEM found that, on average, only 33 percent of those at 2-year institutions and 68 percent of those at 4-year institutions were enrolled full time, over the course of their studies (Van Noy and Zeidenberg, 2014).

In terms of student trajectories, IPEDS and other datasets are not always aligned with time to degree and student mobility. First, students are taking more time than expected to attain STEM credentials: Eagan and

Suggested Citation:"6 Existing Data Sources and Monitoring Systems." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×

colleagues (2014a) found that only 22 percent of first-time, full-time STEM aspirants entering 4-year institutions in fall 2004 completed a STEM degree within 4 years, while 52 percent completed within 6 years. Among all full-time students (not only in STEM) who entered 2-year institutions in 2010, only 29 percent had completed a degree or certificate within 150 percent of the expected time (i.e., 3 years) (National Center for Education Statistics, 2015). However, an analysis of data on students entering 2-year institutions in 2007 found that one-third transferred to 4-year institutions (either before or after completing a degree or certificate), and 42 percent of these transfers (or 14% of the original cohort entering in 2007) completed a bachelor’s degree within 6 years (Jenkins and Fink, 2016). In their analysis of students entering 2-year STEM degree programs, Van Noy and Zeidenberg (2014) found that, after 6 years, 30 percent had attained a credential or were still enrolled in STEM, 33 percent had attained a credential or were still enrolled in a non-STEM field, and 37 percent were neither enrolled nor had attained a credential.

Another aspect of STEM student trajectories that is not always reflected in current federal data sources is mobility.1 Students often transfer among institutions and some enroll at more than one institution at the same time. Many students take a semester or more off, rather than maintaining continuous enrollment. For example, in their analysis of 4-year entrants to STEM, Eagan and colleagues (2014a) found that about 15 percent transferred to 2-year institutions, 13 percent transferred laterally from one 4-year institution to another, and 9 percent were simultaneously enrolled in more than one institution. The frequency of “swirling,” or movement between multiple institutions, was similar for 2-year college STEM students (National Academies of Sciences, Engineering, and Medicine, 2016).

In addition to their limitations in measuring actual student trajectories, existing data collection systems (national, state, and institutional) are often not structured to gather the information needed to understand the quality of undergraduate STEM education (National Academies of Sciences, Engineering, and Medicine, 2016). Overall, there are several reasons that measuring students’ progress through STEM programs is difficult:

  • Representative data are available only for full-time, first-time students.
  • Information on intended major when students first enrolled is only available for 4-year students.
  • Data on the quality of students’ educational experiences are very limited.

___________________

1 IPEDS has recently expanded its data collections to include part-time students and transfer students, as discussed further below.

Suggested Citation:"6 Existing Data Sources and Monitoring Systems." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×
  • Data on the training and qualifications of undergraduate instructors are no longer collected.
  • Degree completion data only cover up to 6 years.
  • Data on subgroups among Hispanics and Asian Americans are not available.
  • The sample sizes are sometimes too small for meaningful analysis of groups, such as Native Americans, first-generation students, veterans, and students with disabilities.

The lack of nationally representative data on student trajectories through undergraduate STEM education results partly from policy decisions. In 2005, the U.S. Department of Education’s National Center for Education Statistics (NCES) proposed to address this data gap by expanding the IPEDS database to include an individual student unit record data system (Cunningham and Milam, 2005). With privacy and confidential protections, the system would have used administrative records of individual students’ progress over time (enrollment status, grades, field of study, etc.). However, Congress effectively banned the creation of any national unit-record database in the 2008 reauthorization of the Higher Education Act (P.L. 110-315).

PUBLIC DATA SOURCES

The committee reviewed several public data sources, considering the frequency of data collection and release, their coverage and representativeness of 2-year and 4-year institutions, and the feasibility of disaggregating the data. Disaggregation is especially important for indicators of equity, diversity, and inclusion. For many data sources, disaggregation by multiple dimensions of student and institutional characteristics leads to small sample sizes that lose statistical significance.

Federal and state sources have two major strengths for use in the proposed indicator systems: their data are publicly available, and the federal sources are generally of high quality, providing nationally representative data that covers all types of institutions (2-year and 4-year, public, private for profit, and private nonprofit) and all student groups. As discussed below, the coverage of institution types within state systems is uneven.

The Integrated Postsecondary Education Data System

The National Center for Education Statistics (NCES) operates IPEDS as its core postsecondary education data collection program. IPEDS is a series of interrelated surveys that are conducted annually. Every college, university, technical, and vocational institution that participates in the fed-

Suggested Citation:"6 Existing Data Sources and Monitoring Systems." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×

eral student financial aid programs is required under Title IV of the Higher Education Act as amended in 1992 (P.L. 102-325) to provide data annually. Because of this requirement, response rates are very high. For example, in the spring 2010 data collection, the response rate for each of the survey components was more than 99 percent (Knapp, Kelly-Reid, and Ginder, 2012). According to the NCES Handbook of Survey Methods (Burns, Wang, and Henning, 2011), IPEDS includes the universe of postsecondary institutions participating in federal student financial aid programs.

In 2014, about 7,300 institutions complied with the mandate to respond, and an additional 200 institutions that did not participate in federal financial aid programs voluntarily provided data (National Center for Education Statistics, 2014). Individual institutions, or in some cases, the state higher education systems responding on behalf of multiple institutions, provide data describing their institutional characteristics, enrollments, completions and completers, graduation rates and other outcome measures, faculty and staff, finances, institutional prices, student financial aid, admissions, and academic libraries. To do so, institutional research staff or administrators aggregate internal administrative records (i.e., student unit record data) to create institution-level data files and submit them to IPEDS.

IPEDS data are collected and released three times each year and are made publicly accessible in two online platforms—the College Navigator, that can be used by students, families, educational policy makers, and others2 and the IPEDS Data Center.3 To ensure data quality, the NCES Statistical Standards Program publishes statistical standards and provides methodological and statistical support to assist NCES staff and contractors in meeting the standards, with the goal of providing high-quality, reliable, and useful statistical information to policy makers and the public (National Center for Education Statistics, 2012). Several data elements in IPEDS (see National Center for Education Statistics, 2014) are relevant the committee’s proposed indicators.

12-Month Enrollment

Data on 12-month enrollment for undergraduate and graduate students are collected in the fall. The data include unduplicated headcounts and instructional activity in contact or credit hours. Instructional activity is used to compute a standardized, 12-month, full-time-equivalent enrollment.

___________________

2 See http://nces.ed.gov/collegenavigator [August 2017].

3 See http://nces.ed.gov/ipeds/datacenter [August 2017].

Suggested Citation:"6 Existing Data Sources and Monitoring Systems." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×

Completions

Completion data covering all degrees (associate’s, bachelor’s, master’s and doctorate) and sub-baccalaureate awards are collected in the fall. These data are disaggregated by race and ethnicity, gender, and field of study. They include all STEM degrees and awards received by students, both those who began at the reporting institution and those who transferred to that institution.

Graduation Rates

The graduation data cover the initial cohort of full-time, first-time, degree- and certificate-seeking undergraduate students at 2-year and 4-year institutions; the number of those students who complete their degrees or certificates within 150 percent of the normal time (i.e., 3 years or 6 years); and the number of those students who transferred to other institutions. Data are reported by race and ethnicity, gender, and field of study. The data also include 100 percent graduation rates: 4-year bachelor’s degree rates have been reported since 1997; 2-year certificate and degree rates have been reported since 2008–2009.

It is important to note that these data do not include part-time students, students who transfer to the reporting institution and students who transfer out and later graduate from another institution. Given the high rates of student “swirl” in STEM fields, these data do not accurately capture STEM graduation rates.

200 Percent Graduation Rates

In 2009, IPEDS added a new survey component, called Graduation Rate 200, the graduation rates at 200 percent of normal time. It is collected in the winter, separately from the graduation rate component so as not to confuse the two different cohorts that are being reported on. Graduation rates at 200 percent of normal time are calculated for all full-time, first-time bachelor degree-seeking students at 4-year institutions and for all full-time, first-time degree- and certificate-seeking undergraduate students at 2-year institutions.

Although this survey component reflects the current reality of extended time to degree, it also excludes part-time students and transfers.

Outcome Measures Survey

To track completion of part-time and transfer students, IPEDS began to implement a new outcome measures survey in 2015–2016. The new survey

Suggested Citation:"6 Existing Data Sources and Monitoring Systems." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×

is designed to help policy makers track the progress of low-income students who receive federal aid (Pell grants and Stafford loans), asking institutions to report separately on students who do and do not receive federal aid. In a second, broader change, the survey asks institutions to report on four cohorts of entering students: (1) first-time, full-time students; (2) first-time, part-time students; (3) transferring full-time students; and (4) transferring part-time students. For each entering cohort, institutions report the number and proportion of students who completed their intended credential; were still enrolled; had enrolled in another institution; or had unknown whereabouts. All these outcomes are to be reported at 4, 6, and 8 years after entry. NCES has released preliminary data from this new survey component (Ginder et al., 2017) and plans to release final data for 2015–2016 in early 2018.

The Beginning Postsecondary Students Longitudinal Study

In contrast to IPEDS, which focuses on enrollment and completion (inputs and outcomes), the NCES Beginning Postsecondary Students Longitudinal Study (BPS) provides more detailed information on students’ progress (process). Unlike IPEDS data, which are collected annually and released three times each year, BPS data are collected and published less frequently, about every 3 to 4 years. However, they provide richer, more detailed data than those in IPEDS. To date, the BPS has followed cohorts of students who entered postsecondary education in 1990, 1996, 2004, and 2012. In each cycle, BPS tracks a cohort of students as they enter 2-year or 4-year institutions and collects data on their course-taking, persistence, completion, transition to employment and demographic characteristics, among other data elements. Students included in the nationally representative sample complete three surveys: one at the end of their first academic year, one at 3 years, and then another at 6 years after they began postsecondary education.

Data are currently available from the 2004/09 BPS (BPS 04/09). The BPS 04/09 study followed a sample of more than 18,000 students who began higher education in academic year 2003−2004 for a total of 6 years, through 2009, and it merged data from the 2009 Postsecondary Education Transcript Study to supplement information collected from student and institutional surveys. NCES designed the BPS survey to sample institutions and students within institutions. The agency used multiple methods to obtain responses, yielding strong response rates and nationally representative data. For example, the BPS 04/09 sample comprised all 18,640 students determined to be eligible during the previous cycle of data collection (BPS 04/06). NCES obtained data from 16,680 respondents, either from the student interview or administrative sources, for a response rate of 89 percent

Suggested Citation:"6 Existing Data Sources and Monitoring Systems." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×

(Wine, Janson, and Wheeless, 2011). The full BPS 04/09 dataset provides rich information on students’ course histories, enrollment and matriculation pathways, college experiences and perceptions, and retention and graduation outcomes. Data are disaggregated by race and ethnicity, gender, socioeconomic status, enrollment status, disability status, field of study, and institution type. However, the sample sizes do not allow disaggregation by demographic characteristics, field of study, and institution type.

The current cohort, BPS 12/17, began college in 2012, was followed up in 2014, and was followed up again in 2017; the data are not yet available. Continuing these regular cycles will be critical for informing some of the proposed indicators. In addition, more frequent data collection would allow the indicators to be updated annually, rather than only once every 3 years.

The National Survey of Postsecondary Faculty

Nationally representative data from 4-year institutions on undergraduate STEM instructors were formerly available from the NCES National Study of Postsecondary Faculty (Cataldi, Fahimi, and Bradburn, 2005). The study was based on faculty surveys conducted in 1988, 1993, 1999, and 2004. All four survey cycles included part-time as well as full-time faculty, and the 1993, 1999, and 2004 surveys included non-faculty personnel with teaching responsibilities. Topics included sociodemographic characteristics; academic and professional background; field of instruction; employment history; current employment status, including rank and tenure; workload; courses taught; publications; job satisfaction and attitudes; career and retirement plans; and benefits and compensation.

NCES ended this survey following the academic year 2003–2004. The National Science Foundation (NSF) has expressed interest in working with NCES to revive the survey and expand it to include evidence-based teaching practices. NSF requested funding for fiscal 2017 to work with NCES to reinstitute the survey and expand it to provide data on teaching practices, the evolving role of technology in education, and the changing nature of faculty work. To date, however, the committee is not aware of any steps taken to revive the survey.

National Student Loan Data System

The Department of Education office of Federal Student Aid operates the National Student Loan Data System (NSLDS) as the central database for monitoring student financial aid. The office uses the database primarily for operational purposes, such as tracking federal grant and loan disbursements, the enrollment and repayment status of aid recipients, payments and

Suggested Citation:"6 Existing Data Sources and Monitoring Systems." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×

remaining balances on federal loans, and borrower status (Executive Office of the President of the United States, 2015). NSLDS receives data from institutions of higher education, agencies that service federal loans, the Pell grant program, and other federal financial aid programs.4 Relevant to the committee’s goals and objectives, NSLDS includes students’ enrollment by academic term and program (including STEM programs).

Because about 70 percent of all graduating students have received Pell grants or other federal aid, the NSLDS covers the majority of the nation’s students. In addition, the characteristics of federal aid recipients at an individual institution are generally similar to those of the overall student population at that institution, in terms of admissions test scores, race and ethnicity, age, and marital status (Executive Office of the President of the United States, 2015). However, students who receive federal aid at an institution have lower incomes relative to the general population at that institution. In addition, the percentage of students receiving financial aid varies across different types of institutions (public and private, for-profit and nonprofit, 2-year and 4-year).

Although the NSLDS data on students’ enrollment in STEM programs are not nationally representative of all students and institutions, the data could provide a rough approximation for the committee’s proposed indicators. Perhaps more importantly, the creation of this database has helped the Department of Education develop the technical capacity to obtain student unit record information from institutions and maintain that information in a central database. The department could potentially apply this technical capacity and expertise if the current legal ban on creation of a national student unit record database were overturned. For example, a new national student unit record system could be created by expanding the NSLDS (Miller, 2016; see Chapter 7 for further discussion).

The NSLDS has a unique relationship with the National Student Clearinghouse (NSC), a proprietary data source, which is described below. Many institutions voluntarily provide student unit record data to NSC, relying on it to prepare the enrollment reports they are required to send to NSLDS. Approximately once a month, these institutions submit to NSC a roster of detailed administrative information on all their students; NSC then matches those data to rosters of students sent to it by NSLDS and, based on its matching, provides reports on behalf of the institutions to NSLDS.

___________________

4 As defined in Title IV of the Higher Education Act, such financial aid includes loans under the Federal Family Education Loan Program or William D. Ford Federal Direct Loan (Direct Loan) Program, as well as Perkins Loans, Pell Grants, Teacher Education Assistance for College and Higher Education Grants, Academic Competitiveness Grants or Science and Math Access to Retain Talent Grants, and Parent PLUS loans.

Suggested Citation:"6 Existing Data Sources and Monitoring Systems." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×

State Unit Record Data Systems

Partly in response to the federal legislation prohibiting the creation of a federal unit record data system and partly to address their own information needs, many states have constructed their own student unit record systems. Since 2007, NCES has supported these efforts, providing technical assistance and awarding federal funds to 47 states through the State Longitudinal Data System Grant Program (grants were last awarded in 2015). NCES has encouraged state K–12 education agencies to collaborate with the respective state higher education and workforce development agencies to develop linked state datasets on education and employment. State higher education governing boards often manage these unit record data systems, using them to respond to questions from state policy makers that are not easily answered by other datasets. For example, these data systems can provide state-level information about the effect of policies (e.g., remedial and developmental education reforms, transfer policies) on student success (Armstrong and Zaback, 2016).

Although they are well-established, data systems in some states are challenged by gaps in data coverage, concerns about privacy, and a lack of resources (Armstrong and Zaback, 2016). Created by state higher education systems to provide information on state-supported public higher education, these systems have limited coverage of private institutions and thus are not representative of all students and institutions in each state. A recent survey of 47 state data systems by the State Higher Education Executive Officers Association (Whitfield and Armstrong, 2016) found that they all included 4-year public institutions, and most of them (42) included public 2-year institutions, but only 27 included private for-profit institutions, and less than one-half (18) collected data from private nonprofit institutions. Among the states collecting data from private nonprofit institutions, most reported that they collected data only from those institutions that participated in state financial aid programs or that volunteered to provide data.

In some states, policy makers have adopted or are considering legislation—stemming from concerns about student privacy—that prevents linking of K–12, postsecondary, and employment databases. Although the federal Family Educational Rights and Privacy Act (FERPA) provides strict guidelines for when and how personally identifiable student information can be shared, such legislation typically prevents agencies from using personally identifiable information to link datasets.

Some respondents to the recent survey (Whitfield and Armstrong, 2016) noted a lack of funding and an inability to retain quality data analysts on staff as barriers to effective maintenance and use of these systems (Armstrong and Zaback, 2016). Federal funding of state data systems has

Suggested Citation:"6 Existing Data Sources and Monitoring Systems." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×

been essential, and not all states have provided state funding to maintain these systems after federal grants expired.

PROPRIETARY DATA SOURCES

In reviewing private proprietary data sources, the committee considered the alignment of data elements with the committee’s goals and objectives, the frequency of data collection and release, their coverage and representativeness of 2-year and 4-year institutions, and the feasibility of disaggregating the data.

Private proprietary data sources have three primary weaknesses as potential sources for the committee’s proposed indicator system. First, because the data are proprietary, federal officials might be required to pay fees and negotiate intellectual property rights to access and use the data, or they might not be able to access the data at all. Second, most of the data from these sources are not disaggregated by discipline, limiting their usefulness for STEM indicators. Third, the coverage and representativeness of these data are uneven.

National Student Clearinghouse

The NSC is a private nonprofit organization launched in 1993 to streamline student loan administration, which now partners with 2-year and 4-year institutions to track student enrollment and verify educational achievements. Although institutional participation is voluntary, the organization states that more than 3,600 institutions enrolling 98 percent of students in U.S. public and private institutions share enrollment and degree records (National Student Clearinghouse, 2016b). Because institutions voluntarily submit their enrollment data, the quality of NSC data depends partly on how each postsecondary institution maintains its data and the processes used to extract that data for NSC. Responding to two researchers’ open letter about the quality of the data, the National Student Clearinghouse (2016a) posted the following statement on its Website: “The accuracy and completeness of the data can realistically be measured only by the institutions themselves.” In a review of NSC data, Dynarski, Hemelt, and Hyman (2013) drew on several other national data sources and conducted a case study of Michigan students enrolled in higher education, concluding that coverage was highest among public institutions and lowest (but growing), among for-profit colleges. In addition, they found that enrollment coverage was lower for minorities that other students but similar for males and females, and there was substantial variation in coverage across states, institutional sectors, and over time.

Suggested Citation:"6 Existing Data Sources and Monitoring Systems." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×

Institutional variation in reporting to NSC affects the data’s relevance to undergraduate STEM education. Some institutions do not report students’ academic field for every term, limiting their ability to inform indicators of progress or retention specifically in STEM. In addition, the uneven reporting of all data elements poses a challenge to disaggregation of the data by demographic characteristics, such as racial and ethnic group, for the proposed indicators of equity, diversity, and inclusion.

Higher Education Research Institute Surveys

The Higher Education Research Institute (HERI) at the University of California, Los Angeles, conducts four surveys, on freshman, first-year college students and college seniors, faculty, and learning environments.

Freshman Survey

The HERI freshman survey gathers data from incoming first-time full-time college students on their educational attitudes and aspirations. More than 1,900 four-year institutions have participated in the survey since 1966. In 2015, HERI identified 1,574 institutions that offer baccalaureates degrees that were included in IPEDS and invited them to participate. This national population of institutions was divided into 26 stratification groups, based on institutional race, type (e.g., university, 4-year college, 2-year college), control (e.g., public, private nonsectarian, Roman Catholic, other religious), and selectivity. Of the 1,574 institutions, 308 institutions responded, a 19 percent response rate. Generally, the response rate among students in the institutions that participated has been high, averaging 75 percent. Since 2011, data from institutions have been included in the “national norms sample” only if there was a response rate of at least a 65 percent among incoming full-time first-year students. Data from institutions just below this cutoff are included if the survey administration methods showed no systematic biases in freshman class coverage. In 2015, data from 199 institutions, representing 141,189 student responses, met these criteria and were included in the national norms sample (Eagan et al., 2016).

In 2015, the survey data were weighted by a two-step procedure. The first weight was designed to adjust for response bias within institutions, and the second weight was designed to compensate for nonresponding institutions within each stratification group by gender. The weighted data are nationally representative of first-time, full-time freshmen in nonprofit 4-year colleges and universities in the United States (Eagan et al., 2016; National Academies of Sciences, Engineering, and Medicine, 2016). Reflecting the high quality of these data, NSF relies on them for the undergradu-

Suggested Citation:"6 Existing Data Sources and Monitoring Systems." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×

ate education section of the National Science Board’s biennial Science and Engineering Indicators report (National Science Foundation, 2016).

Because it does not include 2-year institutions and part-time students are undersampled, the HERI Freshman Survey does not provide data nationally representative of the U.S. population of 2-year and 4-year students. In terms of disaggregation, although the data include the proportion of entering students who stated an intention to major in a STEM field and later completed a degree in that field, they do not measure students’ actual selections of major field (e.g., switching into STEM majors). The data are disaggregated by demographic characteristics.

First-Year College/College Senior Survey

The HERI first-year college/college senior survey primarily includes private 4-year institutions, with no sampling of 2-year institutions. Within participating institutions, student response rates range from 25 to 75 percent. The resulting data are not nationally representative of the universe of 2-year and 4-year institutions and students. Although the data are disaggregated by demographic characteristics, increasing their relevance for equity indicators, they do not allow disaggregation by institutional type, and, as noted above, do not cover 2-year institutions.

Faculty Survey

With the suspension of the NCES Survey of Postsecondary Faculty in 2004 (see above), researchers and policy makers have increasingly relied on the HERI Faculty Survey. Although this survey is designed to include full- and part-time faculty members, most participating institutions choose to sample only full-time faculty. In addition, although both 2-year and 4-year institutions are invited to participate, 4-year nonprofit institutions predominate in the survey. The survey includes questions about working conditions and activities and teaching approaches.

In 2014, HERI identified a national population of 1,505 institutions that grant baccalaureate degrees that had responded to the IPEDS 2012–2013 human resources survey and invited them to participate in the HERI Faculty Survey. The national population was divided into 20 stratification groups based on type, control, and selectivity. Of those invited, 148 institutions participated, a 9 percent response rate. HERI also developed a supplemental sample of 67 institutions to enhance the number of respondents from types of institutions that participated at a lower rate than others to create a normative national sample of institutions (Eagan et al., 2014b).

To be included in the normative national sample, colleges were required to have responses from at least 35 percent of full-time undergraduate

Suggested Citation:"6 Existing Data Sources and Monitoring Systems." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×

faculty, and universities were required to have responses from at least 20 percent of full-time undergraduate faculty. In 2014, data from 133 participating institutions and 63 supplemental sample institutions met these criteria and were included in the normative national sample. Among these institutions, faculty response rates have averaged 40 to 50 percent. In 2015, the sample data were weighted using a three-step procedure. The first weight was designed to adjust for response bias either within the participating institutions or the supplemental sample. The second weight was designed to correct for between-stratification cell differences in institutional participation. The third weight was the product of the first and second weights. Weighting each response in the norms sample brought the counts of full-time undergraduate faculty up to the national population number within each stratification cell, so that the data are representative of the national population of full-time undergraduate faculty at 4-year institutions (Eagan et al., 2014b).

Because this survey provides data on faculty members’ discipline and teaching practices, it has the potential to provide data on evidence-based STEM educational practices to monitor progress toward Goal #1. However, the teaching practices it includes do not necessarily reflect the evidence-based STEM educational practices identified in the relevant research. In addition, it is based primarily on full-time faculty at nonprofit 4-year institutions and thus is not nationally representative of all types of faculty at both 2-year and 4-year institutions.

Diverse Learning Environments Survey

Students’ perceptions of their learning environments become their lived realities, and data for indicators of equity, diversity, and inclusion would likely need to be collected from surveys of STEM students and faculty. The HERI Diverse Learning Environments Survey includes a validated sense of belonging measure based on four items (I feel a sense of belonging to this college; I see myself as a part of the campus community; I feel I am a member of this college; If asked, I would recommend this college to others). The same instrument asks students the extent to which they agree that “faculty members are approachable” in their academic program and that the respondent has “a peer support network among students” in their major.

This survey does not provide nationally representative data related to students’ perceptions of equity, diversity, and inclusion. Only about 30 institutions have administered the survey in each of the past 2 years, and response rates have been low, averaging 25 percent of students at the participating institutions.

Suggested Citation:"6 Existing Data Sources and Monitoring Systems." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×

National Survey of Student Engagement

In 1998, the National Center for Higher Education Management Systems launched the development of a new survey focused on student engagement, college outcomes, and institutional quality. The survey designers drew on research and theory linking student engagement with persistence and success. That survey is now the National Survey of Student Engagement (NSSE) and is administered by the Center for Postsecondary Research at Indiana University. It asks students about their own learning activities, instructors’ behavior, and their perceptions of the college experience, including self-reported learning gains in areas, such as acquiring job-related knowledge and skills, writing clearly and effectively, and contributing to the welfare of their community. NSSE also includes questions about students’ engagement in “high-impact practices” (see Chapter 3), which overlap to some degree with evidence-based STEM educational practices, and the data are broken down by race and ethnicity and gender.

Since 2000, more than 1,600 colleges and universities have administered the survey to first-year and fourth-year students (Center for Postsecondary Research, 2017b). Although coverage of different types of 4-year institutions is good, the survey does not cover 2-year institutions, so the data that are not nationally representative. According to the NSSE Website, in 2016, student response rates within participating institutions ranged from 5 to 77 percent, with an average of 29 percent.5 In a recent study, Fosnacht and colleagues (2017) used data from NSSE administrations between 2010 and 2012 to simulate the effects of low response rates and low respondent counts. They found institution-level estimates for several measures of college student engagement to be reliable under low response rate conditions (ranging from 5% to 25%) and as few as 25 to 75 respondents, based on a conservative reliability criterion (r ≥ .90), albeit with greater sampling error and less ability to detect statistically significant differences with comparison institutions.

Some authors have raised questions about the validity of NSSE data. For example, Porter (2013) examined students’ self-reported learning gains by developing and testing a theory of college student survey response. He found little evidence of the construct and criterion validity of self-reported learning gains. Campbell and Cabrera (2014) analyzed NSSE data from student responses regarding their participation in three “deep approaches to learning” at a single large public research university. Using confirmatory factor analyses and structural equation modeling, the authors found that the three scales were internally consistent, but participation in deep learning was not related to students’ cumulative grade point averages.

___________________

5 See http://nsse.indiana.edu [August 2017].

Suggested Citation:"6 Existing Data Sources and Monitoring Systems." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×

Community College Survey of Student Engagement

The Community College Survey of Student Engagement (CCSSE), developed in partnership with NSEE, was established in 2001 as a project of the Community College Leadership Program at the University of Texas at Austin. Students are asked about their engagement in five groups of benchmark practices thought to enhance learning: active and collaborative learning, student effort, academic challenge, student-faculty interaction, and support for learners. This survey has moderate coverage of 2-year institutions, but the data are not nationally representative. The questions about the validity of student self-reported learning gains noted above (Porter, 2013) also apply to the data from this survey.

Faculty Survey of Student Engagement

The Center for Postsecondary Research at Indiana University conducts the Faculty Survey of Student Engagement (FSSE) as a complement to NSSE. The web-based survey, which is designed for all instructors (faculty members, other instructors, graduate student instructors), asks questions about instructors’ perceptions:

  • how often students engage in different activities;
  • the importance of various areas of learning and development;
  • the nature and frequency of their interactions with students; and
  • how they organize their time, in and outside the classroom.

According to the Center for Postsecondary Research (2017a), more than 250,000 instructors from more than 800 institutions have responded to FSSE since 2003. However, only about 20–25 percent of faculty members at the participating institutions have responded and few 2-year institutions participate. The resulting data are not nationally representative.

MONITORING SYSTEMS

The committee was not able to locate any existing systems that are designed specifically for monitoring the status and quality of undergraduate STEM education. However, it did identify existing monitoring systems that include elements relevant to undergraduate STEM, as discussed below.

Science and Engineering Indicators

Science and Engineering Indicators (SEI) is a congressionally mandated biennial report on U.S. and international science and engineering prepared

Suggested Citation:"6 Existing Data Sources and Monitoring Systems." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×

by NSF’s National Center for Science and Engineering Statistics (NCSES) under the guidance of the National Science Board (National Science Foundation, 2016; Khan, 2016). The report presents indicators, defined as “quantitative representations that might reasonably be thought to provide summary information bearing on the scope, quality, and vitality of the science and engineering enterprise” (National Science Foundation, 2016, p. F-2). The indicators are designed to enhance understanding of the current environment and to inform policy development.

Chapter 2 of SEI presents indicators on human capital, including STEM education at the K–12, undergraduate, and graduate levels, along with statistics on STEM graduates who are in the workforce. These indicators aim to inform stakeholders about inputs, processes, outputs, and outcomes of the STEM education system. Key indicators for undergraduate education include: enrollment by type of institution, field, and demographic characteristics; intentions to major in STEM fields; and recent trends in the number of earned STEM degrees.

Enrollment

The levels and flows of enrollment in STEM show how the different STEM fields are changing over time and so can inform decision makers charged with directing resources to undergraduate education. For postsecondary education, the enrollment data include the number enrolled in STEM relative to other degrees; change over time in the number of undergraduate degrees conferred; demographic characteristics of students enrolled in STEM fields, including citizenship status; and the number of students enrolled in 2-year institutions by demographic characteristics. These statistics are tabulated from the IPEDS fall enrollment survey.

Intentions and Attrition

In response to policy makers’ interest in retaining students in STEM to ensure an adequate supply of STEM professionals, SEI reports intentions of students to major in STEM fields by ethnicity, race, and gender. Since 1971, the data source for this indicator has been the HERI Freshman Survey (described above). SEI also presents statistics on attrition in STEM fields, mainly citing studies by Eagan and colleagues (2014a) and Chen and Soldner (2013).

Earned STEM Degrees

The health of a field of study is often represented by growth rates and other statistics that show dynamics of the system. The most recent SEI

Suggested Citation:"6 Existing Data Sources and Monitoring Systems." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×

(National Science Foundation, 2016) presents the number and growth rates of associate and baccalaureate degrees awarded in STEM, by demographic characteristics, drawing on the IPEDS completion survey.

Data Gaps

A recent review of SEI (National Research Council, 2014) noted that, although it provides a bevy of statistics on 4-year and postgraduate enrollments and degrees, it needs improved information on 2-year students who later earn higher degrees in STEM. Noting an increase in students who attend 2-year institutions as part of their 4-year STEM education, the review recommended that NCSES track graduates of 2-year institutions in STEM fields and publish data on these students’ persistence at different levels of education (National Research Council, 2014, p. 18).

Proprietary Monitoring Systems

In response to growing accountability pressures in higher education, many multi-institution consortia have been formed to promote reform and improvement. These groups often emphasize the collection and analysis of data on student progress as a way to inform internal improvement efforts and create consortium-wide benchmarks of progress. Building on the work of these consortia, other groups have emerged to focus specifically on approaches to data collection and analysis and development of new measures of higher education quality. Examples of these groups include Access to Success, Achieving the Dream, Completion by Design, Complete College America, National Community College Benchmarking Project, and Voluntary Institutional Metrics Project.

These groups often gather data from institutions in the form of specific measures of student progress and outcomes. Like the indicators proposed by the committee, these measures compile data on key aspects of higher education quality into an easily understandable form that educational policy makers and practitioners can use to monitor quality over time. For example, Complete College America (2014) and the National Governors’ Association developed measures for state higher education systems to use in voluntary data gathering: see Box 6-2. The Voluntary Institutional Metrics Project expanded this work by using similar measures to gather data on public, private nonprofit, and private for-profit institutions, including data on students who enroll anytime during the academic year, not only in the fall (HCM Strategists, 2013).

Measures similar to these have been used by groups such as Complete College America and Achieving the Dream to gather institutional survey data and student unit record data from many public 2-year and 4-year

Suggested Citation:"6 Existing Data Sources and Monitoring Systems." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×
Suggested Citation:"6 Existing Data Sources and Monitoring Systems." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×

institutions. The institutions sampled are not nationally representative of the universe of public and private 2-year and 4-year institutions. And because the resulting data are proprietary, they might not be available for the committee’s proposed indicator system. However, because the content of some measures used by higher education reform consortia overlaps with the content of some of the proposed indicators, they could be incorporated in expanded IPEDS institutional surveys (see Chapter 7).

DATA FOR EACH INDICATOR

Drawing on the above review of data sources and monitoring systems, the committee identified potential data sources and research needs for each of the 21 indicators it proposes. For each indicator, the committee identified potential data sources and considered how those sources would need to be revised to support that indicator, such as by revising the data content to align with the content of the indicator or by expanding a survey to ensure national coverage of all groups of students and institutions. For some indicators, the committee determined that research would be needed to more clearly define them and develop the best measurement approaches, prior to data collection. Through this process, the committee found that the availability of data was limited. For some indicators, nationally representative datasets are available, but when these data are disaggregated, first to focus on STEM students and then to focus on specific groups of STEM students, the sample sizes become too small for statistical significance. For other indicators, no data are available from either public or proprietary sources. The committee’s analysis is presented below and summarized in Table 6-2.

Indicator 1.1.1: Use of Evidence-Based STEM Educational Practices in Course Development and Delivery

Data Available and Potentially Available

Currently, few data are available on the extent to which educators (namely, faculty members, graduate student instructors, adjunct instructors, or others) throughout the nation use evidence-based practices in course development and delivery. The limited data currently available comes primarily from self-report surveys, often targeted to instructors who participated in professional development programs (National Research Council, 2012, Ch. 8; Manduca et al., 2017). Because those who participate in professional development may be more motivated than other instructors to learn about and adopt evidence-based practices, the survey responses are unlikely to be representative of the teaching practices among all instructors in the discipline, nationally.

Suggested Citation:"6 Existing Data Sources and Monitoring Systems." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×

TABLE 6-2 Research Needs and Potential Data Sources for the Proposed Indicators

Indicator Sourcea Needs for Research and Modification of Data Collection Instruments and/or Systems Needs for Coverage Improvements
1.1.1 Use of Evidence-Based Educational Practices in Course Development and Delivery HERI Faculty Survey Research to more clearly define evidence-based STEM educational practices; develop and add items Include 2-year institutions; more systematic inclusion of graduate teaching assistants
NSOPF Research to more clearly define evidence-based STEM educational practices; renew survey and develop and add items Expand number of institutions sampled to allow more granular disaggregation
Faculty Survey of Student Engagement Research to more clearly define evidence-based STEM educational practices; develop and add items Include more 2-year institutions and increase response rates at participating institutions
1.1.2 Use of Evidence-Based STEM Educational Practices Outside the Classroom None Research to more clearly define evidence-based STEM educational practices outside the classroom
1.2.1 Extent of Instructors’ Involvement in Professional Development HERI Faculty Survey None Include 2-year institutions; more systematic inclusion of graduate student instructors
NSOPF Renew faculty survey and review professional development items from the 1988 department chairperson survey for possible inclusion Expand number of institutions sampled to allow more granular disaggregation
1.2.2 Availability of Support or Incentives for Evidence-Based Course Development or Course Redesign None Research to identify and clearly define key supports or incentives
Suggested Citation:"6 Existing Data Sources and Monitoring Systems." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×
Indicator Sourcea Needs for Research and Modification of Data Collection Instruments and/or Systems Needs for Coverage Improvements
1.3.1 Use of Valid Measures of Teaching Effectiveness None Research to identify valid measures of teaching effectiveness
1.3.2 Consideration of Evidence-Based Teaching in Personnel Decisions by Departments and Institutions None Research on how to measure consideration of evidence-based teaching
2.1.1 Institutional Structures, Policies, and Practices That Strengthen STEM Readiness for Entering and Enrolled Students None Research to identify and define key structures, policies, and practices; components could be added to IPEDS on the basis of this research
2.1.2 Entrance to and Persistence in STEM Educational Programs BPS None Expand number of institutions and students sampled to allow more granular disaggregation
HERI Freshman Survey and NSC None Incorporate 2-year institutions in Freshman Survey; increase coverage of students’ academic programs in NSC data provided by institutions
2.1.3 Student Participation in Evidence-Based STEM Educational Practices HERI First College Year/Senior Survey Research to more clearly define evidence-based educational practices; develop and add items Include public and 2-year institutions and universities
NSSE Same as above Expand sample of 4-year institutions
CCSSE Same as above Increase coverage of 2-year institutions
Suggested Citation:"6 Existing Data Sources and Monitoring Systems." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×
2.2.1 Diversity of STEM Degree and Certificate Earners in Comparison with Diversity of Degree and Certificate Earners in All Fields IPEDS Include items on students’ Pell (socioeconomic) status and disability status in data provided by institutions None
2.2.2. Diversity of Transfers from 2- to 4-year STEM Programs in Comparison with Diversity of Students in 2-year STEM Programs NSC Add student attributes (gender, race and ethnicity, Pell status, disability status) to the data voluntarily submitted by institutions to NSC More comprehensive participation among and coverage of all types of postsecondary institutions
2.2.3. Time to Degree for Students in STEM Academic Programs NSC Same as above Increase coverage of students’ academic programs in the data voluntarily submitted by institutions to NSC
2.3.1. Diversity of STEM Faculty Members in Comparison with Diversity of STEM Graduate Degree Holders IPEDS Add departmental discipline None
HERI Faculty Survey None Include 2-year institutions
NSF Survey of Doctoral Recipients Add departmental discipline for faculty Misses those with faculty appointments who lack doctorates
2.3.2. Diversity of STEM Graduate Student Instructors in Comparison with Diversity of STEM Graduate Students IPEDS Add departmental discipline None
HERI Faculty Survey None Include 2-year institutions; more systematic inclusion of graduate teaching assistants
2.4.1. Students Pursuing STEM Credentials Feel Included and Supported in their Academic Programs and Departments HERI Diverse Learning Environments Survey None Expand coverage of 2-year and 4-year institutions
Suggested Citation:"6 Existing Data Sources and Monitoring Systems." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×
Indicator Sourcea Needs for Research and Modification of Data Collection Instruments and/or Systems Needs for Coverage Improvements
2.4.2. Faculty Teaching Courses in STEM Disciplines Feel Supported and Included in Their Departments HERI Faculty Survey with Campus Climate Module None Include 2-year institutions; more systematic inclusion of graduate teaching assistants
2.4.3. Institutional Practices Are Culturally Responsive, Inclusive, and Consistent across the Institution None
3.1.1 Completion of Foundational Courses, including Developmental Education Courses, to Ensure STEM Program Readiness BPS 04/09 Research to more clearly define developmental and foundational courses Expand number of institutions and students sampled to allow more granular disaggregation
3.2.1 Retention in STEM Programs, Course to Course and Year to Year BPS 04/09 None Expand number of institutions and students sampled to allow more granular disaggregation
HERI Freshman Survey and NSC None Incorporate 2-year institutions in Freshman Survey; increase coverage of students’ academic programs in NSC data provided by institutions
3.2.2 Transfers from 2-year to 4-year STEM Programs in Comparison with Transfers to All 4-year Programs NSC None Increase coverage of students’ academic programs in data provided to NSC by institutions
Suggested Citation:"6 Existing Data Sources and Monitoring Systems." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×
3.3.1 Number of Students Who Attain STEM Credentials over Time (disaggregated by Institution Type, Transfer Status, and Demographic Characteristics) IPEDS Include items on students’ Pell (socioeconomic) status and disability status in data provided by institutions None

NOTES: BPS, Beginning Postsecondary Students Longitudinal Study; CCSSE, Community College Survey of Student Engagement; HERI, Higher Education Research Institute; IPEDS, Integrated Postsecondary Educational Data System; NSC, National Student Clearinghouse; NSOPF, National Survey of Postsecondary Faculty; NSSE, National Survey of Student Engagement.

a Refer to Table 6-1.

Suggested Citation:"6 Existing Data Sources and Monitoring Systems." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×

More generally, the validity of data from any self-report survey can be threatened by faking, cheating, and motivation. Respondents may be motivated by social desirability (the tendency to think of and present oneself in a favorable light) and therefore not respond accurately. For example, Manduca and colleagues (2017) recently found that instructors reported frequently using evidence-based teaching practices on self-report surveys, but observational methods (discussed below) indicated that these instructors rarely did so. If self-report surveys are used for high-stakes purposes (e.g., to inform decisions about promotion and tenure), it can provide additional incentives to tailor one’s responses to present oneself in the best possible light (Sackett, 2012).

Measuring teaching is difficult, and different measurement methods (e.g., self-report surveys, interviews, observations) have varying strengths, weaknesses, and costs (William T. Grant Foundation, Spencer Foundation, and Bill & Melinda Gates Foundation, 2014). Observational methods, such as the Reformed Teaching Observational Protocol (Piburn and Daiyo, 2000) and the more recent Classroom Observation Protocol for Undergraduate STEM (Smith et al., 2013) require trained experts who analyze either videotapes or actual instruction using protocols that describe various teaching practices. These methods provide high-quality data, but they are time-consuming and expensive to implement even for small groups of instructors. For practical reasons, and to capture information on teaching practices among larger samples of instructors, development of self-report surveys is continuing (e.g., Wieman and Gilbert, 2014). For these same practical reasons, self-report surveys of instructors would be the most likely source of national data for indicators to monitor the use of evidence-based practices in and outside the classroom.

Since 2004, when NCES last administered the National Survey of Postsecondary Faculty (NSOPF), the HERI has conducted the only comprehensive, nationally representative survey of faculty, and this survey covers only faculty at 4-year colleges and universities. The HERI Faculty Survey includes questions about instructors’ activities related to research, teaching, and service, as well as their perceptions of students, campus administration, and workplace stressors. It also invites respondents to report on their participation in a range of professional development opportunities. The resulting (weighted) dataset represents the national population of full-time faculty with responsibility for teaching undergraduates at 4-year colleges and universities. However, the HERI Faculty Survey data have four significant limitations as indicators of use of evidence-based educational practices. First, the data are self-reports, which as noted above sometimes do not correspond with more direct measures of instruction, such as observational protocols. Second, the HERI data are collected primarily from full-time instructors at 4-year institutions, missing the growing numbers

Suggested Citation:"6 Existing Data Sources and Monitoring Systems." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×

of part-time instructors, as well as the large population of full-time and part-time instructors at 2-year colleges. This gap is significant given the committee’s charge to focus especially on indicators of the first 2 years of undergraduate education. Third, it is not clear which instructional practices mentioned in the HERI Faculty Survey items are evidence based, although some survey items refer to instructional practices that are commonly cited in discipline-based educational research, such as cooperative learning, group projects, and undergraduate research experiences (National Research Council, 2012). Finally, reports of instructional behaviors—even those that are evidence based (e.g., using classroom response systems and technology to provide students with fast feedback)—do not include information about whether the instructional behavior or approach was carried out appropriately and effectively.

The FSSE (described above) also collects data on teaching practices. Like the HERI Faculty Survey, data are collected primarily from full-time instructors teaching at 4-year colleges and universities. Thus, neither FSSE nor the HERI Faculty Survey provides sufficient information for this indicator from full-time community college instructors, as well as part-time instructors at 2- and 4-year colleges and universities.

NSF requested funding for fiscal 2017 to re-institute the National Survey of Postsecondary Faculty in partnership with NCES (National Science Foundation, 2016, p. 52). Specifically, NSF planned to participate in revising the survey to gather data on teaching practices, the evolving role of technology in education, and the rapidly changing nature of faculty work which can inform approaches to professional development. Such an endeavor would overcome the challenges of representation shortcomings found in the HERI Faculty Survey and the FSSE, but the committee can offer no evaluation as to whether the content of the yet-to-be-developed instrument would include items that sufficiently address the data needed to measure Indicator 1.1.1.

Evidence on students’ exposure to evidence-based instructional practices in mathematics may soon be available through the work of the National Science and Technology Council Interagency Working Group on STEM Education. In a quarterly progress report from the council, Handelsman and Ferrini-Mundy (2016) indicated that the interagency working group was on track to add an item on undergraduate mathematics instruction to the second follow-up of the NCES High School Longitudinal Survey of 2009. Since that time, the survey data have been collected, including data on undergraduate mathematics instruction. NCES is currently preparing the data files and expects to release the new data in early 2018.6

___________________

6 See https://nces.ed.gov/surveys/hsls09 [November 2017].

Suggested Citation:"6 Existing Data Sources and Monitoring Systems." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×

Data and Research Needs

The first step in developing indicators of instructors’ use of evidence-based practices in course development and delivery is additional research to develop common definitions of evidence-based course development and delivery and to identify its specific characteristics and components. That research could lead to new elements for inclusion in existing federal surveys. Alternatively, the federal government could contract with one or more of the existing survey organizations to include new elements in their existing surveys and administer the revised surveys to nationally representative samples of instructors at 2-year and 4-year institutions.

Indicator 1.1.2: Use of Evidence-Based Practices Outside the Classroom

Data Available and Potentially Available

No nationally representative data are currently available on the extent to which students in 2-year and 4-year institutions participate in evidence-based educational practices outside the classroom. There are research and measurement challenges to defining such evidence-based practices outside the classroom and in documenting the extent to which they are used (National Academies of Sciences, Engineering, and Medicine, 2017). These measurement challenges are very similar to those for Indicator 1.1.1.

Data and Research Needs

Further research is needed to clearly define a specific set of evidence-based practices outside the classroom that show evidence of effectiveness for improving mastery of STEM concepts and skills and persistence in undergraduate STEM courses. A possible starting point for this research would be to review the recent report on undergraduate research experiences (National Academies of Sciences, Engineering, and Medicine, 2017) and Estrada’s (2014) review and synthesis of research and evaluation studies on co-curricular programs. This research would provide a foundation for developing new survey questions that could potentially be included in the redesign of the National Survey of Postsecondary Faculty. However, because advising, mentoring, summer bridge programs, and other experiences are often provided by student support staff or others, the survey sample might have to be broadened to include such non-faculty groups. As an alternative, new survey questions could be included in the HERI Faculty Survey, but it, too, would have to be broadened to include student support staff, and all types of instructors at both 2-year and 4-year institutions.

Suggested Citation:"6 Existing Data Sources and Monitoring Systems." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×

Indicator 1.2.1: Extent of Instructors’ Involvement in Professional Development

Data Available and Potentially Available

National-level data on the use of professional development by instructors are not currently available. Surveys of instructors would be the most likely source to fill this gap. Complementary surveys of academic deans, department chairs, and central administrators would help to develop a picture of what kinds of professional development programs are offered at an institution. (The limitation of current surveys of faculty and other instructors are discussed above, under Indicator 1.1.1.)

Data and Research Needs

Data and research needs in the area of faculty surveys are discussed above, under indicator 3.1.1.

Indicator 1.2.2: Availability of Support or Incentives for Evidence-Based Course Development or Course Redesign

Data Available and Potentially Available

There are no national data currently available on the extent of support available to faculty for evidence-based course development or redesign. Several dimensions of support have the potential to be measured in future surveys. For example, the Partnership for Undergraduate Life Sciences Education (PULSE) rubrics developed to measure change in life sciences departments include rubrics to measure departmental support for course redesign by instructors at 2-year and 4-year institutions (Brancaccio-Taras et al., 2016).7 Future surveys might use similar rubrics to provide a measure of the extent of support at institutional levels. This approach would need to consider potential variations in score in different STEM areas in an institution and, ideally, it would be weighted by instructional faculty numbers.

Data and Research Needs

Research is needed to determine which dimensions of support for instructors that have been identified in previous research are most critical to successful evidence-based course development and redesign and what

___________________

7 See http://www.lifescied.org/content/12/4/579.full [July 2017].

Suggested Citation:"6 Existing Data Sources and Monitoring Systems." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×

additional dimensions may be needed. For example, the PULSE rubrics address several dimensions of support, including support for teaching/learning needs in STEM and faculty mentoring for the teaching role (PULSE Fellows, 2016). Such research is the first step toward developing new survey questions that might be included in existing surveys of institutions (e.g., IPEDS) or in a revived National Survey of Postsecondary Faculty.

Indicator 1.3.1: Use of Valid Measures of Teaching Effectiveness

Data Available and Potentially Available

The committee was not able to find any existing national data on the extent to which 2-year and 4-year institutions use valid measures of teaching effectiveness to measure instructors’ performance. New instruments have been developed to measure instructors’ teaching practices (e.g., Wieman and Gilbert, 2014; Walter et al., 2016), but to date they have been administered primarily to small groups of instructors for the purpose of measuring the instruments’ reliability and validity. Drinkwater, Matthews, and Seiler (2017) report on one larger-scale use: The authors administered the Teaching Practices Inventory (Wieman and Gilbert, 2014) to measure the use of evidence-based teaching approaches in 129 courses across 13 departments and compared the results with those from a Canadian institution to identify areas in need of improvement.

Data and Research Needs

Research is needed to identify instruments designed to measure teaching practices in the classroom, determine which ones are valid and reliable, and consider whether and how the detailed questions they contain might be translated for inclusion in large-scale national surveys. A useful starting point would be a description of 12 strategies for measuring teaching effectiveness in Berk (2005): student ratings, peer ratings, self-evaluation, videos, student interviews, alumni ratings, employer ratings, administrator ratings, teaching scholarship, teaching awards, learning outcomes, and teaching portfolio. Researchers will need to take care when evaluating the various assessment instruments that have been developed using each of these strategies; the data resulting from the different instruments can be interpreted and used in very different ways. For example, typical student evaluations ask students what they liked about a particular aspect of a class, but better information is gained by evaluation instruments that ask students what they learned from particular aspects of a class (Seymour et al., 2000).

Suggested Citation:"6 Existing Data Sources and Monitoring Systems." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×

Indicator 1.3.2: Consideration of Evidence-Based Teaching in Personnel Decisions by Departments and Institutions

Available and Potentially Available Data

No national data are currently available on the extent to which departments or institutions explicitly consider use of evidence-based teaching practices when making personnel decisions.

Data and Research Needs

Further research is needed to identify and analyze potential sources of data for this proposed indicator. Possible data sources could be based on (1) an evaluation of institutional policies (e.g., institutions could be asked to submit policy statements that could be analyzed with a rubric); (2) self-reported data from institutions about their hiring and promotion practices (e.g., institutional leaders could be asked whether departmental or institutional policy statements explicitly consider evidence-based teaching in personnel decisions); and (3) the perceptions of administrators, faculty, and others about organizational features (e.g., instructors could be asked about their perceptions about the extent to which their institution explicitly considers evidence-based teaching in personnel decisions). For (1) and (2), it is important to note that official policies can often differ substantially from actual practices.

Indicator 2.1.1: Institutional Structures, Policies, and Practices That Strengthen STEM Readiness for Entering and Enrolled College Students

Data Available and Potentially Available

Currently, no existing data source provides information about the prevalence of programs and practices that can strengthen STEM readiness for U.S. college students. NCES does collect data on institutions’ developmental education offerings (e.g., presence, broad academic area, number of students enrolled) through IPEDS and on student enrollment in such offerings through occasional longitudinal surveys such as BPS. However, there is no survey that measures institutional practices and programs beyond developmental education that bolster students’ STEM readiness.

Suggested Citation:"6 Existing Data Sources and Monitoring Systems." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×

Data and Research Needs

Research is needed to more clearly define the specific types of programs and practices that support diverse students’ entrance to and progression through STEM degree programs. Based on this research and conceptualization, counts of the presence of such programs (e.g., use of multiple measures to determine appropriate mathematics and writing placement, availability of supplemental instruction for introductory STEM courses) could potentially be included as new elements in IPEDS.

Indicator 2.1.2: Entrance to and Persistence in STEM Academic Programs

Data Available and Potentially Available

Currently, only limited national data are available for this indicator. Longitudinal surveys developed and administered by the NCES (e.g., BPS and Baccalaureate and Beyond) are the primary means of tracking student persistence in STEM and other specific academic programs. These in-depth surveys provide academic data submitted by institutions, which may be more reliable than students’ self-reported major fields. For example, several of the HERI surveys—Freshmen Survey, Your First College Year Survey, and College Senior Survey—allow researchers to examine STEM persistence using matched samples. However, although the data indicate whether students who stated an intention to major in a STEM field completed a degree in that field, they do not provide a measure of actual student major field selection between college entrance and graduation, such as switching into STEM majors. In addition, these data are self-reported by student respondents, which are less reliable than data provided directly by institutions.

Data and Research Needs

The nationally representative longitudinal surveys administered by NCES face challenges with respect to frequency of data collection and coverage of data. Without more frequent data collection for new cohorts, policy makers may wait years for an updated indicator of the status of diverse students’ entrance to and persistence in STEM academic programs. Connecting the HERI Freshman Survey with data from the NSC on term-to-term enrollment and completion data supplemented with academic major or field of study would provide substantial flexibility in understanding the STEM persistence and completion patterns of diverse students. However, the data would be limited to students who attend nonprofit 4-year colleges and universities.

Suggested Citation:"6 Existing Data Sources and Monitoring Systems." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×

HERI’s Freshman Survey collects a wealth of demographic information on entering first-year students at 4-year institutions, which enables greater flexibility in how data are disaggregated across student characteristics. The survey also provides respondents with the opportunity to indicate their intended major, though such self-reported data may provide some challenges in terms of its reliability. The NSC data offer the possibility of examining points of departure from STEM fields measured across academic terms or years of study. However, as discussed in Chapter 3, HERI’s Freshman Survey does not include 2-year institutions, and part-time students attending 4-year institutions are often undersampled. In addition, neither HERI nor NSC provides full coverage of all 4-year institutions. Finally, data from both HERI and NSC are proprietary, which may delay or impede federal agencies from obtaining efficient and timely access to the data necessary for this indicator.

Indicator 2.1.3: Equitable Student Participation in Evidence-Based STEM Educational Programs and Experiences

Data Available and Potentially Available

Currently, no nationally representative data are available on students’ participation in evidence-based STEM educational programs and experiences (National Academies of Sciences, Engineering, and Medicine, 2017). Several national surveys collect data on students’ participation in “high-impact practices” (see Chapter 3), including the National Survey of Student Engagement, the Community College Survey of Student Engagement, and HERI’s Your First College Year and College Senior Survey. Although comprehensive in their collection on the extent of students’ engagement with those practices, the surveys do not focus on evidence-based STEM educational practices and programs as defined by the committee. In addition, these surveys have a number of other limitations: in particular, most of them include only a small number of participating institutions so the resulting data are not necessarily representative of the national universe of either 2-year or 4-year institutions and their students (see Chapter 6). In addition, as noted above, these surveys and the resulting data are proprietary, which may delay or restrict access to them to use in the proposed indicator, and students self-report their participation in these activities, which may introduce substantial measurement error. This concern is underscored by the likelihood that students completing any of these instruments may not share a common definition of high-impact practices, and the respective surveys do not offer respondents the opportunity to associate participation in a practice with a particular discipline or field of study (e.g., STEM).

Suggested Citation:"6 Existing Data Sources and Monitoring Systems." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×

Data and Research Needs

As noted in the previous chapter, additional research is needed to develop common definitions of evidence-based STEM educational practices and programs and to identify the specific characteristics and components of those experiences and programs that are critical to advancing student outcomes in STEM. That research could lead to new elements for inclusion in existing federal surveys. Alternatively, the federal government could contract with one or more of the existing survey organizations to include new elements in their existing surveys and administer the revised surveys to nationally representative groups of 2-year and 4-year institutions.

Indicator 2.2.1: Diversity of STEM Degree and Certificate Earners in Comparison with Diversity of Degree and Certificate Earners in All Fields

Data Available and Potentially Available

The IPEDS Completion Survey can provide data to measure this indicator. Completions can be disaggregated by type of credential, field of credential, institutional type, gender, and race and ethnicity. These opportunities for disaggregation represent an important starting point for measuring this indicator. However, the IPEDS Completion Survey does not allow disaggregation by all dimensions in the committee’s broad definition of diversity, including persons with disabilities, socioeconomic status, and first-generation status. In addition, the level of granularity allows disaggregation only by broad categories of race and ethnicity (e.g., Asian, Hispanic). More refined categories of race and ethnicity (e.g., Southeast Asian American, Chinese American, Mexican American, Cuban American) would offer important insight into the nation’s progress toward achieving representational diversity in earners of STEM credentials.

Indicator 2.2.2: Diversity of Transfers from 2-Year to 4-Year STEM Programs in Comparison with Diversity of Students in 2-Year STEM Programs

Data Available and Potentially Available

The data in IPEDS do not offer the possibility of exploring the characteristics of students who transfer to 4-year colleges and universities from 2-year institutions. However, data submitted to and reported by the NSC may prove useful in measuring this indicator. Since more than 3,600 2-year and 4-year colleges and universities participate, NSC data cover the vast

Suggested Citation:"6 Existing Data Sources and Monitoring Systems." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×

majority of postsecondary students in the United States. Term-to-term enrollment data include information about students’ degree program, their fields of study, and the institution(s) in which they enrolled. Therefore, measures as to the proportion of students who transfer from 2-year institutions to STEM-focused 4-year programs could be derived from NSC data. Although NSC data appear promising, they have significant limitations with respect to the availability of demographic characteristics; for example, they have limited coverage of students’ gender, race and ethnicity, and Pell status.8 Thus, new data or changes in the data voluntarily submitted by institutions to NSC would be required before the data could be used to inform this indicator.

Indicator 2.2.3: Time-to-Degree for Students in STEM Academic Programs

Data Available and Potentially Available

Only limited data are currently available for this indicator. Data from IPEDS do not offer the possibility of isolating 2-, 3-, 4-, or 6-year graduation rates for specific disciplines, and graduation rates in IPEDS are cohort based and restricted to first-time, full-time students. NSC data could inform the measurement of this indicator, since they identify the first term a student is enrolled in a degree program and track the number of terms in which the student designated a STEM major/focus for that particular credential. Since there is broad institutional participation in NSC, its data may be able to account for students who earn a STEM credential after enrolling at multiple postsecondary institutions.

As discussed above, NSC data have several limitations, including limited opportunities for disaggregation across student demographic characteristics, incomplete coverage of NSC’s participating institutions, and their proprietary nature. These limitations could lead to inaccurate or imperfect measures for this indicator. Further research and development would be needed to analyze the nature and scope of these limitations and identify and implement strategies to address them before NSC data could provide accurate measures for this indicator.

___________________

8 Because Pell grants are given to low-income students, data on their recipients can be used to roughly measure low-income students.

Suggested Citation:"6 Existing Data Sources and Monitoring Systems." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×

Indicator 2.3.1: Diversity of STEM Instructors in Comparison with the Diversity of STEM Graduate Degree Holders

Indicator 2.3.2: Diversity of STEM Graduate Student Instructors in Comparison with the Diversity of STEM Graduate Students

Data Available and Potentially Available

Currently, no nationally representative data are available to support these two indicators. Although data on the diversity of postsecondary faculty and enrolled graduate students are available in IPEDS’ Human Resources Survey and Enrollment Survey, respectively, they only allow for the disaggregation of faculty by classification (e.g., instructional faculty, research faculty), race and ethnicity, gender, and employment status (i.e., full time, part time). These IPEDS data do not provide any information regarding instructors’ departmental unit, field of study, or highest degree earned, so they cannot currently be used to measure these indicators. (As noted above, NCES has discontinued the National Study of Postsecondary Faculty.)

As discussed above, the HERI Faculty Survey represents the most comprehensive and representative data source on postsecondary faculty, but those data have several limitations. The survey’s sample typically includes few, if any, 2-year institutions, and part-time faculty at both 2-year and 4-year institutions are not well represented. In addition, the data are proprietary, which may delay or impede efficient, timely access. With these limitations in mind, however, the Faculty Survey data do provide opportunities to examine the diversity of faculty teaching at 4-year institutions in STEM or STEM-related departments. The data can be analyzed across academic rank, tenure status, employment status, race and ethnicity, and gender.

Indicator 2.4.1: Students Pursuing STEM Credentials Feel Included and Supported in Their Academic Programs and Departments

Indicator 2.4.2: Instructors Teaching Courses in STEM Disciplines Feel Included and Supported in Their Departments

Data Available and Potentially Available

No nationally representative data are currently available to inform these two indicators, although there are several surveys that include ques-

Suggested Citation:"6 Existing Data Sources and Monitoring Systems." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×

tions on institutional climate. One, the Diverse Learning Environments Survey9 includes a validated sense-of-belonging measure based on four survey items: “I feel a sense of belonging to this college.” “I see myself as a part of the campus community.” “I feel I am a member of this college.” “If asked, I would recommend this college to others.” The survey also asks students the extent to which they agree that “faculty are approachable” in their academic program and that the respondent has “a peer support network among students” in their major. These data can be disaggregated by demographic characteristics and major, allowing for the examination of racial and ethnic or disciplinary differences.

The second indicator above can be measured with institutional climate data collected through surveys of faculty. Several national surveys (e.g., the HERI Faculty Survey and the Faculty Survey of Student Engagement administered by Indiana University) contain measures of teaching practices, satisfaction, and, in the case of HERI’s survey, sources of stress. HERI’s Faculty Survey, for example, includes items related to a faculty member’s sense of satisfaction with the collegiality of faculty in her or his department and a set of items about various sources of stress, including one that relates to stress due to “discrimination (e.g., prejudice, racism, sexism, homophobia, transphobia).”

Leveraging any measure that relies on individuals’ responses to surveys has several limitations. Generalizability and nonresponse bias will be of concern if effective sampling strategies are not used or response rates to the survey are low. In addition, surveys represent a snapshot in time, so responses may be affected by recent events (in the world, institution, department, or one’s personal life). Surveys on institutional climate generally have had limited participation, and institutions often conduct such assessments in reaction to an incident of bias or discrimination on campus.

Data and Research Needs

Extensive survey research and development would be required to fully develop these two indicators, including design of nationally representative sampling frames and development of effective sampling strategies to address the problem of limited participation in institutional climate surveys.

___________________

9 This survey is administered by the HERI and the Culturally Engaging Campus Environments project at Indiana University.

Suggested Citation:"6 Existing Data Sources and Monitoring Systems." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×

Indicator 2.4.3: Institutional Practices Are Culturally Responsive, Inclusive, and Consistent across the Institution

Data Available and Potentially Available

Currently, there is no data source that catalogs institutions’ and STEM departments’ use of culturally responsive practices on a nationwide basis. Most of the research in this area is qualitative in nature and seeks to describe these practices and the mechanisms by which they effectively support STEM learning and persistence of diverse student populations. Similarly, there is no source with comprehensive data on the type of training and professional development available to faculty search committees, nor the extent to which search committees engage in this training or development.

Data and Research Needs

Additional research is needed to identify and establish common definitions of what instructional and educational practices qualify as being culturally responsive. The ADVANCE program at the National Science Foundation10 can serve as a valuable source of information about practices that are effective in building and retaining diverse STEM faculty; additional surveys and databases would need to be developed in order to provide a systematic, nationwide measure of these practices.

Indicator 3.1.1: Completion of Foundational Courses, Including Developmental Education Courses, to Ensure STEM Program Readiness

Data Available and Potentially Available

As described above, the BPS can provide information for this indicator. Data currently available from the full BPS 04/09 dataset can support the measurement of this indicator through the associated transcript records. To measure retention and completion of students in developmental education courses designed to ensure STEM readiness, analysts can divide the number of completed developmental STEM courses by the total number of such courses attempted in order to calculate the percentage of attempted courses that students finished. Given the diversity of students and institutions represented in this dataset, this proportion can be disaggregated by

___________________

10 ADVANCE: Increasing the Participation and Advancement of Women in Academic Science and Engineering Careers. See https://www.nsf.gov/funding/pgm_summ.jsp?pims_id=5383 [August 2017].

Suggested Citation:"6 Existing Data Sources and Monitoring Systems." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×

institutional type, as well as by students’ demographic characteristics. A similar proportion can be calculated to measure the proportion of students who complete foundational courses (college-level mathematics, English, science, and technology) that are designed to prepare students for STEM program readiness.

Data and Research Needs

Several challenges would need to be addressed when considering the use of these data for this indicator. First, the BPS is conducted only periodically, so data for national indicators would not always be current. Second, the study design includes sampling of both institutions and students in institutions. In order to develop a national indicator with this design, it would be necessary to apply statistical population weights to the dataset, but not all institutions or all students would be represented. Third, one would have to determine which courses across such diverse institutions count as either developmental or foundational STEM preparation courses. Chen (2016) began to address the third challenge, using the BPS 04/09 data to track the progress of students who enrolled in at least one developmental education course through their credential completion. The author identified developmental courses on the basis of course titles and content descriptions. In general, courses described with terms like developmental, remedial, pre-collegiate, and basic skills were considered developmental (Chen, 2016).

A national student unit record data system could address the challenge that BPS is administered only periodically. Such a system would compile course histories for all postsecondary students nationwide to provide universal coverage for this indicator (see Chapter 7).

Indicator 3.2.1: Retention in STEM Degree or Certificate Programs, Course to Course and Year to Year

Data Available and Potentially Available

As described above for Indicator 3.1.1, the BPS 04/09 dataset can provide a solid foundation to measure average course completion rates in STEM. Dividing the number of STEM-related courses that students completed or passed by the number of STEM-related courses they attempted during a given period provides overall statistics, which can be disaggregated by institutional characteristics, as well as by student demographic and enrollment characteristics.

Given its focus on a single cohort of students, the BPS 04/09 data can also provide information on year-to-year retention rates in STEM degree

Suggested Citation:"6 Existing Data Sources and Monitoring Systems." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×

programs. Such a measure can be derived by taking the number of students intending to pursue a credential with a STEM focus as the baseline of students with an initial interest in STEM and then using this total as the denominator for calculating retention rates in subsequent terms. STEM retention rates would be calculated by removing from the baseline total any baseline students who switch to non-STEM credential programs at the end of any specified time and dividing this difference by the original baseline total. This proportion can then be disaggregated by specific credentials (e.g., associate’s degrees, bachelor’s degrees), institutional characteristics, or student demographic or enrollment characteristics.

Data and Research Needs

The same challenges to BPS 04/09 identified for Indicator 3.1.1 apply for this indicator with respect to the relative infrequency of data collection and selective coverage. More frequent data collection for new cohorts is needed to provide an updated indicator of progress toward the objective of successful navigation into and through STEM programs of study. One alternative with different challenges would be data from HERI’s Freshman Survey, matched with NSC’s enrollment and completion data. The Freshman Survey asks students to report their intended major and is administered annually, and its coverage of 4-year institutions matches or is more complete than that of BPS. However, it oversamples first-time, full-time students and does not provide adequate coverage of students in 2-year institutions. Expanding the survey to include representative samples of 2-year students would address this problem.

The NSC provides term-to-term enrollment information and increasingly includes students’ academic degree programs, as well as each credential and its associated field or discipline. The NSC covers 2-year and 4-year institutions but it is voluntary: institutions elect to submit their data. If NSC added student-level attributes (e.g., race and ethnicity, Pell grant status as a proxy for socioeconomic status, full coverage of field or discipline associated with degree program) and fuller and more comprehensive participation among and coverage of postsecondary institutions, the data could be merged with IPEDS data to provide opportunities for disaggregation by institutional characteristics. Such a combination could provide data sufficient to calculate year-to-year program retention and completion rates in STEM-related fields.

Suggested Citation:"6 Existing Data Sources and Monitoring Systems." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×

Indicator 3.2.2: Transfers from 2-Year to 4-Year STEM Programs in Comparison with Transfers to All 4-Year Programs

Data Available and Potentially Available

The NSC’s term-to-term enrollment data, which increasingly has information about the academic focus of students’ degree program, may have sufficient data for this indicator. To calculate this figure, analysts might take the number of students from 2-year institutions who transfer to 4-year institutions and who declare a STEM-related major each term and divide that number by the total number of students who transferred from a 2-year to a 4-year institution in any given academic term or time period.

Data and Research Needs

As described above, the NSC does not have complete data from all participating institutions (e.g., academic field is not available for every term for all institutions), and it does not have complete coverage of all postsecondary institutions. Additional coverage of data elements (e.g., academic field) and broader coverage of 2-year and 4-year institutions would be needed for an accurate indicator.

Indicator 3.3.1: Percentage of Students Who Attain STEM Credentials over Time, Disaggregated by Institution Type, Transfer Status, and Demographic Characteristics

Data Available and Potentially Available

The IPEDS Completion Survey provides comprehensive coverage of the credentials that students earn and the discipline or field of study associated with those credentials for all postsecondary institutions that receive Title IV funding.11 Data are collected at 100 percent, 150 percent, and 200 percent of expected completion time and are broken down by race and gender. The system provides information on the number and type of credentials (certificates, associate’s degrees, and bachelor’s degrees) by discipline or field, as measured by classification of instructional program codes.12 This level of granularity can provide analysts and policy makers with ample flexibility

___________________

11 As noted above, almost all U.S. institutions receive Title IV funds.

12 See https://nces.ed.gov/ipeds/cipcode/Default.aspx?y=55 [August 2017].

Suggested Citation:"6 Existing Data Sources and Monitoring Systems." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×

in determining which credentials “count” and which disciplines and fields of study represent STEM.

Importantly, the IPEDS Completion Survey collects data on all credentials earned in a given year, which helpfully avoids the issues related to cohorts and enrollment status (i.e., full time or part time) in IPEDS data on graduation rates. In addition, the credential counts reported by IPEDS can be disaggregated by gender, race and ethnicity, and institutional type.

Data and Research Needs

Currently, IPEDS does not allow disaggregation of credentials by students’ disability status and socioeconomic status; however, since institutions do report information related to students’ receipt of Pell grants, institutions could be asked to submit data regarding credentials awarded to Pell recipients, which would further support measurement of this indicator. Given current data coverage, flexibility in defining the fields and disciplines constituting STEM, the ability to measure specific types of credentials, and possibilities of disaggregation, the IPEDS Completion Survey represents the best source of data for this indicator.

SUMMARY AND CONCLUSIONS

The committee considered whether, and to what extent, various federal and proprietary data sources are representative of the national universe of 2-year and 4-year institutions and included data relevant to the goals and objectives in the conceptual framework and, more specifically, to the committee’s proposed indicators.

Focusing first on federal data sources and systems, the committee reviewed the IPEDS, which includes high-quality, current data related to the committee’s goals and objectives, including detailed annual data on completion of degrees and certificates in different fields of study. However, these data focus on students who start at an institution and also graduate from it, thus not covering students who transfer or attend multiple institutions, as well as part-time students.

CONCLUSION 2 To monitor the status and quality of undergraduate STEM education, federal data systems will need additional data on full-time and part-time students’ trajectories across, as well as within, institutions.

Although they are conducted less frequently than IPEDS institutional surveys, federal longitudinal surveys of student cohorts, such as the BPS 04/09 study, provide useful data related to the committee’s goals and objec-

Suggested Citation:"6 Existing Data Sources and Monitoring Systems." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×

tives. The survey samples are carefully designed to be nationally representative, and multiple methods are used to obtain strong response rates. The resulting data can be used to track students’ trajectories across institutions and fields of study, including STEM fields. Previously, the discontinued National Study of Postsecondary Faculty provided data related to the committee’s proposed indicators, including faculty members’ disciplinary backgrounds, responsibilities, and attitudes.

CONCLUSION 3 To monitor the status and quality of undergraduate STEM education, recurring longitudinal surveys of faculty and students are needed.

The committee found that IPEDS and other federal data sources generally allow data to be disaggregated by students’ race and ethnicity and gender. However, conceptions of diversity have broadened to include additional student groups that bring unique strengths to undergraduate STEM education and may also encounter unique challenges. To fully support the indicators, federal data systems will need to include additional student characteristics.

CONCLUSION 4 To monitor progress toward equity, diversity, and inclusion of STEM students and faculty, national data systems will need to include demographic characteristics beyond gender and race and ethnicity, including at least disability status, first-generation student status, and socioeconomic status.

The committee also reviewed the many new, proprietary data sources that have been developed over the past two decades in response to growing accountability pressures in higher education. Although not always nationally representative of 2-year and 4-year public and private institutions, some of these sources include large samples of institutions and address the committee’s goals and objectives.

Based on its review of existing public and proprietary data sources, the committee considered research needs and data availability for each of the 21 proposed indicators. It found that, for some indicators, further research is needed to develop clear definitions and measurement approaches, and overall, the availability of data for the indicators is limited. For some indicators, nationally representative datasets are available, but when these data are disaggregated, first to focus on STEM students and then to focus on specific groups of STEM students, the sample sizes become too small for statistical significance. For other indicators, no data are available from either public or proprietary sources.

Suggested Citation:"6 Existing Data Sources and Monitoring Systems." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×

CONCLUSION 5 The availability of data for the indicator system is limited, and new data collection is needed for many of them:

  • No data sources are currently available for most of the indicators of engaging students in evidence-based STEM educational practices (Goal 1).
  • Various data sources are available for most of the indicators of equity, diversity, and inclusion (Goal 2). However, these sources would need enhanced coverage of institutions and students to be nationally representative, along with additional data elements on students’ fields of study.
  • Federal data sources are available for some of the indicators of ensuring adequate numbers of STEM professionals (Goal 3). However, federal surveys would need larger institutional and student samples to allow finer disaggregation of the data by field of study and demographic characteristics.

REFERENCES

Armstrong, J., and Zaback, K. (2016). Assessing and Improving State Postsecondary Data Systems. Washington, DC: Institute for Higher Education Policy. Available: http://www.ihep.org/sites/default/files/uploads/postsecdata/docs/resources/state_postsecondary_data_systems-executive_summary.pdf [June 2016].

Berk, R.A. (2005). Survey of 12 strategies for measuring teaching effectiveness. International Journal on Teaching and Learning in Higher Education, 17(1), 48–62.

Brancaccio-Taras, L., Pape-Lindstrom, P., Peteroy-Kelly, M., Aguirre, K., Awong-Taylor, J., Balser, R., Cahill, M.J., Frey, R.G., Jack, R., Kelrick, M., Marley, K., Miller, K.G., Osgood, M., Romano, S., Uzman, J.A., and Zhao, J. (2016). The PULSE vision & change rubrics, version 1.0: A valid and equitable tool to measure transformation of life sciences departments at all institution types. CBE-Life Sciences Education, 15(4), art 60. Available: http://www.lifescied.org/content/15/4/ar60.full [March 2017].

Brick, J.M., and Williams, D. (2013). Explaining rising nonresponse rates in cross-sectional surveys. The ANNALS of the American Academy of Political and Social Science, 645(1), 36–59.

Burns, S., Wang, X., and Henning, A. (Eds.). NCES Handbook of Survey Methods. (NCES 2011-609). Washington, DC: U.S. Department of Education, National Center for Education Statistics. Available: https://nces.ed.gov/pubsearch/pubsinfo.asp?pubid=2011609 [July 2017].

Campbell, C.M., and Cabrera, A.F. (2014). Making the mark: Are grades and deep learning related? Research in Higher Education, 55(5), 494–507.

Cataldi, E.F., Fahimi, M., and Bradburn, E.M. (2005). 2004 National Study of Postsecondary Faculty (NSOPF:04) Report on Faculty and Instructional Staff in Fall 2003. (NCES 2005-172). Washington, DC: U.S. Department of Education, National Center for Education Statistics. Available: http://nces.ed.gov/pubs2005/2005172.pdf [July 2016].

Center for Postsecondary Research. (2017a). About: FSSE. Bloomington: Indiana University Center for Postsecondary Research. Available: http://fsse.indiana.edu/html/about.cfm [June 2017].

Suggested Citation:"6 Existing Data Sources and Monitoring Systems." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×

Center for Postsecondary Research. (2017b). About: NSSE. Bloomington: Indiana University Center for Postsecondary Research. Available: http://nsse.indiana.edu/html/about.cfm [June 2017].

Chen, X. (2016). Remedial Coursetaking at U.S. Public 2- and 4-Year Institutions: Scope, Experiences, and Outcomes. (NCES 2016-405). Washington, DC: U.S. Department of Education, National Center for Education Statistics. Available: https://nces.ed.gov/pubs2016/2016405.pdf [July 2017].

Chen, X., and Soldner, M. (2013). STEM Attrition: College Students’ Paths Into and Out of STEM Fields. Washington, DC: U.S. Department of Education.

Complete College America. (2014). Four-Year Myth: Make College More Affordable, Restore the Promise of Graduating on Time. Indianapolis, IN: Author.

Cunningham, A.F., and Milam, J. (2005). Feasibility of a Student Unit Record System Within the Integrated Postsecondary Education Data System. (NCES 2005-160). Washington, DC: U.S. Department of Education, National Center for Education Statistics.

Drinkwater, M.J., Matthews, K.E., and Seiler, J. (2017). How is science being taught? Measuring evidence-based teaching practices across undergraduate science departments. CBE Life Sciences Education, 16(1), ar18, 1-11. Available: http://www.lifescied.org/content/16/1/ar18.full.pdf [October 2017].

Dynarski, S.M., Hemelt, S.W., and Hyman, J.M. (2013). The Missing Manual: Using National Student Clearinghouse Data to Track Postsecondary Outcomes. (Working Paper No. W9552). Cambridge, MA: National Bureau of Economic Research. Available: http://www.nber.org/papers/w19552 [September, 2017].

Eagan, K., Hurtado, S., Figueroa, T., and Hughes, B. (2014a). Examining STEM Pathways Among Students Who Begin College at Four-Year Institutions. Paper prepared for the Committee on Barriers and Opportunities in Completing 2- and 4-Year STEM Degrees. Washington, DC. Available: http://sites.nationalacademies.org/cs/groups/dbassesite/documents/webpage/dbasse_088834.pdf [April 2015].

Eagan, M.K., Stolzenberg, E.B., Berdan Lozano, J., Aragon, M.C., Suchard, M.R., and Hurtado, S. (2014b). Undergraduate Teaching Faculty: The 2013–2014 HERI Faculty Survey. Los Angeles: Higher Education Research Institute, University of California, Los Angeles. Available: http://heri.ucla.edu/monographs/HERI-FAC2014-monograph.pdf [August 2016].

Eagan, M.K., Stolzenberg, E.B., Ramirez, J.J., Aragon, M.C., Suchard, M.R., and Rios-Aguilar, C. (2016). The American Freshman: Fifty-Year Trends, 1966–2015. Los Angeles: Higher Education Research Institute, University of California, Los Angeles. Available: http://www.heri.ucla.edu/monographs/50YearTrendsMonograph2016.pdf [August 2016].

Estrada, M. (2014). Ingredients for Improving the Culture of STEM Degree Attainment with Co-curricular Supports for Underrepresented Minority Students. Paper prepared for the Committee on Barriers and Opportunities in Completing 2- and 4-Year STEM Degrees. Available: http://sites.nationalacademies.org/cs/groups/dbassesite/documents/webpage/dbasse_088832.pdf [July 2017].

Executive Office of the President. (2015). Using Federal Data to Measure and Improve the Performance of U.S. Institutions of Higher Education. Washington, DC: Executive Office of the President. Available: https://collegescorecard.ed.gov/assets/UsingFederalDataToMeasureAndImprovePerformance.pdf [February 2018].

Fosnacht, K., Sarraf, S., Howe, E., and Peck, L. (2017). How important are high response rates for college surveys? Review of Higher Education, 40(2), 245–265.

Suggested Citation:"6 Existing Data Sources and Monitoring Systems." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×

Ginder, S., Kelly-Reid, J.E., and Mann, F.B. (2017). Graduation Rates for Selected Cohorts, 2008-2013; Outcome Measures for Cohort Year 2008; Student Financial Aid, Academic Year 2015-2016; and Admissions in Postsecondary Institutions, Fall 2016: First Look (Preliminary Data) (NCES 2017-150). U.S. Department of Education, Washington, DC: National Center for Education Statistics. Available: https://nces.ed.gov/pubsearch/pubsinfo.asp?pubid=2017150 [November 2017].

Handelsman, J., and Ferrini-Mundy, J. (2016). STEM Education: Cross-Agency Priority Goal Quarterly Progress Update, FY2016 Quarter 1. Washington, DC: Office of Science and Technology Policy.

HCM Strategists. (2013). A Better Higher Education Data and Information for Informing Policy: The Voluntary Institutional Metrics Project. Washington, DC: HCM Strategists. Available: http://hcmstrategists.com/wp-content/themes/hcmstrategists/docs/gates_metrics_report_v9.pdf [January 2017].

Jenkins, D., and Fink, J. (2016). Tracking Transfer: New Measures of Institutional and State Effectiveness in Helping Community College Students Attain Bachelor’s Degrees. New York: Community College Research Center, Columbia University. Available: http://ccrc.tc.columbia.edu/media/k2/attachments/tracking-transfer-institutional-state-effectiveness.pdf [September 2017].

Khan, B. (2016). Overview of Science and Engineering Indicators 2016. Presentation to the Committee on Developing Indicators for Undergraduate STEM Education, February 22. Available: http://sites.nationalacademies.org/cs/groups/dbassesite/documents/webpage/dbasse_171321.pdf [June 2016].

Knapp, L.G., Kelly-Reid, J.E., and Ginder, S.A. (2012). Enrollment in Postsecondary Institutions, Fall 2011; Financial Statistics, Fiscal Year 2011; and Graduation Rates, Selected Cohorts, 2003–2008: First Look. (Provisional data, NCES 2012-174). Washington, DC: U.S. Department of Education, National Center for Education Statistics.

Manduca, C.A., Iverson, E.R., Luxenberg, M., Macdonald, R.H., McConnell, D.A., Mogk, D.W., and Tewksbury, B.J. (2017). Improving undergraduate STEM education: The efficacy of discipline-based professional development. Science Advances, 3(2), 1–15.

Miller, B. (2016). Building a Student-level Data System. Washington, DC: Institute for Higher Education Policy. Available: http://www.ihep.org/sites/default/files/uploads/postsecdata/docs/resources/building_a_student-level_data_system.pdf [June 2017].

National Academies of Sciences, Engineering, and Medicine. (2016). Barriers and Opportunities for 2-Year and 4-Year STEM Degrees: Systemic Change to Support Students’ Diverse Pathways. Washington, DC: The National Academies Press.

National Academies of Sciences, Engineering, and Medicine. (2017). Undergraduate Research Experiences for STEM Students: Successes, Challenges, and Opportunities. Washington, DC: The National Academies Press.

National Center for Education Statistics. (2012). 2012 Revision of NCES Statistical Standards: Final. Washington, DC: National Center for Education Statistics. Available: https://nces.ed.gov/statprog/2012/ [September 2017].

National Center for Education Statistics. (2014). Integrated Postsecondary Education Data System (IPEDS). Available: http://nces.ed.gov/statprog/handbook/pdf/ipeds.pdf [June 2016].

National Center for Education Statistics. (2015). Table 326.20. Digest of Educational Statistics. Available: http://nces.ed.gov/programs/digest/d14/tables/dt14_326.20.asp [June 2016].

National Research Council. (2012). Discipline-Based Education Research: Understanding and Improving Learning in Undergraduate Science and Engineering. Washington, DC: The National Academies Press.

Suggested Citation:"6 Existing Data Sources and Monitoring Systems." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×

National Research Council. (2014). Capturing Change in Science, Technology, and Innovation: Improving Indicators to Inform Policy. Washington, DC: The National Academies Press. Available: http://www.nap.edu/catalog/18606/capturing-change-in-science-technology-and-innovation-improving-indicators-to [June 2016].

National Science Foundation. (2016). Science and Engineering Indicators 2016. Arlington, Virginia: National Science Foundation. Available: https://www.nsf.gov/statistics/2016/nsb20161/#/ [February 2018].

National Student Clearinghouse. (2016a). Notes from the Field #3: Research Center Notes on Letter of Sara Goldrick-Rab and Douglas N. Harris, University of Wisconsin-Madison. Herndon, VA: Author. Available: https://nscresearchcenter.org/workingwithourdata/notesfromthefield-3 [July 2016].

National Student Clearinghouse. (2016b). Who We Are. Herndon, VA: Author. Available: http://www.studentclearinghouse.org/about [June 2016].

Piburn, M. and Daiyo, S. (2000). Reformed Teaching Observational Protocol (RTOP) Reference Manual. Available: http://files.eric.ed.gov/fulltext/ED447205.pdf [September 2017].

Porter, S.R. (2013). Self-reported learning gains: A theory and a test of college student survey response. Research in Higher Education, 54(2), 201–226.

PULSE Fellows. (2016). The PULSE Vision and Change Rubrics Version 2.0. Available: http://api.ning.com/files/Kfu*MfW7V8MYZfU7LNGdOnG4MnryzUgUpC2IxdtUmucnB4QNCdLaOwWGoMoULSeKw8hF9jiFdh75tlzuv1nqtfCuM11hNPp3/PULSERubricsPacketv2_0_FINALVERSION.pdf [May 2017].

Sackett, P. (2012). Faking in personality assessments: Where do we stand? In M. Zieger, C. MacCann, and R.D. Roberts (eds). New Perspectives in Faking in Personality Assessment (pp. 330-344). New York: Oxford Univeristy Press.

Seymour, E., Wiese, D., Hunter, A., and Daffinrud, S.M. (2000). Creating a Better Mousetrap: On-line Student Assessment of Their Learning Gains. Paper presentation at the National Meeting of the American Chemical Society, San Francisco, CA, March 27.

Smith, M.K., Jones, F.H.M., Gilbert, S.L., and Wieman, C.E. (2013). The classroom observation protocol for undergraduate STEM (COPUS): A new instrument to characterize university STEM classroom practices. CBE Life Sciences. Available: http://www.lifescied.org/content/12/4/618.full [June 2016].

Van Noy, M., and Zeidenberg, M. (2014). Hidden STEM Knowledge Producers: Community Colleges’ Multiple Contributions to STEM Education and Workforce Development. Paper Prepared for the Committee on Barriers and Opportunities in Completing 2- and 4-Year STEM Degrees. Available: http://sites.nationalacademies.org/cs/groups/dbassesite/documents/webpage/dbasse_088831.pdf [June 2017].

Walter, E.M., Beach, A.L., Henderson, C., and Williams, C.T. (2016). Describing instructional practice and climate: Two new instruments. In G.C. Weaver, W.D. Burgess, A.L. Childress, and L. Slakey (Eds.), Transforming Institutions: Undergraduate STEM Education for the 21st Century. West Lafayette, IN: Purdue University Press.

Whitfield, C., and Armstrong, J. (2016). The State of State Postsecondary Data Systems: Strong Foundations 2016. Boulder, CO: State Higher Education Executive Officers. Available: http://www.sheeo.org/sites/default/files/publications/SHEEO_StrongFoundations2016_FINAL.pdf [June 2016].

Wieman, C., and Gilbert, S. (2014). The teaching practices inventory: A new tool for characterizing college and university teaching in mathematics and science. CBE-Life Sciences Education, 13(3), 552-569.

William T. Grant Foundation, Spencer Foundation, and Bill & Melinda Gates Foundation (2014). Measuring Instruction in Higher Education: Summary of a Convening. New York: William T. Grant Foundation. Available: http://wtgrantfoundation.org/library/uploads/2015/11/Measuring-Instruction-in-Higher-Education.pdf [October 2017].

Suggested Citation:"6 Existing Data Sources and Monitoring Systems." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×

Wine, J., Janson, N., and Wheeless, S. (2011). 2004/09 Beginning Postsecondary Students Longitudinal Study (BPS:04/09) Full-scale Methodology Report. (NCES 2012-246). Washington, DC: National Center for Education Statistics, Institute of Education Sciences, U.S. Department of Education. Available: https://nces.ed.gov/pubs2012/2012246_1.pdf [June 2017].

Suggested Citation:"6 Existing Data Sources and Monitoring Systems." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×
Page 127
Suggested Citation:"6 Existing Data Sources and Monitoring Systems." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×
Page 128
Suggested Citation:"6 Existing Data Sources and Monitoring Systems." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×
Page 129
Suggested Citation:"6 Existing Data Sources and Monitoring Systems." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×
Page 130
Suggested Citation:"6 Existing Data Sources and Monitoring Systems." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×
Page 131
Suggested Citation:"6 Existing Data Sources and Monitoring Systems." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×
Page 132
Suggested Citation:"6 Existing Data Sources and Monitoring Systems." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×
Page 133
Suggested Citation:"6 Existing Data Sources and Monitoring Systems." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×
Page 134
Suggested Citation:"6 Existing Data Sources and Monitoring Systems." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×
Page 135
Suggested Citation:"6 Existing Data Sources and Monitoring Systems." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×
Page 136
Suggested Citation:"6 Existing Data Sources and Monitoring Systems." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×
Page 137
Suggested Citation:"6 Existing Data Sources and Monitoring Systems." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×
Page 138
Suggested Citation:"6 Existing Data Sources and Monitoring Systems." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×
Page 139
Suggested Citation:"6 Existing Data Sources and Monitoring Systems." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×
Page 140
Suggested Citation:"6 Existing Data Sources and Monitoring Systems." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×
Page 141
Suggested Citation:"6 Existing Data Sources and Monitoring Systems." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×
Page 142
Suggested Citation:"6 Existing Data Sources and Monitoring Systems." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×
Page 143
Suggested Citation:"6 Existing Data Sources and Monitoring Systems." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×
Page 144
Suggested Citation:"6 Existing Data Sources and Monitoring Systems." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×
Page 145
Suggested Citation:"6 Existing Data Sources and Monitoring Systems." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×
Page 146
Suggested Citation:"6 Existing Data Sources and Monitoring Systems." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×
Page 147
Suggested Citation:"6 Existing Data Sources and Monitoring Systems." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×
Page 148
Suggested Citation:"6 Existing Data Sources and Monitoring Systems." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×
Page 149
Suggested Citation:"6 Existing Data Sources and Monitoring Systems." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×
Page 150
Suggested Citation:"6 Existing Data Sources and Monitoring Systems." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×
Page 151
Suggested Citation:"6 Existing Data Sources and Monitoring Systems." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×
Page 152
Suggested Citation:"6 Existing Data Sources and Monitoring Systems." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×
Page 153
Suggested Citation:"6 Existing Data Sources and Monitoring Systems." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×
Page 154
Suggested Citation:"6 Existing Data Sources and Monitoring Systems." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×
Page 155
Suggested Citation:"6 Existing Data Sources and Monitoring Systems." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×
Page 156
Suggested Citation:"6 Existing Data Sources and Monitoring Systems." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×
Page 157
Suggested Citation:"6 Existing Data Sources and Monitoring Systems." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×
Page 158
Suggested Citation:"6 Existing Data Sources and Monitoring Systems." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×
Page 159
Suggested Citation:"6 Existing Data Sources and Monitoring Systems." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×
Page 160
Suggested Citation:"6 Existing Data Sources and Monitoring Systems." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×
Page 161
Suggested Citation:"6 Existing Data Sources and Monitoring Systems." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×
Page 162
Suggested Citation:"6 Existing Data Sources and Monitoring Systems." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×
Page 163
Suggested Citation:"6 Existing Data Sources and Monitoring Systems." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×
Page 164
Suggested Citation:"6 Existing Data Sources and Monitoring Systems." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×
Page 165
Suggested Citation:"6 Existing Data Sources and Monitoring Systems." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×
Page 166
Suggested Citation:"6 Existing Data Sources and Monitoring Systems." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×
Page 167
Suggested Citation:"6 Existing Data Sources and Monitoring Systems." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×
Page 168
Suggested Citation:"6 Existing Data Sources and Monitoring Systems." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×
Page 169
Suggested Citation:"6 Existing Data Sources and Monitoring Systems." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×
Page 170
Suggested Citation:"6 Existing Data Sources and Monitoring Systems." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×
Page 171
Suggested Citation:"6 Existing Data Sources and Monitoring Systems." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×
Page 172
Suggested Citation:"6 Existing Data Sources and Monitoring Systems." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×
Page 173
Suggested Citation:"6 Existing Data Sources and Monitoring Systems." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×
Page 174
Suggested Citation:"6 Existing Data Sources and Monitoring Systems." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×
Page 175
Suggested Citation:"6 Existing Data Sources and Monitoring Systems." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×
Page 176
Next: 7 Implementing the Indicator System »
Indicators for Monitoring Undergraduate STEM Education Get This Book
×
 Indicators for Monitoring Undergraduate STEM Education
Buy Paperback | $55.00 Buy Ebook | $44.99
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF

Science, technology, engineering and mathematics (STEM) professionals generate a stream of scientific discoveries and technological innovations that fuel job creation and national economic growth. Ensuring a robust supply of these professionals is critical for sustaining growth and creating jobs growth at a time of intense global competition. Undergraduate STEM education prepares the STEM professionals of today and those of tomorrow, while also helping all students develop knowledge and skills they can draw on in a variety of occupations and as individual citizens. However, many capable students intending to major in STEM later switch to another field or drop out of higher education altogether, partly because of documented weaknesses in STEM teaching, learning and student supports. Improving undergraduate STEM education to address these weaknesses is a national imperative.

Many initiatives are now underway to improve the quality of undergraduate STEM teaching and learning. Some focus on the national level, others involve multi-institution collaborations, and others take place on individual campuses. At present, however, policymakers and the public do not know whether these various initiatives are accomplishing their goals and leading to nationwide improvement in undergraduate STEM education.

Indicators for Monitoring Undergraduate STEM Education outlines a framework and a set of indicators that document the status and quality of undergraduate STEM education at the national level over multiple years. It also indicates areas where additional research is needed in order to develop appropriate measures. This publication will be valuable to government agencies that make investments in higher education, institutions of higher education, private funders of higher education programs, and industry stakeholders. It will also be of interest to researchers who study higher education.

READ FREE ONLINE

  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    Switch between the Original Pages, where you can read the report as it appeared in print, and Text Pages for the web version, where you can highlight and search the text.

    « Back Next »
  6. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  7. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  8. ×

    View our suggested citation for this chapter.

    « Back Next »
  9. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!