National Academies Press: OpenBook

Indicators for Monitoring Undergraduate STEM Education (2018)

Chapter: 3 Goal 1: Increase Students' Mastery of STEM Concepts and Skills

« Previous: 2 Conceptual Framework for the Indicator System
Suggested Citation:"3 Goal 1: Increase Students' Mastery of STEM Concepts and Skills." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×

3

Goal 1: Increase Students’ Mastery of STEM Concepts and Skills

As noted in Chapter 1, the committee does not propose indicators to directly measure student learning. Although some disciplines have begun to identify the core concepts and skills that all undergraduates should master (e.g., Arum, Roksa, and Cook, 2016; Brewer and Smith, 2011) and develop assessments of them, there is currently no agreement on a uniform set of STEM-wide concepts and skills, nor on standardized assessments of such concepts and skills. Rather, the committee expects that engaging students in evidence-based STEM educational practices (Goal 1) and striving for equity, diversity, and inclusion (Goal 2) will increase all students’ mastery of STEM concepts and skills. Advancing these goals is expected to improve persistence among students already interested in STEM and attract other students to STEM majors, thus increasing the number of students earning STEM credentials and ensuring adequate numbers of STEM professionals (Goal 3). These expectations echo the President’s Council of Advisors on Science and Technology (2012); it recommended widespread adoption of evidence-based teaching and learning approaches to increase the number of STEM graduates and ensure an adequate supply of STEM professionals.

The major sections of this chapter address the committee’s four objectives for Goal 1.

1.1: Use of evidence-based STEM educational practices both in and outside of classrooms

1.2: Existence and use of supports that help STEM instructors use evidence-based learning experiences

Suggested Citation:"3 Goal 1: Increase Students' Mastery of STEM Concepts and Skills." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×

1.3: An institutional culture that values undergraduate STEM instruction

1.4: Continuous improvement in STEM teaching and learning

In Appendix B, the committee offers potential measures for some of the indicators: specific quantitative variables that provide a reliable method for monitoring progress toward achieving the objective.

The systems view reflected in these four objectives aligns with current approaches to systemic reform of undergraduate STEM education. For example, the Association of American Universities Undergraduate STEM Education Initiative is guided by a framework placing pedagogy at the center surrounded by scaffolding and cultural change (Miller and Trapani, 2016); this scaffolding includes providing professional development and ongoing collection and ongoing analysis of data to evaluate and improve program performance.

Each section of the chapter summarizes the research that demonstrates the importance of the objective for improving the quality of undergraduate STEM education and presents the committee’s proposed indicators to monitor progress toward that objective. For each indicator, the committee

TABLE 3-1 Objectives and Indicators of Increasing Students’ Mastery of STEM Concepts and Skills

Objective Indicators
1.1. Use of evidence-based educational practices both in and outside of classrooms. 1.1.1 Use of evidence-based educational practices in course development and delivery
1.1.2 Use of evidence-based educational practices beyond the classroom
1.2. Existence and use of supports that help STEM instructors use evidence-based educational practices. 1.2.1 Extent of instructors’ involvement in professional development
1.2.2 Availability of support or incentives for evidence-based course development or course redesign
1.3 Institutional culture that values undergraduate STEM instruction 1.3.1 Use of valid measures of teaching effectiveness
1.3.2 Consideration of evidence-based teaching in personnel decisions by departments and institutions
1.4 Continuous improvement in STEM teaching and learning No indicators: see “Challenges of Measuring Continuous Improvement” in Chapter 2
Suggested Citation:"3 Goal 1: Increase Students' Mastery of STEM Concepts and Skills." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×

discusses the meaning of the indicator and identifies the additional research needed to fully develop the indicators: see Table 3-1.

OBJECTIVE 1.1: USE OF EVIDENCE-BASED EDUCATIONAL PRACTICES BOTH IN AND OUTSIDE OF CLASSROOMS

Importance of the Objective

Students’ mastery of STEM concepts and skills is supported by evidence-based STEM educational practices. As discussed in Chapters 1 and 2, there is a growing body of work that presents and reviews practices that are supported by rigorous research (e.g., Freeman et al., 2014). The committee expects that this body of work will continue to expand and evolve. For this reason, we do not provide a prescriptive list of those practices that we currently view as “evidence based.” Rather, we define them as educational practices meeting at least one of the following criteria:

  • the preponderance of published literature suggests that it will be effective across settings or in the specific local setting, or
  • the practice is built explicitly from accepted theories of teaching and learning and is faithful to best practices of implementation, or
  • the practice is locally collected, valid, and reliable evidence based on a sound methodological research approach that suggests that it is effective.

In this chapter, we use the term “evidence-based educational practices” to represent the variety of educational practices that meet the above criteria. These various practices have been shown to increase students’ mastery of STEM concepts and skills, as well as promote positive attitudes toward learning, and persistence toward a degree (Fairweather, 2012; Kober, 2015; National Research Council, 2012a). These practices are not restricted to classroom environments; there is also emerging evidence suggesting that programs outside classrooms, such as internships and undergraduate research, can benefit college students from many backgrounds (Mayhew et al., 2016; National Academies of Sciences, Engineering, and Medicine, 2017). In this section the committee presents examples of such practices and reviews research demonstrating their effectiveness for supporting learning and persistence, both generally and in STEM fields. These examples are not intended to be exhaustive but, rather, descriptive.

Suggested Citation:"3 Goal 1: Increase Students' Mastery of STEM Concepts and Skills." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×

In the Classroom

Active Learning as a General Class of Evidence-Based Practices There is no generally agreed-upon definition of “active learning” in the research literature, but there are characteristics that such approaches have in common. In this report, the committee uses the term “active learning” to refer to that class of pedagogical practices that cognitively engage students in building understanding at the highest levels of Bloom’s taxonomy (Bloom, Krathwohl, and Masia, 1964; Anderson, Krathwohl, and Bloom, 2001). Active learning instructional practices have been shown to improve students’ academic achievement both generally, across all fields of study (Mayhew et al., 2016), and in STEM specifically (National Research Council, 2012a). These practices include collaborative classroom activities, fast feedback using classroom response systems (e.g., clickers), problem-based learning, and peer instruction (Bonwell and Eison, 1991; Prince, 2004): see Box 3-1. The core idea behind all active learning approaches is that learning requires mental activity, which is more likely to occur when students are engaged in activities or discussions focused on the content than when students are

Suggested Citation:"3 Goal 1: Increase Students' Mastery of STEM Concepts and Skills." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×

passively listening to an instructor lecture. There is ample evidence showing that engaging students in active learning improves academic achievement in comparison with traditional teaching methods. In an updated review of research on how college affects students, Mayhew et al. (2016, p. 51) reported:

. . . across active learning methods and disciplinary fields of study, the weight of evidence has found students who actively engage in the learning process gained greater subject matter competence and were more adept at higher order thinking in the discipline than peers who were less actively engaged in their learning.

Much of the evidence of the effectiveness of active learning approaches is based on studies focusing on specific STEM disciplines, referred to as discipline-based education research. This research has shown that active learning increases students’ STEM content knowledge, conceptual understanding, and problem-solving skills (National Research Council, 2012a; Faust and Paulson, 1998; Prince, 2004). Educational practices shown to be effective include interactive lectures in which students are active participants, collaborative learning activities, lecture-tutorial approaches, and laboratory experiences that incorporate realistic scientific practices and the use of technology (National Research Council, 2012a). In engineering, for example, first-year courses that engage students in teams to solve real-world engineering problems have been shown to increase student persistence in the field (Fortenberry et al., 2007; Lichtenstein et al., 2014).

A recent meta-analysis of studies spanning STEM disciplines provides some of the most conclusive evidence to date that active learning increases student performance in science, engineering, and mathematics (Freeman et al., 2014). Moreover, some research shows that, in comparison with a traditional lecture course, students from traditionally underrepresented groups taking a course with active learning methods are less likely to fail or withdraw (Haak et al., 2011). Appropriate use of active learning and formative assessment increases achievement and persistence for all students, but particularly for traditionally underrepresented students (Freeman et al., 2014). The use of evidence-based practices is especially important for improving outcomes for students in the critical high-enrollment “gateway” courses that are required for STEM majors (Gasiewski et al., 2012). Currently, these courses often act to discourage students from persisting in STEM majors (President’s Council of Advisors on Science and Technology, 2012).

Formative Assessment Rapid feedback to students and instructors on students’ learning progress is another example of an evidence-based practice, and it is also often a component of active learning instruction. Such feed-

Suggested Citation:"3 Goal 1: Increase Students' Mastery of STEM Concepts and Skills." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×

back is provided through formative assessment. Formative assessments are used to diagnose where a student is relative to learning goals and to inform students and instructors of any actions needed to address learning gaps. In contrast to summative assessment, which aims to evaluate student learning after an extended period of instruction and has stakes attached (e.g., a grade, a degree), formative assessments are administered in the course of instruction and have low or no stakes attached to them.

As with active learning, there is significant agreement in the research literature about the importance of formative assessment processes for improving students’ acquisition of STEM concepts and skills (National Research Council, 2012a). The core idea is that students need high-quality feedback about their learning in order to improve. Based on an extensive review of the formative assessment literature, Black and Wiliam (1998) concluded that the positive effect of formative assessment on student learning is larger than for most other educational innovations. There are many ways for instructors to use formative assessment. For example, in one study of several interactive geoscience classrooms, the instructors lectured for 10–20 minutes and then posted a selected-response formative assessment on the blackboard or screen. Based on their evaluation of student responses, the instructors then either lead a whole-class discussion, briefly explained the reason for the correct answer and continued lecturing, or asked students to discuss the reasons for their answers with their neighbors (peer tutoring; see Box 3-1). Students in these interactive classrooms showed a substantial improvement in understanding of key geoscience concepts (McConnell et al., 2006). Student response systems (clickers) are an increasingly popular way for instructors to promote formative assessment in large-lecture introductory STEM courses, in both 2-year and 4-year institutions: see Box 3-2.

A recent study by Roediger and colleagues (2011) indicates that low-stakes assessments can support learning whether or not the results are not used for formative purposes (i.e., to guide changes in teaching and learning). In three classroom-based experiments, the authors found that frequent, low-stakes assessments provided students with practice in trying to remember (retrieving) course material. The retrieval process promoted learning and retention of course material.

Outside the Classroom

There is emerging evidence that programs, support services, and other experiences outside the classroom—including advising, mentoring, internships, and undergraduate research experiences—can support students’ mastery of STEM concepts and skills, as well as persistence and positive attitudes toward STEM learning (National Academies of Sciences, Engineering, and Medicine, 2016, 2017). They are sometimes referred to as

Suggested Citation:"3 Goal 1: Increase Students' Mastery of STEM Concepts and Skills." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×

“co-curricular” activities, but it is important to note that internships are often part of the core curriculum in undergraduate engineering, required for completion of an engineering degree. Below, we present examples of various “outside the classroom” experiences that can be part of undergraduate STEM education and review research on their effectiveness. These examples overlap to some degree with a group of educational approaches referred to as “high impact practices” (see Box 3-3). As in the previous section, these examples are not intended to be exhaustive but, rather, descriptive.

In a review of evaluations of interventions specifically designed to encourage persistence and success in undergraduate STEM, Estrada (2014, p. 5) found “emerging evidence that many programs result in participants pursuing STEM careers at higher rates than those who do not participate in interventions.” Evaluation studies have found that summer bridge programs (Packard, 2016; Strayhorn, 2011) and living-learning programs (Brower and Inkelas, 2010) facilitate intended STEM majors’ successful transition into college and persistence in STEM, particularly for women and students of color. Internships, student professional groups, and peer tutoring programs can also have a positive effect on STEM outcomes by promoting STEM learning, expanding peer and professional networks, and

Suggested Citation:"3 Goal 1: Increase Students' Mastery of STEM Concepts and Skills." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×

developing students’ scientific identity (National Academies of Sciences, Engineering, and Medicine, 2016; Eagan, 2013). A recent review of the research on undergraduate research experiences concluded that they are beneficial for all students and help to validate disciplinary identity among students from historically underrepresented groups (National Academies of Sciences, Engineering, and Medicine, 2017).

Experiences outside the classroom can help all college students develop a basic understanding of STEM concepts and processes, in addition to their value for STEM majors. The higher education community has placed a renewed emphasis on the importance of developing such basic understanding of STEM for all college students, including it among the goals of a liberal arts education (Association of American Colleges & Universities (AAC&U), 2007; Savage, 2014). STEM experiences outside the classroom can develop students’ knowledge of the physical and natural worlds, quantitative literacy, and critical thinking and analytical skills—all of which are among what AAC&U calls essential learning outcomes for the 21st century. Many colleges and universities have devised programs and activities (e.g., first-

Suggested Citation:"3 Goal 1: Increase Students' Mastery of STEM Concepts and Skills." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×

year seminars, sustained enrichment programs, intensive intersession experiences) that require students to engage with scientific evidence and evaluate scientific claims (see, e.g., Savage and Jude, 2014). For both STEM majors and non-STEM majors, such experiences develop STEM competencies and provide opportunities for students to apply these competencies to complex, real-world problems (Savage and Jude, 2014; Estrada, 2014).

Advising Advising relationships are intended to support students’ academic progress and degree completion and are important for all students, especially those majoring in STEM. An effective advisor provides accurate information about general education and degree requirements, guides students through the academic planning process, and ensures that students complete the administrative tasks necessary to move through the higher education institution in a timely manner (Baker and Griffin, 2010; Drake, 2011; Pizzolato, 2008). Light (2004) suggested that good advising was an important characteristic of a successful college experience.

Suggested Citation:"3 Goal 1: Increase Students' Mastery of STEM Concepts and Skills." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×

Research suggests that the quality of advising influences student success. For example, a study of more than 1,000 public university students reported that students’ satisfaction with the quality of the advising they received was positively related to retention to the second year, partly because it was associated with a higher grade point average during the first year (Metzner, 1989). More generally, Pasacarella and Terenzini (2005) found that “research consistently indicates that academic advising can play a role in students’ decisions to persist and in their chances of graduating” (Pascarella and Terenzini, 2005, p. 404); it was not clear whether advising had a direct or indirect effect on students.

Some studies suggest that high-quality advising is particularly valuable for 2-year college students. For example, Seidman (1991) found that 2-year college students who received academic advising three times during their first semester to discuss course planning and involvement opportunities persisted at a rate 20 percent higher than those who participated only in the first-year orientation program (Seidman, 1991). In another study, focusing on full-time 2-year college students in California, Bahr (2008) found that academic advising improved students’ success in completing developmental mathematics and transferring to a 4-year institution. Bahr found that the benefits of advising were greater for students underprepared in mathematics than for those who were ready for college-level mathematics. Field research in multiple community colleges suggests that they can best facilitate student success by redesigning curriculum, instruction, and student supports around coherent programs of study (Bailey, Jaggars, and Jenkins, 2015); high-quality advising is a central element of this “guided pathways” approach (see Chapter 5 for further discussion).

Although better research on the direct effects of academic advising on student outcomes is needed, academic advising is consistently associated with variables that predict student success—namely, student satisfaction with the college experience, effective educational and career decision making, student use of campus support services, student-faculty contact outside the classroom, and student mentoring (Habley, Bloom, and Robbins, 2012).

The quality of advising is critically important for all students, but particularly for STEM majors. When advisors give misinformation regarding rigid course sequences or career opportunities, research shows a link to attrition from STEM majors (Haag et al., 2007). For example, a student who neglects to enroll in the appropriate mathematics course in a given semester might delay the completion of her degree by a semester or more due to the nature of STEM course prerequisites. High-quality advising enables students to make good academic decisions based on accurate information, contributing to the successful completion of the STEM degree (Baker and Griffin, 2010).

Appropriate advising on STEM is important not only for entering stu-

Suggested Citation:"3 Goal 1: Increase Students' Mastery of STEM Concepts and Skills." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×

dents who intend to pursue STEM majors, but also for all students. Many students transfer into a STEM major after initially focusing on another field of study (Chen, 2013; National Academies of Sciences, Engineering, and Medicine, 2016). Such transfers suggest that many students with interest and ability in STEM would benefit from more guidance and information about STEM programs and careers.

Mentoring Effective mentoring practices can also support students’ success in undergraduate STEM. Mentoring has been defined as a concept, a process, a developmental experience, or a set of activities (Crisp and Cruz, 2009). It may involve formal or informal interactions that occur only briefly or are sustained over time. Effective mentors can help students by bringing together ideas from different contexts to promote deeper learning. Although most studies report that mentoring has a positive effect on academic success, the varying definitions of roles and interactions has made it difficult to fully evaluate the impact of mentoring (Crisp and Cruz, 2009; National Academies of Sciences, Engineering, and Medicine, 2017).

Often, an individual faculty member mentors a student, with ongoing interactions centered on the student’s personal and professional development (Baker and Griffin, 2010; Packard, 2016). Effective mentors of STEM students go beyond information sharing; they provide psychosocial support to students, assist students in building key STEM competencies (Packard, 2016), and act as a sounding board as students work through academic and career decisions (Baker and Griffin, 2010). High-quality mentors also use their resources (social capital) and their positions in the institution and in their STEM fields to provide valuable experiences and opportunities to help their mentees meet personal and career goals (Baker and Griffin, 2010; Packard, 2016). Though mentoring requires large investments of faculty time and effort, it is a valuable practice, with positive effects on the outcomes of STEM majors, especially those from historically underrepresented populations (Packard, 2016).

Mentoring may also be provided by peers, such as STEM majors enrolled in advanced classes. Such peer mentors typically receive guidance from faculty to support first- and second-year students. Combining the concept of peer mentoring with the special needs of historically underrepresented populations, Montana State University matches incoming underrepresented and/or first-generation students intending to major in STEM with older underrepresented and/or first-generation STEM students who are succeeding at the university.1 One recent study found that students who received peer mentoring experienced increased satisfaction with and commitment to, a STEM major (Holland, Major, and Orvis, 2012). Another

___________________

1 See http://www.montana.edu/empower/mentoring.html [November 2017].

Suggested Citation:"3 Goal 1: Increase Students' Mastery of STEM Concepts and Skills." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×

study found that students who received peer mentoring reported increased sense of belonging and science identity, as well as improved self-efficacy, all factors that are important for increasing persistence of underrepresented minorities in STEM (Trujillo et al., 2015). Still, a recent review noted that research on undergraduate mentoring programs needs more rigorous research designs (Gershenfeld, 2014).

Limited Use of Evidence-Based STEM Educational Practices

Despite the growing body of research supporting the effectiveness of evidence-based educational practices in and outside the classroom, they have not been widely implemented to date (National Research Council, 2012b; Weaver et al., 2016). For example, Eagan (2016) looked at teaching practices among groups of faculty from the physical and natural sciences, engineering, and mathematics, along with social sciences and humanities. He found a persistent gap in the use of student-centered teaching techniques between faculty in the natural and physical sciences and engineering and those in the social sciences and the arts and humanities.

Recognizing the potential of evidence-based educational practices to significantly increase students’ learning, federal policy makers (President’s Council of Advisors on Science and Technology, 2012; National Science and Technology Council, 2013) and disciplinary associations (e.g., Saxe and Braddy, 2016) continue to recommend wider use of them. Timely, nationally representative information on the extent to which these practices are being used would help policy makers, associations, institutions, and educators devise effective strategies to promote their use and thus increase students’ learning and persistence in STEM.

Proposed Indicators

Given what is known about the value of evidence-based STEM educational practices and the relative lack of their widespread adoption, the committee proposes two indicators to monitor progress toward the objective of using evidence-based practices in and outside of classrooms.

Indicator 1.1.1: Use of Evidence-Based Practices in Course Development and Delivery

In the National Science and Technology Council’s 5-year strategic plan for STEM education (2013), the first recommendation for strengthening undergraduate education was to “identify and broaden implementation of evidence-based instructional practices” (p. 29). Research shows that STEM educators (faculty members, graduate student instructors, adjunct instruc-

Suggested Citation:"3 Goal 1: Increase Students' Mastery of STEM Concepts and Skills." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×

tors or others) are indeed aware of instructional alternatives to extensive lecturing (Henderson, Dancy, and Niewiadomska-Bugaj, 2012) and that such alternatives are being made available in forms more accessible to educators (Kober, 2015). But knowing of these practices is not the same as using them, especially if the departmental or institutional context does not provide support for that use (Weaver et al., 2016; see further discussion below). Surveys conducted within different STEM disciplines suggest that educators have made only limited use of research-based approaches to date (e.g., Henderson and Dancy, 2009). However, little is known about the extent to which STEM educators nationally are drawing on research to redesign and deliver their courses. Indicator 1.1.1 is designed to fill this gap, measuring the extent to which all STEM instructors (tenured and tenure-track faculty, part-time and adjunct faculty, instructors, and graduate student instructors) incorporate evidence-based educational practices in course development and delivery.

Indicator 1.1.2: Use of Evidence-Based Practices Outside the Classroom

Educational experiences outside the classroom also support students’ mastery of STEM concepts and skills, retention in STEM, and, in some cases, entry into STEM careers (Estrada, 2014; National Academies of Sciences, Engineering, and Medicine, 2017). In particular, there is evidence that undergraduate research experiences are beneficial for all students and help underrepresented minority students develop identification with their chosen STEM discipline (National Academies of Sciences, Engineering, and Medicine, 2017; Laursen et al., 2010).

OBJECTIVE 1.2: EXISTENCE AND USE OF SUPPORTS THAT HELP STEM INSTRUCTORS USE EVIDENCE-BASED EDUCATIONAL PRACTICES

Importance of the Objective

Advancing adoption of evidence-based educational practices is difficult because of such barriers as little support from faculty members, other instructors, and departments; few incentives for improved teaching; inappropriate classroom infrastructure; limited awareness of research-based instructional practices; and lack of time (Henderson, Beach, and Finkelstein, 2011; National Research Council, 2012a; National Academies of Sciences, Engineering, and Medicine, 2016). Departments are a critical unit for change in undergraduate STEM since they represent not only individual instructors’ values and aspirations, but also the whole curriculum, beyond the individual courses (e.g., Wieman, Perkins, and Gilbert, 2010). Hence,

Suggested Citation:"3 Goal 1: Increase Students' Mastery of STEM Concepts and Skills." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×

departmental and institutional supports, including professional development, are essential to help instructors learn about and adopt evidence-based educational practices.

Incorporating new approaches into established teaching practices is a challenge for many instructors. And even when they try a new instructional method, many abandon it due to lack of support (Henderson and Dancy, 2009; Henderson Dancy, and Niewiadomska-Bugaj, 2012; National Research Council, 2012a). For example, Ebert-May and colleagues (2011) conducted a survey and also videotaped direct observations of teaching practices among faculty who had participated in workshops introducing evidence-based teaching practices. Although 75 percent of the participants indicated in response to the survey that they were using evidence-based practices including student-centered and cooperative learning approaches, analysis of the videotapes showed that only 25 percent had moved toward these approaches, and 60 percent had not changed their teaching practices at all.

If instructors are to make lasting positive changes to their pedagogy, they often need instructional support, which can include time and resources for professional development opportunities (e.g., through a center for teaching and learning), mini-grants for instructional improvement, and development of instructional facilities that support different types of evidence-based educational practices. Support can also be provided by external entities. For example, many professional societies offer programs to introduce instructors to active learning strategies and support the use of these strategies (Council of Scientific Society Presidents, 2013).

Research on instructional change has advanced in the past decade, so more is now known about the factors that constrain movement toward evidence-based STEM teaching (Smith, 2013). However, much less is known about what supports adoption of evidence-based educational practices. Because there have been few efforts to systematically identify sources of instructional support for the use of evidence-based instructional practices and track how those resources are allocated by administrators and used by instructors, research in this area is needed. The limited research available indicates that the objective of existence and use of supports is complex and multifaceted. For example, physical classroom and laboratory space and technology infrastructure are likely to play a role in instructors’ use of evidence-based practices. Although multiple indicators of multiple factors may be needed to fully measure the status of this objective, the committee proposes two indicators as a starting point, focusing on professional development and support for course development and design.

Suggested Citation:"3 Goal 1: Increase Students' Mastery of STEM Concepts and Skills." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×

Proposed Indicators

Indicator 1.2.1: Extent of Instructors’ Involvement in Professional Development

One barrier to increased use of evidence-based educational practices is a lack of awareness of such practices (Henderson, Beach, and Finkelstein, 2011). Professional development is perhaps the most frequent approach to developing such awareness. Over the past decade, STEM disciplinary societies and departments have begun offering professional development programs and colleges and universities have been establishing centers for teaching and learning to help instructors across all disciplines improve their teaching practices (National Research Council, 2012a). At the same time, discipline-based education research has advanced, identifying evidence-based practices to incorporate in these growing professional development programs.

Some evaluations of such programs suggest that instructors are more likely to adopt new, evidence-based teaching practices when they participate in professional development that includes three features (Henderson, Beach, and Finkelstein, 2011):

  • sustained efforts that last 4 weeks, one semester, or longer;
  • feedback on instructional practice; and
  • a deliberate focus on changing instructors’ conceptions about teaching and learning.

A more recent evaluation of a professional development program in the geosciences yielded positive but somewhat different results. Manduca and colleagues (2017) evaluated “On the Cutting Edge,” a program that included workshops and a website to share teaching resources, to determine whether participation had led to use of evidence-based teaching practices. The authors surveyed program participants in 2004, 2009, and 2012, asking about teaching practices, engagement in education research and scientific research, and professional development related to teaching. In addition, they directly observed teaching using the Reformed Teaching Observation Protocol2 and conducted interviews to understand what aspects of the program had supported change. Analysis of the survey data indicated that use of evidence-based educational practices had become more common, especially among faculty who invested time in learning about teaching. Instructors who had participated in one or more workshops and regularly used

___________________

2 See http://www.public.asu.edu/~anton1/AssessArticles/Assessments/Biology%20Assessments/RTOP%20Reference%20Manual.pdf [July 2017].

Suggested Citation:"3 Goal 1: Increase Students' Mastery of STEM Concepts and Skills." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×

resources provided by the program’s Website were statistically more likely to use evidence-based educational strategies. Respondents also reported that learning about teaching, the availability of teaching resources, and interactions with peers had supported changes in their teaching practice. The authors suggest that even one-time participation in a workshop with peers can lead to improved teaching by supporting a combination of affective and cognitive learning outcomes.

Henderson, Beach, and Finkelstein (2011) emphasize instructors’ extended time in formal professional development programs as key to implementation of evidence-based teaching practices, while Manduca and colleagues (2017) emphasize instructors’ extended time in informal professional development (learning about teaching through interactions with peers, accessing the Website, etc.) as critical for change in teaching practices. But both groups of authors found that investing time in learning about evidence-based teaching practices supports the implementation of such practices: This finding contributed to the committee’s proposed indicator on the extent of instructors’ involvement in professional development.

More and better data about the nature of instructional supports and their use are needed. Beach and colleagues (2016) recently pointed out that faculty development is entering an age of evidence that will require faculty development units (e.g., campus teaching and learning centers) to collect more data about development services and make better use of the data for program improvement. In response to this push for evidence, it is possible that data about instructors’ use of faculty development units and the resources available from such units is being collected at the institution level, but the committee knows of no efforts to aggregate those data across institutions.

Indicator 1.2.2: Availability of Support or Incentives for Evidence-Based Course Development or Course Redesign

Course development and course redesign are time-intensive activities (Dolan et al., 2016). For example, when participants in the National Academies Summer Institutes on Undergraduate Education3 were surveyed after returning to their home institutions, they frequently commented that it took 3 or more years of experimenting with learner-centered teaching strategies before they could implement those strategies effectively (Pfund et al., 2009). Studies conducted at research-intensive universities that may not always

___________________

3 The week-long summer institutes focusing on life sciences education engaged participants in active learning and formative assessment, to help them both understand and experience these evidence-based educational practices. See http://www.hhmi.org/news/hhmi-helps-summer-institute-expand-regional-sites [September 2017].

Suggested Citation:"3 Goal 1: Increase Students' Mastery of STEM Concepts and Skills." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×

value teaching have found that instructors’ course development and redesign can sometimes be accelerated when they receive appropriate support, such as instructional resources, establishment of faculty learning communities (see, e.g., Tewksbury and MacDonald, 2005) and teaching and learning centers, and help from trained instructional assistants (see. e.g., Wieman, Perkins, and Gilbert, 2010). Faculty learning communities have also been developed at 2-year colleges (Sipple and Lightner, 2013).

Support for the time instructors need to develop or redesign a course usually comes from the department or institution in the form of course buyouts during the academic year (Dolan et al., 2016). Financial support can come as additional compensation during the academic year as “overload” or payment during unfunded summer months. Instructional resources can include content repositories, course templates, assessment/learning objective alignment tools, and successful course design models. Support can also come from different types of people, including content developers (collaborating faculty, co-instructors, postdoctoral fellows, graduate students, undergraduate students), experts in pedagogy, assessment and instructional technology, and other instructors in peer learning communities. All of these various forms of support are helpful, if not essential, but they require specific department, college, and institutional cultures that routinely demonstrate, in both words and actions, that evidence-based course development and redesign are valued.

Developing a course is not a one-time activity; it is an ongoing exploration and evolution of how to engage and help all students have the opportunity to learn (Weaver et al., 2016). Targeted experimentation, whether developed locally or as a replication of published work, signals an approach that values ongoing, or continuous, educational improvement (see further discussion below). Full engagement with evidence-based course development or redesign forces examination of learning objectives, instructional activities and approaches, assessment of student learning outcomes, connections with preceding and post courses, and interdisciplinary connections. The work fosters instructional experimentation that activates engagement in the scholarship of teaching and learning in a manner closely linked to the process STEM faculty members use in their own research.

OBJECTIVE 1.3: AN INSTITUTIONAL CULTURE THAT VALUES UNDERGRADUATE STEM INSTRUCTION

Importance of the Objective

The behavior of individuals in an organization is based on both their individual characteristics and the characteristics of their organizational contexts (Henderson and Dancy, 2007; Henderson, Beach, and Finkelstein,

Suggested Citation:"3 Goal 1: Increase Students' Mastery of STEM Concepts and Skills." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×

2011). In academic contexts, organizational culture includes both easily observable surface-level features (e.g., the way people dress, official policies, classroom layouts) and deeper, often unconscious values that groups develop as they respond to external environmental challenges and the need for internal integration (Schein, 2004). The deeper-level shared, tacit assumptions that come to be taken for granted partly determine group members’ daily behavior (Schein, 2010). Institutional values are what those in the organization perceive as important. For example, if a department’s culture values research more highly than teaching, then it is unlikely to provide support for evidence-based course development or redesign.

Because culture is dynamic, continually evolving, and includes unstated, unconscious assumptions, it is very difficult to measure. Nevertheless, proxy measures used in the research literature suggest that institutional culture is related to students’ persistence in STEM majors. For example, Griffith (2010) found that persistence was explained not only by students’ cumulative grade-point averages, but also by institutions’ ratio of undergraduates to graduate students or its share of funding going to undergraduate education relative to research (proxies for a commitment to undergraduate education). Titus (2004) found that students’ persistence was influenced by institutional contexts, concluding that “persistence is positively influenced by student academic background, college academic performance, involvement, and institutional commitment (Titus, 2004, p. 692).

A growing body of research indicates that many dimensions of current departmental and institutional cultures in higher education pose barriers to educators’ adoption of evidence-based educational practices (e.g., Dolan et al., 2016; Elrod and Kezar, 2015, 2016a, 2016b). For example, allowing each individual instructor full control over his or her course, including learning outcomes, a well-established norm in some STEM departments, can cause instructors to resist working with colleagues to establish shared learning goals for core courses, a process that is essential for improving teaching and learning. One recent analysis found that the University of Colorado Boulder science education initiative made little progress in shifting ownership of course content from individual instructors to the departmental level because of this dimension of departmental culture (Dolan et al., 2016).

Given the challenge of measuring organizational culture, the committee focused on a closely related construct, organizational climate. Organizational climate is defined as the shared perceptions of people in an organization about specific and identifiable practices and policies that represent the “surface features” of organizational culture (Denison, 1996; Peterson and Spencer, 1990; Schneider, 1975; Schneider and Reichers, 1983; Schneider, Ehrhart, and Macey, 2013). Since organizational climate can be more directly measured than culture, it is a more useful construct for the purpose

Suggested Citation:"3 Goal 1: Increase Students' Mastery of STEM Concepts and Skills." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×

of developing indicators. In addition, organizational climate is considered more malleable than culture, changing in response to evolving strategic imperatives, leadership messages, and policies and procedures (Kozlowski and Ilgen, 2006).

One example comes from Michigan State University. With leadership and funding from the provost and assistance from the Association of American Universities Undergraduate STEM Initiative, the university developed new policies and practices for introductory genetics. Previously, no single college or department had “owned” the course. A new management committee developed a set of common learning objectives and assumed responsibility for assigning instructors. Through this process, instructors developed new shared perceptions of course goals and instructional approaches (i.e., institutional climate) and increased coordination of instructional and assessment practices. Follow-up assessment showed a substantial increase in student performance relative to the prior format of introductory genetics (Association of American Universities, 2017).

Developing good indicators of climate is important, because although nearly every 4-year college and university has a mission statement stating that it values high quality teaching (i.e., surface features of the organizational culture), a closer look at institutional structures and the behavior of individuals within particular types of institutions often suggests otherwise. One of the most direct ways to understand climate is to examine not only the written policies of an organization, but also the actual practices. Thus, the committee proposes two indicators related to institutional practices.

Proposed Indicators

Indicator 1.3.1: Use of Valid Measures of Teaching Effectiveness

This proposed indicator would focus on how instructors’ performance is measured. It is well known that monitoring and feedback can lead to significant improvements in organizational and individual learning and performance (e.g., Harry and Schroeder, 2005; National Research Council, 2001), including performance in undergraduate STEM teaching. However, the way this monitoring and feedback is implemented is crucial. In particular, the data sources used shape what the institution pays attention to, and the overall integrity of the measurement and feedback process demonstrates the importance of improvement in teaching and learning to the institution.

Student evaluations of teaching are the most common method institutions use to assess teaching effectiveness (Berk, 2005; Henderson et al., 2014) for such purposes as feedback and improvement or to inform personnel decisions (e.g., promotion and tenure). However, there is disagreement in the literature about what student evaluations actually measure (e.g.,

Suggested Citation:"3 Goal 1: Increase Students' Mastery of STEM Concepts and Skills." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×

Ory and Ryan, 2001), and there are many who argue that typical student evaluations are not a valid measure of teaching effectiveness (e.g., Emery, Kramer, and Tian, 2003; Zabaleta, 2007). Furthermore, there is nearly universal agreement that high-quality assessment of teaching effectiveness requires multiple measures (e.g., Anderson et al., 2011; Chism and Stanley, 1999). An interview-based study with 72 physics instructors (Henderson et al., 2014, p. 16) concluded: “. . . both instructors and institutions use a limited repertoire of the possible assessment methods. Both groups would benefit from including additional methods.”

Indicator 1.3.2: Consideration of Evidence-Based Teaching in Personnel Decisions by Departments and Institutions

As noted above, individual instructors’ decisions about teaching practices are influenced by departmental and institutional cultures and contexts that may facilitate or discourage use of evidence-based educational practices (Austin, 2011). Reward and promotion policies are concrete manifestations of culture, and the cultures and policies of research universities have traditionally rewarded research more highly than teaching (see, e.g., Fairweather, 2005; Fairweather and Beach, 2002), with little or no attention to the quality of teaching.

Recent experience in institutional and multi-institutional STEM reform efforts indicates that a lack of rewards for evidence-based teaching discourages wider use. For example, a study of 11 California campuses undertaking institutional-level STEM education reform identified inadequate incentives and rewards for faculty participation in STEM reform projects as one of several common barriers to change (Elrod and Kezar, 2015, 2016a, 2016b). In another example, tenure-track instructors participating in extensive, weekly professional development workshops as part of a university-wide course redesign project at Purdue University reported that there was no incentive to spend much time on course redesign, especially if it reduced time for research, because promotion and tenure decisions would be based entirely on research productivity (Parker, Adedokun, and Weaver, 2016).

A 2014 survey of instructors at eight research universities participating in the Association of American Universities Undergraduate STEM Initiative revealed concerns that evidence-based teaching activities would not be recognized and rewarded in personnel evaluations (Miller and Trapani, 2016). Among the 1,000 respondents (a 37% response rate), about 60 percent were tenured and tenure-track faculty, 26 percent were graduate students, and the rest were other types of instructors. On average, the respondents agreed that teaching was important to administrators, but they were less likely to agree that effective teaching played a role in their annual perfor-

Suggested Citation:"3 Goal 1: Increase Students' Mastery of STEM Concepts and Skills." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×

mance review and salary, posing a challenge to adoption of evidence-based teaching practices.

Case studies suggest that consideration of evidence-based teaching practices in personnel decisions changes the academic climate toward more positive views about such practices. For example, Anderson and colleagues (2011) and Wieman, Perkins, and Gilbert (2010) found that requiring faculty to demonstrate excellence in teaching for promotion, while also providing adequate support and structures to develop faculty teaching ability, changed departmental climates that previously did not support undergraduate STEM education.

A climate that values undergraduate STEM education would be expected to reward and encourage individuals to seek out and make use of current knowledge related to teaching and learning. Such a climate would reflect a shared understanding of the importance of instructors who know about this evidence base and build their teaching practices on it. In such a supportive climate, administrators would use validated measures other than typical student evaluations to not only measure, but also reward, instructors’ use of evidence-based teaching practices.

Because administrators and promotion and tenure committees must consider many factors when making personnel decisions, the committee did not specify how they should weigh information provided by the use of validated measures of teaching quality. The committee thinks that simply the fact that it is a factor in personnel decisions would be significant evidence of a climate that values undergraduate STEM education.

OBJECTIVE 1.4: CONTINUOUS IMPROVEMENT IN STEM TEACHING AND LEARNING

Importance of the Objective

Just as students’ mastery of STEM concepts and skills is supported by ongoing formative assessment and rapid feedback, instructors’ work on course redesign and implementation is supported by ongoing formative and summative assessment of student learning to determine which teaching approaches are most effective and thus inform continued course improvement. At the same time, department-level improvement in STEM teaching and learning can be supported by instructors’ collaborative work to develop common learning goals for all students and engage in ongoing evaluations of students’ progress toward those goals in order to guide continued improvement. At the institutional level, campus-wide commitment to ongoing organizational learning and collection and analysis of data on program effectiveness is critical to facilitate the systemic reform of undergraduate STEM education that increases students’ mastery of STEM concepts and

Suggested Citation:"3 Goal 1: Increase Students' Mastery of STEM Concepts and Skills." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×

skills (Elrod and Kezar, 2015, 2016b). This process of ongoing evaluation and improvement is referred to as continuous improvement: see Box 3-4.

Institutional-level improvement efforts are essential when attempting nationwide improvement in undergraduate STEM education. Although most institutions are engaged in multiple quality improvement efforts in different departments, schools, and classrooms, they are often disconnected, rather than linked for systemic continuous improvement. Therefore, the committee sought to identify examples and evidence of 2-year and 4-year institutions that are engaged in coordinated continuous quality improvement efforts. Because students directly experience courses and programs of study, the committee focused on aspects of courses and programs that signal the continuous improvement process of clearly articulated goals with

Suggested Citation:"3 Goal 1: Increase Students' Mastery of STEM Concepts and Skills." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×

aligned assessment leading to outcomes-based action aimed at improving student success.

At the classroom and department levels, for example, the Science Education Initiative at the University of Colorado used a backward design approach to course transformation. Faculty were asked to define specific, measurable learning goals as the first step toward redesigning courses, along with conceptual assessments to measure student progress and revise instruction accordingly. Whenever possible, learning goals were established through faculty consensus in each department (Wieman, Perkins, and Gilbert, 2010).

At the institutional level, Bradforth and colleagues (2015, p. 283) observed: “Universities accumulate volumes of longitudinal data that have been underused in assessments of student learning and degree progress.” They noted that a few universities have integrated such datasets to create centralized, campus-wide analytics for use in measuring student progress and guiding improvements in teaching and support services. In one example of such integration, the iAMSTEM HUB at the University of California, Davis, offers tools to visualize student pathways through courses and programs. At its most basic level, the ribbon-flow tool informs recruitment and retention efforts by visualizing the majors students start with as freshmen and the majors they complete as seniors.

Challenges of Measuring Continuous Improvement

After considering alternative approaches to measuring continuous improvement in STEM teaching and learning, the committee did not propose specific indicators for this objective. The first step in continuous improvement—establishing clearly articulated student learning goals and assessments of students’ progress toward those goals—could potentially be measured. The importance of this step is supported by a large and growing body of theory and empirical evidence: see Box 3-5. Institution- or department-level measures of the percentage of STEM programs or individual STEM courses that have established learning goals and aligned assessments provide some information about continuous improvement. For example, the PULSE assessment rubric (Brancaccio-Taras et. al, 2016) addresses the degree to which programs have developed and employed curricular and course learning goals/objectives for students along with assessments that are aligned with the learning outcomes desired for students at both the course and whole curriculum level. The rubric includes two major rating categories, Course-Level Assessment and Program-Level Assessment. The majority of criteria included in this life science focused PULSE rubric are broadly applicable to the broad range of STEM disciplines. In addition, some higher education accrediting bodies require institutions to report on

Suggested Citation:"3 Goal 1: Increase Students' Mastery of STEM Concepts and Skills." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×

student learning goals and assessment results. For example, the Senior College and University Commission of the Western Association of Schools and Colleges asks institutions to clearly state student learning outcomes and standards of performance at the course, program, and institution level and to engage faculty in developing and widely sharing these student learning outcomes.4

___________________

4 See https://www.wscuc.org/resources/handbook-accreditation-2013/part-ii-core-commitments-and-standards-accreditation/wasc-standards-accreditation-2013/standard-2-achieving-educational-objectives-through-core-functions [November 2017].

Suggested Citation:"3 Goal 1: Increase Students' Mastery of STEM Concepts and Skills." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×

However, the steps in continuous improvement that follow the establishment of clearly articulated learning goals and assessing students’ progress toward those goals are much more difficult to measure. Analyzing assessment data to identify and understand the causes of student learning gaps and developing and implementing strategies to address the gaps are complex, multilevel processes. Research suggests that the use of assessment data is most effective for improving student learning when it involves—and engages the support of—multiple stakeholders, including instructors, administrators, institutional research offices, and students (Austin, 2011; Blaich and Wise, 2010). Therefore, strategies to address student learning gaps may include not only the actions of individual instructors to develop or redesign their courses, but also departmental or institutional policies that support and encourage instructors in carrying out this work, such as those measured in Indicator 1.2.2 above. Continuous improvement also includes ongoing evaluation to monitor the effects of the different strategies that may be adopted. Given the complexity and variety of the actions that may be taken at different levels of the institution, the committee does not propose a specific indicator of continuous improvement in STEM teaching and learning. Further research is needed to more clearly conceptualize the meaning of “continuous improvement in STEM teaching and learning” and define all of its key components, before indicators can be developed.

REFERENCES

Anderson, L.W., Krathwohl, D.R., and Bloom, B.S. (2001). A Taxonomy for Learning, Teaching, and Assessing: A Revision of Bloom’s Taxonomy of Educational Objectives. New York: Longman.

Anderson, W.A., Banerjee, U., Drennan, C.L., Elgin, S.C.R., Epstein, I.R., Handelsman, J., Hatfull, G.F., Losick, R., O’Dowd, D.K., Olivera, B.M., Strobel, S.A., Walker, G.C., and Warner, I.M. (2011). Changing the culture of science education at research universities. Science, 331(6014), 152–153.

Arum, R., Roksa, J., and Cook, A. (2016). Improving Quality in American Higher Education: Learning Outcomes and Assessments for the 21st Century. Hoboken, NJ: John Wiley & Sons.

Association of American Colleges & Universities. (2007). College Learning for the New Global Century. Washington, DC: Author. Available: https://www.aacu.org/sites/default/files/files/LEAP/GlobalCentury_final.pdf [June 2016].

Association of American Universities. (2017). Progress Toward Achieving Systemic Change: A Five-Year Status Report on the AAU Undergraduate STEM Education Initiative. Washington, DC. Available: https://www.aau.edu/sites/default/files/AAU-Files/STEME-ducation-Initiative/STEM-Status-Report.pdf [October 2017].

Austin, A. (2011). Promoting Evidence-Based Change in Undergraduate Science Education. Paper commissioned by the Board on Science Education. Available: http://sites.nationalacademies.org/cs/groups/dbassesite/documents/webpage/dbasse_072578.pdf [June 2016].

Suggested Citation:"3 Goal 1: Increase Students' Mastery of STEM Concepts and Skills." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×

Bahr, P.R. (2008). Cooling out in the community college: What is the effect of academic advising on students’ chances of success? Research in Higher Education, 49(8), 704–732.

Bailey, T., Jaggars, S.S., and Jenkins, D. (2015). What We Know About Guided Pathways. New York: Columbia University, Teachers College, Community College Research Center.

Baker, V.L., and Griffin, K.A. (2010). Beyond mentoring and advising: Toward understanding the role of faculty “developers” in student success. About Campus: Enhancing the Student Learning Experience, 14(6), 2–8.

Beach, A.L., Sorcinelli, M.D., Austin, A.E., and Rivard, J.K. (2016). Faculty Development in the Age of Evidence: Current Practices, Future Imperatives. Sterling, VA: Stylus.

Berk, R.A. (2005). Survey of 12 strategies for measuring teaching effectiveness. International Journal on Teaching and Learning in Higher Education, 17(1), 48–62.

Black, P., and Wiliam, D. (1998). Inside the black box: Raising standards through classroom assessment. Phi Delta Kappan, 80(2), 139–148. Available: www.pdkint1.org/kappan/kb1a9810.htm [July, 2017].

Blaich, C.F., and Wise, K.S. (2010). Moving from assessment to institutional improvement. New Directions for Institutional Research, 2010(S2), 67–78.

Bloom, B. S., Krathwohl, D.R., and Masia, B.B. (1964). Taxonomy of Educational Objectives. 1. Cognitive Domain. New York: Longman.

Bloom, B.S., Hastings, J.H., and Madaus, G.F (1971). Handbook on Formative and Summative Evaluation of Student Learning. New York: McGraw-Hill.

Bonwell, C., and Eison, J. (1991). Active Learning: Creating Excitement in the Classroom (ASHE-ERIC Higher Education Report No. 1). Washington, DC: George Washington University. Available: http://www.ed.gov/databases/ERIC_Digests/ed340272.html [July 2017].

Borrego, M., Cutler, S., Prince, M., Henderson, C., and Froyd, J.E. (2013). Fidelity of implementation of Research-Based Instructional Strategies (RBIS) in engineering science courses. Journal of Engineering Education, 102(3), 394–425. doi:10.1002/jee.20020.

Bradforth, S.E., Miller, E.R., Dichtel, W.R., Leibovich, A.K., Feig, A.L., Martin, J.D., Bjorkman, K.S., Schultz, Z.D., and Smith, T.L. (2015). Improve undergraduate science education. Nature, 523, 282–284. Available: https://www.nature.com/polopoly_fs/1.17954!/menu/main/topColumns/topLeftColumn/pdf/523282a.pdf [May 2017].

Brancaccio-Taras, L., Pape-Lindstrom, P., Peteroy-Kelly, M., Aguirre, K., Awong-Taylor, J., Balser, R., Cahill, M.J., Frey, R.G., Jack, R., Kelrick, M., Marley, K., Miller, K.G., Osgood, M., Romano, S., Uzman, J.A., and Zhao, J. (2016). The PULSE Vision & Change Rubrics, version 1.0: A valid and equitable tool to measure transformation of life sciences departments at all institution types. CBE-Life Sciences Education, 15(4), ar60. Available: http://www.lifescied.org/content/15/4/ar60.full [March 2017].

Brewer, C., and Smith, D. (Eds.). (2011). Vision and Change in Undergraduate Biology Education. Washington, DC: American Association for the Advancement of Science.

Brower, A.M., and Inkelas, K.K. (2010). Living-learning programs: One high-impact educational practice we know a lot about. Liberal Education, 96(2), 36–43.

Brownell, J.E., and Swaner, L.E. (2010). Five High-Impact Practices: Research on Learning Outcomes, Completion, and Quality. Washington, DC: Association of American Colleges & Universities.

Casagrand, J., and Semsar, K. (2017). Redesigning a course to help students achieve higher-order cognitive thinking skills: From goals and mechanics to student outcomes. Advances in Physiology Education, 41(2), 194–202. doi:10.1152/advan.00102.2016.

Chen, X. (2013). STEM Attrition: College Students’ Paths Into and Out of STEM Fields. Washington, DC: National Center for Education Statistics, Institute of Education Sciences, U.S. Department of Education.

Suggested Citation:"3 Goal 1: Increase Students' Mastery of STEM Concepts and Skills." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×

Chism, N.V.N., and Stanley, C.A. (1999). Peer Review of Teaching: A Sourcebook. Bolton, MA: Anker.

Council of Scientific Society Presidents. (2013). The Role of Scientific Societies in STEM Faculty Workshops. Available: http://www.aapt.org/Conferences/newfaculty/upload/STEM_REPORT-2.pdf [July 2017].

Crouch, C.H., and Mazur, E. (2001). Peer instruction: Ten years of experience and results. American Journal of Physics, 69(9), 970–977.

Crisp, G., and Cruz, I. (2009). Mentoring college students: A critical review of the literature between 1990 and 2007. Research in Higher Education, 50(6), 525–545.

Denison, D.R. (1996). What is the difference between organizational culture and organizational climate? A native’s point of view on a decade of paradigm wars. Academy of Management Review, 21(3), 619–654.

Dolan, E.L., Lepage, G.P., Peacock, S.M., Simmons, E.H., Sweeder, R., and Wieman, C. (2016). Improving Undergraduate STEM at Research Universities: A Collection of Case Studies. Tucson, AZ: Research Corporation for Science Advancement. Available: https://www.aau.edu/sites/default/files/STEM%20Scholarship/RCSA2016.pdf [June 2017].

Drake, J.K. (2011). The role of academic advising in student retention and persistence. About Campus, 16(3), 8–12.

Eagan, M.K. (2013). Understanding Undergraduate Interventions in STEM: Insights from a National Study. Presented to the Committee on Barriers and Opportunities in Completing 2- and 4-Year STEM Degrees. Available: http://sites.nationalacademies.org/cs/groups/dbassesite/documents/webpage/dbasse_085900.pdf [July 2017].

Eagan, K. (2016). Becoming More Student-Centered? An Examination of Faculty Teaching Practices Across STEM and Non-STEM Disciplines Between 2004 and 2014. A report prepared for the Alfred P. Sloan Foundation. Available: https://sloan.org/storage/app/media/files/STEM_Higher_Ed/STEM_Faculty_Teaching_Practices.pdf [May 2017].

Ebert-May, D., Derting, T.L., Hodder, J., Momsen, J.L., Long, T.M., and Jardeleza, S.E. (2011). What we say is not what we do: Effective evaluation of faculty professional development programs. Bioscience, 61(7), 550–558.

Elrod, S., and Kezar, A. (2015). Increasing Student Success in STEM: A Guide to Systemic Institutional Change. Washington, DC: Association of American Colleges & Universities. Available: https://www.aacu.org/peerreview/2015/spring/elrod-kezar [May 2017].

Elrod, S., and Kezar, A. (2016a). Increasing Student Success in STEM: A Guide to Systemic Institutional Change. Washington, DC: Association of American Colleges & Universities.

Elrod, S., and Kezar, A. (2016b). Increasing student success in STEM: An overview of a new guide to systemic institutional change. In G.C. Weaver, W.D. Burgess, A.L. Childress, and L. Slakey (Eds.), Transforming Institutions: 21st Century STEM Education. West Lafayette, IN: Purdue University Press.

Emery, C.R., Kramer, T.R., and Tian, R.G. (2003). Return to academic standards: A critique of student evaluations of teaching effectiveness. Quality Assurance in Education, 11(1), 37–46.

Estrada, M. (2014). Ingredients for Improving the Culture of STEM Degree Attainment with Cocurricular Supports for Underrepresented Minority Students. Paper prepared for the Committee on Barriers and Opportunities in Completing 2- and 4-Year STEM Degrees. Available: http://sites.nationalacademies.org/cs/groups/dbassesite/documents/webpage/dbasse_088832.pdf [July 2017].

Fairweather, J. (2005). Beyond the rhetoric: Trends in the relative value of teaching and research in faculty salaries. Journal of Higher Education, 76(4), 401–422.

Suggested Citation:"3 Goal 1: Increase Students' Mastery of STEM Concepts and Skills." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×

Fairweather, J. (2012). Linking Evidence and Promising Practices in Science, Technology, Engineering, and Mathematics (STEM) Undergraduate Education. Available: http://sites.nationalacademies.org/cs/groups/dbassesite/documents/webpage/dbasse_072637.pdf [June 2016].

Fairweather, J., and Beach, A. (2002). Variation in faculty work within research universities: Implications for state and institutional policy. Review of Higher Education, 26(1), 97–115.

Faust, J.L., and Paulson, D.R. (1998). Active learning in the college classroom. Journal on Excellence in College Teaching, 9(2), 3–24.

Fink, L.D. (2003). Creating Significant Learning Experiences: An Integrated Approach to Designing College Courses. Hoboken, NJ: John Wiley & Sons. Available: http://www.unl.edu/philosophy/[L._Dee_Fink]_Creating_Significant_Learning_Experi(BookZZ.org).pdf [June 2017].

Finley, A., and McNair, T. (2013). Assessing Underserved Students’ Engagement in High-Impact Practices. Washington, DC: Association of American Colleges & Universities.

Fortenberry, N.L., Sullivan, J.F., Jordan, P.N., and Knight, D.W. (2007). Engineering education research aids instruction. Science, 317(5842), 1175–1176. Available: http://itll.colorado.edu/images/uploads/about_us/publications/Papers/SCIENCE%20PUBLISHED%20VERSION%202007Aug31.pdf [February 2016].

Freeman, S., Eddy, S.L., McDonough, M., Smith, M.K., Okoroafor, N., Jordt, H., and Wenderoth, M.P. (2014). Active learning increases student performance in science, engineering, and mathematics. Proceedings of the National Academy of Sciences of the United States of America, 111(23), 8410–8415. Available: http://www.pnas.org/content/111/23/8410.full [July 2016].

Froyd, J.E., Borrego, M., Cutler, S., Henderson, C., and Prince, M.J. (2013). Estimates of use of research-based instructional strategies in core electrical or computer engineering courses. IEEE Transactions on Education, 56(4), 393–399.

Fryer, K.J., Antony, J., and Douglas, A. (2007). Critical success factors of continuous improvement in the public sector: A literature review and some key findings. The TQM Magazine, 19(5), 497–517. doi: 10.1108/09544780710817900.

Fulcher, K.H., Smith, K.L., Sanchez, E.R.H., Ames, A.J., and Meixner, C. (2017). Return of the pig: Standards for learning improvement. Research & Practice in Assessment, 11, 10–40.

Gasiewski, J.A., Eagan, M.K., Garcia, G.A., Hurtado, S., and Chang, M.J. (2012). From gate-keeping to engagement: A multicontextual, mixed method study of student academic engagement in introductory STEM courses. Research in Higher Education, 53(2), 229–261.

Gershenfeld, S. (2014). A review of undergraduate mentoring programs. Review of Educational Research, 84(3), 365–391.

Griffith, A.L. (2010). Persistence of women and minorities in STEM field majors: Is it the school that matters? Economics of Education Review, 29(6), 911–922.

Haag, S., Hubele, N., Garcia, A. and McBeath, K. (2007). Engineering undergraduate attrition and contributing factors. International Journal of Engineering Education, 23(5), 929–940.

Haak, D.C., HilleRisLambers, J., Pitre, E., and Freeman, S. (2011). Increased structure and active learning reduce the achievement gap in introductory biology. Science, 332(6034), 1213–1216.

Habley, W.R., Bloom, J.L., and Robbins, S. (2012). Increasing Persistence: Research-Based Strategies for College Student Success. Hoboken, NJ: John Wiley & Sons.

Harry, M.J., and Schroeder, R.R. (2005). Six Sigma: The Breakthrough Management Strategy Revolutionizing the World’s Top Corporations. New York: Currency.

Suggested Citation:"3 Goal 1: Increase Students' Mastery of STEM Concepts and Skills." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×

Henderson, C., and Dancy, M. (2007). Barriers to the use of research-based instructional strategies: The influence of both individual and situational characteristics. Physical Review Special Topics: Physics Education Research, 3(2), 020102.

Henderson, C., and Dancy, M.H. (2009). Impact of physics education research on the teaching of introductory quantitative physics in the United States. Physical Review Special Topics—Physics Education Research, 5(2), 020107. Available: https://journals.aps.org/prper/abstract/10.1103/PhysRevSTPER.5.020107 [February 2018].

Henderson, C., Beach, A., and Finkelstein, N. (2011). Facilitating change in undergraduate STEM instructional practices: An analytic review of the literature. Journal of Research in Science Teaching, 48(8), 952–984. doi.org/10.1002/tea.20439.

Henderson, C., Dancy, M., and Niewiadomska-Bugaj, M. (2012). Use of research-based instructional strategies in introductory physics: Where do faculty leave the innovation-decision process? Physical Review Special Topics—Physics Education Research, 8(2), 020104.

Henderson, C., Turpen, C., Dancy, M., and Chapman, T. (2014). Assessment of teaching effectiveness: Lack of alignment between instructors, institutions, and research recommendations. Physical Review Special Topics—Physics Education Research, 10, 010106.

Holland, J.M., Major, D.A., and Orvis, K.A. (2012). Understanding how peer mentoring and capitalization link STEM students to their majors. The Career Development Quarterly, 60(4), 343–354.

Jha, S., Noori, H., and Michela, J. (1996). The dynamics of continuous improvement. Aligning organizational attributes and activities for quality and productivity. International Journal of Quality Science, 1(1), 19–47.

Kober, N. (2015). Reaching Students: What Research Says About Effective Instruction in Undergraduate Science and Engineering. Board on Science Education, Division of Behavioral and Social Sciences and Education. Washington, DC: The National Academies Press.

Kozlowski, S.W.J., and Ilgen, D.R. (2006). Enhancing the effectiveness of work groups and teams. Psychological Science in the Public Interest, 7(3), 77–124.

Kuh, G.D. (2008). High-Impact Educational Practices: What They Are, Who Has Access to Them, and Why They Matter. Washington, DC: Association of American Colleges & Universities.

Kuh, G.D., and O’Donnell, K. (2013). Ensuring Quality and Taking High-Impact Practices to Scale. Washington, DC: Association of American Colleges & Universities.

Laursen, S., Hunter, A.B., Seymour, E., Thiry, H., and Melton, G. (2010). Undergraduate Research in the Sciences: Engaging Students in Real Science. Hoboken, NJ: John Wiley & Sons.

Lazry, N., Mazur, E., and Watkins, J. (2008). Peer instruction: From Harvard to the two-year college. American Journal of Physics, 76(11), 1066–1069.

Lichtenstein, G., Chen, H.L., Smith, K.A., and Maldonado, T.A. (2014). Retention and persistence of women and minorities along the engineering pathway in the United States. In A. Johri and B.M. Olds (Eds.), Cambridge Handbook of Engineering Education Research. New York: Cambridge University Press.

Light, R.J. (2004). Making the Most of College. Boston, MA: Harvard University Press.

Manduca, C.A., Iverson, E.R., Luxenberg, M., Macdonald, R.H., McConnell, D.A., Mogk, D.W., and Tewksbury, B.J. (2017). Improving undergraduate STEM education: The efficacy of discipline-based professional development. Science Advances, 3(2), 1–15.

Mayhew, M.J., Rockenbach, A.N., Bowman, N.A., Seifert, T.A.D., Wolniak, G.C., Pascarella, E.T., and Terenzini, P.T. (2016). How College Affects Students: 21st Century Evidence that Higher Education Works (vol. 3). San Francisco, CA: Jossey-Bass.

Mazur, E. (1997). Peer Instruction: A User’s Manual. Upper Saddle River, NJ: Prentice Hall.

Suggested Citation:"3 Goal 1: Increase Students' Mastery of STEM Concepts and Skills." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×

McConnell, D.A., Steer, D.N., Owens, K.D., Knott, J.R., et al. (2006). Using ConcepTests to assess and improve student conceptual understanding in introductory geoscience courses. Journal of Geoscience Education, 54(1), 61–68.

Metzner, B.S. (1989). Perceived quality of academic advising: The effect on freshman attrition. American Educational Research Journal, 26(3), 422–442.

Miller, E., and Trapani, J. (2016). AAU Undergraduate STEM Initiative: Measuring Progress. Presentation to the Committee on Developing Indicators for Undergraduate STEM Education, Washington, DC, April 1. Available: http://sites.nationalacademies.org/cs/groups/dbassesite/documents/webpage/dbasse_183497.pdf [December 2017].

National Academies of Sciences, Engineering, and Medicine. (2016). Barriers and Opportunities for 2-Year and 4-Year STEM Degrees: Systemic Change to Support Students’ Diverse Pathways. Washington, DC: The National Academies Press. Available: http://www.nap.edu/catalog/21739/barriers-and-opportunities-for-2-year-and-4-year-stem-degrees [June 2016].

National Academies of Sciences, Engineering, and Medicine. (2017). Undergraduate Research Experiences for STEM Students: Successes, Challenges, and Opportunities. Washington, DC: The National Academies Press.

National Research Council. (2001). Testing Teacher Candidates: The Role of Licensure Tests in Improving Teacher Quality. Washington, DC: National Academy Press.

National Research Council. (2012a). Discipline-Based Education Research: Understanding and Improving Learning in Undergraduate Science and Engineering. Washington, DC: The National Academies Press.

National Research Council. (2012b). Education for Life and Work: Developing Transferable Knowledge and Skills in the 21st Century. Washington, DC: The National Academies Press.

National Science Foundation. (2016). FY 2017 Budget Request. Arlington, VA: Author. Available: http://www.nsf.gov/about/budget/fy2017 [June 2016].

National Science and Technology Council. (2013). Federal STEM Education 5-Year Strategic Plan. Available: https://obamawhitehouse.archives.gov/sites/default/files/microsites/ostp/stem_stratplan_2013.pdf [March 2017].

Ory, J.C., and Ryan, K. (2001). How do student ratings measure up to a new validity framework? New Directions for Institutional Research, 2001(109), 27–44.

Packard. B.W. (2016). Successful STEM Mentoring Initiative for Underrepresented Students: A Research-based Guide for Faculty and Administrators. Sterling, VA: Stylus.

Park, S., Hironaka, S., Carver, P., and Nordstrum, L. (2013). Continuous Improvement in Education. Stanford, CA: Carnegie Foundation for the Advancement of Teaching. Available: https://www.carnegiefoundation.org/wp-content/uploads/2014/09/carnegiefoundation_continuous-improvement_2013.05.pdf [June 2017].

Parker, L.C., Adedokun, O., and Weaver, G.C. (2016). Culture policy and resources: Barriers reported by faculty implementing course reform. In G.C. Weaver, W.D. Burgess, A.L. Childress, and L. Slakey (eds.). Transforming Institutions: Undergraduate STEM Education for the 21st Century. West Lafayette, IN: Purdue University Press.

Pascarella, E.T., and Terenzini, P.T. (2005). How College Affects Students (vol. 2). K.A. Feldman (ed.). San Francisco, CA: Jossey-Bass.

Pascarella, E.T., Martin, G.L., Hanson, J.M., Trolian, T.L., Gillig, B., and Blaich, C. (2014). Effects of diversity experiences on critical thinking skills over 4 years of college. Journal of College Student Development, 55(1), 86–92. Available: http://aquila.usm.edu/cgi/viewcontent.cgi?article=9211&context=fac_pubs [May 2017].

Peterson, M.W., and Spencer, M.G. (1990). Understanding academic culture and climate. In W.G. Tierney (Ed.), Assessing Academic Cultures and Climates: New Directions for Institutional Research (pp. 3-18). San Francisco, CA: Jossey-Bass.

Suggested Citation:"3 Goal 1: Increase Students' Mastery of STEM Concepts and Skills." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×

Pfund, C., Miller, S., Brenner, K., Bruns, P., Chang, A., Ebert-May, D., Fagen, A.P., Gentile, J., Gossens, S., Khan, I.M, Labov, J.B., Pribbenow, C.M., Susman, M., Tong, L., Wright, R., Yuan, R.T., Wood, W.B., and Handelsman, J. (2009). Professional development: Summer institute to improve university science teaching. Science, 324(5926):470–471. Available:http://science.sciencemag.org/content/324/5926/470.long [May 2017].

Pizzolato, J.E. (2008). Advisor, teacher, partner: Using the learning partnerships model to reshape academic advising. About Campus, 13(1), 18–25.

President’s Council of Advisors on Science and Technology. (2012). Engage to Excel: Producing One Million Additional College Graduates with Degrees in Science, Technology, Engineering, and Mathematics. Available: https://obamawhitehouse.archives.gov/sites/default/files/microsites/ostp/pcast-engage-to-excel-final_2-25-12.pdf [July 2017].

Prince, M. (2004). Does active learning work? A review of the research. Journal of Engineering Education, 93(3), 223–231.

PULSE Fellows. (2016). The PULSE Vision and Change Rubrics Version 2.0. Available: http://api.ning.com/files/Kfu*MfW7V8MYZfU7LNGdOnG4MnryzUgUpC2IxdtUmucnB4QNCdLaOwWGoMoULSeKw8hF9jiFdh75tlzuv1nqtfCuM11hNPp3/PULSERubricsPacketv2_0_FINALVERSION.pdf [May 2017].

Roediger, H.L., Agarwal, P.K., McDaniel, M.A., and McDermott, K.B. (2011). Test-enhanced learning in the classroom: Long-term improvements from quizzing. Journal of Experimental Psychology: Applied, 17(4), 382–395.

Savage, A.F. (2014). Science literacy: A key to unlocking a fully-engaged citizenry. Diversity and Democracy, 17(3). Available: https://www.aacu.org/diversitydemocracy/2014/summer/savage [March 2017].

Savage, A.F., and Jude, B.A. (2014). Starting small: Using microbiology to foster scientific literacy. Trends in Microbiology.http://dx.doi.org/10.1016/j.tim.2014.04.005.

Saxe, K., and Braddy, L. (2016). A Common Vision for Undergraduate Mathematical Sciences Programs in 2025. Washington, DC: Mathematical Association of America.

Schein, E.H. (2004). Organizational Culture and Leadership (3rd ed.). San Francisco, CA: Jossey-Bass.

Schein, E.H. (2010). Organizational Culture and Leadership (4th ed.). Hoboken, NJ: John Wiley & Sons, Inc.

Schneider, B. (1975). Organizational climates: An essay. Personnel Psychology, 28(4), 447–479.

Schneider, B., and Reichers, A.E. (1983). On the etiology of climates. Personnel Psychology, 36(1), 19–39.

Schneider, B., Ehrhart, M.G., and Macey, W.H. (2013). Organizational climate and culture. Annual Review of Psychology, 64, 361–388.

Seidman, A. (1991). The evaluation of a pre/post admissions/counseling process at a suburban community college: Impact on student satisfaction with the faculty and the institution, retention, and academic performance. College and University, 66(4), 223–232.

Simon, B., and Taylor, J. (2009). What is the value of course-specific learning goals? Journal of College Science Teaching, 39(2), 52–57.

Sipple, S., and Lightner, R. (2013). Developing Faculty Learning Communities at Two-Year Colleges: Collaborative Models to Improve Teaching and Learning. Sterling, VA: Stylus.

Smith, D. (2013). Describing and Measuring Undergraduate Teaching Practices. Washington, DC: American Association for the Advancement of Science. Available: http://ccliconference.org/files/2013/11/Measuring-STEM-Teaching-Practices.pdf [May 2017].

Strayhorn, T.L. (2011). Bridging the pipeline: Increasing underrepresented students’ preparation for college through a summer bridge program. American Behavioral Scientist, 55(2), 142–159. doi: 10.11177/000276421038187.

Tewksbury, B.J., and MacDonald, R.H. (2005). Designing Effective and Innovative Courses. Available: http://wp.stolaf.edu/cila/files/2014/08/Assignment_Part_1.2.pdf [July 2017].

Suggested Citation:"3 Goal 1: Increase Students' Mastery of STEM Concepts and Skills." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×

Titus, M.A. (2004). An examination of the influence of institutional context on student persistence at 4-year colleges and universities: A multilevel approach. Research in Higher Education, 45(7), 673–699.

Trujillo, G., Aguinaldo, P., Anderson, C., Busamante, J., Gelsinger, D., Pastor, M., Wright, J., Marquez-Magana, L., and Riggs, B. (2015). Near-peer STEM mentoring offers unexpected benefits for mentors from traditionally underrepresented backgrounds. Perspectives on Undergraduate Research and Mentoring, 4.1. Available: http://blogs.elon.edu/purm/2015/11/11/near-peer-stem-mentoring-offers-unexpected-benefits-for-mentors [July 2016].

Tyler, R.W. (1949). Basic Principles of Curriculum and Instruction. Chicago, IL: University of Chicago Press.

Weaver, G.C., Burgess, W.D., Childress, A.L., and Slakey, L. (2016). Transforming Institutions: Undergraduate STEM Education for the 21st Century. West Lafayette, IN: Purdue University Press.

Wieman, C., Perkins, K., and Gilbert, S. (2010). Transforming science education at large research universities: A case study in progress. Change: The Magazine of Higher Learning, 42(2), 7–14.

Wiggins, G., and McTighe, J. (2005). Understanding by Design (second ed.). Alexandria, VA: Association for Supervision and Curriculum Development.

Zabaleta, F. (2007). The use and misuse of student evaluations of teaching. Teaching in Higher Education, 12(1), 55–76.

Suggested Citation:"3 Goal 1: Increase Students' Mastery of STEM Concepts and Skills." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×
Page 55
Suggested Citation:"3 Goal 1: Increase Students' Mastery of STEM Concepts and Skills." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×
Page 56
Suggested Citation:"3 Goal 1: Increase Students' Mastery of STEM Concepts and Skills." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×
Page 57
Suggested Citation:"3 Goal 1: Increase Students' Mastery of STEM Concepts and Skills." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×
Page 58
Suggested Citation:"3 Goal 1: Increase Students' Mastery of STEM Concepts and Skills." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×
Page 59
Suggested Citation:"3 Goal 1: Increase Students' Mastery of STEM Concepts and Skills." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×
Page 60
Suggested Citation:"3 Goal 1: Increase Students' Mastery of STEM Concepts and Skills." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×
Page 61
Suggested Citation:"3 Goal 1: Increase Students' Mastery of STEM Concepts and Skills." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×
Page 62
Suggested Citation:"3 Goal 1: Increase Students' Mastery of STEM Concepts and Skills." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×
Page 63
Suggested Citation:"3 Goal 1: Increase Students' Mastery of STEM Concepts and Skills." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×
Page 64
Suggested Citation:"3 Goal 1: Increase Students' Mastery of STEM Concepts and Skills." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×
Page 65
Suggested Citation:"3 Goal 1: Increase Students' Mastery of STEM Concepts and Skills." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×
Page 66
Suggested Citation:"3 Goal 1: Increase Students' Mastery of STEM Concepts and Skills." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×
Page 67
Suggested Citation:"3 Goal 1: Increase Students' Mastery of STEM Concepts and Skills." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×
Page 68
Suggested Citation:"3 Goal 1: Increase Students' Mastery of STEM Concepts and Skills." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×
Page 69
Suggested Citation:"3 Goal 1: Increase Students' Mastery of STEM Concepts and Skills." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×
Page 70
Suggested Citation:"3 Goal 1: Increase Students' Mastery of STEM Concepts and Skills." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×
Page 71
Suggested Citation:"3 Goal 1: Increase Students' Mastery of STEM Concepts and Skills." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×
Page 72
Suggested Citation:"3 Goal 1: Increase Students' Mastery of STEM Concepts and Skills." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×
Page 73
Suggested Citation:"3 Goal 1: Increase Students' Mastery of STEM Concepts and Skills." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×
Page 74
Suggested Citation:"3 Goal 1: Increase Students' Mastery of STEM Concepts and Skills." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×
Page 75
Suggested Citation:"3 Goal 1: Increase Students' Mastery of STEM Concepts and Skills." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×
Page 76
Suggested Citation:"3 Goal 1: Increase Students' Mastery of STEM Concepts and Skills." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×
Page 77
Suggested Citation:"3 Goal 1: Increase Students' Mastery of STEM Concepts and Skills." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×
Page 78
Suggested Citation:"3 Goal 1: Increase Students' Mastery of STEM Concepts and Skills." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×
Page 79
Suggested Citation:"3 Goal 1: Increase Students' Mastery of STEM Concepts and Skills." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×
Page 80
Suggested Citation:"3 Goal 1: Increase Students' Mastery of STEM Concepts and Skills." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×
Page 81
Suggested Citation:"3 Goal 1: Increase Students' Mastery of STEM Concepts and Skills." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×
Page 82
Suggested Citation:"3 Goal 1: Increase Students' Mastery of STEM Concepts and Skills." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×
Page 83
Suggested Citation:"3 Goal 1: Increase Students' Mastery of STEM Concepts and Skills." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×
Page 84
Suggested Citation:"3 Goal 1: Increase Students' Mastery of STEM Concepts and Skills." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×
Page 85
Suggested Citation:"3 Goal 1: Increase Students' Mastery of STEM Concepts and Skills." National Academies of Sciences, Engineering, and Medicine. 2018. Indicators for Monitoring Undergraduate STEM Education. Washington, DC: The National Academies Press. doi: 10.17226/24943.
×
Page 86
Next: 4 Goal 2: Strive for Equity, Diversity, and Inclusion »
Indicators for Monitoring Undergraduate STEM Education Get This Book
×
 Indicators for Monitoring Undergraduate STEM Education
Buy Paperback | $55.00 Buy Ebook | $44.99
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF

Science, technology, engineering and mathematics (STEM) professionals generate a stream of scientific discoveries and technological innovations that fuel job creation and national economic growth. Ensuring a robust supply of these professionals is critical for sustaining growth and creating jobs growth at a time of intense global competition. Undergraduate STEM education prepares the STEM professionals of today and those of tomorrow, while also helping all students develop knowledge and skills they can draw on in a variety of occupations and as individual citizens. However, many capable students intending to major in STEM later switch to another field or drop out of higher education altogether, partly because of documented weaknesses in STEM teaching, learning and student supports. Improving undergraduate STEM education to address these weaknesses is a national imperative.

Many initiatives are now underway to improve the quality of undergraduate STEM teaching and learning. Some focus on the national level, others involve multi-institution collaborations, and others take place on individual campuses. At present, however, policymakers and the public do not know whether these various initiatives are accomplishing their goals and leading to nationwide improvement in undergraduate STEM education.

Indicators for Monitoring Undergraduate STEM Education outlines a framework and a set of indicators that document the status and quality of undergraduate STEM education at the national level over multiple years. It also indicates areas where additional research is needed in order to develop appropriate measures. This publication will be valuable to government agencies that make investments in higher education, institutions of higher education, private funders of higher education programs, and industry stakeholders. It will also be of interest to researchers who study higher education.

READ FREE ONLINE

  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    Switch between the Original Pages, where you can read the report as it appeared in print, and Text Pages for the web version, where you can highlight and search the text.

    « Back Next »
  6. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  7. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  8. ×

    View our suggested citation for this chapter.

    « Back Next »
  9. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!