National Academies Press: OpenBook
« Previous: 5 Assessment to Guide Teaching and Learning
Suggested Citation:"6 Structuring the Learning Environment." National Research Council. 2011. Promising Practices in Undergraduate Science, Technology, Engineering, and Mathematics Education: Summary of Two Workshops. Washington, DC: The National Academies Press. doi: 10.17226/13099.
×

6
Structuring the Learning Environment

This session of the workshop focused on the role of learning environments in supporting science, technology, engineering, and mathematics (STEM) learning. Speakers presented different approaches to addressing the challenges that large introductory courses can pose to students’ academic success. These approaches include a variety of strategies to make large classes more interactive, as well as programs to engage undergraduate students in research experiences.

STUDIO COURSES

Karen Cummings discussed a studio physics course at Rensselaer Polytechnic Institute (RPI). Like studio art, studio physics involves learning by doing. Consequently, studio instruction is a whole-course modification that involves collaborative, hands-on learning in specially designed classrooms. The focus on hands-on activities requires longer class periods than is typical in introductory physics courses; at RPI, the studio courses meet twice a week for 2 hours per session. Instructors in studio physics courses use technology in various ways to maximize instructional time and to improve learning outcomes.

Studio physics has its origins in the work of the University of Washington’s Physics Education Research Group (see Chapter 5), which gave rise to the development of Workshop Physics at Dickenson College, a calculus-based physics course with a published curriculum (Jackson, Laws, and Franklin, 2003) that is taught without lectures. According to Cummings, studio physics is a more efficient model of Workshop Physics, and it differs from Workshop

Suggested Citation:"6 Structuring the Learning Environment." National Research Council. 2011. Promising Practices in Undergraduate Science, Technology, Engineering, and Mathematics Education: Summary of Two Workshops. Washington, DC: The National Academies Press. doi: 10.17226/13099.
×

Physics because it is not a curriculum. Instead, it is a pedagogical approach and a classroom structure.

The first studio physics course was established at RPI in 1993. By 2008, all introductory physics courses at RPI were studio courses. Cummings said that there are 15 to 20 sections of studio physics at RPI every semester, and each section contains approximately 50 students.

In one evaluation, Cummings compared a traditional lecture course with two forms of the studio course, one of which incorporated interactive lecture demonstrations and cooperative problem solving that was shown to be effective in previous research. Studying 10 sections of approximately 50 students each, she used student surveys, students’ formal course evaluations, and validated instruments to measure conceptual learning outcomes and attitudinal outcomes. The students were divided into two groups: standard studio and “studio plus” (the studio that incorporated the lecture demonstrations and cooperative problem solving). Both groups did the same homework, saw the same lectures, took the same exams, and had the same classrooms. The only difference was that studio plus incorporated research-based curricular materials.

The standard studio course was more efficient than the traditional lecture because lecture and laboratory time was combined, but no more effective in terms of learning outcomes (Cummings, 2008). When instructors incorporated research-based curricular materials, however, students at all levels made significant improvements on the force concept inventory and its associated attitudinal survey. In Cummings’s view, these data suggest that the studio format alone is not sufficient to improve students’ conceptual understanding.

Cummings also described an introductory biology course at RPI that blends a studio-style course with a web-based learning activity that students can pursue outside the time and space constraints of the classroom (asynchronous learning). To evaluate this course, McDaniel and colleagues (2007) administered a survey that assessed knowledge of biological concepts to students in a standard lecture course and the studio course with the asynchronous component. They measured normalized gains, or the ratio of how much students learned compared with how much room they had to learn based on their pretest scores.1 Students in the studio course performed significantly better in ecology and evolution than students in the traditional biology lecture course.

Studio courses are expensive to implement. As a result, instructors at many institutions are implementing less expensive hybrid models. With these

1

As defined by Hake (1998), normalized gain = (posttest – pretest)/(100 – pretest). For example, students who score 80 on the pretest and 90 on the posttest gain only 10 percentage points, but those 10 percentage points represent half of what they did not know.

Suggested Citation:"6 Structuring the Learning Environment." National Research Council. 2011. Promising Practices in Undergraduate Science, Technology, Engineering, and Mathematics Education: Summary of Two Workshops. Washington, DC: The National Academies Press. doi: 10.17226/13099.
×

models, instructors increase interactivity during lectures and create tight links between lecture materials and laboratory activities without modifying the classroom space or schedule. Based on her research, Cummings said that if instructors use physics education research–based materials in these hybrid models, students’ conceptual understanding can improve significantly.

Although hybrid models can yield appreciable gains in conceptual under standing, the appeal of the studio model lies in its ability to promote other skills. With studio models, Cummings said, students are more responsible for their own learning and develop lifelong learning skills. For example, they are required to communicate about scientific content and their intentions in applying the scientific method. They also must work efficiently in groups that they did not select, which mirrors many work environments. For Cummings, these potential gains raise the question of “What can the studio environment be proven to do that less expensive models and implementation cannot?”

REDESIGNING LARGE CLASSES FOR LEARNING

Project SCALE-UP

Robert Beichner (North Carolina State University) discussed the SCALE-UP (Student-Centered Active Learning Environment for Undergraduate Programs) project, which aims to restructure classes with large enrollments following the studio model (see http://www.ncsu.edu/per/scaleup.html). More than 50 colleges and universities have adapted the SCALE-UP approach in physics, chemistry, mathematics, engineering, and literature courses. Although the implementation of SCALE-UP varies by institution, its central feature is a redesigned learning environment to facilitate collaborative, hands-on learning and interaction among students and instructors. SCALE-UP classrooms typically have round tables with an instructor station in the middle of the room, and some contain white-boards, public thinking spaces, and storage facilities that make equipment accessible to students. Students in SCALE-UP courses are formally assigned to mixed-ability groups of students who sit at the round tables, and each table has several networked laptops.

Similar to the studio approach described in the previous section, a typical SCALE-UP class meets for five or six hours a week, combining lecture time and lab time. Classes often begin with a short lecture to set the stage for the day’s activities and relate them to the previous class. Students spend the remainder of the time in activities called “tangibles,” which are hands-on observations or measurements; “ponderables,” which are complex, real-world questions; and simulations. Classes typically end with a whole-group follow-up discussion and a brief summary lecture.

Suggested Citation:"6 Structuring the Learning Environment." National Research Council. 2011. Promising Practices in Undergraduate Science, Technology, Engineering, and Mathematics Education: Summary of Two Workshops. Washington, DC: The National Academies Press. doi: 10.17226/13099.
×

Data from several institutions show that across all performance levels (top, middle, and bottom of the class), students in SCALE-UP (studio) physics courses made greater normalized gains in their conceptual knowledge than students in lecture courses. Students in the top third of the class made greater normalized gains than students in the middle or bottom third. These gains were particularly pronounced at the Massachusetts Institute of Technology, where students learn the content by teaching each other. Similarly, in a study on physics problem solving at North Carolina State University, students in the SCALE-UP course outperformed their peers on eight of the nine exam questions; the SCALE-UP course had not yet covered the content of the ninth question (Beichner, 2008a).

Researchers at the institutions that are implementing SCALE-UP have also studied the program’s effect on other outcomes. For example:

  • Attendance at North Carolina State University is not required, yet attendance in SCALE-UP courses there averages 93 percent ( Beichner et al., 2007).

  • At Florida International University, the drop, failure, withdraw (DFW) rate for studio-based courses is one-fourth the rate for traditional courses. Enrollment requests for those courses exceed capacity by roughly four times, and faculty and student evaluations of the courses are overwhelmingly positive. After taking the course, 10-20 percent of the students pursue physics majors or minors (Kramer, Brewe, and O’Brien, 2008).

  • Clemson uses a SCALE-UP model for all introductory math courses. DFW rates in those courses have dropped from 44 to 22 percent (Biggers et al., 2007).

  • A 5-year study with 16,000 students at North Carolina State University showed that failure rates are significantly lower for students in SCALE-UP courses than for students in traditional courses, even though course requirements for SCALE-UP are more rigorous (Beichner, 2008a).

  • Female students in SCALE-UP courses at Pennsylvania State University, Erie, had significantly lower pretest scores in a variety of mathematics and science areas. By the end of the semester, their grades were the same as males (Beichner, 2008b).

  • At North Carolina State University, students with SAT mathematics scores of less than 500 fail an advanced engineering course 17 percent of the time if they take an introductory SCALE-UP physics course as the prerequisite. If their introductory course is lecture-based, they fail the later course 31 percent of the time (Beichner, 2008b).

Suggested Citation:"6 Structuring the Learning Environment." National Research Council. 2011. Promising Practices in Undergraduate Science, Technology, Engineering, and Mathematics Education: Summary of Two Workshops. Washington, DC: The National Academies Press. doi: 10.17226/13099.
×

Beichner summarized some of the objectives that he and his colleagues have measured for SCALE-UP physics and the methods they have used to assess those objectives (see Table 6-1).

Based on his research and experiences with SCALE-UP, Beichner identified three issues that warrant further study. First, he explained that in many team settings the input of underrepresented groups is devalued. This phenomenon does not occur in SCALE-UP courses, and it is important to understand why. Second, he called for research on the factors that influence the adoption of these reforms, similar to the work of Henderson and Dancy (2007, 2008a; see Chapter 8, this volume). Finally, he said that a large-scale, international study of SCALE-UP implementation would shed light on how it varies from generation to generation and the effect of different implementations on learning outcomes and affective outcomes.

Online Problem-Based Learning Case Discussions

Marcy Osgood (University of New Mexico) presented the work that she and her colleagues have done to redesign large, introductory biochemistry courses. She explained that the University of New Mexico is a large university with a racially and ethnically diverse student body. Many students come from rural high schools, and many are older students returning to college after serving in the military.

Osgood teaches in the Department of Biochemistry in the School of Medicine. Because her department administers an undergraduate major in the School of Arts and Sciences, she and her colleagues frequently work with undergraduate students. Drawing on their experiences at the medical school, in the early 2000s they transformed a large undergraduate biochemistry class into a hybrid class with lectures and small-group, problem-, and

TABLE 6-1 Summary of Objectives and Assessment Methods for SCALE-UP Physics

Objective

Assessment Method

Conceptual Understanding

Pre-posttests, interviews, portfolios

Problem Solving

Comparison tests, interviews, portfolios

Laboratory

Practical testing, portfolios

Technology

In-class observations, practical testing, portfolios

Communication

In-class observations, video recording, interviews

Attitudes

Maryland Physics Expectations Survey, interviews, in-class observations

Positive Learning Experience

Course evaluations, interviews, focus groups

SOURCE: Beichner (2008b). Reprinted with permission.

Suggested Citation:"6 Structuring the Learning Environment." National Research Council. 2011. Promising Practices in Undergraduate Science, Technology, Engineering, and Mathematics Education: Summary of Two Workshops. Washington, DC: The National Academies Press. doi: 10.17226/13099.
×

case-based tutorials. They adopted this approach because they knew that problem-based cases effectively engage students in the content, and they believed that an interactive approach would provide more opportunities for diverse students to excel.

After two years of implementing problem-based learning in large biochemistry classes, Osgood and her colleagues found that students in those classes performed better on content-based exams than students in traditional courses. However, the approach was time- and space-intensive, so they shifted to an online format.

In the online format, groups of approximately six to eight students use the scientific method to solve vague problems posed by the instructors. Through iterative postings on the course project’s website over a period of weeks, the groups develop hypotheses about what the problem means, develop an approach to solving the problem, and design experiments to investigate the problem. Instructors provide students with data based on their experimental design. Students integrate their data analysis with the course content, reflect on what they have learned, and identify how they might further address the problem.

With the online approach, Osgood can proctor 10 small groups at once, as opposed to proctoring one face-to-face group at a time. She has developed a rubric to grade student postings for content and evaluate group dynamics and progress; the rubric allows her to grade 10 groups in approximately one hour per day. She converts the rubrics to bar graphs that illustrate the groups’ progress (see Figure 6-1). The first group in Figure 6-1 was successful because the graph shows a steady increase in the content in the students’ postings, and the group had nearly 80 postings in a two-week period. In the end, that group solved the case. The second group was slower to start and eventually failed the assignment.

In addition to evaluating the whole group, Osgood and her colleagues use the rubrics to analyze individual student contributions. On the basis of these analyses, they have developed a typology of students. As Osgood explained, one category is the “serial” or “shotgun investigator.” These students conduct all possible tests without checking the results, considering cost-benefit analysis, or asking their colleagues what might be happening. “Summarizers” constitute a second category. As the name suggests, these students summarize the results of their colleagues’ experiments and identify the next steps without conducting any experiments of their own. The third category is “the lonely scientist.” Students in this category conduct all of the steps themselves and typically are the only ones posting to their groups. A final category is the “beginning expert,” who understands the concepts, integrates the methods and content appropriately, and brings the rest of the class along with him or her in the understanding of the problem.

Suggested Citation:"6 Structuring the Learning Environment." National Research Council. 2011. Promising Practices in Undergraduate Science, Technology, Engineering, and Mathematics Education: Summary of Two Workshops. Washington, DC: The National Academies Press. doi: 10.17226/13099.
×
FIGURE 6-1 Graph for assessing the progress and dynamics of small, online groups in a large biochemistry course.

FIGURE 6-1 Graph for assessing the progress and dynamics of small, online groups in a large biochemistry course.

SOURCE: Osgood, Mitchell, and Anderson (2008). Reprinted with permission.

Osgood believes that understanding the relationship between students’ practice in groups and their practice as individuals will help instructors to offer assistance that targets students’ specific needs.

Active Learning Strategies for Introductory Geology Courses

David McConnell (North Carolina State University) discussed his efforts to redesign introductory geology courses at the University of Akron. With colleagues in cognitive psychology and science education, McConnell sought to

  • determine if students are prepared to use higher order thinking skills;

  • teach an introductory course for nonmajors in which students improve their higher order thinking skills and conceptual understanding; and

Suggested Citation:"6 Structuring the Learning Environment." National Research Council. 2011. Promising Practices in Undergraduate Science, Technology, Engineering, and Mathematics Education: Summary of Two Workshops. Washington, DC: The National Academies Press. doi: 10.17226/13099.
×
  • identify strategies that others can use to assess ongoing student learning.

To address the first goal, McConnell and his colleagues assessed 741 introductory geology students using the Group Assessment of Logical Thinking (GALT) (Roadrangka, Yeany, and Padilla, 1982, 1983), a 12-item test in which students answer questions and explain why they answered the way they did. Success on the GALT requires competence in proportional reasoning, controlling variables, combinational reasoning, probabilistic reasoning, and correlational reasoning. On the basis of their results, students are placed on a continuum of ability to think abstractly. Concrete thinkers who prefer a fact-based approach and rely on memorization are at the low end of the continuum, and abstract thinkers who can understand previously unseen ideas are at the high end. Transitional thinkers, who prefer to apply ideas in a practical way, fall in the middle of the continuum.

A total of 43 percent of the University of Akron students were classified as capable of broad, abstract thought in physical science, based on their GALT scores (McConnell, 2008). The remaining 57 percent required support to grasp abstract concepts. As a result, McConnell and his colleagues sought to design learning environments that would foster students’ ability to grasp abstract information, including the concrete and transitional thinkers who required additional support in this area. Drawing on similar work at other institutions, they divided lectures into small segments, assigned students to work together in groups, and used formative assessments during class to determine student understanding and progress. Because McConnell and his colleagues implemented these changes in the context of aging lecture halls in which the seats are bolted down and closely spaced, he believes that this implementation is one of the least expensive permutations of redesigning a large learning environment.

At several points during each lecture, instructors gave students a variety of opportunities for collaborative learning. According to McConnell, these exercises targeted different levels of Bloom’s taxonomy. For example, the tasks required students to confront their preconceptions, allowed them to reflect on their understanding of key concepts, linked information to previous knowledge, and asked questions requiring the use of a range of thinking skills. Other course activities ranged from assigned reading and homework, to concept tests (asking and answering questions among peers), to graphical work products (concept maps, Venn diagrams) that demonstrated analysis and conceptual understanding.

Discussing the results of in-class assessments, McConnell explained that after three days of lecturing, fewer than half of students responded correctly to a question about the number of tectonic plates. After discussing

Suggested Citation:"6 Structuring the Learning Environment." National Research Council. 2011. Promising Practices in Undergraduate Science, Technology, Engineering, and Mathematics Education: Summary of Two Workshops. Washington, DC: The National Academies Press. doi: 10.17226/13099.
×

the topic in groups, 75 percent of students answered the question correctly on a retest. When instructors used rudimentary models rather than standard lecture to introduce plate tectonics, 56 percent of students answered the question correctly the first time, and 84 percent answered correctly after discussion in groups.

McConnell also shared results from a study he and others conducted in his classes about the use of models to explain the seasons (McConnell et al., 2005). Students in two control classes learned about the seasons through standard lecture with some demonstration. Students in six experimental classes received rudimentary models—a foam ball on a skewer with a small flashlight—and instructions about how to model different scenarios related to the seasons. Students in the experimental classes had favorable views about using the models and showed greater gains in their conceptual understanding of the seasons than students in the control classes. In addition, students in the experimental classes made greater gains in their logical thinking skills as measured by the GALT ( McConnell et al., 2005).

DOING SCIENCE: PROVIDING RESEARCH EXPERIENCES

Another way to address the challenges that large introductory classes can pose to academic success is to engage students in research. Research experiences allow students to work directly with, and learn from, individual science faculty. Noting that the best way to learn science is by doing science, committee member David Mogk introduced speakers to discuss two programs that provide research experiences for undergraduate students.

University of Michigan Undergraduate Research Opportunity Program

Sandra Gregerman (University of Michigan) discussed the Undergraduate Research Opportunity Program (UROP), which was launched in 1988 to increase the retention and academic success of underrepresented minority students at the University of Michigan. In this year-long program, first- and second-year students spend 6-12 hours per week conducting research on ongoing faculty projects in the sciences and other disciplines. The program contains academic and social support components, including peer advising, skill-building workshops, and research peer groups in which students discuss a variety of research-related issues. Each year, the program culminates in a symposium; in 2008, 750 students presented their research in poster form and 20 students delivered oral presentations on their research (Gregerman, 2008).

Gregerman and her colleagues have conducted many studies of the program over the years. Results of one longitudinal study with an experi-

Suggested Citation:"6 Structuring the Learning Environment." National Research Council. 2011. Promising Practices in Undergraduate Science, Technology, Engineering, and Mathematics Education: Summary of Two Workshops. Washington, DC: The National Academies Press. doi: 10.17226/13099.
×

mental design2 show that participating in UROP increases retention rates for some students. For example, 75 percent of African American men who participate complete their degrees, compared with 56 percent who do not participate (Gregerman, 2008). To better understand these results, evaluators conducted interviews and focus groups with students in the experimental and control groups. In those interviews, UROP students were more likely than students in the control group to mention that faculty members and graduate students cared about their success and to discuss the possibility of graduate school. They also were more likely than students in the control group to report going to faculty members’ office hours and seeking help from someone in their network instead of the library. A survey of alumni revealed that UROP participants also were significantly more likely to attend graduate or professional school (82 versus 56 percent of nonparticipants).

Center for Authentic Science Practice in Education

Gabriela Weaver (Purdue University) and Donald Wink (University of Illinois, Chicago) discussed their work with the Center for Authentic Science Practice in Education (CASPiE), a multi-institutional partnership to increase student retention in the sciences through authentic research experiences. The partner institutions include a wide range of 2- and 4-year colleges and universities (see http://www.purdue.edu/discoverypark/caspie/partners.html). These partners have developed a model in which first- and second-year science students participate in faculty research projects as part of their regular coursework. Undergraduate research experiences through CASPiE include skill-building workshops, access to sophisticated research equipment, guidance and mentoring from faculty, and opportunities for peer networking and support.

Evaluation results indicate that CASPiE participants learn chemistry as well as nonparticipants and are more likely to perceive their labs as authentic and relevant to the future (Wink and Weaver, 2008). Evaluation data also suggest that CASPiE students increase their ability to communicate the meaning of their work, despite the absence of prescribed steps in their lab manuals.3

2

In this study, researchers matched program applicants on the basis of demographic and academic characteristics and randomly accepted every other applicant. Students who were accepted to the program constituted the experimental group and those who were not selected represented the control group (Gregerman, 2008).

3

For more detail about the evaluation methods and results, see the workshop paper by Wink and Weaver (http://www7.nationalacademies.org/bose/Wink_Weaver_CommissionedPaper.pdf).

Suggested Citation:"6 Structuring the Learning Environment." National Research Council. 2011. Promising Practices in Undergraduate Science, Technology, Engineering, and Mathematics Education: Summary of Two Workshops. Washington, DC: The National Academies Press. doi: 10.17226/13099.
×

SMALL-GROUP DISCUSSIONS

In small groups, participants discussed the day’s presentations. The following points emerged during the summaries of those discussions:

  • Systemic reform is difficult and takes time. The research base is more developed than it was 10 years ago, but practice has not changed on a broad scale. Gaps in the evidence still exist, and evidence alone is not sufficient to drive change.

  • The evidence suggests that teaching methods matter and that some instructional strategies are more effective than others. For example, active, cooperative learning seems to work in different contexts.

  • The research does not fully illustrate why certain practices work, for which students, and in which contexts. Additional gaps include research on the affective domain, instructor effects (implementation of the promising practice, relationship with students, and belief in students’ abilities), the effect of culture, students’ social construction of knowledge, the expert-novice continuum, departmental and institutional change, and cost-benefit analyses.

  • Dissemination of promising practices could be more effective. The disparate pieces have not been pulled together into a coherent whole.

  • The learning goals of a particular promising practice should determine what evidence and methods are required to determine its effectiveness.

  • Different stakeholders—students, faculty, administrators, industry—have different standards of evidence and different metrics for success.

Suggested Citation:"6 Structuring the Learning Environment." National Research Council. 2011. Promising Practices in Undergraduate Science, Technology, Engineering, and Mathematics Education: Summary of Two Workshops. Washington, DC: The National Academies Press. doi: 10.17226/13099.
×
Page 42
Suggested Citation:"6 Structuring the Learning Environment." National Research Council. 2011. Promising Practices in Undergraduate Science, Technology, Engineering, and Mathematics Education: Summary of Two Workshops. Washington, DC: The National Academies Press. doi: 10.17226/13099.
×
Page 43
Suggested Citation:"6 Structuring the Learning Environment." National Research Council. 2011. Promising Practices in Undergraduate Science, Technology, Engineering, and Mathematics Education: Summary of Two Workshops. Washington, DC: The National Academies Press. doi: 10.17226/13099.
×
Page 44
Suggested Citation:"6 Structuring the Learning Environment." National Research Council. 2011. Promising Practices in Undergraduate Science, Technology, Engineering, and Mathematics Education: Summary of Two Workshops. Washington, DC: The National Academies Press. doi: 10.17226/13099.
×
Page 45
Suggested Citation:"6 Structuring the Learning Environment." National Research Council. 2011. Promising Practices in Undergraduate Science, Technology, Engineering, and Mathematics Education: Summary of Two Workshops. Washington, DC: The National Academies Press. doi: 10.17226/13099.
×
Page 46
Suggested Citation:"6 Structuring the Learning Environment." National Research Council. 2011. Promising Practices in Undergraduate Science, Technology, Engineering, and Mathematics Education: Summary of Two Workshops. Washington, DC: The National Academies Press. doi: 10.17226/13099.
×
Page 47
Suggested Citation:"6 Structuring the Learning Environment." National Research Council. 2011. Promising Practices in Undergraduate Science, Technology, Engineering, and Mathematics Education: Summary of Two Workshops. Washington, DC: The National Academies Press. doi: 10.17226/13099.
×
Page 48
Suggested Citation:"6 Structuring the Learning Environment." National Research Council. 2011. Promising Practices in Undergraduate Science, Technology, Engineering, and Mathematics Education: Summary of Two Workshops. Washington, DC: The National Academies Press. doi: 10.17226/13099.
×
Page 49
Suggested Citation:"6 Structuring the Learning Environment." National Research Council. 2011. Promising Practices in Undergraduate Science, Technology, Engineering, and Mathematics Education: Summary of Two Workshops. Washington, DC: The National Academies Press. doi: 10.17226/13099.
×
Page 50
Suggested Citation:"6 Structuring the Learning Environment." National Research Council. 2011. Promising Practices in Undergraduate Science, Technology, Engineering, and Mathematics Education: Summary of Two Workshops. Washington, DC: The National Academies Press. doi: 10.17226/13099.
×
Page 51
Suggested Citation:"6 Structuring the Learning Environment." National Research Council. 2011. Promising Practices in Undergraduate Science, Technology, Engineering, and Mathematics Education: Summary of Two Workshops. Washington, DC: The National Academies Press. doi: 10.17226/13099.
×
Page 52
Next: 7 Faculty Professional Development »
Promising Practices in Undergraduate Science, Technology, Engineering, and Mathematics Education: Summary of Two Workshops Get This Book
×
Buy Paperback | $21.00 Buy Ebook | $16.99
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF

Numerous teaching, learning, assessment, and institutional innovations in undergraduate science, technology, engineering, and mathematics (STEM) education have emerged in the past decade. Because virtually all of these innovations have been developed independently of one another, their goals and purposes vary widely. Some focus on making science accessible and meaningful to the vast majority of students who will not pursue STEM majors or careers; others aim to increase the diversity of students who enroll and succeed in STEM courses and programs; still other efforts focus on reforming the overall curriculum in specific disciplines. In addition to this variation in focus, these innovations have been implemented at scales that range from individual classrooms to entire departments or institutions.

By 2008, partly because of this wide variability, it was apparent that little was known about the feasibility of replicating individual innovations or about their potential for broader impact beyond the specific contexts in which they were created. The research base on innovations in undergraduate STEM education was expanding rapidly, but the process of synthesizing that knowledge base had not yet begun. If future investments were to be informed by the past, then the field clearly needed a retrospective look at the ways in which earlier innovations had influenced undergraduate STEM education.

To address this need, the National Research Council (NRC) convened two public workshops to examine the impact and effectiveness of selected STEM undergraduate education innovations. This volume summarizes the workshops, which addressed such topics as the link between learning goals and evidence; promising practices at the individual faculty and institutional levels; classroom-based promising practices; and professional development for graduate students, new faculty, and veteran faculty. The workshops concluded with a broader examination of the barriers and opportunities associated with systemic change.

  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    Switch between the Original Pages, where you can read the report as it appeared in print, and Text Pages for the web version, where you can highlight and search the text.

    « Back Next »
  6. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  7. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  8. ×

    View our suggested citation for this chapter.

    « Back Next »
  9. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!