National Academies Press: OpenBook
« Previous: 2 Linking Learning Goals and Evidence
Suggested Citation:"3 Surveying Promising Practices." National Research Council. 2011. Promising Practices in Undergraduate Science, Technology, Engineering, and Mathematics Education: Summary of Two Workshops. Washington, DC: The National Academies Press. doi: 10.17226/13099.
×

3
Surveying Promising Practices

PROMISING PRACTICES FOR FACULTY AND INSTITUTIONS AND PREDICTING SUCCESS IN COLLEGE SCIENCE

Moderator Melvin George (University of Missouri) introduced three panelists to discuss a range of promising practices. Each panelist was asked to address the following questions:

  1. How would you categorize the range of promising practices that have emerged over the past 20 years? Consider practices that are discipline-specific as well as those that are interdisciplinary.

  2. What types of categories do you find are most useful in sorting out the range of efforts that have emerged? Why did you choose to aggregate certain practices within a category?

  3. As you chose exemplars for your categories, what criteria did you use to identify something as a promising practice?

Jeffrey Froyd (Texas A&M University) began by describing a framework that he developed to categorize promising undergraduate teaching practices in science, technology, engineering, and mathematics (STEM).1 The framework begins with a set of decisions that faculty members must make in designing a course:

1

For more detail about this framework, see the workshop paper by Froyd (see http://www.nationalacademies.org/bose/Froyd_Promising_Practices_CommissionedPaper.pdf).

Suggested Citation:"3 Surveying Promising Practices." National Research Council. 2011. Promising Practices in Undergraduate Science, Technology, Engineering, and Mathematics Education: Summary of Two Workshops. Washington, DC: The National Academies Press. doi: 10.17226/13099.
×
  • Expectations decision: How will I articulate and communicate my expectations for student learning?

  • Student organization decision: How will students be organized as they participate in learning activities?

  • Content organization decision: How will I organize the content for my course? What overarching ideas will I use?

  • Feedback decision: How will I provide feedback to my students on their performance and growth?

  • Gathering evidence for grading decision: How will I collect evidence on which I will base the grades I assign?

  • In-classroom learning activities decision: In what learning activities will students engage during class?

  • Out-of-classroom learning activities decision: In what learning activities will students engage outside class?

  • Student-faculty interaction decision: How will I promote student-faculty interaction?

The next component of Froyd’s framework relates to two types of standards against which faculty members are likely to evaluate a promising practice: (1) implementation standards and (2) impact standards. Implementation standards include the relevance of the promising practice to the course, resource constraints, faculty comfort level, and the theoretical foundation for the promising practice. Student performance standards relate to the available evidence on the effectiveness of the promising practice, which may include comparison studies or implementation studies.

Froyd then identified eight promising practices related to teaching in the STEM disciplines and analyzed each in terms of his implementation and student performance standards (see Table 3-1).

Jeanne Narum (Project Kaleidoscope) identified three characteristics of institutional-level promising practices in STEM, noting that they (1) connect to larger goals for what students should know and be able to do upon graduation, (2) focus on the entire learning experience of the student, and (3) are kaleidoscopic (Narum, 2008). She explained that promising practices can focus on student learning goals at the institutional level, the level of the science discipline, and the societal level. To illustrate these points, Narum described examples of institutional transformation at the University of Maryland’s Baltimore Campus, Drury University, and the University of Arizona. As she explained, each institution set specific learning goals, designed learning experiences based on the goals, and assessed the effectiveness of the learning experiences. Narum also provided examples of other institutions engaged in promising practices related to assessment and pedagogies of engagement. In closing, Narum said that the best institutional practices arise when administrators and faculty share a common

Suggested Citation:"3 Surveying Promising Practices." National Research Council. 2011. Promising Practices in Undergraduate Science, Technology, Engineering, and Mathematics Education: Summary of Two Workshops. Washington, DC: The National Academies Press. doi: 10.17226/13099.
×

TABLE 3-1 Summary of Promising Practices

Promising Practices

Rating with Respect to Implementation Standards

Rating with Respect to Student Performance Standards

1: Prepare a set of learning outcomes

Strong

Good

2: Organize students in small groups

Strong

Strong

3: Organize students in learning communities

Fair

Fair to good

4: Scenario-based content organization

Good to strong

Good

5: Providing students feedback through systematic formative assessment

Strong

Good

6: Designing in-class activities to actively engage students

Strong

Strong

7: Undergraduate research

Strong or fair

Fair

8: Faculty-initiated approaches to student-faculty interactions

Strong

Fair

NOTE: Strong = easy and appropriate to implement, good = slightly less so, and fair = even less so.

SOURCE: Froyd (2008). Reprinted with permission.

vision of how the pieces of the undergraduate learning environment in STEM fit together and a commitment to work together as an institution to realize that vision.

Philip Sadler (Harvard University) focused on lessons from pre college science education. He described a large-scale survey that he and his colleagues conducted of students in introductory biology, chemistry, and physics courses at 57 randomly chosen postsecondary institutions. The focus of the study was on certain aspects of high school STEM education (e.g., advanced placement courses, the sequencing of high school science courses) that predict students’ success or failure in their college science courses. Sadler reported that 10 percent of students in introductory science courses had previously taken an advanced placement (AP) course in the same subject in high school, and those students performed only slightly better in their introductory college courses than non-AP students. Moreover, AP students who took introductory (101-level) courses did better in 102-level courses than AP students who began with 102-level courses. These findings led Sadler to recommend against AP courses for most high school students.

Next, Sadler discussed the effect of high school science-course taking on students’ performance in introductory college science courses. Overall, students who took more mathematics in high school performed better in all of their science courses than students who took fewer mathematics courses. Moreover, students who took multiple high school courses in a given science discipline performed better in college science courses in that

Suggested Citation:"3 Surveying Promising Practices." National Research Council. 2011. Promising Practices in Undergraduate Science, Technology, Engineering, and Mathematics Education: Summary of Two Workshops. Washington, DC: The National Academies Press. doi: 10.17226/13099.
×

discipline. However, Sadler and his colleagues found no cross-disciplinary effects, meaning that students who took multiple chemistry courses did not perform significantly better in college biology; students who took multiple high school physics courses did not perform better in college chemistry; and so on. Sadler also reported that the use of technology in high school science classes did not predict success in college science; however, experience in solving quantitative problems, analyzing data, and making graphs in high school did seem to predict success in college science courses.

SMALL-GROUP DISCUSSIONS AND FINAL THOUGHTS

In small groups, participants identified what they considered to be the most important promising practices in undergraduate STEM education. The following list emerged from the small-group reports:

  1. Teaching epistemology explicitly and coherently.

  2. Using formative assessment techniques and feedback loops to change practice.

  3. Providing professional development in pedagogy, particularly for graduate students.

  4. Allowing students to “do” science, such as learning in labs and problem solving.

  5. Providing structured group learning experiences.

  6. Ensuring that institutions are focused on learning outcomes.

  7. Mapping course sequences to create a coherent learning experience for students.

  8. Promoting active, engaged learning.

  9. Developing learning objectives and aligning assessments with those objectives.

  10. Encouraging metacognition.

  11. Providing undergraduate research experiences.

To close the workshop, steering committee members reflected on the main themes that were covered throughout the day. Susan Singer focused on the question of evidence and observed that the workshop addressed multiple levels of evidence. Explaining that assessment and evidence are not synonymous, she pointed out that classroom assessment to inform teaching generates one type of evidence that workshop participants discussed. Another type of evidence is affective change, and she observed that some people gather evidence to convince their colleagues to change their practice. Singer said the workshop clearly showed that scholars in some disciplines have given careful thought to the meaning of evidence and have begun to gather it to build a general knowledge base.

Suggested Citation:"3 Surveying Promising Practices." National Research Council. 2011. Promising Practices in Undergraduate Science, Technology, Engineering, and Mathematics Education: Summary of Two Workshops. Washington, DC: The National Academies Press. doi: 10.17226/13099.
×

Melvin George began his reflections by asking, “Why do we need any evidence at all?” He noted that one reason for gathering evidence is to discover what works in science education, but he said that evidence alone does not cause faculty members to change their behavior. Suggesting that the problem might lie with ineffectual theories of change rather than a lack of evidence, George proposed that it might be more productive to direct more attention and resources to making change happen.

David Mogk (University of Montana) observed that the participants discussed a continuum of promising practices ranging from individual classroom activities to courses to curricula to departments to institutional transformation. Discussing the day’s themes, Mogk described a desire to identify promising practices that promote mastery of content and skills while addressing barriers to learning, and he recalled discussions about the difficulty of articulating and assessing some of those skills. He identified the use of technology as a promising practice that cuts across disciplines and suggested a need to examine the cognitive underpinnings of how people learn in each domain. Mogk called for better alignment of learning goals, teaching and learning activities, and assessment tools.

William Wood reflected on the issue of domain-specific versus generic best practices. He noted that many of the practices discussed during the workshop seem universally applicable across disciplines and even across different levels, such as the classroom, department, and institution as a whole. He also suggested that university faculty might apply some of these principles when encouraging their colleagues to transform their teaching practice. Rather than transmitting the evidence in a didactic manner and expecting colleagues to change, Wood proposed taking a more con structivist approach to build their understanding of promising practices.

Kenneth Heller remarked on the different grain sizes of the promising practices that the participants discussed. He noted that the different goals and different kinds of evidence associated with each grain size present a challenge to generating useful evidence about promising practices. He agreed with previous speakers that evidence is important but not sufficient to drive change. Heller concluded by using a quote from the poet Voltaire as a cautionary message about gathering more evidence instead of putting existing research into practice: “The best is the enemy of the good.”

Suggested Citation:"3 Surveying Promising Practices." National Research Council. 2011. Promising Practices in Undergraduate Science, Technology, Engineering, and Mathematics Education: Summary of Two Workshops. Washington, DC: The National Academies Press. doi: 10.17226/13099.
×
Page 21
Suggested Citation:"3 Surveying Promising Practices." National Research Council. 2011. Promising Practices in Undergraduate Science, Technology, Engineering, and Mathematics Education: Summary of Two Workshops. Washington, DC: The National Academies Press. doi: 10.17226/13099.
×
Page 22
Suggested Citation:"3 Surveying Promising Practices." National Research Council. 2011. Promising Practices in Undergraduate Science, Technology, Engineering, and Mathematics Education: Summary of Two Workshops. Washington, DC: The National Academies Press. doi: 10.17226/13099.
×
Page 23
Suggested Citation:"3 Surveying Promising Practices." National Research Council. 2011. Promising Practices in Undergraduate Science, Technology, Engineering, and Mathematics Education: Summary of Two Workshops. Washington, DC: The National Academies Press. doi: 10.17226/13099.
×
Page 24
Suggested Citation:"3 Surveying Promising Practices." National Research Council. 2011. Promising Practices in Undergraduate Science, Technology, Engineering, and Mathematics Education: Summary of Two Workshops. Washington, DC: The National Academies Press. doi: 10.17226/13099.
×
Page 25
Next: 4 Scenario-, Problem-, and Case-Based Teaching and Learning »
Promising Practices in Undergraduate Science, Technology, Engineering, and Mathematics Education: Summary of Two Workshops Get This Book
×
 Promising Practices in Undergraduate Science, Technology, Engineering, and Mathematics Education: Summary of Two Workshops
Buy Paperback | $21.00 Buy Ebook | $16.99
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF

Numerous teaching, learning, assessment, and institutional innovations in undergraduate science, technology, engineering, and mathematics (STEM) education have emerged in the past decade. Because virtually all of these innovations have been developed independently of one another, their goals and purposes vary widely. Some focus on making science accessible and meaningful to the vast majority of students who will not pursue STEM majors or careers; others aim to increase the diversity of students who enroll and succeed in STEM courses and programs; still other efforts focus on reforming the overall curriculum in specific disciplines. In addition to this variation in focus, these innovations have been implemented at scales that range from individual classrooms to entire departments or institutions.

By 2008, partly because of this wide variability, it was apparent that little was known about the feasibility of replicating individual innovations or about their potential for broader impact beyond the specific contexts in which they were created. The research base on innovations in undergraduate STEM education was expanding rapidly, but the process of synthesizing that knowledge base had not yet begun. If future investments were to be informed by the past, then the field clearly needed a retrospective look at the ways in which earlier innovations had influenced undergraduate STEM education.

To address this need, the National Research Council (NRC) convened two public workshops to examine the impact and effectiveness of selected STEM undergraduate education innovations. This volume summarizes the workshops, which addressed such topics as the link between learning goals and evidence; promising practices at the individual faculty and institutional levels; classroom-based promising practices; and professional development for graduate students, new faculty, and veteran faculty. The workshops concluded with a broader examination of the barriers and opportunities associated with systemic change.

READ FREE ONLINE

  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    Switch between the Original Pages, where you can read the report as it appeared in print, and Text Pages for the web version, where you can highlight and search the text.

    « Back Next »
  6. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  7. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  8. ×

    View our suggested citation for this chapter.

    « Back Next »
  9. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!