3
Surveying Promising Practices

PROMISING PRACTICES FOR FACULTY AND INSTITUTIONS AND PREDICTING SUCCESS IN COLLEGE SCIENCE

Moderator Melvin George (University of Missouri) introduced three panelists to discuss a range of promising practices. Each panelist was asked to address the following questions:

  1. How would you categorize the range of promising practices that have emerged over the past 20 years? Consider practices that are discipline-specific as well as those that are interdisciplinary.

  2. What types of categories do you find are most useful in sorting out the range of efforts that have emerged? Why did you choose to aggregate certain practices within a category?

  3. As you chose exemplars for your categories, what criteria did you use to identify something as a promising practice?

Jeffrey Froyd (Texas A&M University) began by describing a framework that he developed to categorize promising undergraduate teaching practices in science, technology, engineering, and mathematics (STEM).1 The framework begins with a set of decisions that faculty members must make in designing a course:

1

For more detail about this framework, see the workshop paper by Froyd (see http://www.nationalacademies.org/bose/Froyd_Promising_Practices_CommissionedPaper.pdf).



The National Academies | 500 Fifth St. N.W. | Washington, D.C. 20001
Copyright © National Academy of Sciences. All rights reserved.
Terms of Use and Privacy Statement



Below are the first 10 and last 10 pages of uncorrected machine-read text (when available) of this chapter, followed by the top 30 algorithmically extracted key phrases from the chapter as a whole.
Intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text on the opening pages of each chapter. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

Do not use for reproduction, copying, pasting, or reading; exclusively for search engines.

OCR for page 21
3 Surveying Promising Practices PROMISING PRACTICES FOR FACULTY AND INSTITUTIONS AND PREDICTING SUCCESS IN COLLEGE SCIENCE Moderator Melvin George (University of Missouri) introduced three panelists to discuss a range of promising practices. Each panelist was asked to address the following questions: 1. How would you categorize the range of promising practices that have emerged over the past 20 years? Consider practices that are discipline-specific as well as those that are interdisciplinary. 2. What types of categories do you find are most useful in sorting out the range of efforts that have emerged? Why did you choose to aggregate certain practices within a category? 3. As you chose exemplars for your categories, what criteria did you use to identify something as a promising practice? Jeffrey Froyd (Texas A&M University) began by describing a frame- work that he developed to categorize promising undergraduate teaching practices in science, technology, engineering, and mathematics (STEM).1 The framework begins with a set of decisions that faculty members must make in designing a course: 1 For more detail about this framework, see the workshop paper by Froyd (see http://www. nationalacademies.org/bose/Froyd_Promising_Practices_CommissionedPaper.pdf). 21

OCR for page 21
22 PROMISING PRACTICES IN UNDERGRADUATE STEM EDUCATION • Expectations decision: How will I articulate and communicate my expectations for student learning? • Student organization decision: How will students be organized as they participate in learning activities? • Content organization decision: How will I organize the content for my course? What overarching ideas will I use? • Feedback decision: How will I provide feedback to my students on their performance and growth? • Gathering evidence for grading decision: How will I collect evi- dence on which I will base the grades I assign? • In-classroom learning activities decision: In what learning activities will students engage during class? • Out-of-classroom learning activities decision: In what learning activities will students engage outside class? • Student-faculty interaction decision: How will I promote student- faculty interaction? The next component of Froyd’s framework relates to two types of standards against which faculty members are likely to evaluate a promising practice: (1) implementation standards and (2) impact standards. Imple- mentation standards include the relevance of the promising practice to the course, resource constraints, faculty comfort level, and the theoretical foun - dation for the promising practice. Student performance standards relate to the available evidence on the effectiveness of the promising practice, which may include comparison studies or implementation studies. Froyd then identified eight promising practices related to teaching in the STEM disciplines and analyzed each in terms of his implementation and student performance standards (see Table 3-1). Jeanne Narum (Project Kaleidoscope) identified three characteris- tics of institutional-level promising practices in STEM, noting that they (1) connect to larger goals for what students should know and be able to do upon graduation, (2) focus on the entire learning experience of the student, and (3) are kaleidoscopic (Narum, 2008). She explained that promising practices can focus on student learning goals at the institutional level, the level of the science discipline, and the societal level. To illustrate these points, Narum described examples of institutional transformation at the University of Maryland’s Baltimore Campus, Drury University, and the University of Arizona. As she explained, each institution set specific learn- ing goals, designed learning experiences based on the goals, and assessed the effectiveness of the learning experiences. Narum also provided examples of other institutions engaged in promising practices related to assessment and pedagogies of engagement. In closing, Narum said that the best insti- tutional practices arise when administrators and faculty share a common

OCR for page 21
23 SURVEYING PROMISING PRACTICES TABLE 3-1 Summary of Promising Practices Rating with Respect Rating with Respect to Implementation to Student Performance Promising Practices Standards Standards 1: Prepare a set of learning outcomes Strong Good 2: Organize students in small groups Strong Strong 3: Organize students in learning Fair Fair to good communities 4: Scenario-based content organization Good to strong Good 5: Providing students feedback through Strong Good systematic formative assessment 6: Designing in-class activities to actively Strong Strong engage students 7: Undergraduate research Strong or fair Fair 8: Faculty-initiated approaches to Strong Fair student-faculty interactions NOTE: Strong = easy and appropriate to implement, good = slightly less so, and fair = even less so. SOURCE: Froyd (2008). Reprinted with permission. vision of how the pieces of the undergraduate learning environment in STEM fit together and a commitment to work together as an institution to realize that vision. Philip Sadler (Harvard University) focused on lessons from precollege science education. He described a large-scale survey that he and his col- leagues conducted of students in introductory biology, chemistry, and physics courses at 57 randomly chosen postsecondary institutions. The focus of the study was on certain aspects of high school STEM education (e.g., advanced placement courses, the sequencing of high school science courses) that predict students’ success or failure in their college science courses. Sadler reported that 10 percent of students in introductory science courses had previously taken an advanced placement (AP) course in the same subject in high school, and those students performed only slightly better in their introductory college courses than non-AP students. Moreover, AP students who took introduc- tory (101-level) courses did better in 102-level courses than AP students who began with 102-level courses. These findings led Sadler to recommend against AP courses for most high school students. Next, Sadler discussed the effect of high school science-course taking on students’ performance in introductory college science courses. Overall, students who took more mathematics in high school performed better in all of their science courses than students who took fewer mathematics courses. Moreover, students who took multiple high school courses in a given science discipline performed better in college science courses in that

OCR for page 21
24 PROMISING PRACTICES IN UNDERGRADUATE STEM EDUCATION discipline. However, Sadler and his colleagues found no cross-disciplinary effects, meaning that students who took multiple chemistry courses did not perform significantly better in college biology; students who took multiple high school physics courses did not perform better in college chemistry; and so on. Sadler also reported that the use of technology in high school science classes did not predict success in college science; however, experience in solving quantitative problems, analyzing data, and making graphs in high school did seem to predict success in college science courses. SMALL-GROUP DISCUSSIONS AND FINAL THOUGHTS In small groups, participants identified what they considered to be the most important promising practices in undergraduate STEM education. The following list emerged from the small-group reports: 1. Teaching epistemology explicitly and coherently. 2. Using formative assessment techniques and feedback loops to change practice. 3. Providing professional development in pedagogy, particularly for graduate students. 4. Allowing students to “do” science, such as learning in labs and problem solving. 5. Providing structured group learning experiences. 6. Ensuring that institutions are focused on learning outcomes. 7. Mapping course sequences to create a coherent learning experience for students. 8. Promoting active, engaged learning. 9. Developing learning objectives and aligning assessments with those objectives. 10. Encouraging metacognition. 11. Providing undergraduate research experiences. To close the workshop, steering committee members reflected on the main themes that were covered throughout the day. Susan Singer focused on the question of evidence and observed that the workshop addressed multiple levels of evidence. Explaining that assessment and evidence are not synonymous, she pointed out that classroom assessment to inform teach- ing generates one type of evidence that workshop participants discussed. Another type of evidence is affective change, and she observed that some people gather evidence to convince their colleagues to change their practice. Singer said the workshop clearly showed that scholars in some disciplines have given careful thought to the meaning of evidence and have begun to gather it to build a general knowledge base.

OCR for page 21
25 SURVEYING PROMISING PRACTICES Melvin George began his reflections by asking, “Why do we need any evidence at all?” He noted that one reason for gathering evidence is to discover what works in science education, but he said that evidence alone does not cause faculty members to change their behavior. Suggesting that the problem might lie with ineffectual theories of change rather than a lack of evidence, George proposed that it might be more productive to direct more attention and resources to making change happen. David Mogk (University of Montana) observed that the participants discussed a continuum of promising practices ranging from individual classroom activities to courses to curricula to departments to institutional transformation. Discussing the day’s themes, Mogk described a desire to identify promising practices that promote mastery of content and skills while addressing barriers to learning, and he recalled discussions about the difficulty of articulating and assessing some of those skills. He identified the use of technology as a promising practice that cuts across disciplines and suggested a need to examine the cognitive underpinnings of how people learn in each domain. Mogk called for better alignment of learning goals, teaching and learning activities, and assessment tools. William Wood reflected on the issue of domain-specific versus generic best practices. He noted that many of the practices discussed during the workshop seem universally applicable across disciplines and even across different levels, such as the classroom, department, and institution as a whole. He also suggested that university faculty might apply some of these principles when encouraging their colleagues to transform their teaching practice. Rather than transmitting the evidence in a didactic manner and ex- pecting colleagues to change, Wood proposed taking a more constructivist approach to build their understanding of promising practices. Kenneth Heller remarked on the different grain sizes of the promising practices that the participants discussed. He noted that the different goals and different kinds of evidence associated with each grain size present a challenge to generating useful evidence about promising practices. He agreed with previous speakers that evidence is important but not sufficient to drive change. Heller concluded by using a quote from the poet Voltaire as a cautionary message about gathering more evidence instead of putting existing research into practice: “The best is the enemy of the good.”