Workshop on Developing Assessments to Meet the Goals of the 2012 Framework for K-12 Science Education
September 13, 2012
National Academy of Sciences Building 2101 Constitution Ave., NW
Auditorium
Washington DC
AGENDA
8:30 | Registration, check-in for workshop |
9:00-9:15 | Welcome, Introductions, Overview of the Agenda |
(9:00) Stuart Elliott, Director, Board on Testing and Assessment | |
(9:05) Martin Storksdieck, Director, Board on Science Education | |
(9:10) David Heil, Collaborative Mentor, CCSSO’s State Collaborative on Assessment and Student Standards (SCASS) in Science |
Part I: Problem Statement: Laying Out the Problem/Challenges
This session will review the Framework and what it calls for and discuss the challenges that it poses for assessment.
Moderator: Mark Wilson, University of California at Berkeley, Committee Cochair
9:15-10:15 | What is the vision of learning and instruction laid out in the Framework? What are the implications for assessment? |
(9:15) Helen Quinn, Stanford University, Committee Member | |
(9:35) Jim Pellegrino, University of Illinois at Chicago, Committee Cochair | |
Reactions and Questions | |
(9:55) James Woodland, Nebraska Department of Education | |
(10:00) Robin Anglin, West Virginia Department of Education | |
(10:05) Audience Q and A | |
10:15-10:30 | Break |
Part II: Exploring Alternatives: Strategies for Assessing Learning as Envisioned in the Framework
Assessing the proficiencies depicted in the Framework will require changes to the status quo. Innovative assessment formats and technology enhancements may offer the means for assessing some of the skills and performances on large-scale, external tests. Some of the skills and performances may not be well suited to large-scale, external testing formats, but other ways of measuring them may produce results that can be utilized in new ways. This session will focus in detail on some of the alternatives.
10:30-12:00 | Large-Scale Assessments |
In this session a series of panelists will discuss examples of large-scale assessments that assess science practices in conjunction with core ideas and crosscutting concepts, similar to those depicted in the Framework. Focus will be on how these strategies can be used to measure learning as envisioned in the Framework. |
Moderators: | |
Catherine Welch, University of Iowa, Committee Member Kathleen Scalise, University of Oregon, Committee Member | |
Presenters will address the following questions: | |
1. How are content knowledge, crosscutting concepts, and science practices assessed in the program? If possible, please provide one or more sample tasks and discuss the content and practices that are assessed. 2. How is the assessment administered? How long does it take and what materials and/or technologies are needed? 3. How are the tasks scored and how are scores reported? Are scores reported separately for content knowledge, crosscutting concepts, and practices or is a composite score created? 4. What steps, if any, are taken to ensure that scores are comparable from one administration to the next? 5. What was involved in developing the assessment tasks/items? What challenges were encountered and how were they handled? Please discuss any practical, cost, or feasibility issues that arose and how they were addressed. |
|
(10:30) NAEP 2009 Science Assessment: Hands-On and Interactive Computer Tasks | |
Alan Friedman, National Assessment Governing Board Peggy Carr, National Center for Education Statistics | |
(10:50) College Board’s Advanced Placement Tests in Biology | |
Rosemary Reshetar, College Board | |
(11:10) SimScientists | |
Edys Quellmalz, WestEd | |
Reactions and Questions | |
(11:30) Moderators’ follow-up questions to panelists | |
(11:40) Yvette McCulley, Iowa Department of Education | |
(11:50) Audience Q and A | |
12:00-12:45 | Lunch in Great Hall |
12:45-2:30 | Assessments Embedded in Curricular Units |
The Framework calls for an approach to instruction and assessment that utilizes learning progressions and associated curricular units. What assessment strategies can be used to measure students’ achievement in relation to a learning progression? What types of activities/tasks allow us to make inferences about where a student is on the progression? This session will feature examples of work to develop assessments of learning progressions in conjunction with curricular units. | |
Moderator: Mark Wilson | |
(12:45) Introductory Remarks by the Moderator | |
Assessing Science Knowledge That Inextricably Links Core | |
Disciplinary Ideas and Practices | |
(1:00) Joe Krajcik, Michigan State University | |
(1:15) Nancy Butler Songer, University of Michigan, Committee Member | |
(1:30) Brian Reiser, Northwestern University, Committee Member | |
(1:45) Rich Lehrer, Vanderbilt University, Committee Member | |
Reactions and Questions | |
(2:00) Roberta Tanner, Loveland High School, Committee Member | |
(2:10) Beverly Vance, North Carolina Department of Public Instruction | |
(2:20) Audience Q and A | |
2:30-3:15 | Measurement Challenges |
This session will consider the featured sample assessments—both large-scale and curriculum-embedded—and discuss the measurement challenges associated with these approaches. The session will focus on issues such as: (1) to what extent do these approaches offer viable alternatives for assessing science learning consistent with the Framework; (2) to what extent are these approaches likely to yield scores that support the desired inferences and policy purposes; (3) what practical, technical, and psychometric challenges might arise with these approaches? | |
Moderator: Mark Wilson | |
(2:30) Ed Haertel, Stanford University, Committee Member | |
Reactions and Questions | |
(2:50) Anita Bernhardt, Maine Department of Education | |
(2:57) Jeff Greig, Connecticut State Department of Education | |
(3:05) Audience Q and A | |
3:15-3:30 | Break |
Part III: Developing Systems of Assessments
This session will address different strategies for gathering assessment information—some based on summative assessment, some based on end-of-course assessments, and some based on collections of classroom work—and consider how to integrate/combine the information. The session will discuss models used in other countries and settings that provide ways to integrate a broad range of assessment information.
3:30-4:30 | Moderator: Jerome Shaw, University of California, Santa Cruz, |
Committee Member | |
Presenters: | |
(3:30) Joan Herman, CRESST, Committee Member | |
(3:45) Knut Neumann, University of Kiel, Committee Member | |
Reactions and Questions: | |
(4:00) Susan Codere Kelly, Michigan Department of Education | |
(4:10) Melinda Curless, Kentucky Department of Education | |
(4:20) Audience Q and A |
Part IV: Synthesis
4:30-5:45 | Moderators: Jim Pellegrino, Mark Wilson |
Panel | |
(4:30) Peter McLaren, Rhode Island Department of Elementary and Secondary Education, Committee Member |
(4:40) Richard Amasino, University of Wisconsin–Madison, Committee Member | |
(4:50) Shelley Lee, Wisconsin Department of Public Instruction | |
(5:00) Matt Krehbiel, Kansas State Department of Education | |
(5:10) Comments from the Moderators | |
(5:20) Audience Q and A | |
Questions for Discussion | |
• What are the main takeaway points from the workshop discussions? • Considering the sample assessments discussed during the workshop, which approaches to assessment seem most promising and consistent with the goals of the Framework? What challenges do they help solve? What challenges would still need to be solved? • What additional issues should the committee explore? |
|
5:45 | Adjourn |