National Academies Press: OpenBook

Grading the Nation's Report Card: Evaluating NAEP and Transforming the Assessment of Educational Progress (1999)

Chapter: Appendix B: Research About Student Learning as a Basis for Developing Assessment Materials: An Example from Science

« Previous: Appendix A: Enhancing the Assessment of Reading
Suggested Citation:"Appendix B: Research About Student Learning as a Basis for Developing Assessment Materials: An Example from Science." National Research Council. 1999. Grading the Nation's Report Card: Evaluating NAEP and Transforming the Assessment of Educational Progress. Washington, DC: The National Academies Press. doi: 10.17226/6296.
×

APPENDIX B

Research About Student Learning as a Basis for Developing Assessment Materials: An Example from Science

Research about how students build their understanding of conceptual areas in science in classroom settings can inform the development of assessment materials. In particular, such investigations can serve as a foundation for the development of scoring rubrics that reflect levels and types of understanding that are based on observations of how students learn the concepts that are being assessed. In the volume of research papers that accompanies this report, Minstrell (1999) presents a synthesis of his investigations of how students build their understanding in several specific areas in the study of force and motion: (1) separating fluid/medium effects from gravitational effects, (2) average velocity, and (3) forces during interactions. Minstrell originally conducted his research to identify ways for improving the instruction of individual students in high school physics classes. We present it as an example of how information from research about student cognition and learning has application to development of large-scale assessment materials.

Table B-1 contains three examples of "facet clusters" drawn from Minstrell's research. These facet clusters are sets of related elements, grouped around a physical situation (e.g., forces on interacting objects) or around some conceptual idea (e.g., meaning of average velocity). The individual facets of students' thinking refer to individual pieces or constructions of a few pieces of knowledge and/or strategies of reasoning. They have been derived from research on students' thinking and from classroom observations by teachers. Within a cluster, facets can be sequenced in an approximate order of development. Those ending with 0 or 1 in the units digit tend to be appropriate, acceptable understandings for introductory physics. The facets ending in 9, 8 or 7 tend to be the more problematic

Suggested Citation:"Appendix B: Research About Student Learning as a Basis for Developing Assessment Materials: An Example from Science." National Research Council. 1999. Grading the Nation's Report Card: Evaluating NAEP and Transforming the Assessment of Educational Progress. Washington, DC: The National Academies Press. doi: 10.17226/6296.
×

TABLE B-1 Three Examples of Facet Clusters

EXAMPLE A

Separating Fluid/Medium Effects from Gravitational Effects—Facets of Student Understanding

310 – pushes from above and below by a surrounding fluid medium lend a slight support (net upward push due to differences in the depth of pressure gradient)

311 – a mathematical formulaic approach

314 – surrounding fluids don't exert any forces or pushes on objects

315 – surrounding fluids exert equal pushes all around an object

316 – whichever surface has greater amount of fluid above or below the object has the greater push by the fluid on the surface

317 – fluid mediums exert an upward push only

318 – surrounding fluid mediums exert a net downward push

319 – weight of an object is directly proportional to medium pressure on it

EXAMPLE B

Average Speed or Average Velocity—Facets of Student Understanding

220 – average speed = (total distance covered)/(total time)

221 – average velocity =

225 – rate expression is overgeneralized (e.g., average velocity = xf/tf)

226 – rate expression is misstated

228 – average rate is not differentiated from another rate (e.g., velocity = speed or average velocity = average acceleration)

229 – average rate (speed/velocity) not differentiated from amount (e.g., average velocity = pf or average velocity = )

Suggested Citation:"Appendix B: Research About Student Learning as a Basis for Developing Assessment Materials: An Example from Science." National Research Council. 1999. Grading the Nation's Report Card: Evaluating NAEP and Transforming the Assessment of Educational Progress. Washington, DC: The National Academies Press. doi: 10.17226/6296.
×

EXAMPLE C

Forces During Interactions—Facets of Student Understanding

470 – all interactions involve equal magnitude and oppositely directed action and reaction forces that are on the separate interacting bodies

474 – effects (such as damage or resulting motion) dictate relative magnitude of forces during interaction

475 – equal force pairs are identified as action and reaction but are on the same object

476 – stronger exerts more force

477 – one with more motion exerts more force

478 – more active/energetic exerts more force

479 – bigger/heavier exerts more force

SOURCE: Adapted from Minstrell (1999).

because they represent limited understandings or, in some cases, serious misunderstandings. Those facets with middle digits frequently arise from formal instruction but may represent over-or undergeneralizations in a student's knowledge structure.

This type of systematic knowledge of the levels at which students understand and represent physical concepts, principles, and/or situations is a starting point for developing highly informative assessment materials that could be used in large-scale survey assessments such as NAEP. Figure B-1 is an example of a constructed-response item designed to probe levels of understanding from the first facet cluster in the table.

As discussed by Minstrell (1999), student responses to this item can be mapped to the facets in this cluster in a relatively straightforward manner. Students may be thinking that weight is due to the downward push by air (319), or they may believe that fluids (air or water) only push downward (318) or only push upward (317), or that fluids push equally from above, below, and all around (315), or that fluids do not push at all on objects in them (314), or that there is a differential in the push depending on how much fluid is above or below the object (316). If they do understand that there is a greater push from below than from above due to the greater pressure at greater depth, they may express it in a formulaic way (311) or with a rich conceptual description (310).

In a simple application, such facet clusters could be adapted for use as

Suggested Citation:"Appendix B: Research About Student Learning as a Basis for Developing Assessment Materials: An Example from Science." National Research Council. 1999. Grading the Nation's Report Card: Evaluating NAEP and Transforming the Assessment of Educational Progress. Washington, DC: The National Academies Press. doi: 10.17226/6296.
×

FIGURE B-1 Example constructed-response item: separating fluid/medium effects from gravitational effects.

scoring rubrics if such an item were to be administered as part of a large-scale assessment, with responses reflecting facets of understanding that end in 0 or 1 being scored as correct, responses reflecting facets ending in 9, 8, or 7 being scored as incorrect, and with responses reflecting intermediate facets being scored at one or more levels of partial credit. Evaluators of students' responses must therefore be able to recognize which facet(s) are represented in a wide variety of

Suggested Citation:"Appendix B: Research About Student Learning as a Basis for Developing Assessment Materials: An Example from Science." National Research Council. 1999. Grading the Nation's Report Card: Evaluating NAEP and Transforming the Assessment of Educational Progress. Washington, DC: The National Academies Press. doi: 10.17226/6296.
×

student responses. In a large-scale scoring setting, this poses challenges for the recruitment and training of scorers.

Minstrell provides some examples of student responses and their relationship to facets of understanding in his research paper. Although this application shows how these facet clusters should be modified to adapt to current large-scale assessment scoring strategies, greater value would be realized by using the facet clusters as a basis for reporting the frequency of occurrence of various facets of understanding in students' responses and as a foundation for the types of interpretive reports we discuss in Chapter 4.

Single items such as that shown in first figure, even when coupled with qualitative evaluation frameworks such as the facet cluster in the table, seldom provide sufficient information to ascertain the specificity versus generality and appropriateness of a student's understanding. However, sets of items or item families can be constructed to assess the context specificity of understanding. Figure B-2 is a multiple-choice item that expands the analysis of medium effects in the context of the first facet cluster in the table.

FIGURE B-2 Example multiple-choice item: separating fluid/medium effects from gravitational effects.

Suggested Citation:"Appendix B: Research About Student Learning as a Basis for Developing Assessment Materials: An Example from Science." National Research Council. 1999. Grading the Nation's Report Card: Evaluating NAEP and Transforming the Assessment of Educational Progress. Washington, DC: The National Academies Press. doi: 10.17226/6296.
×

By considering the response patterns across pairs or sets of items, such as those shown in the two figures, an evaluation can be provided of how much a student's understanding is tied to the specific surface situation described in a given problem. For example, for these items and this conceptual domain more generally, it is not uncommon for student understanding of the effects of a medium to achieve a more sophisticated level for the water context than the air context. Interpretable patterns of responses across items can also be obtained for other physical concepts and situations, and the use of an array of these sorts of item families in NAEP would provide a sound basis for the provision of more interpretive analyses of student performance that have been recommended throughout this report. In his research paper, Minstrell provides additional examples in multiple concept areas in the physical sciences of the application of a facet-based approach to the development of items and the evaluation of student responses.

REFERENCE

Minstrell, James 1999 Facets of student understanding and assessment development, In Grading the Nation's Report Card: Research from the Evaluation of NAEP. James W. Pellegrino, Lee R. Jones, and Karen J. Mitchell, eds. Committee on the Evaluation of National and State Assessments of Educational Progress, Board on Testing and Assessment. Washington, DC: National Academy Press.

Suggested Citation:"Appendix B: Research About Student Learning as a Basis for Developing Assessment Materials: An Example from Science." National Research Council. 1999. Grading the Nation's Report Card: Evaluating NAEP and Transforming the Assessment of Educational Progress. Washington, DC: The National Academies Press. doi: 10.17226/6296.
×
Page 231
Suggested Citation:"Appendix B: Research About Student Learning as a Basis for Developing Assessment Materials: An Example from Science." National Research Council. 1999. Grading the Nation's Report Card: Evaluating NAEP and Transforming the Assessment of Educational Progress. Washington, DC: The National Academies Press. doi: 10.17226/6296.
×
Page 232
Suggested Citation:"Appendix B: Research About Student Learning as a Basis for Developing Assessment Materials: An Example from Science." National Research Council. 1999. Grading the Nation's Report Card: Evaluating NAEP and Transforming the Assessment of Educational Progress. Washington, DC: The National Academies Press. doi: 10.17226/6296.
×
Page 233
Suggested Citation:"Appendix B: Research About Student Learning as a Basis for Developing Assessment Materials: An Example from Science." National Research Council. 1999. Grading the Nation's Report Card: Evaluating NAEP and Transforming the Assessment of Educational Progress. Washington, DC: The National Academies Press. doi: 10.17226/6296.
×
Page 234
Suggested Citation:"Appendix B: Research About Student Learning as a Basis for Developing Assessment Materials: An Example from Science." National Research Council. 1999. Grading the Nation's Report Card: Evaluating NAEP and Transforming the Assessment of Educational Progress. Washington, DC: The National Academies Press. doi: 10.17226/6296.
×
Page 235
Suggested Citation:"Appendix B: Research About Student Learning as a Basis for Developing Assessment Materials: An Example from Science." National Research Council. 1999. Grading the Nation's Report Card: Evaluating NAEP and Transforming the Assessment of Educational Progress. Washington, DC: The National Academies Press. doi: 10.17226/6296.
×
Page 236
Next: Appendix C: A Sample Family of Items Based on Number Patterns at Grade 4 »
Grading the Nation's Report Card: Evaluating NAEP and Transforming the Assessment of Educational Progress Get This Book
×
Buy Hardback | $55.00 Buy Ebook | $43.99
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF

Since the late 1960s, the National Assessment of Educational Progress (NAEP)—the nation's report card—has been the only continuing measure of student achievement in key subject areas. Increasingly, educators and policymakers have expected NAEP to serve as a lever for education reform and many other purposes beyond its original role.

Grading the Nation's Report Card examines ways NAEP can be strengthened to provide more informative portrayals of student achievement and the school and system factors that influence it. The committee offers specific recommendations and strategies for improving NAEP's effectiveness and utility, including:

  • Linking achievement data to other education indicators.
  • Streamlining data collection and other aspects of its design.
  • Including students with disabilities and English-language learners.
  • Revamping the process by which achievement levels are set.

The book explores how to improve NAEP framework documents—which identify knowledge and skills to be assessed—with a clearer eye toward the inferences that will be drawn from the results.

What should the nation expect from NAEP? What should NAEP do to meet these expectations? This book provides a blueprint for a new paradigm, important to education policymakers, professors, and students, as well as school administrators and teachers, and education advocates.

  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    Switch between the Original Pages, where you can read the report as it appeared in print, and Text Pages for the web version, where you can highlight and search the text.

    « Back Next »
  6. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  7. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  8. ×

    View our suggested citation for this chapter.

    « Back Next »
  9. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!