APPENDIX B

Research About Student Learning as a Basis for Developing Assessment Materials: An Example from Science

Research about how students build their understanding of conceptual areas in science in classroom settings can inform the development of assessment materials. In particular, such investigations can serve as a foundation for the development of scoring rubrics that reflect levels and types of understanding that are based on observations of how students learn the concepts that are being assessed. In the volume of research papers that accompanies this report, Minstrell (1999) presents a synthesis of his investigations of how students build their understanding in several specific areas in the study of force and motion: (1) separating fluid/medium effects from gravitational effects, (2) average velocity, and (3) forces during interactions. Minstrell originally conducted his research to identify ways for improving the instruction of individual students in high school physics classes. We present it as an example of how information from research about student cognition and learning has application to development of large-scale assessment materials.

Table B-1 contains three examples of "facet clusters" drawn from Minstrell's research. These facet clusters are sets of related elements, grouped around a physical situation (e.g., forces on interacting objects) or around some conceptual idea (e.g., meaning of average velocity). The individual facets of students' thinking refer to individual pieces or constructions of a few pieces of knowledge and/or strategies of reasoning. They have been derived from research on students' thinking and from classroom observations by teachers. Within a cluster, facets can be sequenced in an approximate order of development. Those ending with 0 or 1 in the units digit tend to be appropriate, acceptable understandings for introductory physics. The facets ending in 9, 8 or 7 tend to be the more problematic



The National Academies | 500 Fifth St. N.W. | Washington, D.C. 20001
Copyright © National Academy of Sciences. All rights reserved.
Terms of Use and Privacy Statement



Below are the first 10 and last 10 pages of uncorrected machine-read text (when available) of this chapter, followed by the top 30 algorithmically extracted key phrases from the chapter as a whole.
Intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text on the opening pages of each chapter. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

Do not use for reproduction, copying, pasting, or reading; exclusively for search engines.

OCR for page 231
GRADING THE NATION'S REPORT CARD: Evaluating NAEP and Transforming the Assessment of Educational Progress APPENDIX B Research About Student Learning as a Basis for Developing Assessment Materials: An Example from Science Research about how students build their understanding of conceptual areas in science in classroom settings can inform the development of assessment materials. In particular, such investigations can serve as a foundation for the development of scoring rubrics that reflect levels and types of understanding that are based on observations of how students learn the concepts that are being assessed. In the volume of research papers that accompanies this report, Minstrell (1999) presents a synthesis of his investigations of how students build their understanding in several specific areas in the study of force and motion: (1) separating fluid/medium effects from gravitational effects, (2) average velocity, and (3) forces during interactions. Minstrell originally conducted his research to identify ways for improving the instruction of individual students in high school physics classes. We present it as an example of how information from research about student cognition and learning has application to development of large-scale assessment materials. Table B-1 contains three examples of "facet clusters" drawn from Minstrell's research. These facet clusters are sets of related elements, grouped around a physical situation (e.g., forces on interacting objects) or around some conceptual idea (e.g., meaning of average velocity). The individual facets of students' thinking refer to individual pieces or constructions of a few pieces of knowledge and/or strategies of reasoning. They have been derived from research on students' thinking and from classroom observations by teachers. Within a cluster, facets can be sequenced in an approximate order of development. Those ending with 0 or 1 in the units digit tend to be appropriate, acceptable understandings for introductory physics. The facets ending in 9, 8 or 7 tend to be the more problematic

OCR for page 231
GRADING THE NATION'S REPORT CARD: Evaluating NAEP and Transforming the Assessment of Educational Progress TABLE B-1 Three Examples of Facet Clusters EXAMPLE A Separating Fluid/Medium Effects from Gravitational Effects—Facets of Student Understanding 310 – pushes from above and below by a surrounding fluid medium lend a slight support (net upward push due to differences in the depth of pressure gradient) 311 – a mathematical formulaic approach 314 – surrounding fluids don't exert any forces or pushes on objects 315 – surrounding fluids exert equal pushes all around an object 316 – whichever surface has greater amount of fluid above or below the object has the greater push by the fluid on the surface 317 – fluid mediums exert an upward push only 318 – surrounding fluid mediums exert a net downward push 319 – weight of an object is directly proportional to medium pressure on it EXAMPLE B Average Speed or Average Velocity—Facets of Student Understanding 220 – average speed = (total distance covered)/(total time) 221 – average velocity = 225 – rate expression is overgeneralized (e.g., average velocity = xf/tf) 226 – rate expression is misstated 228 – average rate is not differentiated from another rate (e.g., velocity = speed or average velocity = average acceleration) 229 – average rate (speed/velocity) not differentiated from amount (e.g., average velocity = pf or average velocity = )

OCR for page 231
GRADING THE NATION'S REPORT CARD: Evaluating NAEP and Transforming the Assessment of Educational Progress EXAMPLE C Forces During Interactions—Facets of Student Understanding 470 – all interactions involve equal magnitude and oppositely directed action and reaction forces that are on the separate interacting bodies 474 – effects (such as damage or resulting motion) dictate relative magnitude of forces during interaction 475 – equal force pairs are identified as action and reaction but are on the same object 476 – stronger exerts more force 477 – one with more motion exerts more force 478 – more active/energetic exerts more force 479 – bigger/heavier exerts more force SOURCE: Adapted from Minstrell (1999). because they represent limited understandings or, in some cases, serious misunderstandings. Those facets with middle digits frequently arise from formal instruction but may represent over-or undergeneralizations in a student's knowledge structure. This type of systematic knowledge of the levels at which students understand and represent physical concepts, principles, and/or situations is a starting point for developing highly informative assessment materials that could be used in large-scale survey assessments such as NAEP. Figure B-1 is an example of a constructed-response item designed to probe levels of understanding from the first facet cluster in the table. As discussed by Minstrell (1999), student responses to this item can be mapped to the facets in this cluster in a relatively straightforward manner. Students may be thinking that weight is due to the downward push by air (319), or they may believe that fluids (air or water) only push downward (318) or only push upward (317), or that fluids push equally from above, below, and all around (315), or that fluids do not push at all on objects in them (314), or that there is a differential in the push depending on how much fluid is above or below the object (316). If they do understand that there is a greater push from below than from above due to the greater pressure at greater depth, they may express it in a formulaic way (311) or with a rich conceptual description (310). In a simple application, such facet clusters could be adapted for use as

OCR for page 231
GRADING THE NATION'S REPORT CARD: Evaluating NAEP and Transforming the Assessment of Educational Progress FIGURE B-1 Example constructed-response item: separating fluid/medium effects from gravitational effects. scoring rubrics if such an item were to be administered as part of a large-scale assessment, with responses reflecting facets of understanding that end in 0 or 1 being scored as correct, responses reflecting facets ending in 9, 8, or 7 being scored as incorrect, and with responses reflecting intermediate facets being scored at one or more levels of partial credit. Evaluators of students' responses must therefore be able to recognize which facet(s) are represented in a wide variety of

OCR for page 231
GRADING THE NATION'S REPORT CARD: Evaluating NAEP and Transforming the Assessment of Educational Progress student responses. In a large-scale scoring setting, this poses challenges for the recruitment and training of scorers. Minstrell provides some examples of student responses and their relationship to facets of understanding in his research paper. Although this application shows how these facet clusters should be modified to adapt to current large-scale assessment scoring strategies, greater value would be realized by using the facet clusters as a basis for reporting the frequency of occurrence of various facets of understanding in students' responses and as a foundation for the types of interpretive reports we discuss in Chapter 4. Single items such as that shown in first figure, even when coupled with qualitative evaluation frameworks such as the facet cluster in the table, seldom provide sufficient information to ascertain the specificity versus generality and appropriateness of a student's understanding. However, sets of items or item families can be constructed to assess the context specificity of understanding. Figure B-2 is a multiple-choice item that expands the analysis of medium effects in the context of the first facet cluster in the table. FIGURE B-2 Example multiple-choice item: separating fluid/medium effects from gravitational effects.

OCR for page 231
GRADING THE NATION'S REPORT CARD: Evaluating NAEP and Transforming the Assessment of Educational Progress By considering the response patterns across pairs or sets of items, such as those shown in the two figures, an evaluation can be provided of how much a student's understanding is tied to the specific surface situation described in a given problem. For example, for these items and this conceptual domain more generally, it is not uncommon for student understanding of the effects of a medium to achieve a more sophisticated level for the water context than the air context. Interpretable patterns of responses across items can also be obtained for other physical concepts and situations, and the use of an array of these sorts of item families in NAEP would provide a sound basis for the provision of more interpretive analyses of student performance that have been recommended throughout this report. In his research paper, Minstrell provides additional examples in multiple concept areas in the physical sciences of the application of a facet-based approach to the development of items and the evaluation of student responses. REFERENCE Minstrell, James 1999 Facets of student understanding and assessment development, In Grading the Nation's Report Card: Research from the Evaluation of NAEP. James W. Pellegrino, Lee R. Jones, and Karen J. Mitchell, eds. Committee on the Evaluation of National and State Assessments of Educational Progress, Board on Testing and Assessment. Washington, DC: National Academy Press.