ThinkerTools is an inquiry-based curriculum that allows students to explore the physics of motion. The curriculum is designed to engage students’ conceptions, to provide a carefully structured and highly supported computer environment for testing those conceptions, and to steep students in the processes of scientific inquiry. The curriculum has demonstrated impressive gains in students’ conceptual understanding and ability to transfer knowledge to novel problems.
White and Frederiksen (1998) designed and tested a reflective assessment component that provides students with a framework for evaluating the quality of an inquiry investigation—their own and others. The assessment categories included understanding the main ideas, understanding the inquiry process, being inventive, being systematic, reasoning carefully, using the tools of research, teamwork, and communicating well. The performance of students who were engaged in reflective assessment was compared with that of matched control students who were taught with ThinkerTools but were asked to comment on what they did and did not like about the curriculum without a guiding framework. Each teacher’s classes were evenly divided between the two treatments. There were no significant differences in students’ initial average standardized test scores (the Comprehensive Test of Basic Skills was used as a measure of prior achievement) between the classes assigned (randomly) to the different treatments.
Students in the reflective assessment classes showed higher gains both in understanding the process of scientific inquiry and in understanding the physics content. For example, one of the outcome measures was a written inquiry assessment that was given both before and after the ThinkerTools inquiry curriculum was administered. It was a written test in which students were asked to explain how they would investigate a specific research question: “What is the relationship between the weight of an object and the effect that sliding friction has on its motion?” (White and Frederiksen, 2000:22). Students were instructed to propose competing hypotheses, design an experiment (on paper) to test the hypotheses, and pretend to carry out the experiment, making up data. They were then asked to use the data they generated to reason and draw conclusions about their initial hypotheses.
Presented below are the gain scores on this challenging assessment for both low- and high-achieving students and for students in the reflective assessment and control classes. Note first that students in the reflective assessment classes gained more on this inquiry assessment. That this was particularly true for the low-achieving students. This is evidence that the metacognitive reflective assessment process is beneficial, particularly for academically disadvantaged students.
This finding was further explored by examining the gain scores for each component of the inquiry test. As shown in the figure below, one can see that the effect of reflective assessment is greatest for the more difficult aspects of the test: making up results, analyzing those results, and relating them back to the original hypotheses. In fact, the largest difference in the gain scores is that for a measure termed “coherence,” which reflects the extent to which the experiments the students designed addressed their hypotheses, their made-up results related to their experiments, their conclusions followed from their results, and their conclusions were related back to their original hypotheses. The researchers note that this kind of overall coherence is a particularly important indication of sophistication in inquiry.