were uncritical and unlikely to be of much help in sorting and selecting from a number of choices.
Committee's response: Clarify the criteria and add specific recommendations for reviewer training.
The degree to which the instructional materials involved students in scientific inquiry did not appear to be an important criterion for the reviewers, although it is an essential standard in the Standards.
Committee's response: Strengthen this important criterion and add recommendations for reviewer training.
Consideration of the cost of the materials, an element in the prototype tool, seemed to confuse reviewers, required extensive research, and did not contribute to the evaluation.
Committee's response: This consideration was moved to a new selection phase of the tool. It was not deleted because it will be an important final consideration.
In one of the three field-test groups, most of the reviewers had previous experience in instructional materials review and strongly suggested the use of a rubric for each criterion. In the education profession a rubric is a scale that includes a detailed definition of each rating level for each criterion.
Committee's response: Refrain from recommending the use of rubrics in order to remain flexible in meeting local needs and to encourage and honor the individual judgments of reviewers.
The experiences with all three review groups indicated that training of the evaluators would be required in order to assure a reference to standards as an integral part of the process, to include the consideration of inquiry-based learning as an important feature of instructional materials, and to encourage the exercise of individual, independent judgment.
Committee's response: Prepare a training guide to accompany the tool.
The field test exposed the separate procedures used for evaluation and selection in some school districts. The prototype tool blurred the different considerations and people involved in these two processes.