To what extent does the entire assessment process (including the exercises, scoring rubrics, and scoring mechanisms)1 yield results that reflect the specified knowledge, skills, dispositions, and judgments?
Is the passing score reasonable? What process was used for establishing the passing score? How is the passing score justified? To what extent do pass rates differ for various groups of candidates, and are such differences reflective of bias in the test?
To what extent do the scores reflect teacher quality? What evidence is available that board-certified teachers actually practice in ways that are consistent with the knowledge, skills, dispositions, and judgments they demonstrate through the assessment process? Do knowledgeable observers find them to be better teachers than individuals who failed when they attempted to earn board certification?
This chapter begins with a discussion of the approach we took to the psychometric evaluation and the resources on which we relied. We then describe the national board’s approach in relation to our two broad questions. We first address Question 1 and discuss the national board’s approach to developing the standards and assessments. This is followed by a discussion of the process for scoring the assessments and setting performance standards. We then turn to Question 2 and discuss the assessment’s technical characteristics, including reliability, validity, and fairness. At the end of the chapter we return to the original framework questions, summarize the findings and conclusions, and make recommendations.
Our primary resource for information about the psychometric characteristics of the assessments is the annual reports prepared for the NBPTS by its contractor at the time, the Educational Testing Service, to summarize information related to each year’s administrations, called Assessment Analysis Reports. We reviewed the three most recent sets of these reports, which provided information for administration cycles in 2002-2003, 2003-2004, and 2004-2005. The reports of the Technical Analysis Group (TAG), the body formed to provide supplementary psychometric expertise as a re-