attention is being paid to issues that were critical during the development stage. This might be expected as a testing program matures and evolves. However, perhaps in part because of staff turnover and a change in location, historical documentation about the assessment has been difficult to locate. We initially encountered significant difficulty in obtaining documentation that was sufficiently detailed to allow us to evaluate the development of the standards and the design of the assessments, although we note that the board eventually provided most of the information we needed to conduct our review.

We found this deficiency to be particularly troublesome as we explored the content-related validity evidence for the national board assessment. Ordinarily, the primary focus in an evaluation of a credentialing assessment is content-related validity evidence—that is, the evidence that the assessment measures the knowledge and skills it is intended to measure, based on the content standards that guide the development of the assessment. Content-related validity evidence, such as documentation of how the content standards were established, who participated in the process, what the process involved, and how the content standards were translated into test items, was the most difficult for us to obtain from the NBPTS.

The NBPTS is unusual in that its mission includes policy reform goals as well as the operation of an assessment program. However, in our opinion, the assessment is its primary responsibility. Ongoing evaluation of an assessment program is critical to maintaining its quality and credibility, and providing thorough documentation that is easily accessible to outside evaluators is a critical element of this process. The NBPTS should be able to readily provide documentation that demonstrates that its assessments are developed, administered, and scored in accord with high standards, such as those laid out in the standards documents for credentialing assessments. We note that during the course of our evaluation, the NBPTS has begun developing a technical guide, and we encourage the NBPTS to finalize this document and make it available to researchers and others interested in learning about the technical attributes of the assessments.

Our key recommendations relating to the assessment itself are as follows:


Recommendation 5-1: The NBPTS should publish thorough technical documentation for the program as a whole and for individual specialty area assessments. This documentation should cover processes as well as products, should be readily available, and should be updated on a regular basis.


Recommendation 5-2: The NBPTS should develop a more structured process for deriving exercise content and scoring rubrics from the content



The National Academies | 500 Fifth St. N.W. | Washington, D.C. 20001
Copyright © National Academy of Sciences. All rights reserved.
Terms of Use and Privacy Statement