For social survey data for youth development, the access issues are a little different. Several states have provided funding and support to field community-level surveys that focus on youth. Oregon, Vermont, and several other states have done this in the past with the YRBS. Colorado and Vermont have done the same with the PSL-AB. The state of Minnesota fields its own Minnesota Student Survey every three years, asking questions on a voluntary basis to all public school youth in grades 6, 9, and 12. In addition, many individual communities have contracted with the Search Institute to field the PSL-AB, using the results for comprehensive youth development planning.

Most communities, however, do not have access to the unique and important information that can only be gathered using such surveys as the ones reviewed above. The costs of fielding such surveys are themselves a barrier, although clearly many communities have made the decision that the information gained is worth the expense. In 2000 the PSL-AB, for example, could be fielded for between $1.65 and $2.00 per youth, with additional charges of several hundred dollars each for the production of reports (Search Institute, 2000).

For individual community programs for youth seeking to use social indicator data to monitor the activities and the success of their own program, the items in these surveys may be too general. For many purposes, such programs need to monitor elements that are specific to their program, looking at issues of process and implementation as well as outcomes. For them, a more tailored approach is needed, one that usually requires professional technical assistance.

ASSESSING PROGRAM IMPLEMENTATION AND OPERATION

When a program has just been launched, program managers, staff, participating youth and their parents, funders, and other stakeholders need information on whether it has been implemented according to design and, if not, how services and operations differ from those envisioned in the program’s underlying model. When a program has moved past implementation into the routine operating stage, information to determine whether it has continued to operate according to design and at the desired level of quality and efficiency will be similarly useful. Also useful will be data on total program costs and average cost per youth served.

Until enough time has passed to allow collection of any but the most short-term outcome indicators, all that stakeholders have for assessing program progress and the possible need for change is information on



The National Academies | 500 Fifth St. N.W. | Washington, D.C. 20001
Copyright © National Academy of Sciences. All rights reserved.
Terms of Use and Privacy Statement