already put in place have not been evaluated in a sufficiently rigorous and systematic manner. This lack of evaluation impedes other reform-minded jurisdictions to undertake similar initiatives with the confidence that they can be implemented successfully and will achieve the desired effects. However, the committee is impressed with the reformers’ ability to generate and consolidate stakeholder coalitions, build a consensus regarding the necessary changes, create the infrastructure needed to maintain momentum, and sustain the effort over the long run. This accumulated experience inspires optimism that juvenile justice reform can be achieved successfully on a national scale.
On the basis of this perspective, the reader is asked to assume that policy makers in a state are committed to transforming their juvenile justice system so that it is grounded in a developmental perspective. The following section aims to summarize what has been learned from efforts to implement policy change, what are the obstacles to successful innovation, and what can be done to address them.
Assembling and Using Data
The issue of data quality and inadequacy has been discussed throughout this report. In Chapter 3, we note the inadequacy of the juvenile arrest data, the incompleteness of court data and the lack of available juvenile justice data due to privacy restrictions. In Chapter 6, we attribute a failure to identify effective programs to the inadequate data for tracking youth outcomes. In Chapter 8, we note the lack of racial/ethnic data on youth at various processing stages.
An essential prerequisite to designing, implementing, and sustaining reform is the compilation of critical data and analytical tools. Many agencies lack data needed for their internal operations (individual, process, and outcome data) and across systems data (education, mental health, education, child welfare).43 Without these data, it is difficult to see the true picture of who is detained, how the system operates, what the impact is on minority youth, whether the youth is receiving the designated services, and what the impact is of the treatment he or she does receive. Agencies need to distinguish between data required for routine monitoring of processes (i.e., outputs and outcomes, such as numbers served, services delivered, costs, and quality of services) and data that are required for empirically based research evaluations (i.e., treatment outcome data, comparison data for different youth samples). A common measure of performance for many
43 Presentation by Laurie Garduque, director of juvenile justice, program on human and community development, John D. and Catherine T. MacArthur Foundation, to the committee, October 11, 2010.