National Academies Press: OpenBook

Evaluating AIDS Prevention Programs: Expanded Edition (1991)

Chapter: B Oversight and Coordination Strategy

« Previous: A Collaborative Contracting Strategy
Suggested Citation:"B Oversight and Coordination Strategy." National Research Council. 1991. Evaluating AIDS Prevention Programs: Expanded Edition. Washington, DC: The National Academies Press. doi: 10.17226/1535.
×
Page 200
Suggested Citation:"B Oversight and Coordination Strategy." National Research Council. 1991. Evaluating AIDS Prevention Programs: Expanded Edition. Washington, DC: The National Academies Press. doi: 10.17226/1535.
×
Page 201
Suggested Citation:"B Oversight and Coordination Strategy." National Research Council. 1991. Evaluating AIDS Prevention Programs: Expanded Edition. Washington, DC: The National Academies Press. doi: 10.17226/1535.
×
Page 202
Suggested Citation:"B Oversight and Coordination Strategy." National Research Council. 1991. Evaluating AIDS Prevention Programs: Expanded Edition. Washington, DC: The National Academies Press. doi: 10.17226/1535.
×
Page 203
Suggested Citation:"B Oversight and Coordination Strategy." National Research Council. 1991. Evaluating AIDS Prevention Programs: Expanded Edition. Washington, DC: The National Academies Press. doi: 10.17226/1535.
×
Page 204
Suggested Citation:"B Oversight and Coordination Strategy." National Research Council. 1991. Evaluating AIDS Prevention Programs: Expanded Edition. Washington, DC: The National Academies Press. doi: 10.17226/1535.
×
Page 205
Suggested Citation:"B Oversight and Coordination Strategy." National Research Council. 1991. Evaluating AIDS Prevention Programs: Expanded Edition. Washington, DC: The National Academies Press. doi: 10.17226/1535.
×
Page 206

Below is the uncorrected machine-read text of this chapter, intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text of each book. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

B Oversight and Coordination Strategy In this appendix, the panel outlines a strategy for overseeing and conduct- ~ng independent multisite experiments to compare clifferent approaches to reducing the risk of REV transmission. The strategy descnbed here is illustrated by the cooperative endeavor of six commun~ty-based or- ganization (CBO) projects, but it is also relevant for evaluations of He media campaign and the counseling and testing program. We believe Mat coordinated results from the individual sites, and especially the cross-site results, can lead to the development of sound health education and risk reduction policy. In the strategy, projects are selected on the basis of their capacity and willingness to engage In randomized tests of two or three interven- tions that promise better results relative to standard interventions. Each experiment is directed by a different principal investigator and involves CBO practitioners and an independent evaluation group. As in most commun~ty-based endeavors, it is expected that the organization and style of operation of each project will vary despite common goals, as will the character of the local cooperation between the project team and the independent evaluation group. The CBOs and their evaluation associates will vary considerably in experience and in their skill In executing high-quality experiments. The results themselves win vary, and integrating the findings across sites, in the interest of developing health education and risk reduction policy, will then be complex. Given the limited resources available for oversight, the pane! suggests a project review team to facilitate the generation of high~uality evidence that is useful both locally and at the aggregate level for determining what works better. 200

OVERSIGHT AND COORDINATION ~ 201 Compared win a conventional advisory board approach, a project review team involves a far deeper commitment and involvement. While this level of involvement and commitment demands more of an oversight body, the pane] believes that the potential benefits of such an approach more than justify its use. THE PROJECT REVIEW TEAM Three major missions seem appropriate and feasible for the project review team: · overseeing the progress of each project's evaluation exper- iments; · conducting periodic, confinnatory reanalysis of the evalua- tive data produced by the project experiments; and · undertaking cross-site analyses of all project evaluations. The first mission, oversight, is a common function for most advisory boards and is basic to the quality control of projects. For AIDS-related work, however, this oversight must be deeper and more extensive than is customary for advisory boards because the CBOs are engaged in difficult missions that are complicated further by the agreement to cooperate in controlled field tests. The second mission, periodic confirmatory reanalysis of evaluation data, is not a common function of advisory boards. Following the mode} used in projects currently sponsored by the National Institute of Justice (N~, the team will receive evaluation data and analyses from each site and then conduct reanalyses of those data to confirm the site's ong~nal work. Conducted quarterly in the ND-sponsored projects, these confirmatory reanalyses have led to improvements in the timeliness of site data production, the quality of site data production systems, the quality of data, and the collective understanding of the operations of the projects (see Nil, 1989~. Use of the project review team approach in AIDS evaluation efforts should offer the same benefits. The team's third mission, which can be undertaken only after each project evaluation is completed, is to produce cross-site analyses and rec- ommendations that go beyond single-site considerations of experimental results. The product of this aspect of the team's work is an understanding of He vanability in project effectiveness c.g., why projects A, B. and C produced positive effects at venous levels while projects D and E produced no detectable effects and of the policy options suggested by the collective results of the several expenments. The team might also be responsible for screening the proposals for evaluation submitted by each site. Using the team to bow select and

202 ~ APPENDIX B oversee the evaluations enhances the continuity of the evaluation process and the uniformity of evaluation designs across sites. Given these responsibilities, it seems wise to constitute the project review team as a group of five or six experts In various aspects of designing and executing randomized experiments In community-based contexts. The skills required would include those common to experts in the areas of statistics, evaluation, methodology, CBO operations, and public health. Beyond these requirements, membership should be tailored to the special audiences of the projects (e.g., adolescents, women at nsk, IV Dug users). To centralize team activities, we suggest that one member of the team be directly responsible for a small support group based at a university or research entity that will run the individual site and cross-site analyses. OPERATIONS As a practical matter, carrying out He oversight functions will require considerable effort by a number of actors: CBO staffs, the principal investigators responsible for the evaluation experiment at each site, the project review team, and at least one senior CDC staff person with research or evaluation expertise. First, quarterly meetings of the team, the CBO staffs, pnncipa] investigators, and the CDC representative are essential to understanding projects and the evaluation experiments. The format of the meetings can be designed to cover sequendaBy the major problems that are usually encountered in controlled field experiments and the options for their solution. The meetings can also be used to discuss pre-experiment studies, resolve problems In data collection or random assignment that appear during the course of the experiment, and address other issues of concern. The development of an invisible "college" composed of aB these actors is part of Me potential product of these quarterly meetings. In addition, a quarterly meeting can serve as the venue for presenting evaluative data and analyses produced by personnel at each site. A representative from each site must provide the project review team with updated data files at each meeting to permit the team's subsequent confirmatory reanalyses. The second team activity relevant to its mission involves repeated site visits to the CBOs participating In the evaluation project. The objective for each visit is to understand local operations, conditions, and problems, and to identify and discuss options (not recommendations) for resolving these problems. The team then reports on these aspects of the projects to CDC. The site visit reports and the sites' progress reports and provision of data at each quarterly meeting form the basis for We third team

OVERSIGHT AND COORDINATION ~ 203 activity, reporting to and advising CDC. Such advice includes whether each project's performance justifies another year of funding. The fourth activity is the quarterly reanalysis of data produced by each site. This activity is calTied out by a group based in a university or other research environment that is organized to produce the reanalyses and provide them to the project review team. The fifth and final activity is the analysis of data across sites and the production of a report for CDC. This activity can be undertaken using the university-based or research unit-based reanalysis team working under the direction of the team, in combination with He team members' independent work. The specific products of Be project review team process described above are the following: · reports to CDC on the progress of project evaluations; · verification or disconfinnation of project evaluations con- ducted by the sites and reported to CDC; advice to CDC on individual project perfonnance and jus- tification and on continuance of each evaluation; and cross-site analyses for policy decisions. The result of a team approach is better evaluative data, more knowledge- able projects, and a Beater understanding of what makes a difference and what works better to reduce high-risk behavior. The discussion above descnbes the project review team approach for the health education/r~sk reduction projects of a selected set of CBOs Yet Be approach is also relevant to overseeing and coordinating selected counseling and testing projects and projects supported in the media cam- pa~gn. For example, a small number of counseling and testing sites could be invited to participate In comparative expenmental tens of new methods for encouraging individuals to return for their test results. The new methods might involve different forms of pretest counseling or more time allotted to such counseling. In agreeing to fry out the new approaches, the sites would also agree to randomly assign individuals to the new method and to the standard approach and to cooperate in the multisite evaluation and related activities. Financial support would be provided to each site for implementing the new approach alongside Be standard memos and for participating in Be cooperative evaluations. Support would also be provided to Be local evaluation group win which each site collaborates in implementing the experiment. Finally, support would also be provided to the PRT for project selection, oversight, confirmatory analyses, and so form.

204 ~ APPENDIX B ~ the case of counseling and testing sites, as with health educa- tion/nsk reduction projects, one of the objects is mutual education about implementing the new counseling regimens and executing high-quality randomized tests. Insofar as the project review team activity is Intensive, it can offer effective oversight, provide options for resolving problems, and confirm the data analyses produced by each site. The team's cross- site analyses would be dedicated to providing evidence on what works better and addressing policy unplications for CDC's consideration in answering the counseling and testing program and developing new initiatives. For media projects, an evaluation might randomly assign half of a group of communities to a new regimen (set of ads or other materials) or to a control condition, thus creating two groups of communities that do not differ systematically except for the presence of the new regimen in half the group. However, if each community is independently respon- sible for implementing the regimen and for the collection of evaluation data, an oversight and coordinating body will be essential to producing Interpretable results. Here again, the project review team is likely to be a useful option and a more valuable approach than an advisory board, especially if the projects are independent members of a loosely orgaruzed cooperative group. SUMMARY AND DISCUSSION The project review team approach to evaluation project coordination and oversight is relevant to conducting independent multisite experiments to compare different approaches to reducing the risk of HIV transmission. The team approach, as well as the multisite cooperative endeavor, is innovative In the social sciences, but there are some precedents (see Boruch et al., 1989) and In the medical coccal teals arena (see Controlled Clinical Trials). A distinctive feature and advantage of the team approach is routine periodic reanalysis of data produced by each cooperating site. Although not commonly used In the social and behavioral sciences, the idea com- ports well with contemporary scientific practice In two respects. First, the feature can be regarded as a data sharing activity because it invites access to He data by a number of scientists beyond the onginal principal investigators and also allows access among the cooperating sites and the CDC. This kind of data sharing is encouraged by, among others, the National Research Council's Committee on National Statistics (Fienberg, Manin, and Straf, 1985~. Given that mistakes and corruption of data may be far more severe a problem in controversial research areas, the

OVERSIGHT AND COORDINATION ~ 205 purpose of data sharing in this kind of effort is to reduce the likelihood of mistakes in complex data analysis and decrease even further the low likelihood of fraud. Data sharing may also enhance the usefulness of a scarce resource- in this case, high-quality data on what works better to reduce HIV transmission. The repeated use of the same data set in the interests of confirming the quality of data and analyses and testing new hypotheses is an economic justification for the activity. A second distinctive feature is that the project review team approach facilitates the coordination of independent project staff and evaluators. Such coordination provides opportunities for the development of local expertise in conducting controlled expenments, an expertise that will, in the Tong run, enhance local and societal understanding of how to generate sound evidence about what interventions work. An alternative approach a contract with a single large organization that is employed to develop cooperation and run the experiments locally generally leads to less local learning (Boruch, Dennis, and Carter-Greer, 1988), although quality control is more certain. Another advantage of the single-contract approach is that the sponsor (CDC) contracts with only one group respon- sible for executing all experiments in the sites, rather than developing a number of separate contracts with each site that has the capability and willingness to conduct experiments and with the group (i.e., university or research organization) responsible for support of the project review team. Finally, the team approach is arguably better than conventional ad- visory boards for coordination and oversight of AIDS prevention project evaluation. The involvement of team members is more extensive and more sustained than is common for advisory board members, leading to a higher quality product. The involvement is such that a "college" of practitioners and researchers develops as a consequence of interac- tions among venous actors, which can lead to more and better mutual education. The team's constitution as, essentially, a blue-nbbon panel, In addition to its level of activity, is likely to lead to high-quality data, especially when coupled with the routine confirmatory reanalyses of site evaluative data. Finally, cross-site analyses are complex enough that conventional advisory boards cannot handle the work; a project review team is designed to handle it well In the interest of providing a timely, policy-relevant product to the sponsoring agency.

206 ~ APPENDIX B REFERENCES Boruch, R. F., Reiss, A., Garner, J., and Lamtz, K. (1989) Coordinating tile Spouse Assault Replication Projects: The Project Review Team. Unpublished report. Northwestern University. Boruch, R. F., Dennis, M., and Carter-Greer, K. (1988) Lessons from the Rockefeller Foundations experiments on the minority female single parent program. Evaluation Review 12~4~:396~26. Fienberg, S. E., Martin, M. E., and Straf, M. L., eds. (1985) Sharing Research Data. Report of the NRC Committee on National Statistics. Washington, D.C.: National Academy Press. National Institute of Justice - ) (1989) Spouse Assault Replication Project Review Team. Washington, D.C.: U.S. Department of Justice.

Next: C Methodological Issues in AIDS Survey »
Evaluating AIDS Prevention Programs: Expanded Edition Get This Book
×
 Evaluating AIDS Prevention Programs: Expanded Edition
Buy Paperback | $60.00
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF

With insightful discussion of program evaluation and the efforts of the Centers for Disease Control, this book presents a set of clear-cut recommendations to help ensure that the substantial resources devoted to the fight against AIDS will be used most effectively.

This expanded edition of Evaluating AIDS Prevention Programs covers evaluation strategies and outcome measurements, including a realistic review of the factors that make evaluation of AIDS programs particularly difficult. Randomized field experiments are examined, focusing on the use of alternative treatments rather than placebo controls. The book also reviews nonexperimental techniques, including a critical examination of evaluation methods that are observational rather than experimental—a necessity when randomized experiments are infeasible.

READ FREE ONLINE

  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  6. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  7. ×

    View our suggested citation for this chapter.

    « Back Next »
  8. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!