National Academies Press: OpenBook

Evaluation Design for Complex Global Initiatives: Workshop Summary (2014)

Chapter: Appendix D: Evaluation Information Summary for Core Example Initiatives

« Previous: Appendix C: Participant Biographies
Suggested Citation:"Appendix D: Evaluation Information Summary for Core Example Initiatives." Institute of Medicine. 2014. Evaluation Design for Complex Global Initiatives: Workshop Summary. Washington, DC: The National Academies Press. doi: 10.17226/18739.
×
Suggested Citation:"Appendix D: Evaluation Information Summary for Core Example Initiatives." Institute of Medicine. 2014. Evaluation Design for Complex Global Initiatives: Workshop Summary. Washington, DC: The National Academies Press. doi: 10.17226/18739.
×

SCOPE OF THE EVALUATION

PEPFAR Global Fund AMFm PMI
Overarching evaluation aim/question What did the requestor and the evaluand want to know from the evaluation? Assess the performance and impact on health of PEPFAR-supported programs in partner countries and make recommendations for improvements Assess the organizational efficiency and effectiveness of the Global Fund; the effectiveness of its partner environment; and the effects of increased resources on the reduction in the burden of the three diseases

Study Area 1: Does the Global Fund, through both its policies and operations, reflect its critical core principles, including acting as a financial instrument (rather than as an implementation agency) and furthering country ownership; and in fulfilling these principles, whether it performs in an efficient and effective manner?

Study Area 2: How effective and efficient is the Global Fund’s partnership system in supporting HIV, tuberculosis (TB), and malaria programs at the country and global level? What are the wider effects of the Global Fund partnership on country systems?
To determine whether, and to what extent, AMFm Phase 1 achieves its objectives of (i) increasing availability of quality-assured ACTs (QAACTs) in outlets across the public, private for-profit, and not-for-profit sectors; (ii) increasing affordability of QAACTs to patients; (iii) increasing market share of QAACTs; and (iv) increasing use of QAACTs by patients, including vulnerable groups Assess and evaluate the PMI program: Identify lessons learned across countries; assess population-based outcomes and impact; identify lessons and share experiences with other U.S. Government (USG) engagements in global health initiatives
Suggested Citation:"Appendix D: Evaluation Information Summary for Core Example Initiatives." Institute of Medicine. 2014. Evaluation Design for Complex Global Initiatives: Workshop Summary. Washington, DC: The National Academies Press. doi: 10.17226/18739.
×
Study Area 3: What is the impact of scaling up against HIV, TB, and malaria? What is the Global Fund’s contribution?
Requestor of the evaluation Who asked for the evaluation? U.S. Congress (mandated in PEPFAR reauthorization legislation) Global Fund Board Global Fund Board PMI Leadership
Funder of the evaluation Who paid for the evaluation? U.S. Department of State, Office of the U.S. Global AIDS Coordinator (OGAC) Global Fund Global Fund USAID through contractor GH Tech
Primary intended audiences (For whom is the evaluation primarily performed based on the statement of task or terms of reference or the intent communicated by the requestor/funder?) U.S. Congress OGAC PEPFAR implementers (USG agencies, country programs, implementing partners) Global Fund The Global Fund Board, to guide its decision on whether to “expand, accelerate, modify, terminate, or suspend the AMFm business line” PMI, USAID, CDC
Suggested Citation:"Appendix D: Evaluation Information Summary for Core Example Initiatives." Institute of Medicine. 2014. Evaluation Design for Complex Global Initiatives: Workshop Summary. Washington, DC: The National Academies Press. doi: 10.17226/18739.
×

SCOPE OF THE EVALUATION

PEPFAR Global Fund AMFm PMI
Secondary audiences Other major audiences for the evaluation findings PEPFAR partner country stakeholders (government and nongovernment) Global HIV stakeholders (e.g., multilateral agencies, other bilateral and foundation donors, advocates) Global Fund funders Global Fund partner countries Global HIV stakeholders (e.g., other multilateral agencies, bilateral and foundation donors, advocates) AMFm pilot country malaria program managers, global malaria stakeholders (bilateral and multilateral agencies, Roll Back Malaria, etc.). U.S. Congress, other USG global health implementers PMI countries both national program personnel and USG personnel
Time period of the evaluated initiative that was assessed in the evaluation 2003–2012 After one complete (5-year) grant cycle 2002–2007, data collection through 2009 2010–2011. The period started from the point of signature of AMFm country grants (mid-2010 onward to the completion of data collection. Endline outlet surveys were completed in December 2011; data collection for additional studies (remote areas and logo studies) was completed in April 2012; household survey data from 2011 to July 12 were included. 2005–2010
Total budget of evaluated initiative during $28.5 billion (current USD) appropriated to PEPFAR country $4.96 billion (2002–2007) $9.97 billion (2002–2009) $216 million copayment fund (to Feb 2012) + $42.4 million for $1.2 billion for the first 5 years of the PMI project (FY2006–FY2010)
Suggested Citation:"Appendix D: Evaluation Information Summary for Core Example Initiatives." Institute of Medicine. 2014. Evaluation Design for Complex Global Initiatives: Workshop Summary. Washington, DC: The National Academies Press. doi: 10.17226/18739.
×
the time period covered in the evaluation programs from FY2004–FY2011 (see Figure 4-3, p. 103, IOM’s 2013 Evaluation of PEPFAR report) supporting interventions (disbursed to Nov 2011)
Duration of the evaluation 2009–2013 (45 months, separated into planning and implementation phases 2007–2009 (23 months) April 2009 to December 2012 May–November 2011
Cost of the evaluation $8.2 million USD Almost $17 million USD for the three distinct evaluations $10.7 million USD $292,935 USD (Boston University [BU] Purchase Order only) Three additional consultant members of the Evaluation Team were paid directly by GH Tech outside of the BU contract
Geographic scope of the evaluation How many countries/regions were included in the scope of the evaluation? “Whole of PEPFAR”: Scope varied by data source, ranging from all countries receiving PEPFAR funds (100+ countries) to the subset of countries where most investment was focused (31 countries submitting country operational plans at time of evaluation) to a subset of those countries with in-depth data collected (13 countries) Whole of Global Fund; scope with regard to data collection varied by study area; 16 countries for SA 2 and 18 countries for SA 3 All 8 Phase 1 pilot countries (Ghana, Kenya, Madagascar, Niger, Nigeria, Tanzania (mainland), Uganda, Zanzibar) 15 PMI focus countries in Africa
Suggested Citation:"Appendix D: Evaluation Information Summary for Core Example Initiatives." Institute of Medicine. 2014. Evaluation Design for Complex Global Initiatives: Workshop Summary. Washington, DC: The National Academies Press. doi: 10.17226/18739.
×

GOVERNANCE STRUCTURE AND REPORT

PEPFAR Global Fund AMFm PMI
Evaluators: Structure and composition of evaluation team, subteams, subcontractors, and advisory mechanisms Single IOM evaluation team composed of a committee of volunteer experts; the IOM staff; and paid consultants for both qualitative and quantitative methods

Evaluation team was divided into subteams for data collection field visits and into working groups by topic area

An external panel of expert volunteers reviewed the evaluation report prior to release to the evaluand and the public
Independent consultants conducted three interlinked studies

Macro International was awarded the contract for the evaluation of all three areas and enlisted a consortium of different universities and service providers, including Johns Hopkins University, Harvard School of Public Health, Washington University, Axios International, Development Finance International (DFI), the Indian Institute for Health Management Research, the African Population and Health Research Centre, and WHO
ICF international was the primary contractor for the independent evaluation (IE), with a subcontract to the London School of Hygiene and Tropical Medicine

Outlet survey data collection was contracted separately by the Global Fund, without a direct line of accountability between the data collectors and the IE

The AMFm Ad Hoc Committee provided oversight of the evaluation and reported the findings to the Board. The TERG provided guidance on the technical parameters of the evaluation design; an Expert Advisory Group advised the Global Fund secretariat
The evaluation was managed by Management Firm (GH Tech), which subcontracted the evaluation to BU and three independent consultants

The evaluation team (made up of the BU team and the three consultants) consisted of members with broad public health experience, malaria expertise, extensive experience in the use of both quantitative and qualitative methods, and proven experience in complex multicountry evaluations

Evaluation team was divided into groups to visit different countries and address the different evaluation objectives
Suggested Citation:"Appendix D: Evaluation Information Summary for Core Example Initiatives." Institute of Medicine. 2014. Evaluation Design for Complex Global Initiatives: Workshop Summary. Washington, DC: The National Academies Press. doi: 10.17226/18739.
×
Relationship of evaluator to evaluand For example: Independent external contractor; internal independent evaluation unit within the same organization as the evaluand; internally conducted by evaluand Independent and external Independent and external Independent and external Independent and external
Suggested Citation:"Appendix D: Evaluation Information Summary for Core Example Initiatives." Institute of Medicine. 2014. Evaluation Design for Complex Global Initiatives: Workshop Summary. Washington, DC: The National Academies Press. doi: 10.17226/18739.
×

GOVERNANCE STRUCTURE AND REPORT

PEPFAR Global Fund AMFm PMI
Oversight mechanisms For example, legislative oversight, oversight mechanisms in the evaluator’s institution, oversight by the evaluand or the evaluand’s advisory bodies Oversight by the IOM standard mechanisms governing consensus studies, including committee formation processes to ensure balance and avoid conflicts of interest; compliance with transparency requirements of the Federal Advisory Committee Act; external report review

OGAC appointed a liaison, the head of Strategic Information, to whom the IOM staff could communicate their needs for evaluation (i.e., data and documentation requests, interview scheduling; information needed for the IOM logistics planning for field work, etc.)
Oversight of the evaluation was provided by the TERG. The TERG is an advisory body that provides independent assessment and advice to the Global Fund Board and advises the Global Fund Secretariat. The TERG organized regular consultative meetings with the contractor to assess progress and discuss issues Building on prior work completed by the Roll Back Malaria Partnership, the Global Fund Secretariat, with input from the TERG, AMFm Ad Hoc Committee, AMFm Phase 1 country officials, Expert Advisory Group, and key technical partners, prepared the initial evaluation design which was put out to tender

AMFm Ad Hoc Committee met approximately twice per year and received process updates and interim evaluation findings (e.g., baseline results)

On encouragement of the Independent Evaluation (IE) team and TERG, the Ad Hoc Committee, with TERG Chair,
GH Tech had administrative oversight

The Evaluation Team submitted a draft report for comments to the PMI, USAID, and CDC staff members. The Evaluation Team reviewed the comments and responded, as they deemed appropriate
Suggested Citation:"Appendix D: Evaluation Information Summary for Core Example Initiatives." Institute of Medicine. 2014. Evaluation Design for Complex Global Initiatives: Workshop Summary. Washington, DC: The National Academies Press. doi: 10.17226/18739.
×
identified independent consultants to propose “success benchmarks”
Reporting requirements Frequency of reports, such as preliminary, periodic, final reports Two reports were published during the evaluation period: (1) Planning report after the planning phase (1.5 years) that detailed the strategic approach to the evaluation and evaluability assessments but no preliminary findings (2) Final report with findings, conclusions, and recommendations at the conclusion of the evaluation A number of interim reports were submitted to the TERG for review

Final reports were released by the independent consultants for each of the three study areas

A synthesis report, based on findings and recommendations of the final study area reports, was also released by Marco International
Three reports were published: Inception report, baseline report and final report. In addition, a supplement with the results of the secondary analysis of household data on antimalarial use was published separately from the main report because data became available after the main report was completed Periodic and draft reports were submitted to the management firm for transmission to evaluand. Comments and feedback were incorporated into the final report as appropriate. The Evaluation Team had full discretion to accept or reject comments as they saw fit
Evaluand’s access to preliminary findings or draft reports Per institutional requirements of the IOM, the evaluand had no access to preliminary findings or draft reports before final release Preliminary and progress reports were submitted regularly to the TERG, and were discussed in detail with the contractor The evaluand had the opportunity to comment on reports before they were finalized The PMI had access to the draft report; they shared it with personnel from USAID and CDC
Suggested Citation:"Appendix D: Evaluation Information Summary for Core Example Initiatives." Institute of Medicine. 2014. Evaluation Design for Complex Global Initiatives: Workshop Summary. Washington, DC: The National Academies Press. doi: 10.17226/18739.
×

GOVERNANCE STRUCTURE AND REPORT

PEPFAR Global Fund AMFm PMI
Authorship and ownership of evaluation findings/report Final report authored by the convened committee and issued by the IOM; content of the report belongs to the IOM Final reports for each of the three study areas, as well as a synthesis report, were authored by the independent contractors and released by Marco International All reports authored by the IE team; the content of the report belongs to the IE team Final report authored by the Evaluation Team; content of the report belongs to the Evaluation Team
The TERG released a summary for the synthesis report and each of the three study areas
In the summaries for each study area, the TERG provided an “assessment” of and comments on each of the contractor’s recommendations
Other notable aspects of the Terms of Reference Any other issues not mentioned above Congressional mandate for study afforded limited ability to negotiate terms, which lead to necessity of communication/explanation to clarify appropriate expectations with respect to complying with specific terms of the mandate The TERG emphasized that the contractor should ensure a clear focus on capacity building in the conduct of evaluations under SA 3, not necessarily building or strengthening systems, but that dormant capacity at the country level should be mobilized by the contractor and in-country Impact Evaluation Task Force While the original terms of reference for the evaluation called for household surveys to measure antimalarial use, these were not included in the final evaluation design because of budget implications. Instead, data from national surveys that were None noted
Suggested Citation:"Appendix D: Evaluation Information Summary for Core Example Initiatives." Institute of Medicine. 2014. Evaluation Design for Complex Global Initiatives: Workshop Summary. Washington, DC: The National Academies Press. doi: 10.17226/18739.
×
appropriately timed (close to baseline and endline) were to be used to report on use. In the end, these were only available for five of the pilots, which did not include the two “fastest-moving” pilots, Ghana and Kenya
Suggested Citation:"Appendix D: Evaluation Information Summary for Core Example Initiatives." Institute of Medicine. 2014. Evaluation Design for Complex Global Initiatives: Workshop Summary. Washington, DC: The National Academies Press. doi: 10.17226/18739.
×

EVALUATION APPROACH

PEPFAR Global Fund AMFm PMI
Evaluability assessments What assessments were conducted before the evaluation was conducted? For example, exploration of program’s objectives, activities, and theory of change; data mapping, assessment of available/accessible data; assessment of feasibility of data collection; evaluation design development. including analysis of methodological issues; and assessment of how evaluation findings can improve the Two-year design and operational planning phase prior to the evaluation implementation phase included

•   Clarification and interpretation of the evaluation mandate

•   Research on PEPFAR’s legislative and programmatic intent and objectives, program design, complexity, and operational structure

•   Design planning to choose conceptual framework; methods to be applied and analytical plan; develop in-depth evaluation questions; map data availability, accessibility, and quality

Evaluability assessments continued and
The TERG nominated an Impact Evaluation Task Force in each of the countries that were to participate The TERG developed the scope of work, study design, and research questions for the evaluation, and upon approval by the Global Fund Board, identified the independent consultant to carry out evaluation activities, after a competitive bidding process An AMFm Phase 1 Monitoring and Evaluation Technical Framework was developed by the Global Fund Secretariat with input from a broad range of stakeholders, including the TERG, AMFm Ad Hoc Committee, AMFm Phase 1 country officials and technical partners, which included elements of the proposed independent evaluation and data mapping for national level surveys planned to be implemented by others

The data collection and analysis methods for the outlet surveys that had been developed by the ACTwatch project were adapted for use in the IE
On an annual basis, the PMI team compiles all routine data, together with any survey data or data from other nonroutine sources (resistance monitoring, special studies, etc.) into an annual report. This report provides a summary of the various data collected during the year, examines progress and trends, and points out areas for improvements or enhancements in the coming year
Suggested Citation:"Appendix D: Evaluation Information Summary for Core Example Initiatives." Institute of Medicine. 2014. Evaluation Design for Complex Global Initiatives: Workshop Summary. Washington, DC: The National Academies Press. doi: 10.17226/18739.
×
program or its objectives adjustments were made during the implementation phase as needed
Overall evaluation approach For example, quasi-experimental; theory of change; case study Retrospective, quasi-cross-sectional, time trend, nonexperimental, mixed methods approach using program impact pathway logic model framework; hybrid case-study approach by country and by topic area; benchmarking analysis for legislative, policy, and other programmatic objectives Retrospective, nonexperimental, mixed methods; separate examination of different study areas; country case studies employing interviews with stakeholders, program managers, and Global Fund staff as well as household and facility surveys and health information system record reviews in the areas of HIV/AIDS, TB, and malaria Nonexperimental design with a pre-test and post-test assessment, with each country treated independently as a case study. Nationally representative outlet surveys conducted at baseline and endline. Assessment of context and process factors using key informants and document review. Theory of change used to interpret and attribute changes, based on logic model together with context and process factors. Additional studies conducted outlet surveys to explore QAACT price, availability, and market share in remote areas of two countries (Ghana and Kenya) and qualitative methods to study understanding of the AMFm logo (in four countries) Mixed methods approach using document review, key-informant interviews, field visits, electronic surveys, and trend analysis of publically available and relevant datasets
Suggested Citation:"Appendix D: Evaluation Information Summary for Core Example Initiatives." Institute of Medicine. 2014. Evaluation Design for Complex Global Initiatives: Workshop Summary. Washington, DC: The National Academies Press. doi: 10.17226/18739.
×

EVALUATION APPROACH

PEPFAR Global Fund AMFm PMI
Sampling used How were choices made about the program components to evaluate? How was sampling done for data collection/analysis? Program areas evaluated were determined by mapping closely to the Statement of Task and the initiative’s predetermined technical areas

Comprehensive sample for most financial data (all PEPFAR countries)

Existing sample for program monitoring data (PEPFAR’s preexisting selection of countries writing country operational plans)

Existing sample for clinical data (PEPFAR’s preexisting selection of Track 1.0 implementing partners)
Program Components: Priority evaluation questions for the Five Year Evaluation were discussed, reviewed and refined by the TERG; input was sought from the Board of the Global Fund and through an extensive stakeholder consultation

Sampling for data collection/ analysis: Purposive sampling of countries for evaluation of impact (SA 3) (n=20) and for SA 2 (n=16) considering the following criteria: 1. Regional and disease balance 2. Availability of existing impact and baseline data 3. Magnitude of Global Fund disbursement 4. Duration of programming 5. Opportunities for partner harmonization
Outlet survey sample sizes estimated for each country to be able to detect a 20 percentage point change between baseline and endline in QAACT availability, separately for rural and urban domains, pooling across outlet types and sectors. The ACTwatch cluster sampling approach was used. A full census was undertaken in Zanzibar. Key informants were selected purposefully Core objectives were evaluated based on the PMI evaluation framework: Leadership, Management, and Resources (Obj 1); Putting Core Operating Principles into Practice (Obj 2); Wider Partnership Environment (Obj 3); Assess Program Outcomes and Impacts (Obj 4); Assess Operational Research Activities (Obj 5); and Make Actionable Recommendations (Obj 6)

Purposeful sampling for key informant; Systematic literature reviews for document review. Countries for field visit selected by evaluand based on countries with at least two malaria survey (Malaria Indicator Survey or DHS) data points
Suggested Citation:"Appendix D: Evaluation Information Summary for Core Example Initiatives." Institute of Medicine. 2014. Evaluation Design for Complex Global Initiatives: Workshop Summary. Washington, DC: The National Academies Press. doi: 10.17226/18739.
×
Purposeful sampling for in-depth qualitative data collection (interviews and some document review)

Systematic literature reviews for some document review
Data types/Data sources Financial data Routine program monitoring data Program clinical data Semi-structured interview data Document review Globally reported indicator data Study Area 1: interviews, an organizational development (OD) assessment of Global Fund governance and management, performance review, benchmarking of results and processes, and document review

Study Area 2: In-depth qualitative assessments (850 interviews), extensive literature review and in-depth review and analysis of performance data on Global Fund grants

Study Area 3: National health accounts, district facility censuses, household surveys, civil society organization surveys, record reviews, and follow-up studies of patients
Outlet surveys Key informant interviews and document review Focus group discussions Secondary analysis of household survey data Routine program monitoring data Semi-structured interview data Document review Globally reported malaria data
Suggested Citation:"Appendix D: Evaluation Information Summary for Core Example Initiatives." Institute of Medicine. 2014. Evaluation Design for Complex Global Initiatives: Workshop Summary. Washington, DC: The National Academies Press. doi: 10.17226/18739.
×

EVALUATION APPROACH

PEPFAR Global Fund AMFm PMI
Data analysis methods and triangulation for mixed methods Analysis and interpretation within each data type appropriate to methodological standards

Iterative interpretation and triangulation within and across data types, sources, and investigators
Study Area 3: The evaluation study design used a stepwise approach to examine trends in health outcomes, coverage, and risk behaviors; access and quality of services; and funding

Secondary analysis of existing data and record reviews were conducted in 18 countries, and new data collection was carried out in 8 of these countries
Analysis used a predetermined tabulation plan, and adjusted for the survey design

A theory of change developed for AMFm was used to integrate findings across outlet surveys and context/process documentation
Iterative interpretation and triangulation within and across data types, sources, and investigators
 
DISSEMINATION AND UPTAKE OF EVALUATION FINDINGS, CONCLUSIONS, AND RECOMMENDATIONS
PEPFAR Global Fund AMFm PMI
Report materials and other communications products beyond the required reports Report summary (15 pages)

Report Summary booklet with chapter main messages (~100 pages)

Report brief (4 pages)

Online interactive experiences
Summaries of each study area Each country participating in the impact study (SA 3) produced its own evaluation report (n=18) Main report (403 pages) and appendixes (287 pages), which includes a short summary (5 pages) and a longer executive summary (45 pages)

Supplementary report on antimalarial use, based on household survey data (68 pages)
N/A
Suggested Citation:"Appendix D: Evaluation Information Summary for Core Example Initiatives." Institute of Medicine. 2014. Evaluation Design for Complex Global Initiatives: Workshop Summary. Washington, DC: The National Academies Press. doi: 10.17226/18739.
×
Lancet paper (Tougher et al.), published November 2012
Dissemination activities Briefings with congressional staff; OGAC; USG agencies Public briefing events (all webcast) Presentations at conferences and meetings Published commentaries Local presentation of results of the country-level impact evaluation, particularly household and facility survey and record review results Summary reports by the Global Fund, posting of reports on the Global Fund website The TERG briefings of the Global Fund Board Feedback to country programs (June 2012); presentations to AMFm ad hoc committee, the AMFm Working Group, and the MDAG (July–October 2012) in lead up to board meeting November 2012 Presentations at International Health Economics Association, American Society of Tropical Medicine and Hygiene (ASTMH), Multilateral Initiative on Malaria (MIM), DFID Briefings with congressional staff (public event co-hosted by PATH) and USG agencies Online dissemination of final report
Suggested Citation:"Appendix D: Evaluation Information Summary for Core Example Initiatives." Institute of Medicine. 2014. Evaluation Design for Complex Global Initiatives: Workshop Summary. Washington, DC: The National Academies Press. doi: 10.17226/18739.
×

DISSEMINATION AND UPTAKE OF EVALUATION FINDINGS, CONCLUSIONS, AND RECOMMENDATIONS

PEPFAR Global Fund AMFm PMI
Responses to evaluation PEPFAR did an internal written response to the recommendations

Bidirectional exchange between the IOM and implementing agencies during technical briefings about specific considerations for recommendations and agency implementation

Commentaries in journals and blog postings authored by stakeholders external to the IOM
The TERG summaries of final study area reports

Global Fund Secretariat implemented provisions to address some observations and recommendations
Written feedback received from the TERG and from the AMFm Working Group

Commentaries in scientific journals and media coverage
The PMI generated a management report identifying which recommendations they wished to act upon and describing how they would implement the proposed changes

The report was highlighted on the PMI website and broadly disseminated electronically
Mechanisms for tracking implementation of recommendations or impact of the evaluation The IOM has limited formal and resourced internal mechanisms for tracking recommendation implementation and other report impact Unknown No formal mechanism for tracking impact of the evaluation Beyond the scope of the Evaluation Team
Suggested Citation:"Appendix D: Evaluation Information Summary for Core Example Initiatives." Institute of Medicine. 2014. Evaluation Design for Complex Global Initiatives: Workshop Summary. Washington, DC: The National Academies Press. doi: 10.17226/18739.
×

OTHER

PEPFAR Global Fund AMFm PMI
What other information should be known that influenced or informed the evaluation? Not a financial audit or an evaluation of the operational structure of PEPFAR or its placement within the USG

Not an evaluation of the U.S. contribution to the Global Fund or of programming and funding to NIH

Committee interpreted additional areas of study not explicitly stated in mandate but determined to be needed to be responsive to the charge (i.e., funding, knowledge management, country ownership, and sustainability)
There was early realization that it would be impossible to separate the potential impacts of the different funding/donor mechanisms on disease burden Implementation period was short: 6.5–15.5 months from first arrival of copaid drugs; some countries had not commenced supporting interventions at the time of endline data collection

Preliminary findings presented to consultative forum of country stakeholders for review and debate
Evaluation Team was asked to avoid comparing the PEPFAR mechanisms with the PMI approach

Evaluation Team had very little country-level financial data

Evaluation Team was constrained by lack of multiple time data points to assess country trends and impacts

SOURCE: Information compiled from evaluation summary documents and members of the respective evaluation teams.

Suggested Citation:"Appendix D: Evaluation Information Summary for Core Example Initiatives." Institute of Medicine. 2014. Evaluation Design for Complex Global Initiatives: Workshop Summary. Washington, DC: The National Academies Press. doi: 10.17226/18739.
×

This page intentionally left blank.

Suggested Citation:"Appendix D: Evaluation Information Summary for Core Example Initiatives." Institute of Medicine. 2014. Evaluation Design for Complex Global Initiatives: Workshop Summary. Washington, DC: The National Academies Press. doi: 10.17226/18739.
×
Page 149
Suggested Citation:"Appendix D: Evaluation Information Summary for Core Example Initiatives." Institute of Medicine. 2014. Evaluation Design for Complex Global Initiatives: Workshop Summary. Washington, DC: The National Academies Press. doi: 10.17226/18739.
×
Page 150
Suggested Citation:"Appendix D: Evaluation Information Summary for Core Example Initiatives." Institute of Medicine. 2014. Evaluation Design for Complex Global Initiatives: Workshop Summary. Washington, DC: The National Academies Press. doi: 10.17226/18739.
×
Page 151
Suggested Citation:"Appendix D: Evaluation Information Summary for Core Example Initiatives." Institute of Medicine. 2014. Evaluation Design for Complex Global Initiatives: Workshop Summary. Washington, DC: The National Academies Press. doi: 10.17226/18739.
×
Page 152
Suggested Citation:"Appendix D: Evaluation Information Summary for Core Example Initiatives." Institute of Medicine. 2014. Evaluation Design for Complex Global Initiatives: Workshop Summary. Washington, DC: The National Academies Press. doi: 10.17226/18739.
×
Page 153
Suggested Citation:"Appendix D: Evaluation Information Summary for Core Example Initiatives." Institute of Medicine. 2014. Evaluation Design for Complex Global Initiatives: Workshop Summary. Washington, DC: The National Academies Press. doi: 10.17226/18739.
×
Page 154
Suggested Citation:"Appendix D: Evaluation Information Summary for Core Example Initiatives." Institute of Medicine. 2014. Evaluation Design for Complex Global Initiatives: Workshop Summary. Washington, DC: The National Academies Press. doi: 10.17226/18739.
×
Page 155
Suggested Citation:"Appendix D: Evaluation Information Summary for Core Example Initiatives." Institute of Medicine. 2014. Evaluation Design for Complex Global Initiatives: Workshop Summary. Washington, DC: The National Academies Press. doi: 10.17226/18739.
×
Page 156
Suggested Citation:"Appendix D: Evaluation Information Summary for Core Example Initiatives." Institute of Medicine. 2014. Evaluation Design for Complex Global Initiatives: Workshop Summary. Washington, DC: The National Academies Press. doi: 10.17226/18739.
×
Page 157
Suggested Citation:"Appendix D: Evaluation Information Summary for Core Example Initiatives." Institute of Medicine. 2014. Evaluation Design for Complex Global Initiatives: Workshop Summary. Washington, DC: The National Academies Press. doi: 10.17226/18739.
×
Page 158
Suggested Citation:"Appendix D: Evaluation Information Summary for Core Example Initiatives." Institute of Medicine. 2014. Evaluation Design for Complex Global Initiatives: Workshop Summary. Washington, DC: The National Academies Press. doi: 10.17226/18739.
×
Page 159
Suggested Citation:"Appendix D: Evaluation Information Summary for Core Example Initiatives." Institute of Medicine. 2014. Evaluation Design for Complex Global Initiatives: Workshop Summary. Washington, DC: The National Academies Press. doi: 10.17226/18739.
×
Page 160
Suggested Citation:"Appendix D: Evaluation Information Summary for Core Example Initiatives." Institute of Medicine. 2014. Evaluation Design for Complex Global Initiatives: Workshop Summary. Washington, DC: The National Academies Press. doi: 10.17226/18739.
×
Page 161
Suggested Citation:"Appendix D: Evaluation Information Summary for Core Example Initiatives." Institute of Medicine. 2014. Evaluation Design for Complex Global Initiatives: Workshop Summary. Washington, DC: The National Academies Press. doi: 10.17226/18739.
×
Page 162
Suggested Citation:"Appendix D: Evaluation Information Summary for Core Example Initiatives." Institute of Medicine. 2014. Evaluation Design for Complex Global Initiatives: Workshop Summary. Washington, DC: The National Academies Press. doi: 10.17226/18739.
×
Page 163
Suggested Citation:"Appendix D: Evaluation Information Summary for Core Example Initiatives." Institute of Medicine. 2014. Evaluation Design for Complex Global Initiatives: Workshop Summary. Washington, DC: The National Academies Press. doi: 10.17226/18739.
×
Page 164
Suggested Citation:"Appendix D: Evaluation Information Summary for Core Example Initiatives." Institute of Medicine. 2014. Evaluation Design for Complex Global Initiatives: Workshop Summary. Washington, DC: The National Academies Press. doi: 10.17226/18739.
×
Page 165
Suggested Citation:"Appendix D: Evaluation Information Summary for Core Example Initiatives." Institute of Medicine. 2014. Evaluation Design for Complex Global Initiatives: Workshop Summary. Washington, DC: The National Academies Press. doi: 10.17226/18739.
×
Page 166
Suggested Citation:"Appendix D: Evaluation Information Summary for Core Example Initiatives." Institute of Medicine. 2014. Evaluation Design for Complex Global Initiatives: Workshop Summary. Washington, DC: The National Academies Press. doi: 10.17226/18739.
×
Page 167
Suggested Citation:"Appendix D: Evaluation Information Summary for Core Example Initiatives." Institute of Medicine. 2014. Evaluation Design for Complex Global Initiatives: Workshop Summary. Washington, DC: The National Academies Press. doi: 10.17226/18739.
×
Page 168
Next: Appendix E: Evaluation Design Resources Highlighted at the Workshop »
Evaluation Design for Complex Global Initiatives: Workshop Summary Get This Book
×
Buy Paperback | $50.00 Buy Ebook | $40.99
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF

Every year, public and private funders spend many billions of dollars on large-scale, complex, multi-national health initiatives. The only way to know whether these initiatives are achieving their objectives is through evaluations that examine the links between program activities and desired outcomes. Investments in such evaluations, which, like the initiatives being evaluated, are carried out in some of the world's most challenging settings, are a relatively new phenomenon. In the last five years, evaluations have been conducted to determine the effects of some of the world's largest and most complex multi-national health initiatives.

Evaluation Design for Complex Global Initiatives is the summary of a workshop convened by the Institute of Medicine in January 2014 to explore these recent evaluation experiences and to consider the lessons learned from how these evaluations were designed, carried out, and used. The workshop brought together more than 100 evaluators, researchers in the field of evaluation science, staff involved in implementing large-scale health programs, local stakeholders in the countries where the initiatives are carried out, policy makers involved in the initiatives, representatives of donor organizations, and others to derive lessons learned from past large-scale evaluations and to discuss how to apply these lessons to future evaluations. This report discusses transferable insights gained across the spectrum of choosing the evaluator, framing the evaluation, designing the evaluation, gathering and analyzing data, synthesizing findings and recommendations, and communicating key messages. The report also explores the relative benefits and limitations of different quantitative and qualitative approaches within the mixed methods designs used for these complex and costly evaluations.

  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    Switch between the Original Pages, where you can read the report as it appeared in print, and Text Pages for the web version, where you can highlight and search the text.

    « Back Next »
  6. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  7. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  8. ×

    View our suggested citation for this chapter.

    « Back Next »
  9. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!