Appendix C

Evaluation Methods

OVERVIEW

The committee’s evaluation employed a mix of methods and layers of investigation and analysis involving a range of primary and secondary data sources, taking into account the methodological design considerations described in Chapter 2. This included mapping of investments using financial data, assessing trends over time using program monitoring indicators and clinical data from the Office of the U.S. Global AIDS Coordinator (OGAC) and PEPFAR implementing partners, benchmarking progress against stated programmatic targets and goals, reviewing extensive documents, and analyzing primary data collected through more than 400 semi-structured interviews with a range of stakeholders on visits to 13 PEPFAR partner countries, at the U.S. headquarters (HQ) of PEPFAR, and at other institutions and multilateral agencies.

Primary and secondary data were analyzed, using appropriate methodologies, by the members of the evaluation committee, the Institute of Medicine (IOM) study staff, and consultants with specialized knowledge in both qualitative and quantitative methodologies. The contracted consultant for quantitative methodologies was a biostatistical firm in Washington, DC, Statistics Collaborative, Inc. (SCI), and for qualitative methodologies was Dr. Sharon Knight in Greenville, North Carolina. The committee, IOM staff, and consultants took steps to assess and ensure the qual-



The National Academies | 500 Fifth St. N.W. | Washington, D.C. 20001
Copyright © National Academy of Sciences. All rights reserved.
Terms of Use and Privacy Statement



Below are the first 10 and last 10 pages of uncorrected machine-read text (when available) of this chapter, followed by the top 30 algorithmically extracted key phrases from the chapter as a whole.
Intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text on the opening pages of each chapter. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

Do not use for reproduction, copying, pasting, or reading; exclusively for search engines.

OCR for page 741
Appendix C Evaluation Methods OVERVIEW The committee’s evaluation employed a mix of methods and layers of investigation and analysis involving a range of primary and secondary data sources, taking into account the methodological design considerations de- scribed in Chapter 2. This included mapping of investments using financial data, assessing trends over time using program monitoring indicators and clinical data from the Office of the U.S. Global AIDS Coordinator (OGAC) and PEPFAR implementing partners, benchmarking progress against stated programmatic targets and goals, reviewing extensive documents, and an- alyzing primary data collected through more than 400 semi-structured interviews with a range of stakeholders on visits to 13 PEPFAR partner countries, at the U.S. headquarters (HQ) of PEPFAR, and at other institu- tions and multilateral agencies. Primary and secondary data were analyzed, using appropriate meth- odologies, by the members of the evaluation committee, the Institute of Medicine (IOM) study staff, and consultants with specialized knowledge in both qualitative and quantitative methodologies. The contracted consultant for quantitative methodologies was a biostatistical firm in Washington, DC, Statistics Collaborative, Inc. (SCI), and for qualitative methodologies was Dr. Sharon Knight in Greenville, North Carolina. The committee, IOM staff, and consultants took steps to assess and ensure the quality and com- pleteness of the data used for the evaluation, and took these factors into account during data interpretation. The methods used to ensure the qual- 741

OCR for page 741
742 EVALUATION OF PEPFAR ity of the primary data collected and the secondary data received through data requests are described in more detail in the sections that follow. When externally analyzed data were used, the committee, IOM staff, and consul- tants reviewed and assessed the quality of the data and the methodologies used. As described in Chapter 2, the mandate of the committee was to draw conclusions and make recommendations across the whole of the PEPFAR initiative. Wherever possible, data were gathered and data analyses and interpretation were conducted and presented across all 31 of the PEPFAR partner countries that were the focus of the evaluation;1 however, only very limited data were comparable and comprehensive across all countries. In order to not limit the committee’s findings to data consistently available across the whole of the program and all of these countries, which would have been a significant constraint, the evaluation drew on those subsets of countries, programmatic areas, or intervention components implemented within PEPFAR for which sufficient data could be gathered to contribute to the assessment. Therefore, data presentations and analyses representing these subsets were interpreted with care to inform conclusions about the whole of the program. For example, analysis of country visit interview data was limited to the countries selected for visits by the committee. In addition, some analyses drew on existing data sources that were available only for some countries, programs, and partners, such as Track 1.0 partner data. Some evaluation questions were most applicable only for a subset of countries, such as countries with concentrated epidemics driven by inject- ing drug use. Finally, the time and resources available limited the scope of some analyses, such as those involving review of Country Operational Plans (COPs), for which the sheer volume of the documents over all countries and years limited the feasibility of comprehensive review across all countries. Throughout the report, where data analyses that do not represent the whole of the program are presented, the scope of these data is described. Because the committee was not charged to draw conclusions or make recommen- 1  To represent the greatest intensity of PEPFAR’s investment, the scope of this evaluation was defined to focus on the 31 partner countries submitting an annual Country Operational Plan (COP) at the time of the initiation of the planning phase for this evaluation in 2009. They include the original 15 focus countries (Botswana, Republic of Côte d’Ivoire, Federal Democratic Republic of Ethiopia, Cooperative Republic of Guyana, Republic of Haiti, Re- public of Kenya, Republic of Mozambique, Republic of Namibia, Federal Republic of Nigeria, Republic of Rwanda, Republic of South Africa, United Republic of Tanzania, Republic of Uganda, Socialist Republic of Vietnam, and Republic of Zambia), as well as the following additional countries: Republic of Angola, Kingdom of Cambodia, People’s Republic of China, Democratic Republic of the Congo, Dominican Republic, Republic of Ghana, Republic of India, Republic of Indonesia, Kingdom of Lesotho, Republic of Malawi, Russian Federation, Republic of the Sudan, Kingdom of Swaziland, Kingdom of Thailand, the Ukraine, and the Republic of Zimbabwe.

OCR for page 741
APPENDIX C 743 dations at the level of specific countries, partners, or programs, analyses of data from subsets of countries or partners are presented in a manner designed to maintain anonymity. By applying this mix of methods and layers of investigation and analy- sis using a range of available primary and secondary data sources, the com- mittee arrived at findings that could be triangulated to draw conclusions about the performance and impact of PEPFAR, even when there was no one data source that was sufficient or one methodological approach that was feasible. Building on the interpretation of the available data, the conclu- sions and recommendations presented in this report represent the consensus reached through the deliberations of the evaluation committee. Over the course of the evaluation, the full committee met six times in person, with participation of the IOM staff and consultants. One additional meeting was conducted using Web-based conferencing. In addition, working groups within the committee that were focused on specific content areas held ad- ditional meetings by teleconference as needed for ongoing deliberations as well as for data analysis and interpretation. These committee activities were augmented by ongoing communications via telephone and e-mail among the committee members, staff, and consultants. The following sections describe some of the overarching processes that the committee used to frame and shape the evaluation. Subsequent, more detailed sections describe the methods for each of the data sources used in the evaluation. Development of Evaluation Questions and Mapping of Data Sources Through working groups consisting of a subset of committee members, the evaluation committee identified proposed evaluation questions based on major content areas, the statement of task (see Appendix A), the Program Impact Pathway (PIP) framework (see Chapter 2), and the preliminary work reported in the Strategic Approach (IOM and NRC, 2010). Once the working groups established their initial questions and subquestions, IOM staff and consultants developed and provided to the committee the follow- ing information pertaining to each of the questions: • The domains of the PIP to which the question belonged (i.e., input, activity, output, outcome, or impact) • The type of data necessary to answer the question (e.g., financial data; program monitoring, surveillance, and clinical data; qualita- tive interview data; literature and document review) • A description of potential data sources that had been identified • Limitations associated with the data sources, such as issues related to availability, the feasibility of accessing the data, and any other

OCR for page 741
744 EVALUATION OF PEPFAR relevant issues that could inform considerations for formulating data requests and for the utility of the data Mapping of Potential Data Sources The IOM staff and consultants then carried out an extensive data- mapping effort for more than 150 evaluation questions, building on the preliminary work conducted during the strategic planning and operational planning phases. The data-mapping process relied on document review, stakeholder interviews, information obtained from preliminary data re- quests, and information gathered during 2 pilot country visits. The data mapping served to assess the feasibility of collecting and using data from each source, taking into consideration the burden that data requests would place on each source’s resources and staff time. In addition, this data map- ping assessed whether data from each source would require new data analy- sis in order to answer the evaluation questions posed by the committee. The categories of available data sources that were mapped and ulti- mately used for the evaluation included financial data; program monitoring, surveillance, and clinical data; qualitative interview data; and literature and document review. The sources included central OGAC data, data from multilateral organizations, data from implementing partners, and data from publicly available documents and other sources. The data sources used for the evaluation are described in more detail in subsequent sections of this appendix. Priority Evaluation Questions Committee members then worked with IOM staff and consultants to finalize a set of priority evaluation questions based on relevance to the statement of task and related evaluation considerations, relative impor- tance among subquestions, and feasibility of answering each question with the time, resources, and data available. The ultimate relative contribution of data sources to different content areas and evaluation questions and, ultimately, to the committee’s conclusions and recommendations varied depending on data availability and appropriateness. Overview of Data Collection A summary of the data request and data collection processes for each major data source is provided in the sections that follow, along with a de- scription of the analyses for which the data were used. Requests for interviews and requests for secondary data not readily publicly available were made by the IOM independently, with OGAC and

OCR for page 741
APPENDIX C 745 partner country mission teams serving as a liaison only when necessary. Participation in the evaluation was voluntary. Except when reference is made to existing published materials, findings, examples, and comments are not attributed to individuals, and the identities of individuals, programs, partners, and countries are protected. FINANCIAL DATA Global Financial Data To contextualize PEPFAR’s financial contribution within the broader donor funding landscape for HIV/AIDS, the committee examined disburse- ment data on official development assistance for HIV/AIDS as reported to the Organisation for Economic Co-operation and Development (OECD) Creditor Reporting System (OECD, 2012). Disbursements represent the sum of two OECD sector codes: sexually transmitted disease (STD) control (which includes HIV/AIDS) and the social mitigation of HIV/AIDS. The committee examined data for the 31 PEPFAR countries that were writing COPs when the IOM evaluation study process began in 2009. PEPFAR Financial Data: Available, Obligated, and Outlaid Each quarter, OGAC submits summary financial status reports to Con- gress on “the allocation, obligation and expenditure of funds appropriated for [PEPFAR]” (OGAC, n.d.-b, p. 1). These reports are publicly available. The committee used the fourth-quarter report from each fiscal year (FY) from 2004 through 2011 to calculate annual appropriations, obligations, and outlays for the PEPFAR program (OGAC, n.d.-b). PEPFAR Financial Data: Annual Expenditure Data Calculated from Agency Reporting In May 2012, in response to a committee data request, SCI received from OGAC PEPFAR funding obligations and outlays for FYs 2004 through 2011 for all countries receiving PEPFAR funding. Upon review of the data and through clarifications with OGAC, IOM staff and consultants realized that these financial data corresponded to the cumulative amount of fund- ing available, obligated, and outlaid from each budget year rather than the actual annual amount of funding available, obligated, and outlaid. Another request was made to OGAC for funding data that would clearly distinguish funding by budget year and reporting year and that would represent ac- tual annual expenditures, regardless of the year in which the money was appropriated or obligated. In July and August 2012, SCI received from

OCR for page 741
746 EVALUATION OF PEPFAR OGAC cumulative agency-specific funding for each reporting year. Annual expenditures were derived as described below. Data Description OGAC sent 78 Excel spreadsheets containing financial data for the 6 agencies that received PEPFAR funding between FY 2004 and FY 2011: • Department of Defense (DOD) • Department of Health and Human Services (HHS) • Department of Labor (DOL) • Department of State (STATE) • Peace Corps (PC) • U.S. Agency for International Development (USAID) With the exception of STATE, each agency reported all of its financial information to OGAC in a consolidated format. STATE, however, reported its PEPFAR funding through five distinct offices/bureaus: • Bureau of African Affairs (AF) • Bureau of East Asian and Pacific Affairs (EAP) • Bureau of Population, Refugees and Migration (PRM) • Bureau of Western Hemisphere Affairs (WHA) • Office of the U.S. Global AIDS Coordinator (OGAC) Each file contained cumulative budget information on available, obli- gated, and outlaid funds, by country, for each FY. As discussed in Chapter 4, most PEPFAR funding does not have an annual use-or-lose requirement (i.e., unspent funding from one FYcan often be carried over to be spent in subsequent years) (OGAC, 2008). Therefore, the money spent during a particular year had the potential to come from the budgets of multiple prior FYs. Consolidating Data into Consistent Files Each agency provided funding to a different group of PEPFAR coun- tries, many of which received funding from more than one agency. There- fore, each agency’s set of budgetary files was first consolidated into a single data file, creating a total of 10 unique datasets—one set per agency or bureau. Each country’s annual funding was retained within each data file to enable potential analyses that would require subgroups of PEPFAR coun- tries based on country attributes. Next these datasets were harmonized into

OCR for page 741
APPENDIX C 747 a single dataset to allow for data to be used together to comprehensively represent PEPFAR spending across agencies, in total and by country. Documenting Discrepancies, Notes, and Comments The data extraction process revealed embedded comments within spreadsheet cells and footnotes explaining data nuances; this informa- tion was recorded in a separate file. Additionally, some funding numbers changed from one reporting year to the next. Increases in funding amounts were expected over time as more of the funding from a particular fiscal year was expected to be obligated or outlaid. Decreases, however, were not expected from one year to the next, i.e., the amount of available funding from a specific fiscal year budget was not expected to decrease in subse- quent reporting years. Therefore, these unexpected changes in the funding data were documented. For all three of these situations—embedded comments, footnotes, or unexpected changes in funding—the following information was recorded for each instance: • Agency/bureau—which agency’s spreadsheets contained the com- ment, footnote, or inconsistency • Country/region—the country or region affected by the comment, footnote, or inconsistency • Reporting year—the reporting year with the observation • Budget year—the year during which the budget was issued • Comment, footnote, or inconsistency—verbatim comments and footnotes from the spreadsheet; inconsistencies were described as clearly as possible • Detected by agency or IOM—an indicator variable reflecting whether the comment or footnote was already in the spreadsheet or whether the inconsistency was encountered by IOM staff during the data extraction process To further assess the most notable discrepancies in the available totals by country and by year, these were compiled in a separate spreadsheet and compared from the inception year through 2011. In particular, major discrepancies occurred when the dollar amounts reported as available for a given budget year changed (both increases and decreases were observed) in subsequent reporting years, although one would expect the amount to be a fixed constant for a budget year after that year in which it was made avail- able. These discrepancies ranged in magnitude with a maximum difference of $214 million between two reports for one budget year for one agency. As a result, it was difficult to assess which were the correct figures for the

OCR for page 741
748 EVALUATION OF PEPFAR total amounts made available. Overall, the number of discrepancies and magnitude of changes from year to year diminished in later reporting years, and the same degree of discrepancy was not seen in the reporting of outlays. Calculation of Annual Expenditures Once the funding data were completely extracted into a single data file, serial subtractions of each reporting year’s cumulative outlay data were performed in order to obtain the amount of money actually spent (outlaid) during each reporting year, regardless of the fiscal year’s budget from which the money came. To get the annual expenditure for a given FY, all prior year outlays were subtracted from the cumulative total outlays reported for that year. Given the data discrepancies described above, in calculating the annual expenditures, the data for all of the FYs were taken from the FY 2011 reports in order to have one consistent source that reflected the most recently available data. Quality Control When all of the data had been extracted into consistent data files, SCI compared all the extracted data files against the raw data files sent from OGAC. The validator worked with the original data extractor and rec- onciled all inconsistencies uncovered within the extracted data files. This independent validator also verified the serial equations used to calculate the amount of funding spent during each reporting year. The validated datasets were not reconfirmed with OGAC. Data Presentation Once all of the data had been validated, SCI imported the data into the analytic software SAS® version 9.3,2 which it used to generate financial presentations of the annual expenditure over time. These presentations were provided to the committee in November 2012. PEPFAR Financial Data—Planned/Approved Funding for All PEPFAR Countries Planned/approved funding reflects how OGAC and PEPFAR mission teams plan to obligate and outlay funds. Each year, OGAC releases an operational plan for PEPFAR that includes summary budget information 2  SAS and all other SAS Institute Inc. product or service names are registered trademarks or trademarks of SAS Institute Inc. in the USA and other countries. ® indicates USA registration.

OCR for page 741
APPENDIX C 749 regarding the planned and approved use of PEPFAR funding, including which activities will be implemented by which agencies, as determined during the interagency planning process. These PEPFAR Operational Plans report planned/approved funding for four technical areas that correspond to the primary categories of HIV/AIDS services and systems strengthen- ing efforts: Prevention, Care, Treatment, and Other. PEPFAR funding is planned through budget codes which capture funding information about more specific activities within these categories (OGAC, 2011c). Data Extraction Process The planned/approved funding was extracted from the PEPFAR Op- erational Plans by year and by budget code (OGAC, 2005a, 2006c, 2007b, 2008, 2010, 2011b,c). Planned/approved funding data were extracted inde- pendently by two IOM staff into identical spreadsheets; records were then compared across datasets to identify and correct inconsistent values. Each staff member extracted data on total PEPFAR funding by implementing agency and year, as well as total PEPFAR funding by budget code and year. Data Presentations The data extraction was validated and SCI converted the data into constant 2010 USD to allow for a consistent interpretation of funding over time. The consultants then generated a final dataset to be used for data pre- sentations showing funding by agency, type of program, and budget code. PEPFAR Financial Data—Planned/Approved Funding for Subset of 31 Countries Data on planned/approved funding from the subset of 31 countries that were the focus of this evaluation were gathered through a separate data extraction. These data were used for the committee’s analysis of funding by country characteristic. Data Extraction Process The planned/approved funding was extracted according to the follow- ing classifications: • By country (31 countries in total) • By year (FY 2005 through FY 2011, for the years during which a given country was completing a COP)

OCR for page 741
750 EVALUATION OF PEPFAR • This funding information was extracted from the following pub- licly available data sources , which were determined to be the most comprehensive across the classifications for the data extraction (OGAC, n.d.-a,c): o Y 2005 through FY 2007—Focus countries only: PEPFAR F Operational Plans o Y 2008—Focus countries: PEPFAR Operational Plans; non- F focus countries: individual COPs o FY 2009—All countries: individual COPs o Y 2010 and FY 2011—All countries: PEPFAR Operational F Plans SCI developed specifications corresponding to the variables necessary for the PEPFAR financial data extraction process and developed dataset specifications for two separate extraction processes. The first data extrac- tion compiled annual, country-specific funding by agency; the second data extraction compiled annual, country-specific funding by technical area and budget code. Data were not extracted by both agency and technical area, but rather either by agency or by technical area. During the extraction pro- cess, any funding corresponding to regions (e.g., Central America, Central Asia, and Caribbean) was omitted, and the process was limited to the 31 countries that were preparing COPs at the time this evaluation was initi- ated. Funding amounts were rounded to the nearest whole dollar. During the extraction by technical area, some budget codes switched from one technical area to another across reporting years; however, these dif- ferences were tracked in an effort to make consistent comparisons over time. Data Extraction Quality Control Two SCI consultants extracted the data independently into comparably formatted spreadsheets. Each consultant extracted a spreadsheet of fund- ing data by year, country, and agency, as well as a second spreadsheet by year, country, technical area, and budget code. Once all of the data had been extracted across all budget years, one of the consultants developed a tool to compare individual records across datasets and to flag inconsis- tent values. This comparison tool flagged every instance of a record with inconsistent information, whether it corresponded to how the extractors recorded a particular budget code or whether the budget amounts differed. Together, the consultants then reconciled the inconsistent records. Once their datasets matched 100 percent, a third, independent SCI consultant imported the data into SAS software and used a random number genera- tor to select 50 of the 1,302 records (about 4 percent) that summarized

OCR for page 741
APPENDIX C 751 the financial information by agency and 80 of the 4,123 records (about 2 percent) that summarized the financial information by budget code; the consultant then crosschecked these 130 values against the information written in the PEPFAR Operational Plans and COPs. Some of these records corresponded to countries, agencies, or budget codes that were not specified during a particular year. Therefore, this selection of records also confirmed that particular combinations of years, countries, agencies, and budget codes were not inadvertently incorporated into the datasets. All 130 validation records matched the operational plans exactly, thus confirming the quality of the data extraction process. Data Presentations Once the validation process was complete, SCI generated a final dataset to be used, along with publicly available data from global sources, for data presentations showing PEPFAR funding by HIV prevalence, average fund- ing per person living with HIV, and country income level. Planned/Approved Funding by Prime Partner The committee examined planned/approved funding data extracted from a range of publicly available data sources. The process of extracting and compiling these data was time intensive, so to be feasible within the resources and time available for the study, the committee’s analysis had to be limited to a subset of partner countries. The committee chose to com- pile these data for the same 13 countries purposefully selected for country visits, as described later in this appendix. Within this subset of countries, the committee was able to compare partner data and planned/approved PEPFAR funding for the focus countries for FY 2004 through FY 2010 and for non-focus countries for FY 2008 through FY 2010. Data Extraction Data were extracted according to the following characteristics: • By country • By year • By prime partner This funding information was extracted from the following data sources, which were determined to be the most comprehensive data sources available across the classifications for the data extraction. For FY 2004 to FY 2006, the prime partner funding data were extracted from a Center for

OCR for page 741
782 EVALUATION OF PEPFAR sources. Each site visit included an interview, some of which were in-depth, open-ended interviews, while others were informal interviews conducted during walking tours of the sites. At least one designated delegate team member visiting the site took handwritten field notes during the visit; these were reviewed and reconciled by team members using the same procedures as interview notes, which is discussed below. The team conducted the majority of interviews in English. In the case of interviewees who preferred conducting the interview in a language other than English, the delegation team hired professional interpreters from the partner country and oriented them to the purpose and process of qualitative data collection and to their role in the process. Participant validation of data summaries A commitment to anonymity and confidentiality and a focus on cross-country data reporting precluded the sharing of country-specific findings with interviewees and their agencies or organizations during or after individual partner country visits. Inter- viewees were able to assess the scope and content of their key messages, however, in response to an end-of-interview summary of key messages that the co-facilitator offered at the conclusion of every interview. Following the summary of key points or messages, co-facilitators explicitly invited interviewees to convey any additions, corrections, or additional informa- tion that they wished to offer. Thus, all interviewees had an opportunity to affirm, modify, or extend their key messages, a process that not only af- firmed that their viewpoints had been clearly understood and documented by the interview team, but also verified the accuracy and completeness of key messages shared with the team. Researcher reflexivity Because delegation team members served as “in- struments” of qualitative data collection, they were aware of the need to be reflexive and have a “simultaneous awareness of self and other and of the interplay between the two” (Rossman and Rallis, 2012, p. 10). In other words, engagement in reflexivity facilitated individuals’ emergent self-awareness of personal predilections, assumptions, biases, and beliefs so that each individual could potentially recognize and thus minimize her or his impact on interviewees and the research environment as well as the impact of the research environment on them (Patton, 2002). Team leaders and consultants urged all team members to engage in reflection and reflexiv- ity throughout the evaluation by using at least one of two primary strate- gies: maintaining a private reflective and reflexivity journal or engaging in verbal reflexivity during any of the interview or team debriefings. Members of the evaluation team frequently, openly, and voluntarily shared their self- awareness of personal assumptions, biases, and beliefs verbally during one or more of the multiple peer debriefings and synthesis processes associated

OCR for page 741
APPENDIX C 783 with data collection. At times, peers encouraged a team member to be reflexive when that individual’s personal assumptions or biases emerged during discussions and debriefings related to the evaluation. During discus- sions, it was not unusual for team members to reference a personal need for self-reflexivity regarding some topic. Thus, the need for all investigators to become increasingly self-aware about their personal beliefs, assumptions, values, and biases that could impact the research or the research environ- ment and vice versa was frequently reinforced during each country visit. Audit trail Maintaining an audit trail served as a means for the evaluation team to establish study credibility and confirmability (Wolf, 2003). Evalu- ation team members were charged with organizing and maintaining vari- ous electronic and hardcopy audit trail evidence related to the evaluation. Evidentiary documents related to the process of the evaluation included • An agenda log maintained electronically for each country visit chronicled interview scheduling and contact information, evalu- ation-related contacts, and information on the participants and questions covered in completed interviews. • An activity log maintained electronically throughout the evalua- tion process chronicled process and methodological decisions and action items both within and across country visits. • Analysis and interpretation notations were indicated on flip chart paper and electronic notes during facilitated team debriefings and the mid-week and exit synthesis process. When evaluation team members recounted interviewees’ viewpoints and experiences re- lated to evaluation topics, they not only reported the content of interviewees’ perspectives (“what they said”) gleaned during in- terviews, but also differentiated interviewees’ narratives from how they as team members interpreted what interviewees shared with them. Team members also discussed emerging linkages among par- ticipants’ interview data and other data such as documents and observations. • A codebook was initially developed and then revised based on evaluation topics and, to a lesser degree, data that emerged from the interviews and site visits. The codebook fostered team mem- bers’ ability to consistently label or code segments of the narrative data. Evaluation team debriefings The evaluation team engaged in a multistage process of data debriefings that were instrumental in verifying and com- municating interview content, facilitating reflection and personal reflexivity,

OCR for page 741
784 EVALUATION OF PEPFAR and synthesizing data findings according to evaluation topic. The types and content of the peer debriefings are outlined below: • Individual Interview Debriefings o  sing the co-facilitator’s end-of-interview summary as a basis, U interview team members’ documented interviewees’ key points or messages, reflected on the interview process, engaged in and acknowledged personal reflexivity, and participated in a prelimi- nary analysis and interpretation of the data collected during the given individual in-depth interview or group interview. • Daily or Every-2-Day Interview Debriefings o  ll delegation team members convened to share key points that A emerged from the interviews of which they were a part, their perspectives about and interpretations regarding the data, and their personal reflections/reflexivity. • Synthesis (End of Week 1) o  ll delegation team members engaged in a midpoint synthesis A of interview findings, the process of which was facilitated by the team leader and structured according to evaluation topic. To assist with the synthesis, each team member received a copy of the interview debriefings that had been conducted so far on the country visit. o  ommittee members often participated during the first week of C the 2-week country visits. The synthesis process at the end of Week 1 was thus critical in eliciting committee member insights into country visit data and interpretation before they exited the country. • Exit Synthesis (End of Week 2) o  ll delegation team members still in country engaged in an 8- to A 10-hour process of verbally synthesizing the findings associated with data collection prior to exiting the country. As with the syn- thesis process at the end of Week 1, each team member received a copy of the interview debriefings that had been conducted. The team leader facilitated the exit synthesis process, which was structured according to evaluation topics and included data documentation, reflection and reflexivity regarding the data col- lected, and verbal analysis and interpretation notations. • Across-Country Debriefings and Discussions o eriodically, between clusters of country visits, IOM study staff P participated in a discussion and synthesis of the qualitative find- ings according to evaluation topic and identified consonance or differences in these findings across a number of countries.

OCR for page 741
APPENDIX C 785 o  t committee meetings that occurred periodically between clus- A ters of country visits, committee members, either as a whole committee or in working groups focused on specific content ar- eas, participated in discussions of the analysis and interpretation of interview data, including review of draft data presentations. Accuracy of data collection Accuracy was critical in documenting the data collected for this evaluation. With participants’ permission, interviews were digitally recorded in conjunction with handwritten notes taken by members of the interview team. Professional transcriptionists ultimately transcribed the digitally recorded interviews, but the need for timeliness, efficiency, ease of comprehension, and engagement in data analysis from the onset of data collection led the evaluation team to rely on their own typed transcription of handwritten interview notes as the primary source of interview data for analysis. To ensure completeness and accuracy of these interview notes, interview team members engaged in an independent, detailed review of the note-taker’s transcribed handwritten notes. This process involved an initial draft by the assigned note taker, a review by another team member who participated in the interview, and a final resolution round by the original note taker. During the end-of-interview summary provided by the co-facilitator (or the facilitator when there was no co-facilitator), interviewees addressed the accuracy of the main end-of-interview points that the co-facilitator shared with them by affirming, correcting, or adding to the end-of-interview sum- mary. In addition, the interview team debriefed each interview shortly after it occurred to affirm the accurate documentation of main points using the co-facilitator’s potentially revised summary as a foundation and contribut- ing additional details. An additional accuracy check was afforded team members who could reference the digital recording of the interview when clarifying segments of narrative or resolving issues of disagreement regard- ing the content of a particular interview. Collection of Non-Country Visit Interview Data As part of the data collection effort for the evaluation, IOM staff and consultants also conducted a series of 32 non-country visit interviews with key stakeholders. The interviewees included the USG at PEPFAR HQ level (including OGAC, CDC, and USAID), U.S.-based implementing partners at HQ level, and other organizations that work in the global response to HIV, including multilateral organizations, NGOs, and another bilateral donor. As with the country visit interviewees, non-country visit interviewees were not only selected through purposeful sampling, but also prioritized on

OCR for page 741
786 EVALUATION OF PEPFAR the basis of targeted focus areas within the evaluation, and the process of mapping data sources for evaluation questions. Semi-structured interviews were conducted using the same methodology as the country visit interviews, using interview guides with questions and prompts adapted as appropriate for each interview. Analyses of Qualitative Data In-country data analysis process In-country data review and preliminary analysis occurred at various levels and at several times during country visits, during the debriefings and synthesis discussions described previ- ously. As soon as possible after each interview, team members conducted a post-interview debriefing to discuss and document the main points shared by the interviewee(s). Delegation members also convened routinely as an entire team during the country visit to engage in debriefings to share with each other the main points from the data across all the interviews that were conducted. At the close of the first week of each 2-week country visit and again at the close of the country visit, the team conducted, respectively, an end-of- week debriefing and an exit synthesis debriefing that utilized an inductive analysis approach for the purpose of identifying dominant themes that emerged from the data. Both of these processes began with team members individually reading the debriefing notes from interviews conducted during the week to review key data from the interviews and to identify concepts and themes emerging from the data. Delegation members then collectively discussed the data and dominant themes that arose from the interviews, systematically using categories that were pre-selected based on the evalu- ation objectives. The delegation team differentiated between evidence or the responses heard during the interviews, and analysis and interpretation, which reflected the delegation’s interpretations of what the evidence meant, focusing on the meaning in relation to the evaluation objectives. The out- put from these processes was an exit synthesis document capturing the key evidence and analysis and interpretation from the interviews grouped by evaluation category, and a key messages document capturing the main themes that emerged across the interviews. These documents were then included as part of the country visit sum- mary, which was reviewed by the members of the trip delegation and then posted on the committee portal. The country visit summary also includes other information provided to the delegation in advance of the trip in the form of a country brief, including background research on the country context and PEPFAR program as well as basic financial data and OGAC and other programmatic or indicator data (including UNAIDS data). The country visit summary is a compilation of the data from these multiple

OCR for page 741
APPENDIX C 787 sources, not a triangulated analysis of the data and evidence available for each country. The goal was to provide a “snapshot” overview to inform the rest of the committee about the visit and the country and to provide a centralized source for country data. Synthesis of exit syntheses To provide the committee with a sense of the overall current findings emerging from the interview data, for some of the evaluation categories the IOM staff and consultant conducted a synthesis to identify and present the dominant themes that emerged in the exit syntheses across countries. This synthesis was conducted and presented in a variety of ways, ranging from an analytical synthesis presented in narrative form to data grouped in bulleted form by sub-themes, which offered less synthesis and analysis but was closer to the “raw” data. Additional analysis of interview data Additional data summaries, syntheses, and analyses from both the country visit and non-country visit interview data were generated using methods detailed below. Members of the IOM staff used NVivo software (version 9.0) to con- duct macro-level coding of the data using detailed interview notes generated by IOM staff and consultants or transcripts generated by contracted profes- sional transcriptionists from audio-recordings of interviews. The subset of data coded in NVivo comprised more than half of the interviews, purpose- fully selected for representation across countries and stakeholder types. This coding was based on a standardized project code book with each code reflecting important data concepts with inclusion and exclusion criteria. The data concepts represented in the codebook were based on evaluation topics identified in the evaluation planning phase (IOM and NRC, 2010), the evaluation committee’s development of priority evaluation questions, and the exit synthesis process and review of initial data collected from the pilot country visits and other early country visits. For synthesis and analy- sis, these coded data were separated and extracted by querying for a single code or combinations of the macro-level codes across interviews. In some cases, data were also extracted from the NVivo dataset using targeted word search queries. Building on this initial thematic identification, IOM staff or Dr. Knight then conducted a more in-depth and refined analysis through repeated read- ing, reflection, and continued micro-level coding of the data for narrower subconcepts. This led to inductive identification of themes, patterns, and categories that emerged as findings from the data. This was followed by deductive confirmation and disconfirmation of those findings and deter- minations of data saturation for topics and themes (i.e., whether any new data had emerged). Prolonged engagement in data collection also led team members to affirm data saturation. Delegation evaluation teams recognized

OCR for page 741
788 EVALUATION OF PEPFAR BOX C-1 Interview Citation Key Country Visit Exit Synthesis Key: Country # + ES Country Visit Interview Citation Key: Country # + Interview # + Organi- zation Type Non-Country Visit Interview Citation Key: “NCV” + Interview # + Orga- nization Type Organization Types: United States: USG = U.S. Government; USNGO = U.S. Nongovernmental Organization (NGO); USPS = U.S. Private Sector; USACA = U.S. Academia Partner Country: PCGOV = Partner Country Government; PCNGO = Part- ner Country NGO; PCPS = Partner Country Private Sector; PCACA = Partner Country Academia Other: CCM = Country Coordinating Mechanism; ML = Multilateral Or- ganization; OBL = Other (non-U.S. and non-Partner Country) Bilateral; OGOV = Other Government; ONGO = Other Country NGO data saturation, through multiple iterations of individual and group analy- ses and discussions described below, as the repetition of information to the point of redundancy, which indicated that data collection could be reason- ably concluded (Merriam, 2002, 2009; Patton, 2002). In the next iteration of the analytical process, drafts of data analysis outputs were read for discussion and revision by members of the project staff, consultants, and evaluation committee members who were familiar with the interview data and had participated in data collection and in- country data analyses. In addition, interview debriefing and exit synthesis documents from all interviews, including those not in the initial coded dataset, were used to carry out supplementary deductive confirmation and disconfirmation of findings that emerged from the coded data, and to iden- tify specific additional interview notes and transcripts for enrichment of the analysis of the coded data. These interview data findings and analyses were presented in a num- ber of ways, including in narrative form with accompanying illustrative quotations, in summary tables, or in bulleted groupings by subconcepts. The presentation of quotes was used when one person’s words provided a memorable description of an issue that was resonant with multiple in- terviewees or perspectives, or in some cases when one person’s words represented a meaningful disconfirming perspective. For this report, single

OCR for page 741
APPENDIX C 789 quotation marks were used to denote an interviewee’s perspective with wording extracted from transcribed notes written during the interview, and double quotation marks were used to denote an exact quote from an inter- viewee either confirmed by listening to the audio-recording of the interview or extracted from a full transcript of the audio-recording. Interview data presented in the report are accompanied by a citation key. Interviews in qualitative research are often cited with a brief descrip- tive demographic phrase; however, this was not feasible for an evaluation of this scope, with more than 400 interviews and the frequent citations for multiple interviews. Therefore, a citation tag was developed to allow the reader to identify the key characteristics relevant for the analysis and inter- pretation of the data for this evaluation, including the range of countries and interviews represented and the stakeholder type. The interview citation key is shown in Box C-1. REFERENCES Bendavid, E., and J. Bhattacharya. 2009. The President’s Emergency Plan for AIDS Relief in Africa: An evaluation of outcomes. Annals of Internal Medicine 150(10):688-695. Bendavid, E., C. B. Holmes, J. Bhattacharya, and G. Miller. 2012. HIV development as- sistance and adult mortality in Africa. Journal of the American Medical Association 307(19):2060-2067. Bouey, P., and J. De Leon. 2011. Interview. Washington, DC, April 27, 2011. Central Intelligence Agency. n.d. The world factbook. https://www.cia.gov/library/publications/ download (accessed June 15, 2012). CGD (Center for Global Development). 2008. PEPFAR funding data full dataset. Washington, DC: CGD. Creswall, J. W. 2007. Qualitative inquiry and research design: Choosing among five ap- proaches. Thousand Oaks, CA: SAGE. Institute for Health Metrics and Evaluation. n.d. Global health data exchange. http:// www.healthmetricsandevaluation.org/ghdx/record/adult-mortality-estimates- country-1970-2010 (accessed June 15, 2012). International epidemiologic Databases to Evaluate AIDS. n.d. International epidemiologic databases to evaluate AIDS http://iedea.org (accessed March 17, 2011). IOM and NRC (Institute of Medicine and National Research Council). 2010. Strategic ap- proach to the evaluation of programs implemented under the Tom Lantos and Henry J. Hyde U.S. Global Leadership Against HIV/AIDS, Tuberculosis, and Malaria Reauthori- zation Act of 2008. Washington, DC: The National Academies Press. McCullough, R., and L. Miller. 2009. Surveying the global HIV/AIDS landscape. In From the ground up: Building comprehensive HIV/AIDS care programs in resource-limited set- tings. Washington, DC: Elizabeth Glaser Pediatric AIDS Foundation. Merriam, S. B. 2002. Assessing and evaluating qualitative research. Edited by Sharon B. Merriam and Associates, Qualitative research in practice. San Francisco: Josey-Bass. Merriam, S. B. 2009. Qualitative research: A guide to design and implementation. San Fran- cisco: Jossey-Bass. OECD (Organisation for Economic Co-operation and Development). 2012. OECD.Stat (da- tabase). Paris: OECD.

OCR for page 741
790 EVALUATION OF PEPFAR OGAC (Office of the U.S. Global AIDS Coordinator). 2005a. Emergency Plan for AIDS Relief fiscal year 2005 operational plan: June 2005 update. Washington, DC: OGAC. OGAC. 2005b. The President’s emergency plan for AIDS relief: Indicators, reporting require- ments, and guidelines for focus countries. Washington, DC: OGAC. OGAC. 2006a. The President’s Emergency Plan for AIDS Relief. FY 2007 supplemental COP guidance resource guide. Washington, DC: OGAC. OGAC. 2006b. The President’s emergency plan for AIDS relief: FY 2007 country operational plan guidance. Washington, DC: OGAC. OGAC. 2006c. The U.S. President’s Emergency Plan for AIDS Relief Fiscal Year 2006: Op- erational plan. 2006 August update. Washington, DC: OGAC. OGAC. 2007a. The President’s emergency plan for AIDS relief: Indicators, reporting require- ments, and guidelines. Indicators reference guide: FY 2007 reporting/FY 2008 planning. Washington, DC: OGAC. OGAC. 2007b. The U.S. President’s Emergency Plan for AIDS Relief fiscal year 2007: Op- erational plan. 2007 June update. Washington, DC: OGAC. OGAC. 2008. The U.S. President’s Emergency Plan for AIDS Relief (PEPFAR) fiscal year 2008: PEPFAR operational plan. June 2008. Washington, DC: OGAC. OGAC. 2009a. Guidance for PEPFAR partnership frameworks and partnership framework implementation plans. Version 2.0. Washington, DC: OGAC. OGAC. 2009b. The President’s Emergency Plan for AIDS Relief: Next generation indicators reference guide. Version 1.1. Washington, DC: OGAC. OGAC. 2010. The U.S. President’s Emergency Plan for AIDS Relief (PEPFAR) fiscal year 2009: PEPFAR operational plan. November 2010. Washington, DC: OGAC. OGAC. 2011a. PEPFAR response to data request from IOM for the PEPFAR II evaluation. Washington, DC: OGAC. OGAC. 2011b. The U.S. President’s Emergency Plan for AIDS Relief (PEPFAR) fiscal year 2010: PEPFAR operational plan. Washington, DC: OGAC. OGAC. 2011c. The U.S. President’s Emergency Plan for AIDS Relief (PEPFAR) fiscal year 2011: PEPFAR operational plan. . Washington, DC: OGAC. OGAC. n.d.-a. Country operational plans. http://www.pepfar.gov/countries/cop/index.htm (accessed April 9, 2012). OGAC. n.d.-b. Obligation and outlay reports. http://www.pepfar.gov/about/c24880.htm (ac- cessed November 29, 2012). OGAC. n.d.-c. Operational plans. http://www.pepfar.gov/about/c19388.htm (accessed April 9, 2012). OGAC. n.d.-d. Partners. http://www.pepfar.gov/funding/budget/partners/index.htm (accessed July 31, 2012). Oomman, N., M. Bernstein, and S. Rosenzweig. 2008. The numbers behind the stories: PEPFAR funding for fiscal years 2005 to 2006. Washington, DC: CGD. Patton, M. Q. 2002. Qualitative research and evaluation methods. 3rd ed. Thousand Oaks, CA: SAGE. Rajaratnam, J. K., J. R. Marcus, A. Levin-Rector, A. N. Chalupka, H. Wang, L. Dwyer, M. Costa, A. D. Lopez, and C. J. Murray. 2010. Worldwide mortality in men and women aged 15-59 years from 1970 to 2010: A systematic analysis. Lancet 375(9727):1704-1720. Rossman, G. B., and S. Rallis. 2012. Learning in the field: An introduction to qualitative research. Thousand Oaks, CA: SAGE. Sessions, M. 2006. Overview of the President’s Emergency Plan for AIDS Relief (PEPFAR). Washington, DC: CGD.

OCR for page 741
APPENDIX C 791 U.S. Department of Commerce. 2010. International data base population estimates and pro- jections methodology. http://www.census.gov/population/international/data/idb/estand proj.pdf (accessed March 15, 2011). U.S. Department of Commerce. n.d. United States census bureau: International database. http://www.census.gov/population/international/data/idb/informationGateway.php (ac- cessed March 15, 2011). UNAIDS (Joint United Nations Programme on HIV/AIDS). 2009. Monitoring the declara- tion of commitment on HIV/AIDS: Guidelines on construction of core indicators: 2010 reporting. Geneva: UNAIDS. UNAIDS. 2011. UNAIDS AIDSinfo database. http://www.aidsinfoonline.org (accessed March 15, 2011). UNAIDS. n.d. Data tools. http://www.unaids.org/en/dataanalysis/datatools (accessed June 15, 2012). UNICEF (United Nations Children’s Fund). 2011. The state of the world’s children 2011: Adolescence, an age of opportunity data. http://www.unicef.org/sowc2011/statistics.php (accessed March 14, 2011). United Nations, Department of Economic and Social Affairs. n.d. World population prospects. http://esa.un.org/wpp/Excel-Data/mortality.htm (accessed June 15, 2012). USAID (U.S. Agency for International Development). 2011. MEASURE DHS: Demographic and health surveys. http://www.measuredhs.com (accessed March 15, 2011). USAID. n.d. Data online for population, health, and nutrition http://dolphn.aimglobalhealth. org (accessed March 17, 2011). WHO (World Health Organization). n.d.-a. Global atlas of the world health workforce. http:// apps.who.int/globalatlas/default.asp (accessed March 17, 2011). WHO. n.d.-b. Global health observatory data repository. http://apps.who.int/gho/data/node. main (accessed March 15, 2011). WHO. n.d.-c. WHO global health workforce statistics. http://www.who.int/hrh/statistics/ hwfstats/en (accessed March 17, 2012). WHO. n.d.-d. WHO mortality database. http://www.who.int/healthinfo/mortality_data/en/ index.html (accessed June 15, 2012). Wolf, Z. R. 2003. Exploring the audit trail for qualitative investigations. Nurse Educator 28(4):175-178. World Bank. n.d. World data bank: World development indicators. http://databank.world bank.org/ddp/home.do (accessed March 15, 2011).

OCR for page 741