3

Data Synthesis, Software Redesign, and Evaluation

The data collection and the software programming for SMART Vaccines proceeded simultaneously, and both were informed by feedback from various stakeholders. The committee chose to retain the United States and South Africa, the test countries selected for Phase I of the SMART Vaccines development, for use in Phase II. These two countries not only have different income, health, and demographic profiles, but they also have different social and economic priorities for developing and delivering new vaccines. South Africa was chosen, in part, because data were available from that country with which to test the vaccine candidates selected in both Phase I and Phase II. The early part of this chapter is devoted to describing the committee’s data synthesis efforts and the latter part toward describing the software prototyping efforts.

Selection of Vaccine Candidates

In Phase I the committee selected influenza, tuberculosis, and group B streptococcus as test vaccine candidates for the United States, and tuberculosis as a test vaccine candidate for South Africa. Supporting data for these candidates are presented in an appendix of the 2012 Institute of Medicine (IOM) report.

The committee was tasked to test three additional vaccine candidates in Phase II. The committee members began with a list of hypothetical vaccine candidates for seven infectious agents: cholera, dengue, human immunodeficiency virus, human papillomavirus, rotavirus, pneumococcal



The National Academies | 500 Fifth St. N.W. | Washington, D.C. 20001
Copyright © National Academy of Sciences. All rights reserved.
Terms of Use and Privacy Statement



Below are the first 10 and last 10 pages of uncorrected machine-read text (when available) of this chapter, followed by the top 30 algorithmically extracted key phrases from the chapter as a whole.
Intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text on the opening pages of each chapter. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

Do not use for reproduction, copying, pasting, or reading; exclusively for search engines.

OCR for page 33
3 Data Synthesis, Software Redesign, and Evaluation The data collection and the software programming for SMART Vaccines proceeded simultaneously, and both were informed by feedback from vari- ous stakeholders. The committee chose to retain the United States and South Africa, the test countries selected for Phase I of the SMART Vaccines development, for use in Phase II. These two countries not only have differ- ent income, health, and demographic profiles, but they also have different social and economic priorities for developing and delivering new vaccines. South Africa was chosen, in part, because data were available from that country with which to test the vaccine candidates selected in both Phase I and Phase II. The early part of this chapter is devoted to describing the committee’s data synthesis efforts and the latter part toward describing the software prototyping efforts. Selection of Vaccine Candidates In Phase I the committee selected influenza, tuberculosis, and group B streptococcus as test vaccine candidates for the United States, and tubercu- losis as a test vaccine candidate for South Africa. Supporting data for these candidates are presented in an appendix of the 2012 Institute of Medicine (IOM) report. The committee was tasked to test three additional vaccine candi- dates in Phase II. The committee members began with a list of hypotheti- cal vaccine candidates for seven infectious agents: cholera, dengue, human immunodeficiency virus, human papillomavirus, rotavirus, pneumococcal 33

OCR for page 33
34 RANKING VACCINES: A Prioritization Software Tool infection, and malaria. The committee chose human papillomavirus, rota- virus, and pneumococcal infection as the test cases for evaluation; licensed vaccines currently exist for the causative agents of each of these three diseases. The purpose of including these candidate vaccines in SMART Vac- cines was to demonstrate the functionality of the software. Each vaccine candidate offers a scenario that may arise in the process of developing and delivering a new preventive vaccine. These scenarios may include decision points that arise in the development and distribution of a vaccine that is aimed at a particular disease and that has certain intended health and eco- nomic benefits. Because vaccines for human papillomavirus, rotavirus, and pneumo- coccal infection currently exist, the committee considered their inclusion in the model as providing test examples of the process one goes through in developing improved vaccines by such methods as including adjuvants, increasing effectiveness, or reducing doses. Another reason for the selec- tion of these three particular vaccines is that each targeted disease affects a different population and has different health implications: Human papil- lomavirus infects sexually active individuals and can lead to anal or cervi- cal cancer over time; rotavirus affects children, and this burden is greater in low-income countries; pneumococcal disease is known to affect young children and the elderly population worldwide. Disease profiles for these three diseases as well as for the diseases targeted by the vaccine candidates evaluated by the Phase I committee— influenza, tuberculosis, and group B streptococcus—are provided in Appen- dix B. A snapshot of the data needs for SMART Vaccines is presented in Table 3-1. Due to time constraints in the Phase I study, data for South Africa were collected only for tuberculosis; for the United States, data for influ- enza, tuberculosis, and group B streptococcus were collected. In this study, the data for human papillomavirus, pneumococcal infection, and rotavirus were collected for both the United States and South Africa. Thus, a total of six datasets for the United States and four for South Africa are available as downloadable spreadsheets (along with the SMART Vaccines software package) on the Institute of Medicine and the National Academies Press websites. Data sources for the necessary parameters are provided in the spreadsheets along with explanatory notes and references. For ease of use, SMART Vaccines 1.0 contains these datasets pre-populated as defaults.

OCR for page 33
Data Synthesis, Software Redesign, and Evaluation 35 TABLE 3-1 A Snapshot of Data Required for SMART Vaccines 1.0 Data Available in Data Requiring User Parameter the Public Domain Estimation Demographics • Life Tablesa • Health Utility Index 2 • Standard Life • Hourly Wage Rate Expectancya Disease Burden • Incidence • Case Fatality Rate Disease • Disutility (Tolls)b • Percent of Cases Morbidity • Disability Weightsb • Costs (Hospital, • Durationb Outpatient, Medication)b Vaccine • Target Populationa • Herd Immunity Characteristics • Coverage • Time to Adoption • Effectiveness • Cost per Dose • Length of Immunity • Administration Cost • Doses Required • Research and per Person Development Costs • Licensure Costs • One-Time (Start-Up) Costs a Standard data irrespective of the vaccine candidates. b Requires case-by-case judgment and modification for specific vaccine complications or morbidity. Data Sourcing and Quality The data gathered by the committee are by no means the best or the most detailed estimates for each disease. They are neither precise projections nor comprehensive analyses. For example, there are data available on the burden of influenza and on the impact of seasonal influenza vaccines in the United States, but because there are no currently licensed vaccines for group B streptococcus, the only data available from the United States for that disease concern the disease burden, with nothing on the impact of a vaccine if it were licensed; thus, the vaccine information for group B strep- tococcus is largely hypothetical. In fact, much of the information required for SMART Vaccines, especially the information related to the use of the vaccines in low-income countries, was based on the opinions of the com- mittee members. A significant concern regarding the committee’s data analysis was the variability and the lack of standardization in surveillance methods. While data may be widely available for certain parameters, the committee thought it important to use only those data that had been collected using standard, comparable methodologies. To ensure the quality of the data, public sources such as peer-reviewed literature, the World Health Orga-

OCR for page 33
36 RANKING VACCINES: A Prioritization Software Tool nization, the Centers for the Disease Control and Prevention, and publica- tions of national health agencies were used as often as possible. Data sourc- ing and methodology are discussed in Appendix C. Development of a Test Model to Inform Software Redesign In Phase II, as part of the model enhancements, the committee developed a spreadsheet prototype to illustrate the possibilities of a dynamic weight- adjustment tool and to show how real-time graphical changes could facili- tate the user’s prioritization process. Figure 3-1 shows an early prototype interface that allowed the user to rank selected values. This interface served as an evolving “test bench” prototype that the committee used to make changes and to incorporate stakeholder feedback obtained during the public presentations. In short, the spreadsheet in the screenshot is an experimental draft shown in order to illustrate the committee’s back-end work as SMART Vaccines 1.0 underwent interface redesign. This prototype spreadsheet allowed the committee members to select up to 10 attributes, with pop-up boxes featuring quick definitions. In Figure 3-1, for example, nine attributes have been chosen (indicated by check marks in the left-hand column). Those nine attributes are ranked from 1 to 9 (in the second column). The most important attribute is ranked 1, and the least important is ranked 9. The fourth column (in yellow) shows the weights as calculated by the rank order centroid method. The slider bars in the third column (labeled “fine adjustment”) allow users to adjust the computed weights. This feature illustrates the committee’s early efforts to provide users with an option to carry out intuitive sensitivity analyses without needing to understand the details of the multi-attribute utility model. The attributes shown in Column 5 are collected into groups, each with a colored heading—purple for “health considerations,” maroon for “economic considerations,” yellow for “demographic considerations,” dark blue for “intangible values,” and so on. These same colors appear in the bar graph at the lower right corner of the screen that shows the calculated SMART Scores for five hypothetical candidate vaccines: an influenza vac- cine with 1-year efficacy, an influenza vaccine with 10-year efficacy, a group B streptococcus vaccine costing $100 per dose, a group B streptococcus vaccine priced at $50 per dose, and a tuberculosis vaccine that does not achieve any herd immunity. Each vaccine bar is divided into colored sec- tions showing how much each of the nine attribute categories adds to the SMART Score for that vaccine.

OCR for page 33
FIGURE 3-1 An early experimental prototype developed for committee use. This prototype was adjusted based on feedback obtained from stakeholders after the public presenta- 37 tions made by committee members. This spreadsheet design informed the subsequent redesign of SMART Vaccines in a MATLAB platform.

OCR for page 33
38 RANKING VACCINES: A Prioritization Software Tool As users change the ranking and then fine tune the weights for each chosen attribute, the heights of the bars for each candidate vaccine adjust automatically. Thus, users can interactively see the effect of altering their weights immediately—by making changes to the rank order, or by fine tun- ing of the weights calculated by the rank order centroid method as part of the sensitivity analysis. Interface Redesign for SMART Vaccines 1.0 In Phase I the blueprint of SMART Vaccines Beta was developed using three software tools: MATLAB for the algorithm, Java servlets for the mid- dleware, and Axure for visual interface design, with Microsoft SQL Server used for preliminary database management. Stakeholder feedback made it clear that SMART Vaccines needed to be developed in a simpler, platform- independent fashion to aid the end users. Therefore, the committee elected to use MATLAB as the sole programming platform for developing, test- ing, and producing a downloadable and executable package for SMART Vaccines 1.0. This choice was made easier by enhancements to MATLAB that allowed it to be used both for implementation of the model and for the creation of a dynamic, cross-platform user interface. Data can be directly entered or imported from spreadsheets into SMART Vaccines for applica- tion and storage. To illustrate the current operational features of SMART Vaccines 1.0, this section includes a step-by-step screenshot tour. SMART Vaccines 1.0 is substantially different from the SMART Vaccines Beta presented in the 2012 report (IOM, 2012). The committee appreciated how direct data entry using the previous software interface format could be burdensome to the user, and hence it spent substantial efforts to simplify data entry with the goal of making it more efficient and intuitive. Figure 3-2 shows the welcome screen of SMART Vaccines 1.0. Here, users are presented with the disclaimer that stresses that SMART Vaccines is a decision-support system and not a decision making tool. By clicking on the radio buttons (selectable circles) at the top, the user can select what to enter and how to use the program. Relevant screens appear when the user selects any of the “Specify” or “Evaluate” buttons. For instance, by selecting “Attributes” the user is taken to a screen where each vaccine candidate’s attributes are chosen; selecting “Weights” takes the user to a screen where attributes are ranked and weighted; and select- ing “Priorities” allows the user to observe the priority rankings calculated by SMART Vaccines once all of the relevant data entry has been completed. The user has the option either to proceed linearly through the program

OCR for page 33
Data Synthesis, Software Redesign, and Evaluation 39 FIGURE 3-2 Welcome page presenting the terms of agreement and disclaimer. SMART Vaccines 1.0 was devel- oped on a MATLAB platform with a redesigned user interface. using the “Continue” buttons or to skip to certain sections, thereby making possible a division of labor among data collection, attribute selection, and weighting. The next screenshot (see Figure 3-3) shows a typical data page—in this instance, demographic data for females in the United States that can be specified using a pull-down menu. As noted earlier, the basic population data can normally be taken directly from institutions that maintain various databases, such as the World Health Organization. For infants, for children from 1 through 4 years of age, and then for each 5-year age group after that (5 through 9, 10 through 14, and so on), SMART Vaccines requires the number of persons in each age group, the number living at the end of the period, the life years that the group mem- bers are expected to have, their life expectancy, and a standard life expec- tancy used in calculating disability-adjusted life years (DALYs). The health utilities index (HUI2) provides the quality adjustment for a typical person in each age category, which is used in calculating quality- adjusted life years (QALYs). Finally, the hourly wage rate (converted to U.S.

OCR for page 33
40 RANKING VACCINES: A Prioritization Software Tool FIGURE 3-3 SMART Vaccines 1.0 screen where the user specifies the population information (by age and sex) to be used for ranking vaccines. dollars) gives a simple estimate of the value of time lost to illness for this population. In the screenshot shown in Figure 3-4, the user defines the char- acteristics of the disease for which candidate vaccines might be targeted. SMART Vaccines treats the disease characteristics separately from vaccine attributes, because the user may wish to consider a number of different vaccines that might apply against the same disease. In the example shown in Figure 3-4, the first block of data describes the disease impact on the relevant population (in this case, females in the United States), categorized by age group, but in less refined groupings than the actual population data. This approach is intended to reduce user bur- den in data entry, reflecting the many cases where more refined disease burden data may not be available. The population data include the number of people in each age group (calculated automatically from the population data if entered previously), the annual incidence per 100,000 people, and the case fatality rate (probability of death, conditional upon contracting the disease). The second block of data on this page shows the disease burden,

OCR for page 33
Data Synthesis, Software Redesign, and Evaluation 41 FIGURE 3-4 SMART Vaccines 1.0 screen where the user defines the burden of the selected disease, including morbidity scenarios and quality-of-life scores. Mouse-over pop-ups guide the user with additional information on the parameters. breaking the cases down into categories of severity, including death, and categories of required treatment (without outpatient treatment, with out- patient treatment, and with inpatient hospital care). For each of these cat- egories the user must enter the costs of providing each type of treatment (hospital costs, outpatient costs, medication costs, and other costs) as well as the disease duration and the disability tolls (for DALYs) or weight (for QALYs). The user then enters vaccine characteristics—a central compo- nent of the priority-setting process—in the screen shown in Figure 3-5. In this example, which involves information concerning an influenza vac- cine for the U.S. female population, separated into several age groups, the user specifies (using check marks) which age groups might appropriately receive the vaccine, the percent receiving the vaccination (coverage), and the effectiveness of the vaccine. It also provides the option of making herd immunity present or absent by using a check box. The second block on this screen requires data about the vaccine candidate itself—the duration of immunity conferred, the time to adop-

OCR for page 33
42 RANKING VACCINES: A Prioritization Software Tool FIGURE 3-5 SMART Vaccines 1.0 screen where the user enters such information as the product profile charac- teristics and other characteristics related to the vaccines being ranked. tion, doses required, cost per dose, administration cost, and estimates of research and development cost, licensure cost, and one-time start-up costs. In the next step, the user selects the vaccine attributes of interest. The attributes selected and the weights attached to them apply to every candidate vaccine (see Figure 3-6). In SMART Vaccines 1.0, the user can click a radio button for any category to bring up the list of attributes within that group. Using a check box, the user can then select the attributes that will be entered into the analysis. In this screenshot the set of attributes in the category “Health Con- siderations” is shown, and the user has selected “Incident cases prevented per year” and “Quality-adjusted life years gained.” In the subsequent screenshot (see Figure 3-7), the selection of attributes in the category “Scientific and Business Considerations” is shown. The user has selected “Likelihood of financial profitability for the manufacturer,” “Demonstrates new production platforms,” and “Interest from NGOs and philanthropic organizations.” This set of attributes might be chosen by, say, a vaccine manufacturer, whereas a different user might select a completely differ- ent set. Figure 3-8 shows the empty fields in which users can enter user-

OCR for page 33
Data Synthesis, Software Redesign, and Evaluation 43 FIGURE 3-6 SMART Vaccines 1.0 attribute selection page that permits the user to select and subsequently rank the attributes of importance listed in nine categories from health to policy, including up to seven user defined attributes. defined attributes. Currently, SMART Vaccines 1.0 can only handle binary options for user-defined attributes—that is, any attribute defined by the user is answered with either yes or no. The next screen (see Figure 3-9) appears in the form of a ranking dashboard and shows the attributes selected by this hypothetical user from all of the categories (note the color coding). The user assigns a rank to each of the seven chosen attributes. The weights calculated by the rank order centroid method appear in the bar chart on the right, with the greatest weight being applied to the attribute with the highest ranking. As with the prototype discussed earlier, the slider bars allow the user to modify these preliminary weights (calculated by the rank order centroid method) up and down (see Figure 3-10), and SMART Vaccines automati- cally recalculates the weights on other attributes so that the weights con- tinue to sum to 100 percent (a requirement of the multi-attribute utility model). More radical changes in weights can be accomplished by altering the rankings altogether. In this example, the user has increased the weights placed on “Likelihood of financial profitability for the manufacturer” from 4 percent in Figure 3-9 to 31 percent in Figure 3-10, thus making this the

OCR for page 33
50 RANKING VACCINES: A Prioritization Software Tool FIGURE 3-13 Output screen for SMART Vaccines permitting sensitivity analysis based on adjustment of weights. new candidates: a rotavirus and a pneumococcal vaccine. Though the users choose the same attributes—see Figure 3-20 for user X and Figure 3-21 for user Y—their rank orders for the selected attributes are different. User X has ranked incident cases prevented per year as most important, whereas user Y has selected net savings resulting from vaccine use as having the highest priority, with the other ranks also varying according to the differ- ent perspectives of the two users. Figure 3-22 shows user X’s comparative scores: a 33 for rotavirus and an 82 for pneumococcal vaccine. User Y’s results are shown in Figure 3-23: a SMART Score of 52 for rotavirus and of 77 for pneumococcal vaccine. The users may have arrived at these scores independently, but now their SMART Scores could enable a discussion between them. In this case, the “winner” in both cases is pneumococcal vaccine, albeit with slightly dif- ferent scores. If user X and user Y had settled on other sets of attributes and value judgments, then their preferences could have led to quite dif- ferent results, as often happens in real-world scenarios. Regardless of the outcome, however, the SMART Scores can help start a discussion between the users in which they compare their differing values and results.

OCR for page 33
Data Synthesis, Software Redesign, and Evaluation 51 FIGURE 3-14 Output screen for SMART Vaccines permitting sensitivity analysis based on adjustment of devel- opment risk. Users’ Evaluation of the Prototype The committee engaged seven potential users to provide comments on the user interface and functionalities relating to an early prototype of SMART Vaccines 1.0. These consultant evaluators participated in a webinar led by a committee member. Three evaluation sessions were conducted, with two of them lasting 1 hour each (one and two participants, respectively) and a third session lasting about 90 minutes (four participants). These sessions, which were carried out via a remote desktop connection, were intended to illustrate the dynamic capabilities of the software and to engage the evalu- ators in constructing possible evaluation scenarios. The evaluators were given a set of framing questions (see Box 3-1) in advance of the demonstra- tion sessions as a way of directing the focus of their feedback during those sessions. The reactions of the evaluators were overall very positive concern- ing the design and innovation underlying SMART Vaccines. In addition to this positive overall response, the consultants also provided feedback about possible further improvements and explored potential additional applica- tions of SMART Vaccines, which are discussed in Chapter 4. Moreover,

OCR for page 33
52 RANKING VACCINES: A Prioritization Software Tool FIGURE 3-15 Output screen for SMART Vaccines permitting adjustment or reassessment of original scores. during the review process, external reviewers of this report participated in a webinar session containing the software demonstration and offered feedback. Subsequently, the prototype evaluators were re-engaged to allow hands-on interaction with SMART Vaccines and to provide additional feedback prior to the software and report release. BOX 3-1 Framing Questions for Evaluators of SMART Vaccines 1.0 • Do you foresee using SMART Vaccines in the decision-making process of your organization? • What additional features would be desirable in SMART Vaccines? • Could the SMART Score be persuasive in guiding you to a ­decision?

OCR for page 33
Data Synthesis, Software Redesign, and Evaluation 53 FIGURE 3-16 Attribute structure and ranks created by a hypothetical federal agency director (user A) for evalu- ating a new human papillomavirus vaccine and a new influenza vaccine.

OCR for page 33
54 RANKING VACCINES: A Prioritization Software Tool FIGURE 3-17 Comparison of SMART Scores for two hypothetical new vaccines resulting from user A’s selected attributes and ranking system.

OCR for page 33
Data Synthesis, Software Redesign, and Evaluation 55 FIGURE 3-18 Attribute structure and ranks created by a hypothetical senior executive of a major pharmaceuti- cal company (user B) for prioritizing development between a new human papillomavirus vaccine and a new influenza vaccine.

OCR for page 33
56 RANKING VACCINES: A Prioritization Software Tool FIGURE 3-19 Comparison of SMART Scores for two hypothetical new vaccines based on user B’s selected attri- bute and ranking structure.

OCR for page 33
Data Synthesis, Software Redesign, and Evaluation 57 FIGURE 3-20 Attribute and rank structure selected by a hypothetical health minister (user X) in South Africa.

OCR for page 33
58 RANKING VACCINES: A Prioritization Software Tool FIGURE 3-21 Attribute and rank structure selected by a hypothetical finance and trade minister (user Y) in South Africa.

OCR for page 33
Data Synthesis, Software Redesign, and Evaluation 59 FIGURE 3-22 Comparison of SMART Scores for a new rotavirus vaccine and a new pneumococcal vaccine with user X’s rank and value structures.

OCR for page 33
60 RANKING VACCINES: A Prioritization Software Tool FIGURE 3-23 Comparison of SMART Scores for a new rotavirus and a new pneumococcal vaccine with user Y’s rank and value structures.