Below are the first 10 and last 10 pages of uncorrected machine-read text (when available) of this chapter, followed by the top 30 algorithmically extracted key phrases from the chapter as a whole.
Intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text on the opening pages of each chapter. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.
Do not use for reproduction, copying, pasting, or reading; exclusively for search engines.
OCR for page 109
6 Pathway to an Improved Survey T his final chapter is based on the information and analysis in previ- ous chapters of the report. It describes three potential prototypes to consider in redesigning the Consumer Expenditure Surveys (CE) and offers recommendations to the Bureau of Labor Statistics (BLS) from the Panel on Redesigning the BLS Consumer Expenditure Surveys on research and other inputs needed in redesigning the CE. OVERVIEW The CE has many purposes and a diverse set of data users. This is both the strength of the program and the foundation of its problems. The CE program tries to be “all things to all users.” The current design creates an undesirable level of burden on households and quality issues with its data. The Interview survey asks respondents for a very high level of detail collected over an entire year, with potentially counterproductive effects on their motivation and/or ability to report accurately. Because the Interview survey had been deemed not to satisfy all user needs, the program also includes the Diary survey. The Diary survey supplies much of the same information but at an even higher level of detail over a shorter period of time by using a different collection mode and a different set of respondents. However, Diary respondents appear to lack motivation to report consis- tently throughout the two-week collection period. Unfortunately, these two surveys are designed independently so the resulting data are not statistically consolidated to achieve their potential precision and usefulness. The Consumer Price Index (CPI) drives the level of detail asked in the 109
OCR for page 110
110 MEASURING WHAT WE SPEND current CE surveys. The CPI is a Principal Economic Indicator of the United States and a crucial user of the CE. The CPI currently uses CE data for over 800 different expenditure items to create the budget shares required for those indexes. Most, but not all, budget shares come from the CE. In theory, a number of survey designs can provide the information required by the CPI, collecting a significant level of expenditure data without inflicting the level of burden on households that the current CE does. These designs, including a number of “matrix type” sample designs, involve asking each household only a portion of the total detail required while using weight- ing and more sophisticated modeling to produce the needed estimates. The data from these types of designs can provide the needed level of detail with appropriate precision needed by the CPI but with less burden on each household (Eltinge and Gonzalez, 2007; Gonzales and Eltinge, 2008, 2009). This family of designs would also meet most of the needs of government agencies in program administration and would allow BLS to continue to publish standard expenditure data tables. However, these types of designs are not optimal for other uses of the CE. Researchers and policy analysts use the CE microdata to examine the impact of policy changes on different groups of households and to study consumers’ spending habits and trends. Many such uses are described in Chapter 2. These data users generally do not need the same level of item level detail required by the CPI. To them, the value of the CE lies in the “complete picture” of demographics, expenditures, income, and assets all collected for the same household. A comprehensive set of data at the house- hold level allows microdata users to look at the multivariate relationships between spending and income in different types of situations for different groups of households. These data users also use the “panel” component of the CE, which provides the same information for a given household over multiple quarters. Parker, Souleles, and Carroll (2011) and Chapter 4 (“Feedback from Data Users”) further describe the usefulness of panel data in this type of analysis. The multiple and divergent CE data uses are difficult to satisfy effi- ciently within a single design. Survey designs always involve compromise, and the current CE design tries to provide the breadth and detail of data to meet the needs of all users and then compromises by accepting the heavy burden and unsatisfactory data quality that emerges. The panel recommends that BLS redesign the CE only after rethinking what those compromises should be so that the trade-offs associated with redesign possibilities can be articulated and assessed within a well-developed priority structure. De- termining these types of priorities for the CE is ultimately the responsibility of BLS, and is beyond what would be appropriate or realistic for the panel to undertake. Therefore, the panel makes the following recommendation:
OCR for page 111
PATHWAY TO AN IMPROVED SURVEY 111 R ecommendation 6-1: It is critical that BLS prioritize the many uses of the CE data so that it can make appropriate trade-offs as it considers redesign options. Improved data quality for data users and a reduction in burden for data providers should be very high on its priority list. The panel recommends a major redesign of the CE once the priorities for a redesign are established. In its work, the panel concluded that many response and nonresponse issues in both the Diary and Interview surveys create burden and lead to quality problems with the expenditure data. The panel has concluded that less invasive cognitive and motivational correc- tions that might be made to improve recall and the reporting of specific expenditures would most likely increase overall burden. Since burden is inextricably connected with much of the survey’s problems, increasing it would be counterproductive. R ecommendation 6-2: The panel recommends that BLS implement a major redesign of the CE. The cognitive and motivational issues asso ciated with the current Diary and Interview surveys cannot be fixed through a series of minor changes. The charge to this panel was to provide a “menu of comprehensive design options with the highest potential, not one specific all-or-nothing design” (see Appendix B). Before BLS sets prioritized objectives for the CE, the panel’s most effective course of action is to suggest alternative design prototypes, each of which has a higher potential for success when enlisted to achieve a different prioritized set of objectives. With that said, these prototypes share much common ground. The statistical independence of the current interview and diary samples is elimi- nated. The prototypes orient data collection methods toward an increas- ingly computer-literate society in which new tools can make the reporting tasks easier for respondents while providing more accurate data. The new prototypes are geared to increase the use of records and decrease the effects of proxy reporting. There is an increased emphasis on self-administration of survey components, while creating tools and an infrastructure that will monitor and support the respondent in these endeavors. The field represen- tatives’ role will still be important in directly collecting data, but their role will grow to also provide support in additional ways. The panel proposes incentives that will increase a respondent’s motivation to comply and report accurately. Finally and most importantly, all three prototypes propose new proce- dures and techniques that have not been researched, designed, and tested. The prototypes that the panel offers are contingent upon new research un-
OCR for page 112
112 MEASURING WHAT WE SPEND dertakings and rigorous assessment. There is a lot of relevant background theory and research available, and the BLS research program and Gemini Project deserve praise for much of that work. However, the panel wishes to state clearly that the empirical evidence on how well each of the proposed prototypes would work is missing. As with the current CE surveys, the new prototypes include some diary-type data collection and some recall- type data collection. They include some self-administered data collection and some interviewer-assisted data collection. Notwithstanding, the new prototypes are sufficiently different from the current CE surveys that BLS cannot and should not use the current CE to extrapolate how well these prototypes will work in regard to response, accuracy, and underreport- ing. Considerable investment must be made in researching elements of the proposed designs, to find specific procedures that not only are workable but also are most effective. Some ideas will ultimately be successful, while others will be shown to have serious flaws. The critical point is that these prototypes are not operationally ready, and the process of selecting a pro- totype or components of a prototype for implementation should be based not only on BLS’ prioritization of goals of the CE, but also on empirical evidence that the proposed procedures can meet those goals. R ecommendation 6-3: After a preliminary prioritization of goals of the new CE, the panel recommends that BLS fund two or three major fea- sibility studies to thoroughly investigate the performance of key aspects of the proposed designs. These studies will help provide the empirical basis for final decision making. Issues related to nonexpenditure items on the CE were discussed in detail in Chapter 5. These issues include such things as synchronization of expenditure and nonexpenditure items over similar reference periods, and collecting changes in employment status and other life events. These types of issues are important to the research uses of the CE. The panel offers the following recommendation that should be viewed within the context of BLS prioritization of the goals of the CE. R ecommendation 6-4: A broader set of nonexpenditure items on the CE that are synchronized with expenditures will greatly improve the quality of data for research purposes, as well as the range of important issues that can be investigated with the data. The BLS should pay close attention to these issues in the redesign of the survey. With a new design, some existing uses of data may fall by the wayside. New and important uses will emerge. BLS has a talented and knowledge- able staff of statisticians and researchers who have worked with the CE
OCR for page 113
PATHWAY TO AN IMPROVED SURVEY 113 for many years. They understand the survey well, and the cognitive issues described by the panel are not a surprise to that staff. Using the framework that the panel has put forward, BLS statisticians will be able to pull together and test the specific details of a redesign that is appropriate for BLS’ collec- tive priorities, budget, and time frame. The rest of this chapter describes the three different prototypes, with many commonalities but each with its own focus. A more detailed discus- sion of those commonalities comes first, and then the report describes and compares the three prototypes. The final sections of the chapter begin a roadmap for moving toward a new design, including a discussion of im- portant research issues. PANEL’S APPROACH TO DESIGN AND THE COMMONALITIES THAT EMERGED The panel considered many approaches to a redesign of the CE, and sorted through those numerous options by focusing on the following fundamentals: • Improve data quality. • Be mindful that the resources (both out-of-pocket and staff) avail- able to support this survey are constrained. • Be mindful that the survey processes have to be workable across the entire population of U.S. households—the more distinct processes that need to be designed for different population groups, the more resources will be required. • Keep it simple—to the extent possible. • Provide respondents with relief from the current burden level of the CE. • Provide respondents with sufficient motivation to participate. • Support the use of records and receipts. • Support the current uses of the CE to the extent possible, and provide options in support of the prioritization of those uses in the future. • Utilize newer data collection methodology and external data sources when supportive of the above fundamentals. It is not reasonable for the panel to discuss all of the options that they considered and laid aside, but this section of the report is intended to illumi- nate the concepts and strategies that emerged with broad consensus during discussion of some of the major decision points in the panel’s deliberations. These commonalities can be seen in the design of the three prototypes.
OCR for page 114
114 MEASURING WHAT WE SPEND Implement a Major Integrated Redesign The panel came to an early conclusion that the cognitive issues with the existing surveys cannot be fixed with minor corrections, and it would be a mistake to focus independently on the various cognitive issues addressed in this report. The best approach is an overall redesign of the CE, with component pieces being shaped to minimize these cognitive problems at each phase of data collection. The sample design for the new CE should be developed with a view toward integrating sample panels and data collection periods on a panel via statistical modeling in the estimation process, rather than generating inde- pendent estimates for each panel and data collection period. This method ensures that all data collected within the design can be fully utilized to minimize variance of estimates by capitalizing on the temporal components of the design or by integrating sample panels that collect different, but related variables from respondents. It may be possible that sophisticated sample designs along with appropriate modeling can provide needed data products with reduced burden on respondents. In investigating this possibil- ity, it will be important to avoid creating household-level data with such a complicated structure of measurement error or statistical dependencies that it makes research use very difficult. At least, any reductions in possible use of the data need to be consistent with newly clarified BLS priorities. Reduce Burden The extreme detail associated with the current CE, and the amount of time and effort it takes to report those details, are major causes of under- reporting of expenditures. These need to be significantly reduced for most respondents. The panel identified a number of ways to reduce burden, and more than one burden-reducing concept is included in each redesign proto- type. Burden-reducing opportunities include (1) reducing the overall detail that is collected on expenditures, income, and/or assets; (2) asking details (or certain sets of details) to only a subsample of respondents, providing burden relief for the remaining sample; (3) reducing the number of times a household is interviewed or the number of tasks they are asked to do; (4) reducing the overall sample size and using more sophisticated estimation and modeling to maintain levels of precision; and (5) making response cog- nitively and physically easier. The panel spent considerable time identifying ways to reduce burden. It realizes that several of these options may be at odds with collecting a complete picture of income and expenses from each individual household over longer reporting periods. This is why it is essen- tial for BLS to further clarify its priorities for data uses, recognizing that one survey cannot satisfy all of the possible data users.
OCR for page 115
PATHWAY TO AN IMPROVED SURVEY 115 Use Incentives to Increase and Focus Motivation The CE surveys are very complex and burdensome, and even with the burden-reducing changes, the CE will remain a difficult challenge for households. Respondents currently have little motivation to respond, or more precisely to respond accurately, on the CE. The panel anticipates that respondents will have additional responsibility under a redesign to keep and record expenditures. The panel collectively agreed that respondents needed greater motivation to carry out these tasks and proposed that an incentive structure composed of monetary and nonmonetary incentives should be developed and implemented. The structure should be based on the amount of effort asked of a respondent and used to effectively encourage record- keeping and reporting from those records. The panel speculated that the incentive payments would need to be fairly large to effect the needed moti- vation to report accurately. Components of an effective incentive program are discussed in more detail later in this chapter. Support Accurate Use of Records The panel envisions a redesign that will increase the respondents’ use of records in reporting expenses. This can be accomplished in a variety of ways. In the three prototypes, incentives are offered. There is an increased emphasis in each prototype to incorporate supported self-administration in a way that provides a structure to promote accurate reporting and increased use of records. This means incorporating flexibility to allow respondents to provide data at a time and in a way that is most convenient for them, and to answer questions in the order that they prefer. It means redesign of data collection instruments (whether self-reports and interviewer-driven, paper or electronic), technology tools, training, reinforcement, and incentives to facilitate recordkeeping. Minimizing proxy reporting in the reporting of detailed information is another improvement that can lead to more accurate reporting and use of records and receipts. Redesign Survey Instruments The new CE will need to redesign data collection instruments so that they simplify the respondent’s task. The panel sees a movement toward self-administered data collection with the field representative acting in a support role. However, the prototypes also incorporate interviewing by field representatives. Even though the panel envisions a wide acceptance of tablet-based interfaces, paper instruments will be needed for the foreseeable future. The current instruments may not suffice for this purpose.
OCR for page 116
116 MEASURING WHAT WE SPEND Increase Use of Self-Administration The panel discussed the advantages and disadvantages of various col- lection modes, and considered changes from the current CE surveys. The panel expressed concern about the shift in the current CE toward telephone data collection (primarily due to constrained resources), and felt this was not the best shift for data quality. The panel’s final recommendations move toward self-administration of these complex surveys. There are several reasons for this shift. The first is to encourage the use of records as dis- cussed in the paragraphs above. This mode allows respondents to provide data in a way and at a time that is most convenient for them. When paired with an appropriate incentive structure, it can encourage respondents to take the time needed to use those records and receipts. A second reason is to take advantage of newer technology that can allow consistent, remote monitoring of self-administered data collection without the cost of having an interviewer present. Reduce Proxy Reporting The current CE surveys use proxy reporting because of the additional cost associated with working separately with multiple survey respondents within a household. The panel looked for solutions that will allow (and encourage) individual members of a household to report their expenditures without the accompanying increase in cost. The solution is a shared house- hold tablet that each member of the household can use to enter expenses, but there is still a “primary household respondent” who oversees the entire process. This solution does not provide confidential reporting, and thus does not solve the problem when household members are reluctant to share details of certain expenditures with other household members. However, it does have the potential to eliminate much of the current proxy report process with minimal added cost per household. Utilize Newer Data Collection Technology The time is right to emphasis new technological tools in data collec- tion. This is an essential component of the panel’s concept of supported self-administration. The panel discussed many technological alternatives and found one tool that was particularly appealing to the panel across a variety of designs—the tablet computer. The panel proposes the use of tab- lets in each of its redesign prototypes as an effective data collection tool. Lightweight and easy-to-use tablets represent stable (robust) technology, are commonplace, feature more than sufficient computing power, and are economical in price. The panel envisions that the tablet would sit on the
OCR for page 117
PATHWAY TO AN IMPROVED SURVEY 117 kitchen counter and be used by multiple household members in a “shared” approach to recording expenditures. The panel also considered such alternatives as Web-based data col- lection, smart phone apps, and portable scanners for receipts. All are interesting tools and potentially could be used together in a redesigned CE. However, the panel stuck with its fundamentals—keep it simple and be mindful that the survey processes have to be workable across the entire population of U.S. households and that each additional approach (tool) will require additional resources to build and support. The panel looked for the one tool with the most potential. Web data collection is not that tool. The Bureau of the Census (2010) estimated that only 44 percent of all U.S. households had Internet ac- cess either within or outside the home. This percentage varied greatly by household demographics and income. So requiring Internet access to use the electronic instrument would relegate the majority of households to the “paper” option. Additionally, building high quality Web-based instruments that work on multiple platforms (different computers, different browsers, high-speed versus dial-up Internet access, smart phone browers) can be very resource intensive. By providing the tablet to the household, BLS would be developing for a single platform, and the panel hypothesizes that a substan- tially greater percentage of households will be able to use the tool than if BLS relied on Web collection. The panel saw similar issues with using smart phone apps—lack of coverage of the population of households, and considerable variability in hardware and software platforms. These devices are growing in popularity, but BLS would have to develop and maintain multiple versions even for use within the same household. Portable scanners would allow respondents to scan receipts and upload them to a waiting file. These devices could be used along with the tablet PC for use in recording receipts. However, the array of formats and abbrevia- tions that are used on printed receipts would likely require considerable intervention after the scanning to properly record each expenditure. Add- ing these scanners to the household would also require additional training. The use of technology tools, and the tablet PC in particular, is discussed in more detail later in this chapter. The panel will recommend that BLS begin using this one simple tool, knowing that its implementation will be challenge enough for the short run. Use Administrative Data Appropriately but with Caution The potential use of external records or alternative data sources as a replacement or adjunct to current survey data for the CE is often raised in discussions of a CE redesign. Whether at the aggregate or the micro level,
OCR for page 118
118 MEASURING WHAT WE SPEND the appeal of “readily available” information appears, at first glance, to be low-hanging fruit. Although such information might hold great promise, upon closer inspection the panel also realized that use of these data is ac- companied by increased risk and significant resource outlays. There is a cost/quality/risk trade-off that needs to be fully investigated and understood. The panel discussed the potential use of these external data at the micro level and identified several concerns: Permission from household members to access such things as personal financial data, utility bills, and shopping data (loyalty card) would be difficult to obtain and thus replace only a small percentage of survey data; BLS would have to develop an in-house infrastructure to access and process data from each external source (this would be a significant drain on available resources); and BLS would have to continue to field a complete survey for the majority of households. That said, there are scenarios under which these data could be quite useful, par- ticularly at a more macro level. However, caution is warranted. This subject is discussed in greater detail later in this chapter. Create a Panel Component and Measure Life Event Changes Economic analysts utilize the panel component of the current CE in much of their research. The report incorporates a panel component (with data collection from the same households at a minimum of two points in time) within each of the three prototypes. Each design also includes a re-measurement of income and “life events” (such as employment status, marital status, and disability) at each wave. However, the panel components differ considerably from design to design in the length of the response pe- riod, and this will significantly affect their relative usefulness in economic research. Of the three prototypes described, Design B has the most com- prehensive panel component, with three waves and a response period of six months for each wave. Design C has two waves with a response period of three months for each wave. Design A has two waves, but with more variable response periods for each wave. REDESIGN PROTOTYPES In this section, the panel presents three specific redesign prototypes. All three designs meet the basic requirements presented in Consumer Expenditure Survey (CE) Data Requirements (Henderson et al., 2011). All three prototypes strive for increased use of records, incorporate self- administration (supported by the field representative, a tablet computer, and a centralized support facility) as a mode of data collection, and use incentives to motivate respondents. All three prototypes continue to use field representatives for interviewing and other support, and they all feature
OCR for page 119
PATHWAY TO AN IMPROVED SURVEY 119 either a single sample or integrated samples. However, each prototype is different—a better fit for a specific use of the data. BLS needs to prioritize the various data requirements of the CE and move toward a redesign that is best for its carefully considered prioritization. In overview, • Design A focuses on obtaining expenditure data at a detailed level. To do this, the panel proposes a design with concurrent collection of expenditures through a “supported journal”—diary-type self- administered data collection with tools that reduce the effort of recordkeeping while encouraging the entry of expenditures when memory is fresh and receipts available. It also features a self- administered recall survey to collect larger and recurring expenses that need a longer reporting period. This design collects a complete picture of household expenses but with reports over different re- porting periods. • Design B provides expenditure data for 96 expenditure catego- ries, rather than the more detailed expenses provided by Design A, but provides a complete picture of household expenditures over an 18-month period. It builds a dataset that would be excel- lent for economic and policy analysis. This design makes use of a recall interview coupled with a short supported journal. Two subsequent contacts with the same households are made over 13 months, repeating the original data collection using supported self- administration to the extent possible. This design also recommends that a small subsample be subsequently interviewed intensively over the following two calendar years, with collation of records, the use of financial software, and emphasis on a budget balance. This is discussed separately at the end of the description of Design B. • Design C incorporates elements of both Designs A and B. It col- lects the detail of expense items as in Design A, while providing a household profile for six months. To do both, it uses a more com- plex sample design, collects different information from different samples, and requires more extensive use of modeling to provide expenditure estimates and the household profile. Design A—Detailed Expenditures Through Self-Administration This prototype features a sample of households with two data collec- tion waves, each of which features the concurrent reporting of expenditures over a two-week period using a supported journal. The design also incorpo- rates a self-administered recall survey for larger, less frequent expenses. The design maximizes the use of supported self-administration and concurrent reporting of expenses. Figure 6-1 provides a flow outline of Design A.
OCR for page 177
PATHWAY TO AN IMPROVED SURVEY 177 ods that produce multiple values for planned or unplanned missing data from questionnaires, along with an understanding of how estimates and their standard errors can be generated by users. In addition, leveraging ad- ministrative and commercial data on expenditures will necessitate expertise in statistical methods for data linkage and integration. Estimation methods will require greater reliance on models and potentially the ability to create synthetic (fully imputed) datasets that can provide users with information to measure consumer behavior over time. BLS staff must be hired or trained to carry out these activities. Knowl- edge of sampling techniques and weighting will not be enough. More expertise is needed in model-assisted and model-based estimation methods for sampling, imputation, estimation, data integration, and error modeling to generate data products, evaluate methodological research, and quantify error in estimates (including the impact of methodological changes). • Develop a more fluid bridge between operations, research, and expertise in other organizations. More flexibility will be gained if research and operations staffs have closer ties. Production staff can help think through the practical issues that might arise with a new method (e.g., what might work well, what problems could arise) and gain exposure to possible future changes in the survey well before they are called upon to implement them. Research staff may develop more effective experiments and gain an understanding of aspects of data collection they are not familiar with. However, it will be important for program staff and research staff to have the technical abilities necessary to communicate with one another. This means that the two staffs must have a basic understanding of what each will bring to the table for solving both the statistical and operational problems that are sure to arise in implementing any new CE survey design. Other agencies have extensive expertise in areas that will be of interest to BLS as it redesigns the CES and other surveys. For example, the Census Bureau, which currently has responsibility for the CE data collection, has expertise in using administrative data to augment survey datasets and is devoting considerable energy to expanding its abilities in this area. Census staff have also conducted research in survey designs that administer partial questionnaires to each respondent. Joint research endeavors can be used to leverage expertise in these areas. In addition, where institutional barriers prove detrimental to conduct- ing responsive research, it may be wise to develop partnerships with survey vendors who are able to provide a quicker and more effective research service than is possible within BLS.
OCR for page 178
178 MEASURING WHAT WE SPEND R ecommendation 6-9: BLS should increase the size and capability of its research staff to be able to effectively respond to changes in the contextual landscape for conducting national surveys and maintain (or improve) the quality of survey data and estimates. Of particular importance is to facilitate ongoing development of novel survey and statistical methods, to build the capacity for newer model-assisted and model-based estimation strategies required for today’s more complex survey designs and nonsampling error problems, and to build better bridges between researchers, operations staff, and experts in other organizations that face similar problems. Obtaining Necessary Expertise Through Others The development of tablet-based applications requires technical ex- pertise that is likely not available at BLS or the Census Bureau at present. Rather than take the time to develop that expertise in-house, the panel urges BLS to pursue outside expertise to speed up the development and evaluation of tablet-based applications. This will require detailed knowledge of the development environment (e.g., Android, Apple iOS) and familiarity with data collection tools such as those being envisioned for the CE. Access to design and usability expertise will also be critical for the successful devel- opment of such applications, which will likely require close collaboration with BLS subject-matter staff. But relying on in-house expertise to develop the apps would likely result in development delays and possibly suboptimal designs. If BLS decides to pursue the tablet path to redesign, developing requirements for such contract work and getting outside experts engaged as soon as possible will be critical to the success of the redesign. R ecommendation 6-10: BLS should seek to engage outside experts and organizations with experience in combining the development of tablet computer applications along with appropriate survey methods in de- veloping such applications. Targeted Research Needed for Successful Implementation of Design Elements The panel views the CE redesign not as one major effort, but as an ongoing process to continually address changes in the population and in survey methods. However, it is important to set priorities for aspects of the design that need the most immediate attention to achieve basic change. The CE survey methods group at BLS has undertaken many important projects to better understand survey errors and identify design features that can help reduce these errors. The panel’s objective is to provide guidance on
OCR for page 179
PATHWAY TO AN IMPROVED SURVEY 179 important areas for further research and assist in their prioritization. Below, the pathways for further research are divided into those needed to inform improvements to the surveys and those needed to inform the general design that has been suggested. Research Needed to Support the CE Redesign The Bureau of Labor Statistics will need to conduct research to support and inform the redesign and its implementation. The panel recommends the use of tablet technology as an important new technology for collection of expenditure data, and there are numerous aspects of its implementa- tion that require evaluation. The panel also recommends the collection of fewer, less detailed, expenditure categories in two of the prototypes, which requires evaluation of how this structure can be used to compute the CPI and how to best collect these data. The panel also recommends research on several other promising areas that may lead to further improvements of the survey to reduce burden and help obtain better quality data, presented in a separate subsection This is by no means a comprehensive list of research areas, but an identification of several areas that need additional research, related to implementation of the proposed designs. Use of a tablet device. All three of the proposed designs recommend the use of a tablet device. There are numerous potential benefits in using a tablet, yet they depend on how a device is selected and implemented. Even the overall advantage of a tablet over a paper instrument is an assertion that needs to be evaluated rather than taken for granted. The highest priority research is an evaluation of the tablet technology. Criteria may include the utility, interface, robustness, data storage, trans- mission, and cost. Related to this is an evaluation of the optimum type of interface. The panel recommends one resembling the TurboTax model, which features separate modules that can be selected in any order and provides the respondent with an option to enter information directly into a form or to do so through a structured conversational guide. Other types of interfaces are possible, such as one that more closely aligns with a typi- cal survey questionnaire or an interface that is more like an event history calendar. Conditional on a selected interface design, experimentation with different visual design elements that provide visually appealing and easy-to- understand features will also be beneficial. Whatever the interface design, it needs to have a self-evident navigation, as in mobile apps. Experimentation with the structure of the instrument will also be needed. For example, the instrument can be modularized by type of expen- diture (e.g., food, clothing), it can have a linear structure (e.g., reporting all
OCR for page 180
180 MEASURING WHAT WE SPEND expenditures for the day), or offer both options. Each approach has unique advantages and disadvantages, yet an understanding of any measurement differences is needed first. Experimentation will be needed in determining the best way for the technology to provide help in key entry of items. Possibilities include drop-down menus, automated “completion” of a word being entered, and other options. Experimentation is needed to develop prompts to encourage forgotten expenditures. In addition to designing an intuitive interface, a critical aspect of the design of the application is to maintain respondent engagement and to effectively motivate respondents to report all expenditures. For example, since the respondents are provided with the tablet device, it is possible to include games and utilities that improve user engagement. The interface itself can use features of “gamification”—applying psychological and at- titudinal factors underlying successful games to motivational strategies to improve user engagement, such as virtual badges, points, and status levels. Experiments will be needed in order to achieve a design that attains a high level of respondent engagement, which can in turn help to collect more ac- curate and complete data. The choice of a tablet device, the design of the interface, and the ad- dition of any motivation features are all prerequisites for the successful implementation of a tablet device to collect expenditure data. Yet there are fundamental questions about the cost feasibility and ubiquitous use of the tablet technology that need to be addressed. First among these questions is what proportion of the households (consumer units, or CUs) will have the motivation and ability to use a tablet, once an intuitive interface has been constructed. The answer to this question does not necessarily affect the use of a tablet device, but it can inform the overall design and resource allocation. Several direct comparisons to the use of a paper-and-pencil instrument will be needed to determine the desirability of using a tablet device. Such a comparison will also allow for a better understanding of measurement differences between the two modes, as a likely final design will involve the use of both modes by different sample members. Comparisons between the two modes will be needed for both cost and quality outcomes, as a trade-off is likely. Finally, much of the objective in the CE redesign lies in reduction of respondent burden; the tablet and the paper-and-pencil administration need to be compared in terms of respondent burden. How people keep financial records. A fruitful line of research may be to gain a better understanding of how different people and households keep financial and expenditure data. This could inform the methods used to col- lect the expenditure data and the design of the tablet interface, as well as
OCR for page 181
PATHWAY TO AN IMPROVED SURVEY 181 identify different subgroups for which the methods need to be different. For example, some households may have electronic records of nearly all expen- ditures, such as on credit card and bank account statements; others may keep paper receipts for expenditures; a third group may use a combination of methods; and yet a fourth group may not maintain sufficient records in any form. Among those who keep electronic records, some may even use specialized software that serves as a single repository of expenditure data, such as tax-related software packages. Within a household, some may rely on a single person to be responsible for all expenditure records, while other households divide this responsibil- ity by the type of expenditure or by the person who made the expenditure. These examples are certainly an oversimplification of what is invariably a complex and multifaceted recordkeeping phenomenon, but studies are needed to improve understanding of how households and individuals within those households keep expenditure records today. Collecting data on a reduced set of 96 expenditure categories. Much of the burden in the CE surveys stems from the data requirements imposed on the surveys. It is imperative to conduct a study to investigate designs that minimize the number of questions and that reduce burden on respondents, in order to acquire accurate data. Both Design B and Design C require this research. The instrument can be reduced in a number of ways, but at a minimum, an evaluation is needed of the impact of collecting 96 categories of expenditures instead of the more detailed 211 expenditure categories now collected. A preliminary evaluation of the impact on the CPI, for ex- ample, can be conducted using extant data. Use of incentives. The U.S. population has become more reluctant to participate in surveys (e.g., Groves and Couper, 1998; Stussman, Dahlhamer, and Simile, 2005), and incentives can help mitigate the effect on nonresponse. Key, however, is how incentives are incorporated into the survey design, if they are included. The panel did not venture to recommend a particular design, as this choice can only be informed through experi- mentation. Aspects that may warrant experimental manipulation include the structure (e.g., prepaid versus promised, household versus individual), timing (e.g., prior, during, or after completion of the supported journal), form (e.g., cash, and if cash, whether it is electronic transfer), criteria for payment (e.g., a certain level of supported journal completeness), amounts, and potential use of differential incentives (e.g., lower compliance groups, based on burden such as from the number of people in the household). More detail on this topic is covered earlier in this chapter under “Guidelines for the Use of Incentives.”
OCR for page 182
182 MEASURING WHAT WE SPEND Instrument development. A substantial amount of research will be needed on the instrument development. For example, experiments are needed to (1) investigate the optimum period to ask households to keep a supported journal, (2) evaluate measurement error, nonresponse rates, and error, and (3) evaluate the stability of the estimates. Other important areas for study, especially pertinent to the proposed designs, are the optimum recall period for different types of expenditures and the optimum time between interviews. All the proposed designs include a shift to greater reliance on self- reports, yet still involve interviewer administration to some degree. Thus, it would be beneficial to experiment with interviewer- and self-administration of different types of questions. Privacy versus open access. Each household member can input his or her own expenses, but there is a design choice in whether to allow each respondent to see the expenditures from others in the household. The panel is recommending allowing all household members to view recorded expenditures of the household. Certainly, allowing such transparency in expenditures within the household can limit duplication of expenses, but raises a number of potential issues, ranging from potential problems related to unwanted disclosure of expenditures to other members of the household to intentional underreporting of expenditures due to the lack of privacy. Re- search is needed to compare providing household members with complete privacy (e.g., individual login for each member and no information sharing between accounts) versus being able to see and assess the completeness of total household expenditures. Potential impact from reducing proxy reporting of expenditures. The use of proxy reporting invariably involves error trade-offs, which need to be evaluated. Understanding whether the additional measurement error overwhelms the reduction in nonresponse error compared to not using proxy reporting can inform the use or avoidance of proxy reporting in the redesigned survey. Schaeffer (2010) provided useful guidance to BLS for evaluating the use of proxy reporting, including separate comparisons by topic, reference period, and relationship to the sample member and the proxy respondent. As Mathiowetz (2010) pointed out, validation studies of the accuracy in reporting based on self versus proxy reports for objective measures are virtually nonexistent; further research on the CE is needed. A further complication in the direct comparison is cost; proxy reporting will likely reduce the data collection costs and will also need to be evaluated against the likely higher measurement error in the proxy reports. However, the designs proposed by the panel encourage multiple responders within the household with a minimum of additional cost.
OCR for page 183
PATHWAY TO AN IMPROVED SURVEY 183 Experiments with imputation methods and other statistical approaches. An important recommendation that is also reflected in all three alternative de- signs, and especially Design C, is a greater integration of statistical methods into the survey design. The designs vary along this dimension with increas- ing reliance on statistical methods, with two designs using subsamples with more intensive methods to calibrate the rest of the collected data to reduce measurement error and provide accurate estimates for detailed types of ex- penditures. At least two lines of research are needed to inform the proposed designs. All designs involve the collection of data for a small number of the weeks in a year. Weighting or imputation methods will be needed to garner annual expenditure estimates at the household level. Finally, whichever data need is addressed, different statistical methods will need to be evaluated. Evaluation of the effectiveness of using more intensive methods. The proposed designs suggest the use of more intensive methods to improve the accuracy of the data collected on all sample members, or the use of more intensive methods on a subsample in statistical adjustments for measure- ment error. Whether such approaches are warranted and how they are implemented depend on the effectiveness of the more intensive methods to obtain more accurate data. The effectiveness of these methods, in turn, depends on what they entail. Therefore, experimentation is needed with different designs. Additional Research Although not necessitated by any of the proposed designs, several avenues for additional research can prove beneficial to the CE redesign, particularly in the long run. Experiment with other technologies to record and extract data. Many technologies can be used to help record or extract available data on expen- ditures, such as scanners (including bar code scanners and receipt scanners), handheld devices and smart phones with cameras, and software that can facilitate importing of statements. These technologies are rapidly evolving and, therefore, involve the risk of being outdated in terms of hardware and software. Furthermore, the way expenditure data are stored is also chang- ing, and a challenge for any of these technologies is to be relevant for the foreseeable future. Cautious investigation of technologies is recommended, but no single technology is likely to replace the collection of survey data. Split questionnaire design. An area in which BLS has devoted consider- able attention is the potential use of split questionnaire design—a form of matrix sampling of survey questions in which several distinct forms of the
OCR for page 184
184 MEASURING WHAT WE SPEND survey are constructed (thus modules are sampled, rather than questions) and respondents randomly (although not necessarily with equal probability) assigned to one survey form. Evaluate the utility and the ability to obtain data from additional sources. Attempts can be made to retrieve expenditure data either from other sources or directly from records that the respondents have retained. These may in- clude credit card/bank account statements, utility statements, pay stubs, and tax records. Some guidance may be obtained from other surveys such as the Residential Energy Consumption Survey (RECS) in how to obtain permission to access these records. Most of this evaluation, however, may need to be tailored to the CE. Augment sample with wealthy households. The wealthiest households tend to be nonrespondents at a higher rate, causing substantial problems for some uses of the CE data. A potential remedy is to augment the CE with additional samples of wealthy households, such as based on IRS records or income data linked to small geographic areas. BLS has possibilities to link the CE sample of households to existing administrative data sources, such as IRS records, that can provide some better information about nonrespon- dents. Design C has a base survey that would easily facilitate the selection of additional higher income households in its follow-on components. Identify and evaluate sources of auxiliary data (e.g., retailer data). Re- placement of some CE data with data from other sources, such as retailer data or data from other surveys such as the RECS, is a risky expectation, but certain survey or diary data elements may be replaced or augmented us- ing other sources of data. More likely uses of auxiliary data, due to the dif- ferent error properties and reasons for their collection, are as benchmarks that can help evaluate the CE estimates and changes in the CE estimates, and to aid in sampling, post-survey adjustments, and estimation. One ex- ample would be to leverage auxiliary data (such as income) obtainable on sampled households from the Census Bureau to estimate nonresponse bias and improve nonresponse adjustments. A broad range of auxiliary data can be considered, such as IRS data, for a multitude of uses. Permission may be needed from a household to access these data, and research could explore “opt-out” permission (rather than “opt-in”) for the CE surveys, which would allow access to a household’s data unless they receive a “no access” notification. Research by Pascale (2011) and Singer, Bates, and Hoewyk (2011) on the use of administrative records in other contexts pro- vides useful background for conducting such research. Furthermore, these data sources change over time, and investigation of such sources needs to be ongoing rather than at one point in time.
OCR for page 185
PATHWAY TO AN IMPROVED SURVEY 185 Research Specific to a Single Prototype These topics were listed earlier in the chapter under the descriptions of the panel’s three prototypes, but are repeated here so that research needs are together in one place in this report. Design A—Detailed Expenditures Through Self Administration: • Develop models that would estimate quarterly and annual expen- ditures and income at the household level from the four weeks of reported detailed data plus the data reported on larger and routine expenditures. Design B—A Comprehensive Picture of Expenditures and Income: • Investigate the assumption that a “bounding” interview is unneces- sary to avoid telescoping and other issues. • Investigate the accuracy and completeness of aggregated expendi- tures for periods up to six months and for estimates of averages (i.e., average monthly spending for gasoline) used in this prototype to construct a full set of microdata for the entire six-month period. • Develop appropriate models to “disaggregate” aggregated expenses using data from the one-week supported journal. • Develop methodology for a successful component that will use an intensive interview and process based on prior collation of records and financial software to achieve a budget balance for the year at the household level as described below. Extend existing research done by Fricker, Kopp, and To (2011) to fully evaluate its potential and limitations. Design C—Dividing Tasks Among Multiple Integrated Samples: • Research and develop models for estimation using the base survey and two waves of data collection. • Research and develop models for imputing at the household level “smaller expense items” collected on the detailed expenditure component and not on the household profile component into the household-level dataset to complete the overall household expense profile. R ecommendation 6-11: BLS should engage in a program of targeted research on the topics listed in this report that will inform the specific redesign of the CE.
OCR for page 186
186 MEASURING WHAT WE SPEND The redesign of the CE is not a static operation, and the panel antici- pates a long-term need for BLS to continue to propose, test, and evaluate new data collection methods and technologies. Thus the panel recommends that BLS maintain a methods panel to allow such testing into the future. R ecommendation 6-12: BLS should fund a “methods panel” (a sample of at least 500 households) as part of the CE base, which can be used for continued testing of methods and technologies. Thus the CE would never again be in the position of maintaining a static design with evi- dence of decreasing quality for 40 years. SUMMARY The current CE design has been in place since the late 1970s, and change is needed. The uses of the CE have grown over that time, and the current program tries to meet the needs of many users. The result is that the current surveys create an undesirable level of burden, and the data suf- fer from a number of quality issues. The panel believes that change should begin with BLS prioritizing the breadth and detail of data currently sup- porting the many uses of the CE so that a new design can most efficiently and effectively target those priorities. The panel offers three prototype designs, each of which meets the basic requirements presented in Consumer Expenditure Survey (CE) Data Re- quirements (Henderson et al., 2011). A given prototype may be a better fit than others, depending on the revised objectives of the CE. The prototypes have considerable comparability as well. They all are designed to promote an increased use of records. They all incorporate self-administration (sup- ported from the field representative, a tablet computer, and a centralized support facility) as a mode of data collection. They all use incentives to motivate respondents. This report provides guidance to BLS in the next steps toward rede- sign. It recommends that BLS produce a roadmap for redesign within six months. The report provides guidance on how to incorporate new tech- nology, particularly the tablet computer. The redesigned CE will still be a difficult survey for respondents, and the panel recommends developing an effective program of incentives to enhance motivation. It provides guidance in doing so. The panel understands that a redesign of the CE will require significant targeted research to develop specific procedures that are workable and most effective. The report provides an outline of the research that is needed and the panel’s suggestions on the priority of those research endeavors. The panel recommends that BLS enhance the size and capability of its in-house research program in order to carry out the targeted research but also to
OCR for page 187
PATHWAY TO AN IMPROVED SURVEY 187 meet additional challenges in future years. Finally, the panel recommends that BLS reach out to other organizations for assistance in implementing the tablet-based data collection system and the apps that will make it work smoothly. The panel has great confidence that BLS, with its dedicated and knowl- edgeable staff, will be able to move forward successfully toward a new CE. We trust that this report has helped in that process. It has been a challeng- ing opportunity to consider these issues and make recommendations.