National Academies Press: OpenBook

Reengineering the Census Bureau's Annual Economic Surveys (2018)

Chapter: 4 Harmonization of Questionnaires and Data Collection Processes

« Previous: 3 Business Register
Suggested Citation:"4 Harmonization of Questionnaires and Data Collection Processes." National Academies of Sciences, Engineering, and Medicine. 2018. Reengineering the Census Bureau's Annual Economic Surveys. Washington, DC: The National Academies Press. doi: 10.17226/25098.
×

4

Harmonization of Questionnaires and Data Collection Processes

This chapter examines the data collection processes for the U.S. Census Bureau’s annual economic surveys covered in this report. It considers the development and harmonization of questionnaire content and field data collection operations, working toward the panel’s recommended Annual Business Survey System (ABSS). At present, the surveys largely operate as “silos,” with each survey having its own way of working, its own methodology, and its own implementation plans. Even some Census Bureau employees seemed surprised by the many differences that were revealed in the Bureau’s presentations to the panel.

In recent years, the Census Bureau has taken steps toward streamlining the survey and statistical processes for the annual economic surveys. By 2015, the Bureau had transferred survey data collection efforts from the staff of the individual surveys to a centralized management division, and subsequently described to the panel its planned efforts under the direction of an “Econ Hub” to harmonize questionnaire content and style across the surveys.1 The intent of these efforts is to use similar processes across the annual economic surveys to achieve improvements in quality and reductions in respondent burden and costs. In a recent presentation at the International Conference on New Techniques and Technologies, Ron Jarmin, the Bureau’s Associate Director for Economic Programs, explained that the Census Bureau wants to move from “siloed governance to corporate governance” (Jarmin, 2017). There has been considerable progress toward

___________________

1 Based on a presentation by Jessica Wellwood, U.S. Census Bureau, “Econ Hub: Content Harmonization,” during the panel’s June 3, 2016, meeting.

Suggested Citation:"4 Harmonization of Questionnaires and Data Collection Processes." National Academies of Sciences, Engineering, and Medicine. 2018. Reengineering the Census Bureau's Annual Economic Surveys. Washington, DC: The National Academies Press. doi: 10.17226/25098.
×

harmonization across the different annual surveys, but more remains to be done.

Several national statistical offices, such as those in Canada and the Netherlands, have moved away from specialized business surveys toward a more integrated, process-oriented organization in which new organizational units have each been assigned a specific process for all of the surveys. The result is that questionnaires, communication materials, and field operations all can be developed in a coordinated fashion, with the goal of harmonizing as much as possible while tailoring some materials to the circumstances of particular sampled industries. The panel notes that one of the goals of the Census Bureau’s Econ Hub is to achieve this questionnaire and process alignment and that the Bureau has taken initial steps to move in this direction.2 With a more harmonized system, one potential benefit is that staff time for data collection process management and information technology support can be reduced. Although the achievable labor savings are likely to vary across organizations, the 2000 redesign of the Annual Structural Surveys at Statistics Netherlands (see discussion in Chapter 8) made it possible to reduce staffing for these surveys by 40 percent.3

4.1 DATA SOURCES

An important consideration in reengineering the Census Bureau’s annual economic surveys is the choice of data sources, which currently include surveys as the primary source, together with federal administrative data and annual reports and 10-K reports that public companies are required to provide to both shareholders and the Securities and Exchange Commission.4 With the increasing difficulty and cost of maintaining high response rates for the annual economic surveys and the growing availability of alternative data sources, there is the potential to move the surveys and a future ABSS toward a mix of data sources, including sources not in current use, that could better meet users’ needs for timeliness, consistency, and additional relevant content.

As discussed in Chapter 3, administrative data supplied by the Internal Revenue Service (IRS) are the backbone of the Census Bureau’s Business Register, which also draws on data from the Social Security Administration and the Bureau of Labor Statistics (BLS). The register is regularly updated by the Company Organization Survey (COS) and the Business and Professional Classification Survey (SQ-CLASS). Table 4-1 summarizes the

___________________

2 Based on a presentation by Susanne Johnson, U.S. Census Bureau, “Data Collection Strategy and Response Monitoring,” during the panel’s June 3, 2016, meeting.

3 Personal communication from Wim Vosselman and Jeroen van Velzen, Statistics Netherlands, July 21, 2016.

4 The required form gives a comprehensive summary of a company’s financial performance.

Suggested Citation:"4 Harmonization of Questionnaires and Data Collection Processes." National Academies of Sciences, Engineering, and Medicine. 2018. Reengineering the Census Bureau's Annual Economic Surveys. Washington, DC: The National Academies Press. doi: 10.17226/25098.
×

TABLE 4-1 Uses of Administrative and Other Data for the Census Bureau’s Annual Economic Surveys

Survey Uses of Administrative Data from the Business Register Uses of Company Reports
Sampling-Related Purposesa Editing and Evaluation of Survey Responses Imputation for Missing Responses Use of Data in Lieu of Questionnaires Imputation for Missing Responses
ASM x x x xb
M3UFO x x x x
MOPS x
ARTS x x x xc x
AWTS x x x xc x
SAS x x x x
ACES x x
ICTSd x x
ASE x e
SQ-CLASS x
COS x

NOTES: See text for full names of surveys.

aIncludes weighting; some surveys draw new samples each year from the register, while others draw completely new samples only once every 5-6 years (see Chapter 5).

bASM uses Business Register administrative data for sampled employer businesses that it defines as “small” and “medium.” (ASM does not sample nonemployer businesses.)

cARTS and AWTS use administrative data for sampled nonemployer businesses. They draw IRS administrative records data from the Census Bureau’s annual Nonemployer Statistics (NES) program (see https://www.census.gov/programs-surveys/nonemployer-statistics/about.html [November 2017]) and not directly from the Business Register. The same IRS records feed both the NES and the register.

dSurvey is currently suspended.

eASE uses data from the previous economic censuses for imputation.

SOURCE: Information provided to the panel by the Census Bureau; see also Appendix C.

ways in which the administrative data contained in the Business Register are used in each annual survey program. The register provides the sample frame for all 11 surveys (see Chapter 5), including the Annual Survey of Manufactures (ASM), Manufacturers’ Unfilled Orders Survey (M3UFO), Management and Organizational Practices Survey (MOPS), Annual Retail Trade Survey (ARTS), Annual Wholesale Trade Survey (AWTS), Service Annual Survey (SAS), Annual Capital Expenditures Survey (ACES), Infor-

Suggested Citation:"4 Harmonization of Questionnaires and Data Collection Processes." National Academies of Sciences, Engineering, and Medicine. 2018. Reengineering the Census Bureau's Annual Economic Surveys. Washington, DC: The National Academies Press. doi: 10.17226/25098.
×

mation and Communication Technology Survey (ICTS),5 Annual Survey of Entrepreneurs (ASE), SQ-CLASS, and COS. Several surveys use register data for editing and imputation (see Chapter 6), while some surveys obtain information on smaller businesses from IRS tax records—the same information that is used in the register—instead of sending them a questionnaire.

Some surveys do not incorporate data from the Business Register when survey data are not available. For example, as detailed in Appendix C, ASE considers the register information on receipts to be “unreliable”: instead, it uses a model incorporating information (generally older) from the economic censuses to impute missing information on receipts. In contrast, ARTS and AWTS use IRS data for nonemployer businesses in lieu of asking for survey responses, and ASM uses IRS data for employer businesses that it defines as small and medium in lieu of asking for survey responses. In some cases, when IRS data are not available on a timely basis, these surveys perform imputations for nonemployer businesses. Finally, several surveys use 10-K and annual reports for public companies for item nonresponse.

Overall, each survey program has apparently come to different conclusions about whether and to what extent the Business Register administrative records-based data are suitable to use for editing and imputation for incorrect or missing data or for substitution for a survey questionnaire. These decisions reflect a variety of considerations, including not just the quality of the register data, but also other factors such as the units for which data are available on the register in comparison with the units as defined for the survey. However, the panel believes that any concerns about the quality or content of register data can be addressed (see Chapter 3) and that expanding the use of these data could yield substantial efficiencies.

The Census Bureau is allowed to use administrative data where possible in the production of its statistics and specifically to use administrative data in place of “direct inquiries” (U.S. Code Title 13). Section 6 “requires that the Census Bureau use administrative data from other agencies, state and local governments and other instrumentalities, and private organizations instead of conducting direct inquiries if such data meet the quality and timeliness standards of the Census Bureau” (Gates, 2009). Three such uses could include: (1) directly incorporating administrative data for smaller businesses in place of a questionnaire; (2) replacing selected questions with administrative data for larger businesses; or (3) informing larger businesses of the administrative values on file at the Census Bureau and asking if they are correct. Any of these uses would reduce respondent burden and could improve quality and timeliness. For the third option, research would be needed to determine the extent to which businesses might respond that the administrative values on file were correct when actual consultation of their

___________________

5 The ICTS has been suspended since 2014 due to lack of funds.

Suggested Citation:"4 Harmonization of Questionnaires and Data Collection Processes." National Academies of Sciences, Engineering, and Medicine. 2018. Reengineering the Census Bureau's Annual Economic Surveys. Washington, DC: The National Academies Press. doi: 10.17226/25098.
×

records would have resulted in a different answer. The additions to the content of the Business Register that we recommend (see Recommendation 3-1, in Chapter 3) would make it more desirable and feasible to expand the uses of the register data along these lines.

Other statistical agencies around the world have made a deliberate choice to use administrative data more systematically where possible rather than collecting survey data (see, e.g., Bakker and van Rooijen, 2012; U.N. Economic Commission for Europe, 2007; Wallgren and Wallgren, 2007). For example, at Statistics Netherlands, following years of study and increasing incorporation of administrative information for statistical purposes, the use of register data was formalized in 2004 with the new Act on Statistics Netherlands. This law states that, in order to reduce response burden, secondary sources must be considered first, before conducting surveys (see, e.g., Snijkers, Göttgens, and Hermans, 2011). At Statistics Canada, the Integrated Business Statistics Program systematically incorporates the use of administrative data to reduce respondent burden (Statistics Canada, 2015).

More widespread use of administrative records in survey-based programs may reduce both survey costs and respondent burden. Following U.S. Census Bureau Statistical Quality Standards: Section B2, Acquiring and Using Administrative Records (U.S. Census Bureau, 2013), administrative data and any other potentially useful sources for the annual economic surveys and an ABSS (e.g., those similar to the annual reports and 10-K forms already being used for public companies) would need to be evaluated for quality, the timeliness of their delivery, and the availability of metadata (the definitions of the data). Daas and colleagues (2011) provide an in-depth discussion of quality issues related to secondary sources; see also Bakker and van Rooijen (2012) and work conducted by the European Statistical System network (ESSnet).6 For an in-depth discussion of the use of multiple data sources in constructing statistics, see National Academies of Sciences, Engineering, and Medicine (2017).

RECOMMENDATION 4-1: The Census Bureau should carry out a comprehensive evaluation of the potential for using administrative data sources as a supplement to or an alternative for collecting survey responses for its various annual economic surveys. The Bureau should move toward a mixed-data sources approach for an

___________________

6 See, e.g., the ESSnet project on the use of administrative and accounts data for business statistics, https://ec.europa.eu/eurostat/cros/content/use-administrative-and-accounts-data-business-statistics_en [November 2017]; and the ESSnet project on the Quality of Multisource Statistics—KOMUSO, https://ec.europa.eu/eurostat/cros/content/essnet-quality-multisource-statistics-komuso_en [November 2017].

Suggested Citation:"4 Harmonization of Questionnaires and Data Collection Processes." National Academies of Sciences, Engineering, and Medicine. 2018. Reengineering the Census Bureau's Annual Economic Surveys. Washington, DC: The National Academies Press. doi: 10.17226/25098.
×

Annual Business Survey System to reduce respondent burden and, potentially, to reduce costs and improve data quality.

4.2 KEY CONCEPTS FOR DATA COLLECTION

The first step in harmonizing data collection for the Census Bureau’s annual economic surveys (whether data are collected in a survey or obtained from other sources) is to identify and harmonize the key concepts that are collected across the questionnaires. This does not mean that industry-specific content is to be ignored, but, rather, that industry-specific variation should be avoided except when necessary. In addition, it is important to understand that aligning concepts is different from aligning questions. Concepts may be defined similarly for all industries, but operational definitions may be tailored to various industries. For example, when measuring total annual revenue, the appropriate reference period may be the calendar year or a fiscal year. This reference period may vary across industries, and questions could be tailored to include the most common reference period for different industries. Alternatively, respondents could be allowed to define whether a calendar or fiscal year is appropriate for their reporting, with a question asked of all respondents about what reference period was used. We refer to this as a “tailoring within harmonization” approach.

Harmonization efforts should be forward looking, working toward the panel’s recommended ABSS. For example, any harmonization initiative that is undertaken should review the concepts underlying the transition to the North American Product Classification System across all industries, not just manufacturing (see Chapter 3). The initial harmonization effort that the Census Bureau has launched under its Econ Hub can be the key to building a question repository from which items for any survey and industry can be obtained for future data collection.

As the Census Bureau proceeds to harmonize and align key concepts across the various annual economic surveys, it will be important to keep in mind: (1) the user perspective, related to the relevance of the data; (2) the respondent perspective, related to tailoring questionnaires to industries and minimizing respondent burden; and (3) the methodological perspective, especially with respect to data quality. In addition, to ensure that all data sources are measuring the same concept to the extent feasible, harmonization will need to consider the concepts underlying administrative records (and other data sources) that are planned for use in place of survey questions for some or all sampled businesses.

Suggested Citation:"4 Harmonization of Questionnaires and Data Collection Processes." National Academies of Sciences, Engineering, and Medicine. 2018. Reengineering the Census Bureau's Annual Economic Surveys. Washington, DC: The National Academies Press. doi: 10.17226/25098.
×

4.2.1 Key Concepts in Current Surveys

The key concepts currently measured in the annual economic surveys are shown in Table 4-2. Some of the surveys, such as M3UFO, ACES, and ICTS, collect specialized information that does not have a counterpart in the surveys for industry sectors. Focusing on four of the industry sector surveys—ASM, ARTS, AWTS, and SAS—and delving into the questionnaire details, it becomes clear that these surveys do not collect a consistent set of information and that concepts that may appear the same are not necessarily the same. We offer three examples of these differences:

  1. ASM is the only one of the four surveys that collects any information on employment, yet such information for other sectors (including on alternative work arrangements, such as contracting, telework, and others—see “Data Gaps,” Section 4.3 below) would be of considerable interest to users.
  2. All four surveys collect information on e-commerce, but the definitions are slightly different, as is the requested information, and it is not known which version produces the most accurate data with the most comparability across industry sectors:
    • ASM asks for the percentage of the reported “total value of products shipped and other receipts” accounted for by e-commerce, defined as “goods that were ordered or whose movement was controlled or coordinated over electronic networks.”
    • ARTS asks for the dollar amount of sales from e-commerce, defined as “the sale of goods and services where the buyer places an order, or the price and terms of the sale are negotiated, over an Internet, mobile device (m-commerce), extranet, EDI [Electronic Data Interchange] network, electronic mail, or other comparable online system. Payment may or may not be made online.”
    • AWTS asks for the dollar amount of e-commerce sales, separately for EDI network sales and online systems sales, using a similar definition of e-commerce to that used in ARTS.
    • SAS asks for either the dollar amount or the percentage of revenues accounted for by e-commerce, defined as revenues from “customers entering orders directly on the company’s website or mobile applications, . . . on third-party websites or mobile applications, . . . or via any other electronic systems (such as private networks, dedicated lines, kiosks, etc.).”
  3. All four surveys indicate that goods delivered on a rental or leasing basis are to be included when calculating total revenues, but the instructions for ASM ask the respondent to include the esti
Suggested Citation:"4 Harmonization of Questionnaires and Data Collection Processes." National Academies of Sciences, Engineering, and Medicine. 2018. Reengineering the Census Bureau's Annual Economic Surveys. Washington, DC: The National Academies Press. doi: 10.17226/25098.
×

mated market value of “high cost office and production equipment” leased to a customer and not the lease payments, while lease payments requested in ARTS, AWTS, SAS, and ASM presumably include information for other leased products.

Harmonization likely would have value for many of the concepts asked across multiple questionnaires in the annual economic surveys. Harmonization would need to take into account the implications for the corresponding concepts in the economic censuses and the monthly and quarterly economic indicator surveys.

The experience of Statistics Netherlands with the harmonization of its Annual Structural Surveys, undertaken in 2000, may be illuminating. The basic approach was what we have referred to as “tailoring within harmonization.” The Census Bureau’s annual economic surveys are tailored to some extent within a specific industry—for example, SAS has questionnaire versions for subsectors of the service industry. Statistical Netherlands took that concept further to tailor its surveys across all North American Industry Classification System (NAICS) groups.

In this process, Statistics Netherlands staff focused on the key concepts and objectives underlying the questions on the various surveys included in the harmonization effort. After identifying those concepts and objectives, they worked toward a harmonized questionnaire design (see Section 4.5 below, “Questionnaire Design and Development”). The result was a core questionnaire with 58 versions, tailored by NAICS main group and business size (one version for each industry for large businesses and a shorter version for smaller businesses). Each tailored NAICS-specific core questionnaire collects information on the same concepts but asks the core questions in a way intended to be meaningful to respondents in that industry (Snijkers and Willimack, 2011). Today, Statistics Netherlands has 90 tailored versions of the core questionnaire, though in some cases the difference between these versions is as small as one question asked only of respondents in a particular NAICS main group. In addition to the tailored core questionnaires, NAICS subgroup–specific concepts were identified, including data items that only need to be collected for specific sectors of the economy. These

TABLE 4-2 Key Concepts Measured in the Census Bureau’s Annual Economic Surveys

Survey Concepts
ASM Total annual employment, payroll, cost of materials, operating expenses, value of shipments, e-commerce shipments, value added, end-of-year inventories, inventory held outside the United States, and value of product shipments
Suggested Citation:"4 Harmonization of Questionnaires and Data Collection Processes." National Academies of Sciences, Engineering, and Medicine. 2018. Reengineering the Census Bureau's Annual Economic Surveys. Washington, DC: The National Academies Press. doi: 10.17226/25098.
×
Survey Concepts
M3UFO Total annual sales and total year-end value of unfilled orders from domestic (U.S.) companies for selected in-scope manufacturing activities
MOPS For manufacturing establishments, use of performance indicators; basis for managers’ bonuses; promotion methods; treatment of underperforming staff; whether establishment or headquarters decides on hiring, pay increases, product pricing, new product introduction, and advertising; availability and use of data to support decision making; use of predictive analytics; and background characteristics of employees
ARTS Total annual sales, e-commerce sales, end-of-year inventories, inventory held outside the United States, gross margins, total operating expenses (detail every 5 years), purchases, taxes, accounts receivable, and merchandise lines
AWTS Total annual sales, e-commerce sales, end-of-year inventories, inventory held outside the United States, purchases, and total operating expenses (detail every 5 years)
SAS Operating revenue and expenses for both taxable and tax-exempt companies and organizations, e-commerce data, sources of revenue by type, export and inventory data for selected industries only
ACES Capital expenditures for new and used structures and equipment, capital leases, capitalized computer software, internally developed software, prepackaged software, vendor-customized software, domestic depreciable assets, gross sales, operating receipts, and revenue
ICTSa Two categories of noncapitalized expenses (purchases; operating leases and rental payments) and capital expenditure data for four types of information and communication technology equipment and software (computers and peripheral equipment; information and communication technology, equipment, excluding computers and peripherals; electromedical and electrotherapeutic apparatus; and computer software, including payroll associated with software development)
ASE Core questionnaire: demographic information (gender, ethnicity, race, veteran status, educational attainment), and reasons for ownership; hours worked, job functions, activity level (e.g., seasonal) for owner(s); sources of funding for business; effects of regulations on profitability; outside advice sought; types of customers and employees; whether have e-commerce revenue or operations overseas. Supplement content varies.
SQ-CLASS Two months of sales or receipts, principal lines of merchandise, company organization, North American Industry Classification System codes, wholesale inventories, class of customer (for retail and wholesale businesses), and tax status
COS Establishment-level employment, quarterly and annual payroll, operational status classification, and physical location

NOTES: See text for full names of the surveys.

aSurvey is currently suspended.

SOURCE: Key concepts are those identified on the Census Bureau website for each survey and by the panel from an inspection of the survey questionnaires.

Suggested Citation:"4 Harmonization of Questionnaires and Data Collection Processes." National Academies of Sciences, Engineering, and Medicine. 2018. Reengineering the Census Bureau's Annual Economic Surveys. Washington, DC: The National Academies Press. doi: 10.17226/25098.
×

items are included on subquestionnaires administered to businesses in the appropriate NAICS codes. The entire questionnaire for a business consists of the applicable core questionnaire supplemented with the questions in the industry-specific subquestionnaire.

4.2.2 Users’ Perspective

The panel’s proposed harmonization process provides the opportunity for the Census Bureau to assess the relevance of key concepts to producing useful statistics. This process requires addressing two issues: whether the existing key concepts are still relevant, considering user needs and a changing economy; and in what areas the list of key concepts may need to be expanded. Additional concepts that have been identified by data users as crucial to providing greater understanding of the economy, such as information on innovation and global value chains (see Chapter 2), could be included at the stage of concept identification: see Section 4.3 below (“Data Gaps”) for the panel’s suggestions in this regard. Obtaining answers to the two questions about users’ data needs will require additional discussions with stakeholders, especially regarding decisions that may be made to exclude items or concepts from the set of concepts used in a core questionnaire or subquestionnaire.

4.2.3 Respondents’ Perspective

During the panel’s proposed concept review, issues related to question overlap will need to be addressed and resolved. First, respondent burden can be reduced by eliminating duplicative questions. Second, response burden related to the selected concepts and data items can be identified through feasibility studies and the results of any pretests of the annual survey questionnaires. In particular, when identifying concepts, it is important to determine not only whether the information requested is easily available in business records, but also whether the Census Bureau’s understanding of the concepts aligns with respondents’ understanding.

A team of survey methodologists at the Census Bureau has done work on this issue. Their methods include cognitive interviews with businesses, respondent debriefings, and usability testing, among others. In a presentation to the panel, Willimack described a number of concepts that the team has found are difficult for businesses to report or to find in their records and survey questions that use language that does not align with the language used by businesses. One example is that “cost of R&D [research and development]” in survey questions is interpreted as “R&D expense” by responding businesses. In other words, there has been a disconnect between the concept of R&D “performed” that was of interest to the

Suggested Citation:"4 Harmonization of Questionnaires and Data Collection Processes." National Academies of Sciences, Engineering, and Medicine. 2018. Reengineering the Census Bureau's Annual Economic Surveys. Washington, DC: The National Academies Press. doi: 10.17226/25098.
×

survey designers and the concept of R&D “paid for” that respondents understood to be what was requested.7

A third set of issues includes difficulties encountered by respondents in providing a detailed accounting of employees or operating expenses by type. Empirical information about potential data quality problems will be essential to incorporate in the major redesign of questionnaires that leads to an ABSS.

4.2.4 Methodological Perspective

Along with the perspective of respondents, also needed are the expertise of methodologists and the empirical evidence bearing on the quality of the responses to various questions they can provide. For instance, comparisons between the information provided by businesses in response to a question and corresponding information from the Business Register may be illuminating in terms of potential measurement errors and whether it would be an improvement to obtain the information directly from the register in lieu of a question. Examination of item nonresponse rates, imputation rates, and editing rates may also provide insights into the reliability and validity of responses, as well as the response burden they impose on businesses.

Methodologists also may be able to identify instances in which concepts and questions overlap, which can result in dropping selected questions from the survey questionnaires. Alternatively, it may be that a matrix sampling approach for content, whether in the current surveys, or in the core questionnaire or one or more subquestionnaires for an ABSS, would be an appropriate means of reducing burden and improving response. In such an approach, all sampled businesses would receive some questions (Y) in common, but different subsamples would receive subsets of other questions. For example, subsample A might receive A, B, and Y questions, while subsample B receives B, C, and Y questions, and subsample C receives C, A, and Y questions. Imputation would be used to supply missing values for respondents not asked a question subset by modeling the responses for the other subsamples. This approach adds complications for questionnaire design and processing, but it could be useful to explore when all of the data are needed, there is clear evidence of burden that adversely affects the quality of responses, and appropriate administrative records are not available to replace questions.

Similarly, methodologists may conclude that some questions, for which responses are known to be stable over time, could be asked less frequently than once a year. Currently, the Business Expenses Supplement is included

___________________

7 Based on a presentation by Diane K. Willimack, U.S. Census Bureau, “Testing and Evaluation of Questions and Instruments,” during the panel’s June 3, 2016, meeting.

Suggested Citation:"4 Harmonization of Questionnaires and Data Collection Processes." National Academies of Sciences, Engineering, and Medicine. 2018. Reengineering the Census Bureau's Annual Economic Surveys. Washington, DC: The National Academies Press. doi: 10.17226/25098.
×

on ASM annually but only every 5 years on ARTS and AWTS, and MOPS is an every 5-year supplement to ASM. Reducing the frequency of some questions could be investigated more fully as part of harmonizing concepts and questions among the annual economic surveys and for an ABSS, although it must be recognized that asking questions in some years and not others impedes users’ ability to construct time series.

Finally, methodologists can assist with various kinds of research, in addition to pretesting, that are important for concept evaluation. In social surveys, results from pretesting often deal with comprehension issues, the first step in the Tourangeau question-answer model (Tourangeau, Rips, and Rasinski, 2000). This approach also is applicable for business surveys, but in the business context the process of retrieving information from records also needs to be studied carefully (Giesen, 2007; Willimack and Nichols, 2001). Retrieval in business surveys involves the overall response process for a business, including the number of response coordinators, respondents, and data providers (e.g., in departments such as accounting, human resources, and others) who play a role in this process, as well as the various sources from which the data need to be retrieved. Time also plays a role, as the data may not yet be available when the survey is conducted but may become available at a later date. Bavdaz (2010), Haraldsen, Snijkers, and Zhang (2015), and Lorenc (2007) address these issues in business responses. A number of methods can be used to study the response (or questionnaire completion) process and to appropriately tailor a questionnaire to the business context, such as early-stage scoping, feasibility studies, recordkeeping studies, or accounting expert reviews. Snijkers and Willimack (2011) and Willimack (2013) provide an overview of methods that can be used in the concept and data item definition stage. The results of these investigations can be used to inform the design of a questionnaire, as well as the Census Bureau’s communications strategy. In addition, when a questionnaire has been drafted, it is critical that it be pretested with organizations that are or could be selected into the survey as respondents.

When Statistics Netherlands carried out a sequential process of the sort just described, questions on the original questionnaires that eventually were dropped included those with high item nonresponse rates or other indications that the responses were of poor quality. Stakeholders were consulted before the questions were dropped, with negotiation between Statistics Netherlands and users about balancing content, response burden, and data quality. Although data users always want more data with high levels of detail, it is nonetheless important for a statistical agency not to defend retaining questions that have high response burdens and low-quality answers.

In the redesign of its Annual Business Survey, Statistics Canada had the same experience. Data users wanted to have all the detailed questions on the questionnaire, but empirical evidence about the poor quality of some of

Suggested Citation:"4 Harmonization of Questionnaires and Data Collection Processes." National Academies of Sciences, Engineering, and Medicine. 2018. Reengineering the Census Bureau's Annual Economic Surveys. Washington, DC: The National Academies Press. doi: 10.17226/25098.
×

the questions resulted in those items being dropped. Being able to propose other solutions, such as substituting modeled estimates or obtaining the data from other sources, can be important factors in user acceptance of a decision to drop a question. For questions that have high item nonresponse rates and perform poorly on other quality metrics to be retained, it must be absolutely clear why they are needed and that there are no other ways of obtaining the data.

4.2.5 A Concept Harmonization Team

Table 4-3 displays information about the Census Bureau units that are currently responsible for the development of the annual economic survey questionnaires. All of the annual surveys receive operational support from the Economic Management Division (EMD). The Innovation and Technology Office and the Application Services Division are responsible for the development of the instruments. Subject-matter experts, within various divisions, including the Economy-Wide Statistics Division (EWD), the Economic Indicator Division (EID), and the Center for Economic Studies (CES), are responsible for content for various of the annual surveys. Although in information provided to the panel some surveys cited input from the Economic

TABLE 4-3 Groups Responsible for Content in Questionnaires Used for the Census Bureau’s Annual Economic Surveys

Survey Subject Matter within EWD Subject Matter within EID Subject Matter within CES ESMD
ASM x
M3UFO x
MOPS x x
ARTS x
AWTS x
SAS x x
ACES x
ICTSa x
ASE x x
SQ-CLASS x
COS x x

NOTES: See text for full names of the surveys.

aSurvey is currently suspended.

SOURCE: Information provided by the Census Bureau to the panel.

Suggested Citation:"4 Harmonization of Questionnaires and Data Collection Processes." National Academies of Sciences, Engineering, and Medicine. 2018. Reengineering the Census Bureau's Annual Economic Surveys. Washington, DC: The National Academies Press. doi: 10.17226/25098.
×

Statistical Methods Division (ESMD) Data Collection Methodology and Research Branch, in many cases mentions of methodological and questionnaire design expertise were notably missing.

We envision that EMD would establish a content harmonization team, separate from the staffs of the individual annual economic surveys, to develop harmonized concepts to be introduced into the various annual economic surveys and ultimately into a new ABSS. The team would have a goal of developing concepts for a core questionnaire and tailored subquestionnaires. We envision a small team that includes both subject-matter experts and survey methodologists with expertise in the business context; to prevent many conflicting content-matter discussions, it should not be too large a team. One possible approach would be to form a subgroup of subject-matter specialists, including one subject-matter expert from each of the industry sectors. Such a group of subject-matter experts would be assigned to specify the content and concepts to be measured in the new core instrument and those to be included in subquestionnaires. Additional subject-matter experts could be brought in at that point to flesh out the content for the various subquestionnaires.

Survey methodologists and measurement experts would be critical at this stage in evaluating existing and proposed concepts. Methodologists would be able to provide input into how companies keep their records and whether information is available at the level of aggregation desired (establishment or enterprise). It is important to reiterate that, as discussed in U.S. Census Bureau Statistical Quality Standards: Section A2, Developing Data Collection Instruments and Supporting Materials (U.S. Census Bureau, 2013), the role for survey methodologists and measurement experts goes beyond the evaluation of individual questions. Studying the business context at an early stage provides important input to questionnaire development, as do previous pretest results.

Questionnaire development would draw on the same team, but with different roles. Just as the subject-matter experts would have primary responsibility for concept specification, the survey methodologists with the expertise to write the survey questions across industries would have primary responsibility for designing and developing the questionnaires and the pretesting of questionnaire drafts; they would work with information technology staff for programming the questionnaires and continue to work with subject-matter experts. The questionnaire development team could be managed by a project manager reporting directly to a more senior manager who identifies the roles and responsibilities in the concept and questionnaire harmonization process.

In a presentation to the panel, Jessica Wellwood said that the Census Bureau’s Econ Hub would be launching a content harmonization team that would be given the charge to harmonize concepts and definitions for sales (also called shipments, receipts, and revenue), expenses, payroll, inventory,

Suggested Citation:"4 Harmonization of Questionnaires and Data Collection Processes." National Academies of Sciences, Engineering, and Medicine. 2018. Reengineering the Census Bureau's Annual Economic Surveys. Washington, DC: The National Academies Press. doi: 10.17226/25098.
×

and administrative information about companies, including the employer identification number, months in operation, collection period, operating status, contact information, certification, and NAICS codes.8 This commitment to aligning concepts and questions across industries was restated in the Econ Hub blueprint document subsequently shared with the panel. The panel views this as an important and necessary effort, and one that the Census Bureau can build on.

RECOMMENDATION 4-2: The Census Bureau should use a sequential approach for evaluating and harmonizing key concepts in the annual economic surveys, starting with concepts that are common across multiple surveys and most critical for economy-wide statistics and then moving to less-central concepts. The Census Bureau should then evaluate each concept and item asked or expected to be asked in the annual economic surveys for potential inclusion, revision, or deletion for an Annual Business Survey System.

RECOMMENDATION 4-3: The Census Bureau should involve both subject-matter experts and methodologists in questionnaire redesign for the annual economic surveys, moving toward an Annual Business Survey System. These experts should be part of a content harmonization team that works on concept harmonization and feasibility studies, through new question construction and pretesting, to final questionnaire design.

4.3 DATA GAPS9

Although there are likely to be benefits for deleting some of the concepts and data items included in the current annual economic surveys, there is also additional information not currently being collected that would be especially valuable to data users. As discussed in Chapter 2, topics of particular interest include innovation, investment in new technologies,

___________________

8 Based on a presentation by Jessica Wellwood, U.S. Census Bureau, “Econ Hub: Content Harmonization,” during the panel’s June 3, 2016, meeting.

9 As outlined in Section 8.1, “The Concept of an ABSS,” in Chapter 8, the Census Bureau recently gained approval to field a new Annual Business Survey (ABS), beginning in June 2018, which combines ASE, the quinquennial Survey of Business Owners, the innovation portion of the Business R&D and Innovation Survey (BRDIS), and the Business R&D and Innovation Survey-Microbusinesses (BRDI-M). This development occurred after the panel had completed its work. It may well be that the ABS will fill gaps identified in this section. Careful coordination of the ABS with the panel’s recommendations for an ABSS will be required to minimize unnecessary overlap while ensuring that the ABS includes sufficient information on such topics as contracting out and offshoring to facilitate industry-specific data and analysis.

Suggested Citation:"4 Harmonization of Questionnaires and Data Collection Processes." National Academies of Sciences, Engineering, and Medicine. 2018. Reengineering the Census Bureau's Annual Economic Surveys. Washington, DC: The National Academies Press. doi: 10.17226/25098.
×

organizational practices, the use of workers under alternative employment arrangements, the contracting out of business functions, and offshoring and global value chains. Implicit in many of these topics is a broad interest in the role of intangible capital, such as intellectual property and human capital, as contrasted with tangible capital, such as equipment and structures, in the modern economy. Among the reasons to be interested in data on these topics is the contribution such data can make to better understanding ongoing structural changes in the U.S. economy and the sources of productivity growth.

The Census Bureau already collects some information on several of these topics, but gaps remain. The Business R&D and Innovation Survey (BRDIS), funded by the National Center for Science and Engineering Statistics at the National Science Foundation and fielded by the Census Bureau, collects information on product and process innovation and on R&D spending. An important omission in BRDIS, however, is that it does not collect information on marketing or organizational innovations (National Research Council, 2014).

ICTS, a supplement to ACES, collects information about capital spending on information and communication technology equipment and computer software. Given the growing importance of information technology to business operations, the recent suspension of this survey has created an important data gap. The panel also notes, however, that companies are changing how they meet their information technology needs, with many opting to pay for information technology services rather than purchasing information technology equipment (see Byrne, Corrado, and Sichel, 2017). In view of these ongoing changes, the design of the ICTS may need some rethinking, but there is certainly a need to understand how companies are acquiring and using information technology.

MOPS, first fielded in 2010 and then in 2015, is planned as a quinquennial supplement to ASM, asking questions about the information that managers use to guide production and investment decisions, the centralization or decentralization of business decisions, the use of performance-based compensation, and the share of workers with flexible hours or work-at-home arrangements. By design, however, MOPS covers only the manufacturing sector, so that similar information is not available for other sectors.

ASE, launched in 2014 as a collaboration between the Census Bureau, the Marion Ewing Kauffman Foundation, and the Minority Business Development Agency, covers all nonfarm employer businesses in most sectors of the economy. In addition to core information about the characteristics of business owners, the length of time the business has been in operation, payrolls, and employment, each year’s ASE has contained a supplemental module with questions on some special topic. The 2014 module focused on business innovation and R&D activity; the 2015 module focused on

Suggested Citation:"4 Harmonization of Questionnaires and Data Collection Processes." National Academies of Sciences, Engineering, and Medicine. 2018. Reengineering the Census Bureau's Annual Economic Surveys. Washington, DC: The National Academies Press. doi: 10.17226/25098.
×

business management practices; and the 2016 module focused on business advice and planning.

The panel commends the efforts the Census Bureau has made to address gaps in the information available on topics data users have identified as particularly important and notes that the development of an ABSS will create new opportunities to fill the gaps in existing information. Given the Bureau’s desire to streamline the annual economic surveys and minimize the burden on survey respondents, the panel does not believe that it would be desirable to add large numbers of new questions to the core questionnaires. The desire for better information on innovation may best be satisfied through modifications to BRDIS. Reinstating an updated ICTS as a supplement to ACES would be a natural way to obtain information on how companies are meeting their information technology needs. Mesenbourg (2015) suggests a set of possible questions about how companies use information technology that could be considered for incorporation into ICTS.

For the other topics listed above—organizational practices, the use of workers under alternative employment arrangements, the contracting out of business functions, and offshoring and global value chains—supplements to one or more of the current annual economic surveys could be developed. In turn, the experience with such supplements could inform the design of an ABSS to include periodic supplements carried out on a regular schedule as a good way to meet important needs for information. The periodic supplements could be administered to all or a subsample of the enterprises that receive the core ABSS questionnaires, with their coverage spanning all sectors of the economy (see Chapter 8).

The questions on current Census Bureau data collections will in some cases be a natural starting point for questions to be included on the envisioned supplements. Many of the topics of interest relate to how businesses organize themselves to carry out their essential functions. The 2015 ASE supplemental module, for example, included a useful question about the share of employment accounted for by full-time employees; part-time employees; paid day laborers; temporary staffing from a temporary help agency; leased employees; and contractors, subcontractors, independent contractors, or outside consultants. This question was followed by a series of questions about the categories of work performed by each type of worker, categorized into 10 functional areas (procurement, logistics, and distribution; operations related to the company’s main business activity; marketing, sales, and customer accounts; customer and after-sales service; product or service development; technology and process development; general management; human resources management; strategic management; and other).

Questions developed by researchers outside the Census Bureau also may be helpful for devising questions for an ABSS supplemental module to ad-

Suggested Citation:"4 Harmonization of Questionnaires and Data Collection Processes." National Academies of Sciences, Engineering, and Medicine. 2018. Reengineering the Census Bureau's Annual Economic Surveys. Washington, DC: The National Academies Press. doi: 10.17226/25098.
×

dress data gaps. Similar to the questions developed for the 2015 ASE, a set of questions piloted by Brown, Sturgeon, and Cole (2013) asks companies about the extent to which various functions are conducted by the company’s own employees, contracted out domestically, conducted by an international affiliate, or offshored to a foreign provider. The functions listed in their survey questions are similar to those used in the questions developed for the ASE, but the questions are designed with the goal of determining the importance of contracting out and offshoring in performing these functions. Mesenbourg (2015) suggests a similar set of questions for learning about how companies carry out their business operations, along with related questions about companies’ motivations for their operational decisions.

Looking to the future, new topics on which modules would be especially valuable may emerge. As recommended in Chapter 2, the Census Bureau will wish to engage with data users on a continuing basis to ensure that those topics are identified.

RECOMMENDATION 4-4: The Census Bureau should develop the ability to collect data on emerging and missing topics of special interest for users of economic survey data on a timely basis. Topics identified by users for which additional information is needed include innovation, investment in new technologies, organizational practices, the use of workers under alternative employment arrangements, the contracting out of business functions, and offshoring and global value chains. The Census Bureau should determine which of these topics are appropriate for the annual economic surveys and an Annual Business Survey System and develop new or expanded questions and supplemental rotating modules accordingly.

4.4 DATA COLLECTION MODES

Having identified the concepts and sets of data items that will be collected (both core data items and those specific to a subquestionnaire or supplemental module), the next step in a harmonization process is the design and development of the actual questionnaires for the annual economic surveys and an ABSS. This first requires decisions about the data collection mode(s) to be used because question design will be affected by mode choice.

4.4.1 Currently Used Data Collection Modes

The modes of questionnaires and data collection available to respondents in the existing annual economic surveys are shown in Table 4-4. The Census Bureau has been working for more than two decades on moving respondents to electronic reporting. This transition was recently completed,

Suggested Citation:"4 Harmonization of Questionnaires and Data Collection Processes." National Academies of Sciences, Engineering, and Medicine. 2018. Reengineering the Census Bureau's Annual Economic Surveys. Washington, DC: The National Academies Press. doi: 10.17226/25098.
×

TABLE 4-4 Mode of Invitation, Data Collection, and Follow-Up for the Census Bureau’s Annual Economic Surveys

Survey Mode
Initial Letter Sent by Mail Web-Based Questionnaire Worksheet Available (offline use only) Downloadable Spreadsheet Available (can be uploaded) Paper Questionnaire Available Telephone Nonresponse Follow-Up
ASM x x x x
M3UFO x x x x
MOPS x x x x
ARTS x x x x
AWTS x x x x
SAS x x x x
ACES x x x x
ICTSa x x x x
ASE x x x b
SQ-CLASS x x x x
COS x x x x

NOTES: See text for full names of the surveys.

aSurvey is currently suspended.

bASE uses mail follow-ups

SOURCE: Information provided to the panel by the Census Bureau.

Suggested Citation:"4 Harmonization of Questionnaires and Data Collection Processes." National Academies of Sciences, Engineering, and Medicine. 2018. Reengineering the Census Bureau's Annual Economic Surveys. Washington, DC: The National Academies Press. doi: 10.17226/25098.
×

and web-based questionnaires are now available for all of the annual surveys currently fielded.

ASM and COS provide a downloadable spreadsheet (in Excel) for sampled businesses, along with worksheet or PDF versions to aid respondents in determining reporting requirements and gathering data. M3UFO and MOPS offer respondents a choice of electronic or paper questionnaires, while ICTS had paper questionnaires when this survey was last conducted. AWTS, ARTS, SAS, ACES, and ASE provide a worksheet for respondents to use to gather data, but all of the data must be reported through the online instrument (the worksheet cannot be uploaded). Some of the surveys (M3UFO, MOPS, and SQ-CLASS) provide the instruments or worksheets online separately; others (ASM, AWTS, ARTS, SAS, ACES, ASE, and COS) do not, requiring survey respondents to log in to the survey to see a worksheet.

Some of this variation across the annual economic surveys is likely due to two different in-house web survey software programs having been used as platforms for electronic data collection. Until 2015, ASM and COS were conducted using an electronic instrument software application called Surveyor that required downloading to a respondent’s PC, while AWTS, ARTS, SAS, ACES, and ASE have used Centurion, which is web-based. To inform businesses about the questionnaire, all surveys send initial letters by mail. For nonresponse follow-up, almost all of the surveys use telephone, encouraging nonrespondents to participate through one of the other modes (ASE uses mail follow-ups).

4.4.2 Benefits of Electronic Data Collection

For business surveys, there are some clear benefits to prioritizing electronic data collection from sampled enterprises or establishments. Electronic data collection removes data entry steps, which accelerates postsurvey processing time. It also permits real-time edits to be built into a survey questionnaire, allowing businesses to resolve discrepancies while data are being collected. Skip patterns and other dependencies can be built into the survey instrument, reducing unnecessary errors by responding businesses. Furthermore, in electronic data collection, information is available from businesses that partially complete the survey. Well-designed electronic questionnaires should reduce response burden. The up-front costs associated with developing a robust online survey instrument must be recognized, but in the case of large surveys, electronic data collection should result in production cost reductions. As noted above, the Census Bureau has taken significant steps on moving respondents to electronic data collection.

Electronic questionnaires also offer multiple possibilities to facilitate the response process. In particular, the questionnaire can be tailored to the

Suggested Citation:"4 Harmonization of Questionnaires and Data Collection Processes." National Academies of Sciences, Engineering, and Medicine. 2018. Reengineering the Census Bureau's Annual Economic Surveys. Washington, DC: The National Academies Press. doi: 10.17226/25098.
×

business context for a particular industry or type of business unit, using routing, flexible question wording, edit checks, and other features (see Snijkers, 1992). A disadvantage of electronic questionnaires, especially when the questionnaire becomes complex, is that respondents do not have an overview of the information that will be requested. To facilitate responding on electronic questionnaires, respondents need to be provided with the option to obtain a hard copy of the instrument or to print it out. This option would allow respondents who will be completing the questionnaire to have an overview of all data that are requested. They also may need to distribute the questionnaire to multiple informants within the business, a process that is facilitated by having a print-out or hard copy readily available (Dowling and Stettler, 2007; Giesen, 2007). Use of a sequential mixed-mode design may be another reason for having a paper questionnaire: in the initial contact, the electronic questionnaire is provided, while in a mailed nonresponse follow-up, a paper questionnaire could be included to stimulate response (Snijkers and Jones, 2013, pp. 422–426).

Developing a separate paper questionnaire can be costly, especially for an integrated survey system. Electronic and paper questionnaires need to be consistent in content, including question wording, definition instructions, order of questions, and so on, with a unified visual design for the two modes. Making it possible for respondents to print out the electronic version of the questionnaire may be a less expensive option. Furthermore, especially for larger businesses, the login portal for an electronic instrument may need to allow multiple IDs and passwords to facilitate logging in across the organization or sharing the survey among several employees.

Another advantage to having an electronic questionnaire is that paradata for the questionnaire completion process, such as the number of logins, the amount of time spent on a survey question or screen, and questions for which respondents are particularly likely to break off or change answers, can be collected.10 Although these data are not currently used for evaluating and improving problematic questions in the annual economic surveys, an integrated ABSS would afford an opportunity to develop both survey-level and item-specific paradata measures for key variables that could be helpful for such evaluation and improvement. Paradata may be less informative if many respondents first enter their information on a worksheet and then transfer it to the web instrument, although at this writing, there is no empirical information about the proportion of respondents to the annual surveys who do this or how it affects web completion.

An innovative approach for the Census Bureau to consider for an ABSS is to move toward a data collection system that accesses information

___________________

10 Paradata are data collected in real time about survey processes; for an overview of paradata related to business survey errors, see Snijkers and Haraldsen (2013, pp. 431–457).

Suggested Citation:"4 Harmonization of Questionnaires and Data Collection Processes." National Academies of Sciences, Engineering, and Medicine. 2018. Reengineering the Census Bureau's Annual Economic Surveys. Washington, DC: The National Academies Press. doi: 10.17226/25098.
×

directly from business records rather than requiring the business to supply answers to a survey questionnaire. This approach includes system-to-system data collection using extended business reporting language or standardized business reporting technology, also known as electronic data interchange, automated data capture, or at the Census Bureau, passive interviewing. A session at the 5th International Conference on Establishment Surveys was devoted to this development.11 The BLS provides a U.S. example: beginning in 1995, BLS undertook to extract employment, payroll, and hours from payroll files for large multi-establishment businesses for its Current Employment Statistics monthly survey. Currently, BLS obtains data from payroll files for more than 40 percent of the 275,000-odd work sites in the Current Employment Statistics survey.12 The Census Bureau is experimenting with such methods (U.S. Census Bureau, 2016). Their use could well facilitate data collection in the annual economic surveys from businesses that use standard third-party software for such functions as payroll and accounting or have centralized files for multiple establishments.

RECOMMENDATION 4-5: For the annual economic surveys and a future Annual Business Survey System, the Census Bureau should continue to develop and enhance a data collection strategy that takes maximum advantage of electronic data collection modes at all stages in the survey process. The Bureau should use a consistent software platform across all surveys and make use of paradata to identify questions that are problematic for respondents and would benefit from redesign. To facilitate response, businesses should be able to readily obtain a print version of every questionnaire.

4.5 QUESTIONNAIRE DESIGN AND DEVELOPMENT

Questionnaire design and development is a big part of a redesign process, and it needs to start as soon as possible for harmonizing the annual economic surveys and a future ABSS. Questionnaire design begins with the steps discussed above: concept development, identification of alternative data sources, and identification of preferred data collection modes. This section discusses the design of actual questionnaires, including core and subquestionnaires.

___________________

11 See http://ww2.amstat.org/meetings/ices/2016/proceedings/ICESV_TOC.pdf [November 2017].

12 See https://www.bls.gov/opub/mlr/2016/article/one-hundred-years-of-current-employmentstatistics-data-collection.htm [November 2017].

Suggested Citation:"4 Harmonization of Questionnaires and Data Collection Processes." National Academies of Sciences, Engineering, and Medicine. 2018. Reengineering the Census Bureau's Annual Economic Surveys. Washington, DC: The National Academies Press. doi: 10.17226/25098.
×

4.5.1 Questionnaire Elements

Table 4-5 displays elements of a questionnaire that would need to be harmonized at the levels of individual questions, question blocks, and questionnaire in moving toward an ABSS. A set of individual items or questions that deal with the same topic constitute a block or section of a questionnaire; all blocks together make up the questionnaire. Items within a block need to be ordered in a systematic way, as do the blocks within the questionnaire.

For a good questionnaire, all of the elements need to work together. Business survey questionnaires should be designed according to best practices and principles of visual design for layout and format that will facilitate accurate business response (see Haraldsen, 2013). Questionnaires also

TABLE 4-5 Questionnaire Design Elements for Harmonizing the Annual Economic Surveys and a Future Annual Business Survey System

Individual Questions Questions
  • Question type: open, closed, partially closed
  • Question wording
Response options
  • Closed questions: wording of options, order of options
  • Open questions: answer format (e.g., $ or 1,000 $)
Instructions at the question level
  • Definitions
  • Tasks
  • Routing
Range checks and error messages
Prefilled information
Question Blocks Question blocks
  • Order of questions
  • Format: sequence versus matrix/grid
Instructions at the block level
  • Topic of the block
  • Routing instructions at the block level
Consistency checks within a block and error messages
Questionnaire Questionnaire: order of blocks
Instructions
  • Completion instructions
  • Usability instructions (for the web)
Consistency and range checks over blocks
Error messages
Visual design and layout
Usability elements (for the web)
  • Buttons
  • Progress indicator (if used)
  • Questionnaire overview (if used)
Suggested Citation:"4 Harmonization of Questionnaires and Data Collection Processes." National Academies of Sciences, Engineering, and Medicine. 2018. Reengineering the Census Bureau's Annual Economic Surveys. Washington, DC: The National Academies Press. doi: 10.17226/25098.
×

should be thoroughly tested by potential respondents for usability issues, with all necessary skips, contingencies, and edits built into the instrument. The panel recognizes that the Census Bureau’s Economic Directorate has Guidelines on Questionnaire Design (Morrison et al., 2008), but encourages the Bureau to review the extensive literature on questionnaire design that has been conducted in the decade since these guidelines were written as input to the process of revising questionnaires for the annual economic surveys and developing an ABSS questionnaire.

At the question level, similar questions across questionnaires can be designed as one, with necessary tailoring to industry differences (“tailoring within harmonization”). In this effort, the redesign team will have to confront whether differences across industries are truly necessary or are simply historical artifacts. Some of the current industry-specific surveys ask explicit questions for key concepts (e.g., “What were the total sales of merchandise and other operating receipts for this EIN [employer identification number] in 2013?” in ARTS); others label boxes to be filled in without asking an explicit question (e.g., “Total value of products shipped and other receipts” in ASM). Some industries have explicit include and exclude instructions, and others do not. ASM provides information about reports from prior years but other surveys do not. Additionally, some surveys ask about dollars rounded to the thousands, while others ask for precise numbers: see example in Figure 4-1.

This kind of variation across industries in question type (e.g., explicit question or boxes with labels), inclusion of instructions, and provision of prior information seems unnecessary. It may well lead to confusion and additional burden for companies that receive multiple surveys. Most importantly, if there is empirical evidence to support a “best practice” when making questionnaire design decisions, this evidence should guide decisions for how to standardize questionnaires (see Haraldsen, 2013).

Another example with regard to harmonization in electronic questionnaires is error checks and messages. To the extent possible, harmonized error checks need to be designed at the time of questionnaire development. The data file resulting from the questionnaire is input to the editing process, and thus, data edits also need to be considered and harmonized at the time of questionnaire development.

All the elements in Table 4-5 (above) have to be specified before the programming of a questionnaire can begin. It is likely that constraints on software or technical capabilities will provide constraints on the types of edit and range checks and consistency checks that can be included, as well as on visual design options. Thus, the design and development of harmonized questionnaires is likely not to be a linear process.

A challenge with a fully electronic, integrated set of questionnaires (core and subquestionnaires) for a future integrated ABSS is questionnaire

Suggested Citation:"4 Harmonization of Questionnaires and Data Collection Processes." National Academies of Sciences, Engineering, and Medicine. 2018. Reengineering the Census Bureau's Annual Economic Surveys. Washington, DC: The National Academies Press. doi: 10.17226/25098.
×
Image
FIGURE 4-1 Examples of two questions about receipts in ARTS and ASM.
NOTES: ARTS, Annual Retail Trade Survey; ASM, Annual Survey of Manufactures.
SOURCE: Figure provided to the panel by the Census Bureau.

version management. Doing this well will require clear and thorough documentation systems for all questionnaire items for all industries, along with linkages across the items and the industries. At all times, it should be clear to a user of the developing system which are the latest versions of each of the core and subquestionnaires.

A flowchart can be a helpful tool for designing core and subquestionnaires. It can be developed at various levels—block level, core questionnaire level, or subquestionnaire level—to show how the questions are related. For an ABBS, a flowchart can provide a visual display of the structure of the entire system of questionnaires. A flowchart also can be used to manage version control, helping to ensure that all parts of a questionnaire have been addressed each time the version changes.

Suggested Citation:"4 Harmonization of Questionnaires and Data Collection Processes." National Academies of Sciences, Engineering, and Medicine. 2018. Reengineering the Census Bureau's Annual Economic Surveys. Washington, DC: The National Academies Press. doi: 10.17226/25098.
×

4.5.2 Pretesting

The panel recognizes the importance of pretesting drafts of revised annual economic survey questionnaires and future ABSS questionnaires. This is in fact mandated by Statistical Policy Directive No. 2 (U.S. Office of Management and Budget, 2006) and by U.S. Census Bureau Statistical Quality Standards (U.S. Census Bureau, 2013). Such pretesting will be essential to ensure a successful transition to a new survey system. Pretesting includes lab and field testing of questions, using qualitative and quantitative methods, as well as a thorough testing of all possible paths in the questionnaire (see Willimack, 2013). Pretesting also includes functional and performance tests.

According to a Census Bureau presentation to the panel, the Bureau currently uses expert reviews by methodologists and subject-matter specialists, exploratory early-stage scoping interviews, cognitive interviews, usability testing, low-fidelity prototypes, and other methods for development and testing of the agency’s annual economic surveys.13 Other methods, such as recordkeeping studies or field pilot data, are used less routinely, while analysis of paradata from web survey instruments is only beginning to be used as the agency has staff members with the appropriate skills. This sort of pretesting has revealed a number of ways in which the current annual economic surveys do not match business records. Accurate, reliable, and valid reports for a future ABSS will depend on businesses being able to accurately and easily retrieve the information requested from their records. The panel views the redesign of the annual economic surveys as an opportunity to abandon or change questions seeking information that sampled businesses cannot easily retrieve or report.

Pretesting needs to be planned carefully. Often, survey organizations find that questionnaire design and development take much more time than anticipated, leaving no time for pretesting (Snijkers and Willimack, 2011; Willimack, 2013).

RECOMMENDATION 4-6: In harmonizing questionnaires across the current annual economic surveys, leading to an Annual Business Survey System, the Census Bureau should develop a template providing standardized question, question block, and questionnaire elements to be used wherever possible. This work should begin as early as possible, allowing time for feasibility studies at the concept and data item defining stage and pretesting of questionnaire drafts at the questionnaire design stage.

___________________

13 Based on a presentation by Diane K. Willimack, U.S. Census Bureau, “Testing and Evaluation of Questions and Instruments,” during the panel’s June 3, 2016, meeting.

Suggested Citation:"4 Harmonization of Questionnaires and Data Collection Processes." National Academies of Sciences, Engineering, and Medicine. 2018. Reengineering the Census Bureau's Annual Economic Surveys. Washington, DC: The National Academies Press. doi: 10.17226/25098.
×

4.6 SURVEY RECRUITMENT AND PARTICIPATION

A business survey communications and recruitment strategy consists of a number of prefield, field, and postfield communication measures, aimed at obtaining accurate, complete, and timely responses from businesses selected for the survey sample (Snijkers and Jones, 2013). The objectives of a communications strategy are to establish contact, gain survey cooperation, and communicate information, instructions, and procedures. To obtain responses, a communications strategy may include advance letters, due-date reminder letters, nonresponse follow-ups (including reminder letters and reminder telephone calls), websites with frequently asked questions (FAQs), and a help desk. The Account Manager Program (see Chapter 3) is part of such a strategy.14 Among other things, that program seeks to develop and maintain a working relationship with key enterprises that are large enough to be included in many survey samples. Another important part of a communications strategy is timing, including when a survey is fielded and when letters are sent (see Snijkers, 2014; Snijkers and Jones, 2013). Finally, it is important to establish an active monitoring program to track the success of survey recruitment and participation and make adjustments as needed.

Not surprisingly, elements of communication differ across the current annual economic surveys. Some of these differences may be appropriate for the industries targeted, but many differences are candidates for harmonization. Elements that would benefit from harmonization across the surveys and for an ABSS include letters and other communication materials, websites, and FAQs. In many cases, a common survey calendar may be appropriate (see Section 4.6.2, “Timing,” below). The template for these elements may be tailored to the target group, but commonality across data collection is to be desired to the extent possible. During the transition from the current surveys to an ABSS, communications with sampled business units that are or have been sampled for one or more of the current surveys will need to inform them about what has changed and why.

4.6.1 Communication Methods

Current Census Bureau practice with regard to communications for the annual economic surveys has been informed by lessons learned through experience. For each of the current annual surveys, the first contact with a sampled business unit is a letter followed by a due date reminder shortly

___________________

14 Based on a presentation by Charles Brady, U.S. Census Bureau, “Annual Survey Account Manager Program,” during the panel’s June 3, 2016, meeting.

Suggested Citation:"4 Harmonization of Questionnaires and Data Collection Processes." National Academies of Sciences, Engineering, and Medicine. 2018. Reengineering the Census Bureau's Annual Economic Surveys. Washington, DC: The National Academies Press. doi: 10.17226/25098.
×

before the due date, follow-up letters after the due date with increasing intensity (in wording and delivery), and telephone follow-up.15

Each of the initial letters begins with text reading “A Message from the Director, U.S. Census Bureau,” and the remainder of the text of the letters is generally similar, emphasizing that response is required by law and that data will be kept confidential. The letters differ, however, with respect to their specificity concerning the uses of the data to be collected. The letters also provide information to respondents about a help line and ways to reach the Census Bureau if assistance is needed. Each of the surveys provides an estimated time burden, as required by the U.S. Office of Management and Budget. Some surveys refer to the questionnaire’s form number (e.g., MA 10000(S)), which can be meaningful to some respondents who are in the sample year after year, but may be less meaningful to others.

The due date reminder letter is sent in the week prior to the survey’s due date. According to information provided to the panel, an experimental test of a due date reminder in the 2014 ARTS indicated that it increased the timeliness of the responses and reduced the need for more expensive certified letters and telephone follow-up attempts. According to the Census Bureau’s website, the use of a due date reminder was extended to the 2015 ASM, M3UFO, MOPS, AWTS, ARTS, SAS, ACES, SQ-CLASS, and COS.

After regular mail follow-up attempts, certified delivery letters sometimes are used to communicate the official and important nature of the survey request. A test during the 2012 economic censuses indicated that a certified follow-up letter increased the check-in rate (the percentage of employers logging onto the survey website) more than a noncertified letter and that it was more cost effective to use certified letters in later followups.16 Certified follow-up has been used as the second or third follow-up for ASM, ARTS, AWTS, SAS, ACES, ASE, and COS.

A telephone call is the most expensive mode, and thus this final contact attempt is reserved for the hardest-to-reach cases. Responses are monitored to set priorities and target the lowest-responding industries and highest-impact companies to balance response coverage and costs.17 Telephone follow-up has been used for ASM, M3UFO, AWTS, ARTS, SAS, ACES, SQ-CLASS, and COS.

The panel welcomes experimentation aimed at improving the efficiency of contact strategies for the annual economic surveys and encourages such

___________________

15 Based on a presentation by Charles Brady, U.S. Census Bureau, “Annual Survey Account Manager Program,” during the panel’s June 3, 2016, meeting.

16 Based on a presentation by Susanne Johnson, U.S. Census Bureau, “Data Collection Strategy and Response Monitoring,” during the panel’s June 3, 2016, meeting. This test is described in Marquette, Kornbau, and Toribio (2015); see also Thompson and Kaputa (2017).

17 Based on a presentation by Susanne Johnson, U.S. Census Bureau, “Data Collection Strategy and Response Monitoring,” during the panel’s June 3, 2016, meeting.

Suggested Citation:"4 Harmonization of Questionnaires and Data Collection Processes." National Academies of Sciences, Engineering, and Medicine. 2018. Reengineering the Census Bureau's Annual Economic Surveys. Washington, DC: The National Academies Press. doi: 10.17226/25098.
×

experimentation, not only to provide evidence for greater harmonization among the separate surveys, but also to lead to an integrated set of best contact practices for an ABSS. Harmonization, whether for the current surveys or an ABSS, likely would include tailoring to specific industries. Such “tailoring within harmonization” would facilitate effective communication with sampled businesses and continued improvements by Census Bureau staff, since their efforts could be focused on an integrated survey system and not be spread among separate surveys as is currently the case.

The current communications strategy for each of the annual economic surveys also includes FAQs available on websites about the surveys. The FAQs are not fully harmonized across surveys. Although most of the current annual survey FAQ web pages follow a common structure, highlighting “General” questions, “Completing the Report,” and “Survey Definitions,” the ASM instead uses a “How Do I Get Started?” page, and M3UFO has a completely different format. Development of a common FAQ structure for the surveys would be a helpful first step toward an integrated set of FAQs for an ABSS. The home page for an ABSS FAQs could cover general topics, including how to complete the core questionnaire, with links to FAQs for specific industries and topics. This structure would be more helpful to businesses and easier for the Census Bureau to develop and maintain than the currently separate sets of FAQs.

A key part of a communications strategy is a website where a respondent logs into the survey. Having a single portal for a business to access all of the future ABSS questionnaires it is asked to complete would be an important goal of the redesign process. Currently, the web questionnaires for the annual economic surveys are accessed through different sites, with different login pages, and with different information presented to and required of a respondent. ASM and COS use a two-step login procedure, requiring respondents to create sign-in information on a portal and then use that information to access the survey. The other surveys provide businesses with a login ID and password, and the business can sign in directly. The look and feel of these portals varies dramatically across surveys, with different information made prominent or salient to survey participants. These differences are a potential problem for large businesses operating in more than one industry sector, as they may be asked to respond to separate surveys for the different sectors, as well as some of the topical surveys, such as ACES (see Table 5-4 in Chapter 5). With an ABSS, through the single-portal access site that we recommend, such businesses would be able to readily complete the core questionnaire plus any relevant industry and topic modules.

Suggested Citation:"4 Harmonization of Questionnaires and Data Collection Processes." National Academies of Sciences, Engineering, and Medicine. 2018. Reengineering the Census Bureau's Annual Economic Surveys. Washington, DC: The National Academies Press. doi: 10.17226/25098.
×

4.6.2 Timing

Timing is an important part of a communications strategy. Table 4-6 shows the current timelines for each of the annual economic surveys. ASM, SAS, and COS are fielded between January and September, but other surveys are fielded during different parts of the year. Harmonizing timing will require harmonizing reference periods. ASE was fielded from September through the subsequent January, possibly because the data collected are not on a calendar or fiscal year basis.

In developing an ABSS, to the extent that similar information is being requested from respondents in different industries, similar timing for the collection of that information is likely to be appropriate. Depending on the nature of the data being collected for an industry, however, some tailoring of timing may be required.

One Census Bureau concern relayed by Diane Willimack at a panel meeting was that some respondents to the annual economic surveys find the timing of the survey requests inconvenient, as they arrive when respondents are busy with other reporting obligations, such as filings to the Securities and Exchange Commission, tax filings, or preparation of internal financial reports.18 Survey timelines are based in part on when the survey samples can be generated and on data dissemination requirements, but the Census Bureau also attempts to apply best practices for contacting respondents. Respondents’ time constraints vary, however, so while the currently established due dates may be fine for some, they may be challenging for others. The input from respondents suggests that some tailoring of the timeline to a particular business context may be appropriate. Such tailoring of survey timelines to facilitate response, however, could undercut one of the advantages of harmonization of timelines—namely, the benefit to users from having data disseminated on similar schedules. Preparation of preliminary estimates for industries for which data are collected on a later timeline could be a means to reconcile this tension (see Section 5.8, “Preliminary Estimates,” in Chapter 5).

4.6.3 Assessing a Communications Strategy

Data and other information about the survey fieldwork will be needed to ensure that the fielding strategy is efficient. Pretests, including full operational tests with small samples of business units, are important sources of relevant information. Postfield questionnaire debriefings, complaints, comments, and incoming calls to the survey helpdesk also can be helpful in this

___________________

18 Based on a presentation by Diane K. Willimack, U.S. Census Bureau, “Testing and Evaluation of Questions and Instruments,” during the panel’s June 3, 2016, meeting.

Suggested Citation:"4 Harmonization of Questionnaires and Data Collection Processes." National Academies of Sciences, Engineering, and Medicine. 2018. Reengineering the Census Bureau's Annual Economic Surveys. Washington, DC: The National Academies Press. doi: 10.17226/25098.
×

TABLE 4-6 Periods of Data Collection (year-month) for the Census Bureau’s Annual Economic Surveys

Survey Year 1 Year 2
Jan Feb Mar Apr May June July Aug Sept Oct Nov Dec Jan Feb
ASM
M3UFO
MOPS
ARTS
AWTS
SAS
ACES
ICTSa
ASE
SQ-CLASS
COS

NOTE: See text for full names of the surveys.

aSurvey is currently suspended.

SOURCE: Information provided to the panel by the Census Bureau.

Suggested Citation:"4 Harmonization of Questionnaires and Data Collection Processes." National Academies of Sciences, Engineering, and Medicine. 2018. Reengineering the Census Bureau's Annual Economic Surveys. Washington, DC: The National Academies Press. doi: 10.17226/25098.
×

regard. Account managers may be able to convey useful information from the companies for which they are responsible.

Another source of data relevant to optimizing a fielding strategy are response process paradata and indicators, such as the return (or check-in) rate, unweighted unit response rate (URR), and weighted response rate (or total quantity response rate, TQRR). These indicators can provide useful information about how well a survey is progressing as well as, in some cases, any need for tailoring. Paradata for the current surveys can be analyzed as input for developing the fielding strategy for an ABSS. Census Bureau staff presented analyses of this kind to the panel.19

Paradata related to respondents’ login experiences, such as the number of login attempts, successful and unsuccessful login attempts, and logins by unique persons, also can be informative. For example, the panel heard that about 7 percent of ASE IDs currently have an unsuccessful login attempt.20 The Census Bureau is currently reviewing paradata to evaluate the new portal login procedure implemented in the 2016 ASM and COS.

Learning from the available paradata on an ongoing basis will require the development of dashboards that allow for continuous monitoring of fieldwork. These dashboards ideally would provide indicators of performance not only for the survey sample as a whole, but also for subgroups of the sample that are of particular interest (e.g., certainty units). Response rates, both URR and TQRR (the latter based on a quantity such as revenues or payroll), and costs also need to be actively monitored during data collection.

Active monitoring of paradata on completion rates and the correlates of completion rates can assist in developing an adaptive or responsive design for a future ABSS. Such designs seek to optimize data collection costs and data quality by using paradata to alter the survey design in real time, such as prioritizing particular kinds of sample units for nonresponse follow-up. The goal is to target resources so as to minimize nonresponse bias in the statistics of interest. These types of designs originated for household surveys (the seminal paper is Groves and Heeringa, 2006), but increasingly they are being used in business surveys. For example, Statistics Canada has implemented an active fieldwork management approach (Hunter and Carbonneau, 2005; Laflamme, Maydan, and Miller, 2008) based on real-time monitoring of such key fieldwork indicators as the URR and TQRR, thus allowing for timely corrective action if needed (see Snijkers and Haraldsen, 2013).

___________________

19 Based on a presentation by Susanne Johnson, U.S. Census Bureau, “Data Collection Strategy and Response Monitoring,” during the panel’s June 3, 2016, meeting.

20 Based on a presentation by Patrice Norman, U.S. Census Bureau, “Annual Survey of Entrepreneurs (ASE) Content Determination,” during the panel’s June 3, 2016, meeting.

Suggested Citation:"4 Harmonization of Questionnaires and Data Collection Processes." National Academies of Sciences, Engineering, and Medicine. 2018. Reengineering the Census Bureau's Annual Economic Surveys. Washington, DC: The National Academies Press. doi: 10.17226/25098.
×

Census Bureau staff have carried out studies that could be inputs for such a system, including response analyses and nonresponse bias analyses (see, e.g., Lineback and Thompson, 2010; Thompson and Oliver, 2012). Thompson and Oliver (2012) state that the Census Bureau is moving toward a standardized statistical process control framework. The framework presented in their paper would allow program managers to see how a process operated in the past and how it is operating currently. The idea is that it would show “when an intervention is necessary to bring the process into control or to improve the process” (Thompson and Oliver, 2012, p. 236). This approach is in line with an active fieldwork approach, with real-time monitoring of field data collection. Active monitoring of field progress is only one component of a responsive or adaptive design.

The Census Bureau has noted that there are challenges in setting up a fieldwork monitoring system.21 Cumulative response rate charts, which provide a visual display of the pattern of response over time, have proven to be one relatively simple but useful tool, making it easy to see if response rates are falling behind where they had been at a comparable point in previous years. Similar charts could be developed to show TQRR, as well as key estimates over the course of the data collection period. Figure 4-2 provides an example for the 2013, 2014, and 2015 SAS check-in rate. The panel commends the Census Bureau’s work in developing measures and tools for active monitoring, and encourages the continuation and extension of that work.

RECOMMENDATION 4-7: The Census Bureau should work to harmonize its communications and fieldwork strategies across the annual economic surveys, leading to a strategy for an Annual Business Survey System (ABSS) aimed at efficiently achieving accurate, complete, and timely responses from sampled businesses. This strategy should be informed by available information from pretests, postfield evaluations, experiments to test alternative approaches, and active monitoring of the response process through paradata. A goal should be to develop a paradata-based monitoring system (or process control framework) that supports a responsive (or adaptive) design for the fieldwork for an ABSS.

___________________

21 Based on a presentation by Susanne Johnson, U.S. Census Bureau, “Data Collection Strategy and Response Monitoring,” during the panel’s June 3, 2016, meeting.

Suggested Citation:"4 Harmonization of Questionnaires and Data Collection Processes." National Academies of Sciences, Engineering, and Medicine. 2018. Reengineering the Census Bureau's Annual Economic Surveys. Washington, DC: The National Academies Press. doi: 10.17226/25098.
×
Image
FIGURE 4-2 Cumulative return (check-in) rates for the Service Annual Survey, 2013, 2014, and 2015.
SOURCE: Figure provided to the panel by the Census Bureau.

4.7 REFERENCES

NOTE: All URL addresses were active as of November 2017.

Bakker, B.F.M., and van Rooijen, J. (Eds.). (2012). Methodological challenges of register-based research. Statistica Neerlandica, 66(Special Issue 1), 1–84. doi: https//:doi.org/10.1111/ j.1467-9574.2011.00520.x.

Bavdaz, M. (2010). The multidimensional integral business survey response model. Survey Methodology, 36(1), 81–93. Available: http://www.websm.org/uploadi/editor/1368460644Bavdaz_2010_The_multidimensional_integral_business_survey.pdf.

Brown, C., Sturgeon, T., and Cole, C. (2013). The 2010 National Organizations Survey: Examining the Relationships between Job Quality and the Domestic and International Sourcing of Business Functions by United States Organizations. Working Paper No. 156-13. Berkeley, CA: Institute for Research on Labor and Employment. Available: http://irle.berkeley.edu/workingpapers/156-13.pdf.

Byrne, D.M., Corrado, C., and Sichel, D.E. (2017). The Rise of Cloud Computing: Minding Your Ps and Qs. Presentation to the Conference on Research in Income and Wealth (CRIW)/ National Bureau of Economic Research (NBER) Conference on Measuring and Accounting for Innovation in the 21st Century, Washington, DC, March. Available: https://bea.gov/about/pdf/acm/2017/the-rise-of-cloud-computing-minding-your-ps-and-qs.pdf.

Suggested Citation:"4 Harmonization of Questionnaires and Data Collection Processes." National Academies of Sciences, Engineering, and Medicine. 2018. Reengineering the Census Bureau's Annual Economic Surveys. Washington, DC: The National Academies Press. doi: 10.17226/25098.
×

Daas, P., Ossen, S., Tennekes, M., Zhang, L.-C., Hendriks, C., Foldal Haugen, K., Bernardi, A., Cerroni, F., Laitila, T., Wallgren, A., and Wallgren, B. (2011). BLUEETS Deliverable 4.1: List of Quality Groups and Indicators Identified for Administrative Data Sources. Luxembourg: Eurostat. Available: https://www.blue-ets.istat.it/fileadmin/deliverables/Deliverable4.1.pdf.

Dowling, Z., and Stettler, K. (2007). Factors influencing business respondents’ decision to adopt web returns. In Proceedings of the 3rd International Conference on Establishment Surveys (ICESIII) (pp. 102–1039). Alexandria, VA: American Statistical Association. Available: http://ww2.amstat.org/meetings/ices/2007/proceedings/ICES2007-000237.pdf.

Gates, G.W. (2009). Expanding Statistical Use of Administrative Data: A Research Proposal Focused on Privacy and Confidentiality. Unpublished Working Paper. Washington, DC: U.S. Census Bureau. Available: http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.178.7063&rep=rep1&type=pdf.

Giesen, D. (2007). The response process model as a tool for evaluating business surveys. In Proceedings of the 3rd International Conference on Establishment Surveys (ICESIII) (pp. 871–880). Alexandria, VA: American Statistical Association. Available: http://ww2.amstat.org/meetings/ices/2007/proceedings/ICES2007-000056.pdf.

Groves, R.M., and Heeringa, S.G. (2006). Responsive design for household surveys: tools for actively controlling survey errors and costs. Journal of the Royal Statistical Society: Series A (Statistics in Society), 169(3), 439–457. doi: https//:doi. org/10.1111/j.1467-985X.2006.00423.x.

Haraldsen, G. (2013). Questionnaire communication in business surveys. In G. Snijkers, G. Haraldsen, J. Jones, and D.K. Willimack (Eds.), Designing and Conducting Business Surveys (Ch. 8). Hoboken, NJ: John Wiley & Sons.

Haraldsen, G., Snijkers, G., and Zhang, L.-C. (2015). A Total Survey Error Approach to Business Surveys. Presentation to the International Total Survey Error (TSE) Conference, Baltimore, MD, September 19–22. Available: https://www.eiseverywhere.com/file_uploads/e98531cd1321d4fe4cb4e09ef143fa78_TSE15Abstracts_FINAL.pdf.

Hunter, L., and Carbonneau, J.-F. (2005). An Active Management Approach to Survey Collection. Presentation to the 2005 International Statistics Canada Methodology Symposium: Methodological Challenges for Future Information Needs, Ottawa, Canada, October 25–28.

Jarmin, R. (2017). Modernizing Economic Statistics: Opportunities and Challenges. Presentation to the 2017 International Conference on New Techniques and Technologie, Brussels, Belgium, March 13–17. Available: http://nt17.pg2.at/data/presentations/presentation_261.pdf.

Laflamme, F., Maydan, M., and Miller, A. (2008). Using paradata to actively manage data collection survey process. In Proceedings of the Survey Research Methods Section, Joint Statistical Meetings (pp. 630–637). Alexandria, VA: American Statistical Association. Available: https://ww2.amstat.org/sections/srms/Proceedings/y2008/Files/300608.pdf.

Lineback, J.F., and Thompson, K.J. (2010). Conducting nonresponse bias analysis for business surveys. In Proceedings of the Government Statistics Section, Joint Statistical Meetings (pp. 317–331). Alexandria, VA: American Statistical Association. Available: https://ww2.amstat.org/sections/srms/Proceedings/y2010/Files/306113_55883.pdf.

Lorenc, B. (2007). Using the theory of social distributed cognition to study the establishment survey response process. In Proceedings of the 3rd International Conference on Establishment Surveys (ICESIII) (pp. 881–891). Alexandria, VA: American Statistical Association. Available: http://ww2.amstat.org/meetings/ices/2007/proceedings/ICES2007-000247.pdf.

Suggested Citation:"4 Harmonization of Questionnaires and Data Collection Processes." National Academies of Sciences, Engineering, and Medicine. 2018. Reengineering the Census Bureau's Annual Economic Surveys. Washington, DC: The National Academies Press. doi: 10.17226/25098.
×

Marquette, E., Kornbau, M., and Toribio, J. (2015). Testing contact strategies to improve response in the 2012 Economic Census. In Proceedings of the Section on Government Statistics. Alexandria, VA: American Statistical Association.

Mesenbourg, T.L., Jr. (2015). Technology and Emerging Data Needs. Unpublished paper. Key Concepts Knowledgebase, LLC. December, Fairfax, VA.

Morrison, R.L., Stokes, S.L., Burton, J., Caruso, A., Edwards, K.K., Harley, D., Hough, C., Hough, R., Lazirko, B.A., and Proudfoot, S. (2008). Economic Directorate Guidelines on Questionnaire Design. Washington DC: U.S. Census Bureau. Available: https://www.census.gov/srd/Economic_Directorate_Guidelines_on_Questionnaire_Design.pdf.

National Academies of Sciences, Engineering, and Medicine. (2017). Innovations in Federal Statistics: Combining Data Sources While Protecting Privacy. Washington, DC: The National Academies Press. doi: https://doi.org/10.17226/24652.

National Research Council. (2014). Capturing Change in Science, Technology, and Innovation: Improving Indicators to Inform Policy. Panel on Developing Science, Technology, and Innovation Indicators for the Future. R.E. Litan, A.W. Wyckoff, and K.H. Fealing (Eds.). Committee on National Statistics and Board on Science, Technology, and Economic Policy. Washington, DC: The National Academies Press. doi: https://doi.org/10.17226/18606.

Snijkers, G. (1992). Computer assisted interviewing: Telephone or personal? In A. Westlake, R. Banks, C. Payne, and T. Orchard (Eds.), Survey and Statistical Computing (pp. 137–146). Amsterdam: North-Holland.

Snijkers, G., and Haraldsen, G. (2013). Managing the data collection. In G. Snijkers, G. Haraldsen, J. Jones, and D.K. Willimack (Eds.), Designing and Conducting Business Surveys (Ch. 10). Hoboken, NJ: John Wiley & Sons.

Snijkers, G., and Jones, J. (2013). Business survey communications. In G. Snijkers, G. Haraldsen, J. Jones, and D.K. Willimack (Eds.), Designing and Conducting Business Surveys (Ch. 9). Hoboken, NJ: John Wiley & Sons.

Snijkers, G., and Willimack, D.K. (2011). The Missing Link: From Concepts to Questions in Economic Surveys. Presentation to the 2nd European Establishment Statistics Workshop (EESW11), Neuchâtel, Switzerland, September 12–14. Available: https://s3.amazonaws.com/sitesusa/wp-content/uploads/sites/242/2014/05/Willimack_2012FCSM_VII-B.pdf.

Snijkers, G., Göttgens, R., and Hermans, H. (2011). Data Collection and Data Sharing at Statistics Netherlands: Yesterday, Today, Tomorrow. Presentation to the 59th Plenary Session of the Conference of European Statisticians (CES), United Nations Economic Commission for Europe (UNECE), Geneva, Switzerland, June 14–16. Available: http://www.unece.org/fileadmin/DAM/stats/documents/ece/ces/2011/20.e.pdf.

Statistics Canada. (2015). Integrated Business Statistics Program Overview. Catalogue No. 68-515-X. Ottawa: Statistics Canada. Available: http://www.statcan.gc.ca/pub/68-515-x/2015001/mi-rs-eng.htm.

Thompson, K.A. and Kaputa, S.J. (2017). Investigating nonresponse subsampling in establishment surveys through embedded experiments. Journal of Official Statistics, 33(3), 835–856. doi: http://dx.doi.org/10.1515/JOS-2017-0038.

Thompson, K.J., and Oliver, B.E. (2012). Response rates in business surveys: Going beyond the usual performance measure. Journal of Official Statistics, 28(2), 221–237.

Tourangeau, R., Rips, L.J., and Rasinski, K. (2000). The Psychology of Survey Response. New York: Cambridge University Press.

U.N. Economic Commission for Europe. (2007). RegisterBased Statistics in the Nordic Countries: Review of Best Practices with Focus on Population and Social Statistics. Geneva, Switzerland: Author. Available: http://www.unece.org/fileadmin/DAM/stats/publications/Register_based_statistics_in_Nordic_countries.pdf.

Suggested Citation:"4 Harmonization of Questionnaires and Data Collection Processes." National Academies of Sciences, Engineering, and Medicine. 2018. Reengineering the Census Bureau's Annual Economic Surveys. Washington, DC: The National Academies Press. doi: 10.17226/25098.
×

U.S. Census Bureau. (2013). U.S. Census Bureau Statistical Quality Standards Section A2, Developing Data Collection Instruments and Supporting Materials. Washington, DC: Author. Available: https://www.census.gov/content/dam/Census/about/about-the-bureau/policies_and_notices/quality/statistical-quality-standards/Quality_Standards.pdf.

U.S. Census Bureau. (2016). Economic Directorate’s Big Data and Passive Data Collection Overview. Washington, DC: Author. Available: https://www2.census.gov/cac/sac/meetings/2016-09/2016-hogue.pdf.

U.S. Office of Management and Budget. (2006). Statistical policy directive no. 2: Standards and guidelines for statistical surveys. Federal Register, 71(55522). Available: https://www.federalregister.gov/d/06-8044.

Wallgren, A., and Wallgren, B. (2007). RegisterBased Statistics—Administrative Data for Statistical Purposes. Hoboken, NJ: John Wiley & Sons.

Willimack, D.K. (2013). Methods for the development, testing, and evaluation of data collection instruments. In G. Snijkers, G. Haraldsen, J. Jones, and D.K. Willimack (Eds.), Designing and Conducting Business Surveys (Ch. 7). Hoboken, NJ: John Wiley & Sons.

Willimack, D.K., and Nichols, E. (2001). Building an alternative response process model for business surveys. In Proceedings of the Joint Statistical Meetings. Alexandria, VA: American Statistical Association. Available: http://www.websm.org/db/12/15974/Web_Survey_Bibliography/Building_an_alternative_response_process_model_for_business_surveys/ ?menu=1&lst=&q=search_0_1_-1&qdb=12&qsort=0.

Suggested Citation:"4 Harmonization of Questionnaires and Data Collection Processes." National Academies of Sciences, Engineering, and Medicine. 2018. Reengineering the Census Bureau's Annual Economic Surveys. Washington, DC: The National Academies Press. doi: 10.17226/25098.
×

This page intentionally left blank.

Suggested Citation:"4 Harmonization of Questionnaires and Data Collection Processes." National Academies of Sciences, Engineering, and Medicine. 2018. Reengineering the Census Bureau's Annual Economic Surveys. Washington, DC: The National Academies Press. doi: 10.17226/25098.
×
Page 53
Suggested Citation:"4 Harmonization of Questionnaires and Data Collection Processes." National Academies of Sciences, Engineering, and Medicine. 2018. Reengineering the Census Bureau's Annual Economic Surveys. Washington, DC: The National Academies Press. doi: 10.17226/25098.
×
Page 54
Suggested Citation:"4 Harmonization of Questionnaires and Data Collection Processes." National Academies of Sciences, Engineering, and Medicine. 2018. Reengineering the Census Bureau's Annual Economic Surveys. Washington, DC: The National Academies Press. doi: 10.17226/25098.
×
Page 55
Suggested Citation:"4 Harmonization of Questionnaires and Data Collection Processes." National Academies of Sciences, Engineering, and Medicine. 2018. Reengineering the Census Bureau's Annual Economic Surveys. Washington, DC: The National Academies Press. doi: 10.17226/25098.
×
Page 56
Suggested Citation:"4 Harmonization of Questionnaires and Data Collection Processes." National Academies of Sciences, Engineering, and Medicine. 2018. Reengineering the Census Bureau's Annual Economic Surveys. Washington, DC: The National Academies Press. doi: 10.17226/25098.
×
Page 57
Suggested Citation:"4 Harmonization of Questionnaires and Data Collection Processes." National Academies of Sciences, Engineering, and Medicine. 2018. Reengineering the Census Bureau's Annual Economic Surveys. Washington, DC: The National Academies Press. doi: 10.17226/25098.
×
Page 58
Suggested Citation:"4 Harmonization of Questionnaires and Data Collection Processes." National Academies of Sciences, Engineering, and Medicine. 2018. Reengineering the Census Bureau's Annual Economic Surveys. Washington, DC: The National Academies Press. doi: 10.17226/25098.
×
Page 59
Suggested Citation:"4 Harmonization of Questionnaires and Data Collection Processes." National Academies of Sciences, Engineering, and Medicine. 2018. Reengineering the Census Bureau's Annual Economic Surveys. Washington, DC: The National Academies Press. doi: 10.17226/25098.
×
Page 60
Suggested Citation:"4 Harmonization of Questionnaires and Data Collection Processes." National Academies of Sciences, Engineering, and Medicine. 2018. Reengineering the Census Bureau's Annual Economic Surveys. Washington, DC: The National Academies Press. doi: 10.17226/25098.
×
Page 61
Suggested Citation:"4 Harmonization of Questionnaires and Data Collection Processes." National Academies of Sciences, Engineering, and Medicine. 2018. Reengineering the Census Bureau's Annual Economic Surveys. Washington, DC: The National Academies Press. doi: 10.17226/25098.
×
Page 62
Suggested Citation:"4 Harmonization of Questionnaires and Data Collection Processes." National Academies of Sciences, Engineering, and Medicine. 2018. Reengineering the Census Bureau's Annual Economic Surveys. Washington, DC: The National Academies Press. doi: 10.17226/25098.
×
Page 63
Suggested Citation:"4 Harmonization of Questionnaires and Data Collection Processes." National Academies of Sciences, Engineering, and Medicine. 2018. Reengineering the Census Bureau's Annual Economic Surveys. Washington, DC: The National Academies Press. doi: 10.17226/25098.
×
Page 64
Suggested Citation:"4 Harmonization of Questionnaires and Data Collection Processes." National Academies of Sciences, Engineering, and Medicine. 2018. Reengineering the Census Bureau's Annual Economic Surveys. Washington, DC: The National Academies Press. doi: 10.17226/25098.
×
Page 65
Suggested Citation:"4 Harmonization of Questionnaires and Data Collection Processes." National Academies of Sciences, Engineering, and Medicine. 2018. Reengineering the Census Bureau's Annual Economic Surveys. Washington, DC: The National Academies Press. doi: 10.17226/25098.
×
Page 66
Suggested Citation:"4 Harmonization of Questionnaires and Data Collection Processes." National Academies of Sciences, Engineering, and Medicine. 2018. Reengineering the Census Bureau's Annual Economic Surveys. Washington, DC: The National Academies Press. doi: 10.17226/25098.
×
Page 67
Suggested Citation:"4 Harmonization of Questionnaires and Data Collection Processes." National Academies of Sciences, Engineering, and Medicine. 2018. Reengineering the Census Bureau's Annual Economic Surveys. Washington, DC: The National Academies Press. doi: 10.17226/25098.
×
Page 68
Suggested Citation:"4 Harmonization of Questionnaires and Data Collection Processes." National Academies of Sciences, Engineering, and Medicine. 2018. Reengineering the Census Bureau's Annual Economic Surveys. Washington, DC: The National Academies Press. doi: 10.17226/25098.
×
Page 69
Suggested Citation:"4 Harmonization of Questionnaires and Data Collection Processes." National Academies of Sciences, Engineering, and Medicine. 2018. Reengineering the Census Bureau's Annual Economic Surveys. Washington, DC: The National Academies Press. doi: 10.17226/25098.
×
Page 70
Suggested Citation:"4 Harmonization of Questionnaires and Data Collection Processes." National Academies of Sciences, Engineering, and Medicine. 2018. Reengineering the Census Bureau's Annual Economic Surveys. Washington, DC: The National Academies Press. doi: 10.17226/25098.
×
Page 71
Suggested Citation:"4 Harmonization of Questionnaires and Data Collection Processes." National Academies of Sciences, Engineering, and Medicine. 2018. Reengineering the Census Bureau's Annual Economic Surveys. Washington, DC: The National Academies Press. doi: 10.17226/25098.
×
Page 72
Suggested Citation:"4 Harmonization of Questionnaires and Data Collection Processes." National Academies of Sciences, Engineering, and Medicine. 2018. Reengineering the Census Bureau's Annual Economic Surveys. Washington, DC: The National Academies Press. doi: 10.17226/25098.
×
Page 73
Suggested Citation:"4 Harmonization of Questionnaires and Data Collection Processes." National Academies of Sciences, Engineering, and Medicine. 2018. Reengineering the Census Bureau's Annual Economic Surveys. Washington, DC: The National Academies Press. doi: 10.17226/25098.
×
Page 74
Suggested Citation:"4 Harmonization of Questionnaires and Data Collection Processes." National Academies of Sciences, Engineering, and Medicine. 2018. Reengineering the Census Bureau's Annual Economic Surveys. Washington, DC: The National Academies Press. doi: 10.17226/25098.
×
Page 75
Suggested Citation:"4 Harmonization of Questionnaires and Data Collection Processes." National Academies of Sciences, Engineering, and Medicine. 2018. Reengineering the Census Bureau's Annual Economic Surveys. Washington, DC: The National Academies Press. doi: 10.17226/25098.
×
Page 76
Suggested Citation:"4 Harmonization of Questionnaires and Data Collection Processes." National Academies of Sciences, Engineering, and Medicine. 2018. Reengineering the Census Bureau's Annual Economic Surveys. Washington, DC: The National Academies Press. doi: 10.17226/25098.
×
Page 77
Suggested Citation:"4 Harmonization of Questionnaires and Data Collection Processes." National Academies of Sciences, Engineering, and Medicine. 2018. Reengineering the Census Bureau's Annual Economic Surveys. Washington, DC: The National Academies Press. doi: 10.17226/25098.
×
Page 78
Suggested Citation:"4 Harmonization of Questionnaires and Data Collection Processes." National Academies of Sciences, Engineering, and Medicine. 2018. Reengineering the Census Bureau's Annual Economic Surveys. Washington, DC: The National Academies Press. doi: 10.17226/25098.
×
Page 79
Suggested Citation:"4 Harmonization of Questionnaires and Data Collection Processes." National Academies of Sciences, Engineering, and Medicine. 2018. Reengineering the Census Bureau's Annual Economic Surveys. Washington, DC: The National Academies Press. doi: 10.17226/25098.
×
Page 80
Suggested Citation:"4 Harmonization of Questionnaires and Data Collection Processes." National Academies of Sciences, Engineering, and Medicine. 2018. Reengineering the Census Bureau's Annual Economic Surveys. Washington, DC: The National Academies Press. doi: 10.17226/25098.
×
Page 81
Suggested Citation:"4 Harmonization of Questionnaires and Data Collection Processes." National Academies of Sciences, Engineering, and Medicine. 2018. Reengineering the Census Bureau's Annual Economic Surveys. Washington, DC: The National Academies Press. doi: 10.17226/25098.
×
Page 82
Suggested Citation:"4 Harmonization of Questionnaires and Data Collection Processes." National Academies of Sciences, Engineering, and Medicine. 2018. Reengineering the Census Bureau's Annual Economic Surveys. Washington, DC: The National Academies Press. doi: 10.17226/25098.
×
Page 83
Suggested Citation:"4 Harmonization of Questionnaires and Data Collection Processes." National Academies of Sciences, Engineering, and Medicine. 2018. Reengineering the Census Bureau's Annual Economic Surveys. Washington, DC: The National Academies Press. doi: 10.17226/25098.
×
Page 84
Suggested Citation:"4 Harmonization of Questionnaires and Data Collection Processes." National Academies of Sciences, Engineering, and Medicine. 2018. Reengineering the Census Bureau's Annual Economic Surveys. Washington, DC: The National Academies Press. doi: 10.17226/25098.
×
Page 85
Suggested Citation:"4 Harmonization of Questionnaires and Data Collection Processes." National Academies of Sciences, Engineering, and Medicine. 2018. Reengineering the Census Bureau's Annual Economic Surveys. Washington, DC: The National Academies Press. doi: 10.17226/25098.
×
Page 86
Suggested Citation:"4 Harmonization of Questionnaires and Data Collection Processes." National Academies of Sciences, Engineering, and Medicine. 2018. Reengineering the Census Bureau's Annual Economic Surveys. Washington, DC: The National Academies Press. doi: 10.17226/25098.
×
Page 87
Suggested Citation:"4 Harmonization of Questionnaires and Data Collection Processes." National Academies of Sciences, Engineering, and Medicine. 2018. Reengineering the Census Bureau's Annual Economic Surveys. Washington, DC: The National Academies Press. doi: 10.17226/25098.
×
Page 88
Suggested Citation:"4 Harmonization of Questionnaires and Data Collection Processes." National Academies of Sciences, Engineering, and Medicine. 2018. Reengineering the Census Bureau's Annual Economic Surveys. Washington, DC: The National Academies Press. doi: 10.17226/25098.
×
Page 89
Suggested Citation:"4 Harmonization of Questionnaires and Data Collection Processes." National Academies of Sciences, Engineering, and Medicine. 2018. Reengineering the Census Bureau's Annual Economic Surveys. Washington, DC: The National Academies Press. doi: 10.17226/25098.
×
Page 90
Next: 5 Sampling and Estimation »
Reengineering the Census Bureau's Annual Economic Surveys Get This Book
×
Buy Paperback | $60.00 Buy Ebook | $48.99
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF

The U.S. Census Bureau maintains an important portfolio of economic statistics programs, including quinquennial economic censuses, annual economic surveys, and quarterly and monthly indicator surveys. Government, corporate, and academic users rely on the data to understand the complexity and dynamism of the U.S. economy. Historically, the Bureau's economic statistics programs developed sector by sector (e.g., separate surveys of manufacturing, retail trade, and wholesale trade), and they continue to operate largely independently. Consequently, inconsistencies in questionnaire content, sample and survey design, and survey operations make the data not only more difficult to use, but also more costly to collect and process and more burdensome to the business community than they could be.

This report reviews the Census Bureau's annual economic surveys. Specifically, it examines the design, operations, and products of 11 surveys and makes recommendations to enable them to better answer questions about the evolving economy.

  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    Switch between the Original Pages, where you can read the report as it appeared in print, and Text Pages for the web version, where you can highlight and search the text.

    « Back Next »
  6. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  7. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  8. ×

    View our suggested citation for this chapter.

    « Back Next »
  9. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!