The panel’s review of several aspects of the survey’s statistical quality was challenging. At several points in this report, some of the methods and practices used in ARMS are characterized as “unique” or “unconventional.” In large part, the unique nature of the survey is due to it complexity, with multiple modes and phases and with a goal to collect, classify, and aggregate several types of information from three interrelated but not entirely overlapping reporting units. ARMS reflects some unique practices that are part of the U.S. Department of Agriculture’s (USDA) way of doing business, such as its board review process, which are not within the panel’s purview to assess. Nonetheless, the panel has been able to document and assess the adequacy of the survey design, data collection, analysis, and dissemination.
The panel concludes that appropriate attention is being paid by the National Agricultural Statistics Service (NASS) and the Economic Research Service (ERS) to the basic elements of survey quality, although much more could be done to improve important features of the survey. Several aspects of survey operations need more attention, including the employment of analytical tools to investigate the quality of survey responses, additional control and further automation of the interview process, shifting the focus from nonresponse rates to nonresponse bias, introduction of new methods of imputation of missing values and documentation of the results of imputation, improvements to variance estimation that are more compatible with the types of data analysis uses that are now employed, and more attention to facilitating access to the data files for research and analysis.
In addition to identifying areas of needed improvement in current methods and practices, our review identifies several emerging challenges. These challenges are associated with the changing structure of farming, overall trends in federal surveys—such as the growing difficulty of obtaining satisfactory survey response—and the growing sophistication of survey data users, both inside and outside the federal government. The agencies have attempted to respond to these challenges with some foresight—adding new questions, testing such initiatives as incentives for increasing reporting, developing proposals to collect longitudinal data, and enhancing the provision of microdata files in a protected environment. Our review leads to the conclusion that several areas still need attention, and the recommendations that follow may be considered a roadmap to the future for ARMS.
We are aware that our list of recommendations is long and that some of them will be costly to implement. Full implementation of all of them would require a significant fraction of the ARMS budget. In our view, if additional funds cannot be obtained, at least those recommendations involving methodological research and development directly related to data quality assurance should be undertaken, even at the expense of reducing the size or scope of the survey. For other costly recommendations, notably the training