National Academies Press: OpenBook
« Previous: 3 Survey Management
Suggested Citation:"4 Sample and Questionnaire Design." National Research Council. 2008. Understanding American Agriculture: Challenges for the Agricultural Resource Management Survey. Washington, DC: The National Academies Press. doi: 10.17226/11990.
×

4
Sample and Questionnaire Design

In this chapter, the panel considers issues of design and development of key methodological aspects of the Agricultural Resources Management Survey (ARMS), focusing on the sample frame for ARMS, the foundation and implementation of the multiphase stratified sample design of the survey program, and the design and development of the survey questionnaire. These methodological issues are addressed in light of survey goals and are evaluated in consideration of the panel’s understanding of the current state of the art in survey and questionnaire design.

The panel recognizes that the survey has evolved over the years into an “as-is” state that must be taken into account when considering possible improvements. In summary, ARMS is executed in three distinct but interrelated phases. Each phase is complicated; together they form a complex mosaic of question variations, differing modes of administration, and tailored sample designs to represent the different populations covered. All of these factors must be considered jointly in the overall design process.

  • The Phase I survey (screening survey) serves to screen a standing list of farms for commodities of interest, as well as for whether or not the farm is in business. The Phase I screening is sometimes combined with the screening for vegetable chemical use survey or the crop/stocks survey. It is commodity-specific and has up to 48 state-specific versions so as to avoid asking respondents about commodities not usually grown in their areas. It employs phone, web, and mail data collection modes.

  • The Phase II survey (production practices survey) covers the use of chemicals, fertilizers, and pesticides and has from one to four versions of

Suggested Citation:"4 Sample and Questionnaire Design." National Research Council. 2008. Understanding American Agriculture: Challenges for the Agricultural Resource Management Survey. Washington, DC: The National Academies Press. doi: 10.17226/11990.
×

the questionnaire, based on commodity. As noted earlier, the ARMS Phase II questionnaire was recently and temporarily integrated with another survey questionnaire for the Conservation Effects Assessment Project (CEAP) for those operations that were selected for both the ARMS and CEAP samples. It has only one collection mode: personal interviews via face-to-face contact.

  • The Phase III survey (cost and returns survey) asks about farm and household economics and farm/farm operator characteristics, includes several questionnaire versions (a general questionnaire, from one to three commodity-specific questionnaires, and a core questionnaire), and includes an additional number of sample units added specifically to produce sufficient reliability to produce estimates for the 15-state oversample. The sample for this phase is drawn from both the list frame (also the source of sample units for Phases I and II), and an area frame using a sample selected from eligible farms identified in an annual June area survey. Every five years it is integrated with the Census of Agriculture. It incorporates several modes of data collection (face-to-face, telephone, mailout-mailback with face-to-face follow-up for the mail nonrespondents, and, soon to come, the Internet).

Each phase of the survey operations is a survey design challenge in itself. When combined in one program, they pose an intricate, interwoven series of design challenges that must be addressed holistically. It is useful to consider those challenges in light of sampling and questionnaire design goals, such as minimizing respondent burden, minimizing cost, and ensuring compatibility among the pieces and across time.

SURVEY DESIGN GOALS

Minimizing Respondent Burden

In a survey with so many phases and lengthy questionnaires on highly technical topics, the issue of respondent burden is pressing. The response burden (in minutes) for ARMS estimated by the National Agricultural Statistics Service (NASS) varies substantially for the different components of each survey phase (Table 4-1). The costs and returns survey (Phase III) is especially burdensome, containing questions that are difficult for the respondent to answer—often because the data are hard to retrieve or estimate and sometimes because the question is conceptually complex or unclear from the respondent’s point of view.

These observations have several implications. One is that the survey suffers from both item nonresponse—that is, missing values of variables—and from unit nonresponse—that is, entirely missing observations in at least one phase. Another consequence is that there has been a conscious effort

Suggested Citation:"4 Sample and Questionnaire Design." National Research Council. 2008. Understanding American Agriculture: Challenges for the Agricultural Resource Management Survey. Washington, DC: The National Academies Press. doi: 10.17226/11990.
×

TABLE 4-1 Estimated Response Burden by Survey and Phase, 2006

Survey

Minutes Per Response

Integrated screening survey (Phase I)

15

Vegetable chemical use

32

Practices and costs (Phase II)

57

Costs and returns (Phase III)

83

15-state core, costs and returns (Phase III)

57

Commodity costs of production (Phase III)

105

Organic soybeans practices and costs (Phase III)

57

Organic soybeans costs and returns (Phase III)

105

SOURCE: National Agricultural Statistics Service, submission to U.S. Office of Management and Budget, 2007.

on the part of survey managers to avoid repeat visits to the same farm in successive years when a farm is included in ARMS. The decision to avoid repeat visits limits the ability of the survey to follow farms longitudinally. As discussed later in this chapter, ARMS employs a special sampling routine (called Perry-Burt or P-B, see below) to reduce the likelihood that a farm will be selected for another survey in the survey year or the ARMS survey two years in a row. A further result is that the scope of the survey is circumscribed beyond what analysts consider desirable.1

Minimizing Costs

ARMS is a very expensive program. In an effort to hold down those expenses, as the survey has evolved, many steps have been taken to build in efficiencies and control costs. The reliance of the survey on many of the same respondents for more than one phase of data collection reflects, in part, an attempt to achieve collection efficiencies. The use of mixed modes for data collection is often an important means of controlling costs. The cooperative agreement arrangement with the National Association of State Departments of Agriculture provides a substantially more economical means of collecting data than would be possible working with any other data collection organization, in the view of NASS.2 These and other measures that are designed in part to achieve cost efficiencies have potential implications for data quality.

1

Presentation by Katherine Smith, Economic Research Service, February 2, 2006.

2

Presentation by Robert Bass, Census and Surveys Division, National Agricultural Statistics Service, June 8, 2006.

Suggested Citation:"4 Sample and Questionnaire Design." National Research Council. 2008. Understanding American Agriculture: Challenges for the Agricultural Resource Management Survey. Washington, DC: The National Academies Press. doi: 10.17226/11990.
×

Ensuring Compatibility

To ensure that the survey is comprehensible to the respondent and that there is theoretical coherence to the concepts employed by the analysts of the ultimate data, it is important that ARMS use a common conceptual framework across the phases and versions of the survey. This means that ARMS must establish consistent concepts and definitions among the phases and versions and that they should be coded for retrieval in a consistent manner as well. For the most part, ARMS has been successful in ensuring consistency of concepts and definitions, but there are exceptions. Since consistency of concepts and definitions relies on the common wording of questions and common formatting from version to version, the impact of truncating the number of questions in the self-administered or core questionnaire (version 5) in Phase III is of concern. For example, do the answers to fewer questions obtained by mail in the core questionnaire compare directly with the more detailed answers to the questions obtained by face-to-face interviews in the other versions of the Phase III questionnaire? These issues are discussed in this chapter.

Similarly, there are variations in compatibility across time. ARMS collects data on production practices for a rotating list of specific agricultural commodities, meaning, in practice, that some of the questions must change from one survey round to the next because production practices vary from commodity to commodity (see Table 2-1 for recent commodity coverage). Also, topics of special policy or research interest may be introduced and subsequently eliminated as the rationale for questions on the topics changes. These design features of the survey are sometimes incompatible with a desire to maintain a consistent core of questions over time. The goal should be consistency across time for commodities to the degree possible.

These changes in the questionnaires are likely to have affected the time series in unknowable ways. Particularly because the effects cannot now be quantified, they are of concern. Various design decisions could be made to increase consistency, such as implementing a panel design with consistent cost-of-production data on at least one commodity over a period of years to study dynamics over time, evaluate the effect of periodic changes in programs, and predict what factors change behavior.

SAMPLING FRAME

Target Population

The sampling frame is developed to ensure coverage of the population of interest (the target population) in the sample population. The target population for ARMS Phase III is the official U.S. Department of Agriculture

Suggested Citation:"4 Sample and Questionnaire Design." National Research Council. 2008. Understanding American Agriculture: Challenges for the Agricultural Resource Management Survey. Washington, DC: The National Academies Press. doi: 10.17226/11990.
×

(USDA) farm population in the 48 contiguous states. This population includes all establishments (except institutional farms, see footnote 3) that sold or would or could normally have sold at least $1,000 (nominal) of agricultural products during the year.

This target population was originally established for the Census of Agriculture. By using this definition for both the census and the surveys, USDA appropriately ensures consistency between the census and the surveys. Furthermore, the definition of the target population has been consistently employed since 1974, so it has become ingrained as the appropriate population of interest. Nonetheless, a recent review of the Census of Agriculture concluded that the application of a target population that extends coverage to the very small farms with little overall effect on agricultural production imposes the significant challenges for NASS in finding its target population and getting that population to respond—and respond accurately—to the census and surveys (Council on Food, Agricultural and Resource Economics, 2007, p. 23).

Some of the changes in American agriculture outlined in Chapter 2 have increased the challenges imposed by this long-lasting definition of target population. The Census of Agriculture review likewise noted that the growth in large complex agricultural operations, integrated production, nontraditional farms and “life-style” farms have made practical interpretation of the definition a continuing challenge (Council on Food, Agricultural and Resource Economics, 2007, p. 13). While it is appropriate that these matters of definition be considered in the context of the Census of Agriculture to continue to ensure compatibility between the census and the surveys, the impact of the definition on ARMS should be recognized.

There are often exceptions to the general rule in statistical surveys, made for practical purposes. Some types of farming operations that might be considered to meet the farm population definition (e.g., “abnormal” farms) are not considered part of the ARMS target population. “Point farms,” those with only the potential to sell more than $1,000 of agricultural products, are difficult to find as consistently as more clearly commercially oriented farms.3 Such factors impose an added burden, since the exceptional types of farms

3

A point farm is a farm that did not sell at least $1,000 of agricultural products during the year but could have. Point farms are included under the “would normally have sold” part of the farm definition. The determination of whether an agricultural establishment qualifies as a point farm is made by assigning specific point values for crop acreage and livestock inventory. Any establishment with at least 1,000 points will be defined as a point farm. Each assigned point is assumed to represent one dollar in potential sales. It is necessary to correctly identify these point farms to ensure their representation in the summary.

An abnormal farm is out of scope for the survey. It is defined as a business (i.e., operates land for agricultural purposes or with potential for agricultural production) that does not fit the criteria for the ARMS sample population. This includes Indian reservations, prison farms, private or university research farms, not-for-profit farms operated by religious organizations, and high school Future Farmers of America farms. These institutional farms do not have the same expenses or income patterns as traditional farms (National Agricultural Statistics Service, 2005, p. 4).

Suggested Citation:"4 Sample and Questionnaire Design." National Research Council. 2008. Understanding American Agriculture: Challenges for the Agricultural Resource Management Survey. Washington, DC: The National Academies Press. doi: 10.17226/11990.
×

must be identified so that they can be contacted and asked scoping questions in order to determine if they will be included or excluded, as appropriate.

Dual Frame

NASS develops two sampling frames to select farms for ARMS and other periodic surveys. The primary sample is derived from the NASS list frame. The list frame for NASS surveys is different from and less comprehensive than the list for the agricultural census in that it does not contain potential farming and ranching operation that are available to NASS but have not yet been screened for agricultural activity.

Emphasis in constructing the list frame is placed on farms producing significant amounts of commodities for which NASS provides annual estimates of acreages, yields, and production. A special effort is made to identify and include cases in which a few holdings provide a large share of production of an important commodity, such as cattle in feedlots, hogs, poultry, potatoes, or rice. NASS attempts to keep the list frame as complete as possible, especially for the large producers. Recently, however, NASS has devoted extra attention to ensuring coverage of small farms and ranches and minority operators of farms and ranches. As recommended by a 1998 USDA National Commission on Small Farms, NASS has stepped up its outreach efforts into communities representing the small and minority farms as it constructs the list frame (U.S. Department of Agriculture, 2006a).

The list is constructed and maintained from many different sources, including other NASS surveys and administrative files, as well as third-party commercial databases and USDA program files. Names obtained from such sources that are not already on the NASS list frame are screened to determine their farm status prior to inclusion in the list. Records for approximately 1.3 million farms were carried on the list frame in 2005.

The second sampling frame for ARMS is the NASS area frame. This frame supports the NASS June area survey sample, which is constructed anew each spring. The ARMS selects a subsample of the June area survey sample that is not on the ARMS list sample and meets the official USDA definition of a farm. This process provides coverage of eligible farms that are not included in the list frame. The eligible farms not on the list frame are also known as nonoverlap farms (NOL).

In developing the area frame, NASS relies on satellite imagery, other aerial photographs, and maps to divide the U.S. land area into small segments. Each segment is about 1 square mile, and each has unique and identifiable boundaries. In most states, the segments are divided for sampling purposes into four broad land use categories classified by intensity of cultivation: land intensively cultivated for crops; land used primarily for livestock production; residential and business areas in urban areas; and areas devoted to parks, military installations, and other uses.

Suggested Citation:"4 Sample and Questionnaire Design." National Research Council. 2008. Understanding American Agriculture: Challenges for the Agricultural Resource Management Survey. Washington, DC: The National Academies Press. doi: 10.17226/11990.
×

An initial area frame sample is randomly selected from these segments. The resulting probability sample contains about 15,400 area segments—roughly 0.7 percent of the total land area in the 48 contiguous states. Each June, NASS uses this area sample file to conduct a major multiple frame survey outside ARMS, in which about 52,000 farmers are visited by enumerators to get a firsthand accounting of their agricultural activities. This midyear survey identifies all land uses within the segment and collects information about crops, operator households, animals, grain storage facilities, and environmental factors. The resulting information can be used to stratify farm operations in the selected segments to target crops for follow-on surveys.

In principle, the area frame sample provides coverage of all agricultural activity in the United States, regardless of changes in farm boundaries and management. This sample frame construction technique tends to guard against omissions or duplication in the list frame. Indeed, in 2005 the area frame added about 1,600 eligible nonoverlap units to the frame for the Phase III ARMS when combined with the Phase I screened sample.

The dual frame approach used in ARMS has several benefits, as well as some possible drawbacks. There is no doubt that the area listing operation identifies a large number of small and other types of farms that are not identified in the more traditional listing operation. Many small farms are below the radar for local extension services and others who provide input to the lists. These farms often come into business and exit again relatively quickly. They share with other small businesses the characteristic of being hard to identify on a timely basis.

Given the farm population that is required to be covered by the sample, there are no meaningful alternatives to using the area frame for including omissions from the list frame. NASS does maintain a number of potential farm records on the list frame that are screened on a regular basis for agricultural activity. As mentioned previously, these potential farm records are included on the mail list for the Census of Agriculture but are not included on the sampling frame for ARMS. A relatively low percentage of these records are actual farms. Also, the number of potential farm records varies widely during the five-year census cycle, with the largest number only available in the year preceding the census. So, NASS has decided to only include records that have been identified as farms based on previous survey data as the sampling frame for ARMS, for consistency and efficiency reasons.

There is a question as to whether the benefits of finding these small farms are worth the cost. On one hand, inclusion of small farms, in the larger view, adds little to the estimates of the volume of production and the understanding of the overall impact of agriculture in the U.S. economy. On the other hand, there is a serious national concern about the well-being of small farms and farm families on those farms. Besides, the costs of developing the area frame for ARMS are marginal. The June survey, which

Suggested Citation:"4 Sample and Questionnaire Design." National Research Council. 2008. Understanding American Agriculture: Challenges for the Agricultural Resource Management Survey. Washington, DC: The National Academies Press. doi: 10.17226/11990.
×

bears most of the expense of developing the area frame sample, supports multiple survey operations and provides useful information in its own right. Still, if some way could be found to bring small farms and new operations into the regular listing operation for ARMS with more certainty, the now-marginal costs involved in the area frame operation could be further reduced or eliminated.

SAMPLE DESIGN

Three major objectives establish the sample design parameters for the ARMS: adequately representing all size classes, reducing respondent burden, and attaining an expected level of precision.

Sample Selection

The selection of the target population, described above, is largely done to ensure representation of all size classes of at least $1,000 of agricultural products. The target population defines the frame, which, in turn, plays an important role in defining the design of the survey. Translated into operational terms, ARMS covers all noninstitutional agricultural establishments with farm value sales (FVS) of at least $1,000 in agricultural products.4 The design objective changes from year to year as different types of farms are targeted for the cost of production component of the survey. To accomplish these objectives, the sample frame is subsequently stratified by farm value sales and farm type (Table 4-2) for sampling and estimation purposes. The selection of the sample from the list frame follows these steps

  1. The population is classified into five strata defined by farm value sales.

  2. The Phase I sample is selected. The sample is selected independently by state. For each state, a systematic sample is selected within strata. The strata are formed based on farm value of sales. Within each stratum the population is sorted by type of farm before the stratified systematic sample is drawn.

  3. The Phase I sample is reselected to eliminate duplication with other surveys. After the sample is drawn, poststrata are formed based on type of farm within the farm value of sales strata. The purpose of the poststrata is to control the way the Perry-Burt procedure (explained below) will reselect the sample to reduce respondent burden. If there are five FVS strata and 17 types of farms, then this will result in a maximum of 85 poststrata. The

4

The farm value of sales is calculated by assigning points on a per-head/per-area basis that reflect expected sales.

Suggested Citation:"4 Sample and Questionnaire Design." National Research Council. 2008. Understanding American Agriculture: Challenges for the Agricultural Resource Management Survey. Washington, DC: The National Academies Press. doi: 10.17226/11990.
×

TABLE 4-2 ARMS Sample by Farm Types and Sizes, 2005

Type of Farm

$1,000-$100,000

$100,000-$250,000

$250,000-$500,000

$500,000-$1,000,000

$1,000,000+

Total

Oilseeds, grains, beans

3,830

1,856

1,534

1,371

1,055

9,646

Tobacco

128

50

46

32

19

275

Cotton

65

131

188

269

304

957

Vegetables, melons & potatoes

172

101

131

187

746

1,337

Fruit, tree nuts & berries

795

358

281

291

563

2,288

Greenhouse, nursery

125

470

303

244

810

1,952

Cut Christmas trees

60

21

4

11

23

119

Other crops & hay

1684

241

106

98

131

2,260

Hogs & pigs

87

113

163

256

437

1,056

Milk

99

584

601

420

1,026

2,730

Cattle & calves

2,945

976

518

347

570

5,356

Sheep & goats

139

6

11

7

14

177

Equine

470

4

 

 

 

474

Poultry & eggs

85

347

708

1,203

1,228

3,571

Aquaculture

28

75

17

25

67

212

Other animal

73

19

8

4

8

112

Land/cropland

 

 

 

 

 

71

NOL (area)

 

 

 

 

 

1,610

Total

10,846

5,360

4,620

4,766

7,001

34,203

SOURCE: Economic Research Service.

Suggested Citation:"4 Sample and Questionnaire Design." National Research Council. 2008. Understanding American Agriculture: Challenges for the Agricultural Resource Management Survey. Washington, DC: The National Academies Press. doi: 10.17226/11990.
×

Perry-Burt procedure cross-classifies these poststrata with the strata definitions for the other surveys as well as the strata definitions for last year’s ARMS sample and reselects a sample within these cells that has less burden than the original sample. The poststrata ensure that only similar types of farms are replaced.

  1. The Phase I sample is screened for target crops and “in-business” status. (“In-business” status means the screening survey response indicates the operation meets the ARMS definition for a farm operation [greater than $1,000 gross value sales or potential] for the survey reference year.)

  2. The Phase III samples for costs and returns and the core versions are selected from the “good reports” to represent all agriculture in a state.

  3. The Phase III list sample is then supplemented with farms that were found in the area list operation.

  4. The Phase II commodity samples are selected from the “good reports” with target crops. (“Good reports” are screened samples that are in business and meet any other survey criteria such as farm type, commodity of interest, organic certification, etc., based on the commodity mix and questionnaire version for which the sample is targeted.)

After assignment into the FVS strata, the sample is further stratified by farm type. Each farm operation is classified into one of 17 farm types, and the type of farm forms the substrata within the design strata. The following types of farms were used for classification in 2005: oilseeds, grains, and beans; tobacco; cotton; vegetables, melons, and potatoes; fruit, tree nuts, and berries; greenhouse and nursery; cut Christmas trees; other crops and hay; hogs and pigs; milk; cattle and calves; sheep and goats; equine; poultry and eggs; aquaculture; other animals; and total land/cropland of all types. Table 4-2 shows the resulting ARMS sample for 2005 by type of farm and size.

Sample coverage varies significantly by farm type and size. The number of farms in the smallest size class ($1,000 to $100,000 FVS) was 10,846 in 2005, or less than 0.6 percent of the approximately 1.8 million farms in that size class. Large farms are oversampled. About one-fourth of the approximately 28,000 farms in the largest size class ($1,000,000 or more FVS) are in the sample. As might be expected, due to the thinness of the sample for many of the farm types and sizes, there is not coverage in each state.

Note that, left unadjusted, inflation over time will cause the nominally fixed dollar limits in ARMS to admit ever “smaller” farms, because the $1,000 limit of actual or potential income would fall in inflation-adjusted terms. Similarly, the stratification categories would change. Although these changes may be small in the short run, they are likely to have cumulated

Suggested Citation:"4 Sample and Questionnaire Design." National Research Council. 2008. Understanding American Agriculture: Challenges for the Agricultural Resource Management Survey. Washington, DC: The National Academies Press. doi: 10.17226/11990.
×

to a substantial change in a decade of operation of ARMS. To avoid this cumulative effect, it would be useful to fix the dollar amounts in the dollars of any period and henceforth hold them fixed until some event spurs more fundamental reconsideration of the sample design.

Strategy to Reduce Respondent Burden

The second major influence on the design is the objective of reducing respondent burden. As mentioned earlier, a major strategy for reducing burden is to avoid revisiting respondents from other NASS surveys and those who reported to ARMS in prior years.

NASS employs a method developed in the early 1990s by Charles Perry, Jameson Burt, and William Iwig to control sampling to minimize the number of times that NASS samples a farm operation for several surveys. Called the P-B method, it is designed to reduce the likelihood that a farm might be in the survey for two years in a row (Kott and Fetter, 1999). The P-B method cross-classifies the Phase I ARMS sample with the samples selected the previous year for ARMS and four other recurring USDA surveys—hogs, cattle, crops/stocks, and labor. The P-B method groups the four non-ARMS surveys to identify duplications across the surveys (first stage) and then groups the ARMS sample across years (second stage). As mentioned above in the discussion of the sample selection steps, the ARMS sample is then redrawn to have less overlap with the other surveys and with itself over years. Essentially, then, the Perry-Burt procedure cross-classifies these poststrata with the strata definitions for the other surveys as well as the strata definitions for prior years ARMS samples and reselects a sample within these cells that is less burdensome to the respondents than the original sample. The cross-classification is done within farm type substrata that are defined within the larger FVS strata. This is done to minimize bias in the final sample. Without the substrata, the P-B method could trade a one type of farm for another; for example, a nursery for a dairy.

Following application of the P-B method to selection of the Phase I sample, NASS draws the Phase II and Phase III list samples from “good” reports in Phase I. Beginning in 2005, NASS has used sequential interval poisson (SIP) sampling to select the samples for each phase of ARMS (Kott, 2003). As in Phase I, the objective is to reduce burden, so each operation is selected for one and only one sample. In the end, over 33,000 sample units in Phase III were from the list frame and about 1,600 from the area (NOL) frame.

There is some concern that the procedures used to reduce the probability of a respondent’s inclusion in multiple surveys might lead to a biased sample. Is the set of cases that would otherwise be selected more than once systematically different from other cases? Does conditioning on selection

Suggested Citation:"4 Sample and Questionnaire Design." National Research Council. 2008. Understanding American Agriculture: Challenges for the Agricultural Resource Management Survey. Washington, DC: The National Academies Press. doi: 10.17226/11990.
×

under two sets of criteria say anything important about the cases, so that omitting such overlap might tend to induce systemic bias?

There has not been any investigation of potential bias induced by the P-B method since the initial analysis that led to the decision to employ it over a decade ago (Perry et al., 1994). The 1994 analysis concluded that the “potential for bias resulting from the second stage of the algorithm will be much less than one percent of the … estimates, hence undetectable in light of the coefficients of variation associated with the estimates.” Since then, NASS has continued to assume that any bias would be overwhelmed by the size of the sampling error and would be undetectable.

NASS is considering several changes to ARMS sampling in the future. Research is under way to move to a multivariate probability proportionate to size (MPPS) design for Phase I. This design would allow more flexibility to further target the sample where it is needed most—rare and poorly represented farm types. The agency is also considering controlling burden and overlap with other surveys using SIP sampling. This would replace the Perry-Burt method of burden reduction (Kott, 2003). These important research areas are the type of work that is suggested for management under the interagency research and development program recommended in this report (Recommendation 3.3).

Level of Precision

A third major influence on the design is the specification of the expected level of precision of the key estimates. The expected level of precision defines the size and design of the survey sample (U.S. Office of Management and Budget, 2006a, p. 7). This level of precision is specified in target coefficients of variation—the ratio of the standard error for an estimate to the mean value of the estimate. A small coefficient of variation (say 1 percent) would indicate that an estimate could vary slightly due to sampling error, whereas large coefficients of variation would mean that the estimate is quite imprecise. The most common way to improve the coefficient of variation involves increasing sample size.

The expected level of precision for key ARMS estimates is set forth in NASS Policy and Standards Memoranda (PSM) 45 (Standards for Target Coefficients of Variation for Major Probability Surveys). Stratification of the eligible population, sample sizes, and sample allocations are determined to achieve the target coefficients of variation specified in this document, subject to budget constraints.

Table 4-3, taken from PSM 45, represents the NASS targets and applies to ARMS as well as other surveys. The PSM was first issued in 1999, before the program expanded to provide state-level estimates for the 15 states and updated in 2004. NASS reports that the agency meets 100 percent of the

Suggested Citation:"4 Sample and Questionnaire Design." National Research Council. 2008. Understanding American Agriculture: Challenges for the Agricultural Resource Management Survey. Washington, DC: The National Academies Press. doi: 10.17226/11990.
×

TABLE 4-3 Target Coefficients of Variation for Expenditures from the Agricultural Resource Management Survey

Item

U.S.

Category*

Expenditures for:

 

 

Total

2.5

8.0

Fuels, interest, farm services, seeds, taxes, fertilizers, chemicals

3.5

10.0

Feeds, labor, buildings and improvements, farm supplies

7.5

15.0

Livestock

10.0

20.0

*Maximum values for categories. A category is defined in three ways:

1. Region: Appalachian (KY, NC, TN, VA, WV); Corn Belt (IL, IN, IA, MO, OH); Delta (AR, LA, MS); Lake (MI, MN, WI); Mountain (AZ, CO, ID, MT, NV, NM, UT, WY); Northeast (CT, DE, ME, MD, MA, NH, NJ, NY, PA, RI, VT); Northern Plains (KS, NE, ND, SD); Pacific (CA, OR, WA); Southeast (AL, FL, GA, SC); Southern Plains (OK, TX).

2. Economic class: $1,000 to $9,999, $10,000 to $49,999, $50,000 to $99,999, $100,000 to $249,999, $250,000 to $499,999, $500,000 to $999,999, $1,000,000+.

3. Farm type: livestock, crop.

U.S. targets specified in PSM 45. As for the detailed targets, which include regional, each of the 15 states, and the expense classes, NASS met 86 percent of targets in the 2004 ARMS, 75 percent in the 2005 ARMS, and 97 percent in the 2006 ARMS.

QUESTIONNAIRE DESIGN AND DEVELOPMENT

The key linkage between concept and response is the design of the questionnaire. If done well, a questionnaire will yield information consistent with the concepts and definitions. If not, design of the questionnaire can be a major source of measurement error, defined as “the discrepancy between respondents’ attributes and their survey responses” (Groves, 1987).

There are generally understood to be four sources of measurement error: the interviewer, the respondent, the questionnaire, and the mode of data collection and the data processing methods (Groves, 1987, p. S163-S166). In addressing issues of measurement error arising from the questionnaire itself, the panel undertakes to discuss best practices in organizing for and designing questionnaires.

In the past two decades, the science of questionnaire design has been refined to include two groups of specialists who are drawn on in a coordinated effort: content specialists and design specialists. In ARMS, the content specialists are mainly the subject-matter experts in the Economic Research Service (ERS) with program or survey development responsibilities, and the design specialists are mainly the NASS survey professionals who design

Suggested Citation:"4 Sample and Questionnaire Design." National Research Council. 2008. Understanding American Agriculture: Challenges for the Agricultural Resource Management Survey. Washington, DC: The National Academies Press. doi: 10.17226/11990.
×

and evaluate questionnaires, prepare training materials, and attend to the myriad of tasks pertaining to capturing information from respondents.

This is not a unique arrangement for a federal survey program. It is often the case that these specializations are based in different agencies, with each agency bringing its strength to the questionnaire design and evaluation process. When properly organized for the task, content and design specialists will perform as an integrated working group, constituting a questionnaire design and evaluation team that reaches out to incorporate the interviewer and the respondent in the process of selecting content and design through field and cognitive testing of each collection mode (Esposito, 2003).

In this section, the panel describes and critiques the questionnaire design process as it has been implemented in ARMS. Several concerns are noted with the current process, and recommendations are made for improvements. Although the panel has concerns about the highly technical nature of some parts of the ARMS questionnaires and about respondents’ understanding of some specific questions, we view a detailed item-by-item review of the questionnaires as beyond the scope of this study.

Periodic Major Redesigns

Questionnaires used in all surveys conducted on a recurring basis need to be evaluated and revised from time to time. The need for revision arises because the topics of interest to the survey sponsors and data users may change over time, respondents’ understanding of questions changes over time, the behaviors and opinions about which respondents might be questioned change over time, and so on. Without proper accommodation, all of these factors could reduce the statistical value and substantive relevance of a questionnaire. Some items may never have worked as intended, so periodic questionnaire revision provides an opportunity to improve them.

The process of changing questionnaire content is probably less daunting if there is a set of prescriptions or guidelines to follow. This reduces the revision task to one of matching existing items to the conditions in each guideline and then, when there is a match, taking the prescribed action. The problem with prescriptions and guidelines is that by definition they are general in nature and rarely fit particular questionnaire items in a straightforward way. Even when it is clear that they apply, the right action is not always clear. Sometimes a guideline may fit a questionnaire item in a superficial way, but because this approach does not take into account the constraints and nuances of the particular survey, the prescription may not actually be appropriate. Finally, the application of guidelines by them-

Suggested Citation:"4 Sample and Questionnaire Design." National Research Council. 2008. Understanding American Agriculture: Challenges for the Agricultural Resource Management Survey. Washington, DC: The National Academies Press. doi: 10.17226/11990.
×

selves does not necessarily include an evaluation of their impact—have they helped or hurt or had no impact?

Annual Questionnaire Updating

NASS and ERS follow a more limited process for annual updating of the ARMS questionnaires. As discussed in Chapter 3, this is a shared responsibility between the two agencies. ERS provides questionnaire changes to NASS. The recommendations to add or delete questions are based on current policy issues and data requests from users. ERS writes a justification for each question. In some cases, questions that were tested but did not work in the past are subjected to further testing in this process. The NASS role is to determine the feasibility of the questions, considering question content, timing, space constraints, and do so in light of U.S. Office of Management and Budget (OMB) requirements.

Potential new or changed questions and combined questionnaires are subjected to at least a rudimentary cognitive review by NASS. The objective of the cognitive pretesting, mainly of paper questionnaires, is to determine if respondents are able to answer the questions, not to measure data quality. The agency utilizes two main types of cognitive testing: (a) observing enumerators interviewing respondents, with the observer probing for more information about apparent problems or general impression of questionnaire, and (b) using enumerators as test respondents. Following the fieldwork, the agency summarizes the results and makes appropriate changes.

This ongoing cognitive testing program is constrained by limited resources and by OMB rules requiring formal clearance of questionnaires involving 10 or more subjects. As a result, the cognitive testing is usually limited to fewer than 10 interviews.

NASS is considering several initiatives to strengthen the ongoing cognitive testing program.5 One of these initiatives would be to obtain so-called generic OMB clearance for testing of questionnaires, following the example of several other agencies. OMB has approved such clearances for pretests, cognitive tests, and similar categories of information under which agencies are granted a continuing authorization to modify the instruments and information collected within the limits approved by OMB. Generic clearances require submission of applications for OMB approval and are processed in the same way as other clearances, but they provide greater flexibility for subsequent modification and a simplified process of notification to OMB of changes.

Other initiatives to improve the cognitive testing program include changing the emphasis to evaluating questions based on the quality of the reported data, not just ability to give an answer to the questions; conduct-

5

Presentation by Kathy Ott, June 8, 2006.

Suggested Citation:"4 Sample and Questionnaire Design." National Research Council. 2008. Understanding American Agriculture: Challenges for the Agricultural Resource Management Survey. Washington, DC: The National Academies Press. doi: 10.17226/11990.
×

ing multiple iterations of pretests; and using an independent contractor, as NASS did in the development of the mail version of the Phase III survey, to obtain an outside perspective.

While applauding these initiatives, the panel observes that other federal agencies follow a more extensive research and development approach to questionnaire design, with an ongoing, overall conceptual reevaluation followed by a theoretically guided redesign and rigorous empirical testing of the questionnaire. For example, the Census Bureau has codified standards for development and pretesting survey instruments and materials and has clearly demarked the responsibilities of program areas, the Statistical Research Division, and the Economic Statistics and Methods Processing Division. In a statement of policy, the Census Bureau comprehensively sets out standards and guidelines for all bureau programs (U.S. Census Bureau, 2003).

This deliberative process is often very labor-intensive, requiring highly developed skills in cognitive sciences and design. The development process is complex, often requiring a variety of methods and more than one iteration, as tests clarify both design issues and the conceptual framework that is feasible to probe. This more deliberative approach would better serve the ARMS program because it will produce more tailored solutions, the effectiveness of which is empirically grounded.

A recent effort (2003) to develop a short-form, self-administered mail version of ARMS Phase III for the 15-state oversample had some of the character of the methodology research and development-based approach the panel advocates. The objective was to redesign and improve the shortened ARMS form for mail survey administration, improving the design for self-administration and improving comprehension, incorporating instructions into the questionnaire, making it more user-friendly, and advancing the visual design. NASS engaged the Social and Economic Sciences Research Center of Washington State University for this work.6

In the redesign, the investigators considered the current state of science in understanding the linkage between how people perceive and respond to objects in their environment and desirable features of questionnaire design. They developed a series of visual design principles applied to the construction of this questionnaire.

The empirical test of the redesign, however, was not conclusive. The response rate, prior to enumerator follow-up, was only 28 percent, considerably lower than the final response rate achieved in the traditional modes of collection. In their search for a reason for the low response rate, the investigators wondered if the respondents’ perception of length could

6

Presentation by Danna L. Moore, Don A. Dillman, and Nikolay Ponomarev, June 8, 2006.

Suggested Citation:"4 Sample and Questionnaire Design." National Research Council. 2008. Understanding American Agriculture: Challenges for the Agricultural Resource Management Survey. Washington, DC: The National Academies Press. doi: 10.17226/11990.
×

contribute to that lower mail response rate. Although the redesigned mail questionnaire was 16 pages, in contrast to the more than 30-page interviewer questionnaire, respondents may have been less aware of the survey length when the survey was administered by an enumerator. Moreover, the redesigned version is visually quite dense. These intriguing issues could not be answered in this limited redesign effort, but they could be considered in the context of an ongoing cognitive research and development program.

Although there was no experimental control group, NASS was able to conduct two data quality tests comparing the distribution of data for 15 variables and item nonresponse for 9 variables with the concurrently collected enumerator-administered version. However, these limited data quality tests did not permit any conclusion as to which questionnaire was better. Despite the inconclusive findings, NASS elected to continue to use the redesigned form for future surveys because “the redesigned form is much more visually appealing and user friendly as a self-administered instrument” (Ott and Crouse, 2005).

A continuing research and testing program could cognitively test respondents who complete questions to obtain their evaluations of the visual features of the questionnaire and the questions. Other research objectives could address mode effects, including the comparability of data collected through future versions of self-administered paper and web questionnaires with enumerator-mediated surveys.

Several issues warrant focused investigation in the continuing cognitive research and development program, regardless of interview mode:

  1. What information goes into answers? What questions are answered by respondents on the basis of what they know versus what questions do they answer by searching their records and financial reports, and what questions do they answer by guessing? It is likely that both record checking and guessing might occur in a given interview. How do these different approaches affect data quality and what can be done to maximize quality for a given approach?

  2. How well do ARMS concepts fit respondents’ concepts? The ARMS questionnaires use fairly technical terminology that may be relatively unfamiliar to some respondents or may mean something different to them than to the question authors. These problems may be of particular concern for small farms. Can question wording be made less technical? What can be done—particularly in enumerator-mediated interviewing—to detect conceptual discrepancies and bring the thinking of respondents and question authors into alignment?

  3. When do examples stimulate respondents’ thinking by helping to define the concept and when do examples restrict respondents’ thinking to just the examples?

Suggested Citation:"4 Sample and Questionnaire Design." National Research Council. 2008. Understanding American Agriculture: Challenges for the Agricultural Resource Management Survey. Washington, DC: The National Academies Press. doi: 10.17226/11990.
×

UNDERSTANDING RECORD-KEEPING PRACTICES

In addition to understanding the effects of question wording and collection mode, it is important to understand the linkage among questions requiring reference to records, the existence of records, and records that are actually referenced. The kind of information NASS collects on all three phases of ARMS is based on hard facts about the farming operations and thus could be expected to rely heavily on records. Although some items may be quite familiar to every farm operator (acreage, management practices, hours worked, personal and family characteristics, and the like), it seems unlikely that farmers can accurately answer most of the inquiries on pesticide use and costs and returns without reference to some sort of written business record. Of course, the larger and more complex the operation, the more likely it is that responding to a question will require reference to a business or family record.

There is a general understanding of the role that records play in ARMS responses. At the conclusion of the Phase III survey, NASS has asked enumerators to record whether respondents looked at their records, how often they used those records, and which records were used. The concluding questions are tailored to the various questionnaires, since the form of records is understood to vary across operations. The costs and returns (Phase III) questionnaire, for example, asks what record was referred to when reporting most of the income and expense data: a general ledger or personal record book, a formal farm records book or account book, loose receipts, or a computer or computer printout.

In the case of some data items, the existence of records may be assumed. For example, farmers who apply pesticides are required by the Federal Pesticide Recordkeeping Program to maintain the necessary records of restricted-use pesticides to ensure the applicator’s compliance with the regulation.7

Records for other important aspects of farmers’ business may be less accessible. A study in Minnesota and Wisconsin found that one-third of farmers kept records only as needed for tax purposes, 43 percent used whole-farm record keeping, and just 2 percent did enterprise budgeting. The usual kinds of financial statements that are common in nonfarm businesses are often inadequate or missing altogether for farm businesses; an average of only 40 percent of all farm loan applicants were found to prepare financial statements, and 16 percent prepare business plans, according to lenders (Van Schaik, 2003).

One recent formal NASS study of respondent record-keeping practices gave equally discouraging results for a group of 96 farm operations in Missouri and Virginia that had previously reported on the ARMS Phase III survey. This 1998-1999 study, called the Panel Plus Pilot Study, arranged

7

The 1990 Farm Bill mandated the secretary of agriculture to require certified private applicators to maintain records regarding the use of federally restricted-use pesticides.

Suggested Citation:"4 Sample and Questionnaire Design." National Research Council. 2008. Understanding American Agriculture: Challenges for the Agricultural Resource Management Survey. Washington, DC: The National Academies Press. doi: 10.17226/11990.
×

for a supervisor to accompany the enumerator to the selected units to observe the interview and report back on operator reactions, availability of data, types of records, and the feasibility of using alternative data collection strategies (Ott, 1999).

The idea that some items could be answered without reference to records was affirmed in this survey: every respondent answered at least one core question using no records. About half used tax forms or loose receipts, about one-quarter used a computer, and less than one-fifth used a settlement sheet. The link between a solid source of data and the answer was quite tenuous. Almost 20 percent of respondents used no records at all for the entire interview. The conclusion of Panel Plus Pilot Study was that a relatively low percentage of farm operators have a formal record-keeping system.

In the absence of formal records, a significant number of respondents were observed to answer the questions by guesswork. Some questions seemed more susceptible to generating guesses than others—particularly expenses for utilities, farm labor hours, value of land and buildings, market value of equipment, household expenses for food, nonfarm transportation, and other living expenses. In some cases, respondents asked enumerators to help in making a guess, leading to the possibility of differential bias or variability across enumerators.

Although the few reviews of record-keeping practices that have been conducted have been relatively informal and based on very small samples, they are disquieting in that they cast doubt on the quality of the responses to several key data items in the ARMS. The potential impact on nonsampling bias of misreporting data to ARMS due to poor record-keeping practices may not be trivial.


Recommendation 4.1: The methodology research and development program the panel recommends should systematically (1) evaluate current instruments and practices, (2) collect data that inform both the revision of existing items as well as the creation of new items, (3) test revised instruments before they are put into production, (4) use experimental control groups to evaluate the differences between the old and new questionnaires, (5) improve understanding of respondent record-keeping practices and their effect on survey quality, and (6) designate a subsample of the existing ARMS sample for research and testing purposes. Key parts of this work would best be conducted in a cognitive or usability laboratory facility. It would be enabled by obtaining a generic clearance from the Office of Management and Budget for testing of all phases of the survey to allow for broader cognitive testing, evaluate the quality of data reported in response to each question, and evaluate the impact of mode of data collection across the three phases.

Suggested Citation:"4 Sample and Questionnaire Design." National Research Council. 2008. Understanding American Agriculture: Challenges for the Agricultural Resource Management Survey. Washington, DC: The National Academies Press. doi: 10.17226/11990.
×

APPROPRIATENESS OF THE SURVEY QUESTIONS

Although the panel did not interpret its mandate to include review at the question level, we do think that certain questions warrant review in the context of the systematic methodology research and development activities we recommend. For example, the collection of income information in ARMS Phase III involves several questions about personal income.

As a general rule, personal income is among the most sensitive topics that are asked of survey respondents (Bradburn et al., 1989), and this variable seems to be unreported or misreported relatively often as a result (Moore et al., 1997). One technique that seems to help improve the quality of income reports is to allow the reporting of range values, rather than a single amount. For a respondent who truly does not have an exact answer, ranges may yield more honest answers. Experience with the Survey of Consumer Finances of the Federal Reserve Board shows that using ranges rather than point estimates virtually eliminated “don’t know” as a response (Kennickell, 1997). Another such approach is the use of so-called unfolding brackets (e.g., Juster and Smith, 1997), in which respondents who are unable to or unwilling to answer an open income question are then asked if their income is above or below a particular and relatively low dollar figure (e.g., $5,000). If they say above, they are asked it is above or below a higher figure (e.g., $50,000) and, depending on their answer, they are asked if it is above another figure. The process continues until the respondent has assigned the income in question to a relatively narrow bracket.

Another approach is computerized self-administration, which is widely used to increase the reporting of sensitive behavioral information, such as drug use (e.g., Tourangeau and Smith, 1996). It is now common for interviewers who enter responses into a laptop computer in a face-to-face interview to allow the respondent to directly enter his or her answers into the laptop for the sections of the questionnaire that concern sensitive topics. In ARMS that could take place when collecting income data and possibly data on other topics. It is hard to separate the sense of privacy created by self-administration in general, whether paper or on computer, from the benefits of computerization in particular, for example, automatically selecting the next question depending on the previous answer(s) or flagging suspect or out-of-bounds answers. Computerization may increase respondents’ sense of the study’s legitimacy. This could increase a respondent’s willingness to provide income data and facilitate response by automating such calculations as adding multiple sources of income. Thus, self-administered computerized data collection may improve the honesty, completeness, and precision of income data in ARMS.

Collection of income data is exactly the sort of methodology research

Suggested Citation:"4 Sample and Questionnaire Design." National Research Council. 2008. Understanding American Agriculture: Challenges for the Agricultural Resource Management Survey. Washington, DC: The National Academies Press. doi: 10.17226/11990.
×

and development issue that might be on the agenda of a dedicated staff of methodologists, as recommended above. Even if this kind of work is conducted and NASS and ERS determine that the accuracy of income data was not sufficiently improved by new methods to warrant a change, the decision would be empirically informed.

Consistency of Questions Across Survey Versions

ARMS Phase III is a multiversion survey. The commodity-specific versions (2-4) and the self-administered or core version (5) contain fewer questions than the main version (1), and the commodity-specific versions add questions concerning the commodity enterprise to the core version and are generally longer than version 1. Most of the questions are the same across versions. However, there are three classes of exceptions:

  • When a particular “hot” topic is added in a given year (such as Internet use by farm businesses in 2005), only the main version includes these questions, not versions 2-5.

  • Questions specific to the commodity enterprises in versions 2-4.

  • In the section for farm debt, versions 2-5 contain fewer questions than the main version 1, but the common questions are not compatible across versions because questions on version 1 refer to specific loans, whereas questions on versions 2-5 refer to types of loans (in which a few loans may be combined together).

There are often good reasons for differing questions on different versions of the questionnaire. Asking fewer questions on ARMS versions 2-5 is probably done to reduce respondent burden. However, there are also trade-offs between collecting a larger sample (more respondents) and collecting information on more questions from the same respondents. These practices serve to limit research on particular topics to the subsample of version 1 respondents.

Consistency of Variables over Time

Questions in ARMS will need to change from time to time to meet unfulfilled needs, address new topics, and maintain relevance. When such changes occur, there is usually tension between the consistency of maintaining time series of variables and the flexibility of adding new questions. Clear procedures are needed to distinguish between core questions (which should remain constant) and noncore questions (which could change). Efforts should be made to keep consistent time series of variables. Table 4-4

Suggested Citation:"4 Sample and Questionnaire Design." National Research Council. 2008. Understanding American Agriculture: Challenges for the Agricultural Resource Management Survey. Washington, DC: The National Academies Press. doi: 10.17226/11990.
×

TABLE 4-4 Variables Included in the Debt Section of the ARMS Phase III Questionnaire, 1996-2005

Year

Lender Type

Loan Balance

Interest Rate

Loan Term

Year Obtained

Percent for Farm Use

Loan Guarantee

Type of Loan

Purpose of Loan

Number of Other Loans

Balance on Other Loans

1996

 

1997

 

 

1998

 

 

1999

 

 

 

 

2000

 

 

2001

 

 

2002

 

 

2003

 

2004

 

2005

Suggested Citation:"4 Sample and Questionnaire Design." National Research Council. 2008. Understanding American Agriculture: Challenges for the Agricultural Resource Management Survey. Washington, DC: The National Academies Press. doi: 10.17226/11990.
×

illustrates which questions have been asked consistently over the years and which have not.


Recommendation 4.2: ERS and NASS should improve the consistency of variables across ARMS versions and over time.

SUPPLEMENTAL DATA

As changes to the process of designing the questionnaire, NASS and ERS may wish to consider administrative arrangements and design changes that would enable a recurring supplementation of the microrecords of the survey with additional data items. One way to accomplish this would be to accelerate the program to enrich the data collected with data from other sources. Another would be to introduce a recurring, formal supplemental portion of the ongoing survey; such a supplement would require significant design changes.

The managers of ARMS have already done extensive work to enrich the microdata from ARMS with additional cross-tabulations and other insightful sources in order to respond to the call by researchers for additional data from the survey. For example, ERS already provides some additional data by adding a number of external variables to the farm business and household and crop production practices research microdata files. These variables include zip code, county and state codes, as well as administrative region designations. Other variables generated by combining ARMS data with other sources have also been added to the microdata records, such as cost-of-production estimates, farm typology, and commodity cost and return estimates.

In some cases there may be both a reduction in respondent burden and an increase in data quality from using administrative data to supplement or replace some survey data. Data on program participation, tax-related information, and geographically linked data obtained from other sources (such as satellite monitoring and local weather records) are obvious starting points.

In an attempt to use existing data and only ask additional questions that are needed, NASS draws on administrative sources for use in ARMS analysis and estimation. For example, NASS uses administrative data from the California Environmental Protection Agency’s Mandatory Pesticide Use Reporting System and a similar system in Arizona instead of asking for such information from respondents. NASS reports that it is investigating the use of available USDA program payment data for potential use in its census and survey programs and is continually searching for new sources that would be helpful.

Suggested Citation:"4 Sample and Questionnaire Design." National Research Council. 2008. Understanding American Agriculture: Challenges for the Agricultural Resource Management Survey. Washington, DC: The National Academies Press. doi: 10.17226/11990.
×

Even with these additions, much of what researchers need is simply not available on the questionnaire, nor is it available in a form that could be easily added to the individual record from other sources. Much of the needed data will have to be collected anew.

ARMS managers have put in place a capability to introduce ad hoc questions about “hot topics.” A more formal recognition of the need to collect supplemental information as a regular part of ARMS might be useful. For example, a section of the questionnaire could be set aside for the collection of special items, and provision could be made for soliciting input from the general user community for items to be collected, perhaps on a cost-reimbursable basis. One model of such an arrangement is the Current Population Survey, which provides the opportunity to add questions with cost reimbursement by the organization that commissions the supplemental collection.

Because such data collection could be seen as ancillary to the central purpose of ARMS, efforts would need to be devoted to special training for enumerators and to motivating respondents about the particular importance of the data. The overall burden of collecting supplemental information might be reduced if collection were limited to specific subgroups, such as farms in a particular type of watershed.

Although ARMS is already perceived as a survey with a high level of respondent burden, additional data collection may well be justified if there are issues of sufficiently great importance that require joint analysis with other data already collected in the survey.


Recommendation 4.3: NASS and ERS should explore the collection of auxiliary information on a formal basis, as well as the feasibility of enriching the ARMS data files with information from administrative data sources, geospatial data, and the like.

Suggested Citation:"4 Sample and Questionnaire Design." National Research Council. 2008. Understanding American Agriculture: Challenges for the Agricultural Resource Management Survey. Washington, DC: The National Academies Press. doi: 10.17226/11990.
×
Page 62
Suggested Citation:"4 Sample and Questionnaire Design." National Research Council. 2008. Understanding American Agriculture: Challenges for the Agricultural Resource Management Survey. Washington, DC: The National Academies Press. doi: 10.17226/11990.
×
Page 63
Suggested Citation:"4 Sample and Questionnaire Design." National Research Council. 2008. Understanding American Agriculture: Challenges for the Agricultural Resource Management Survey. Washington, DC: The National Academies Press. doi: 10.17226/11990.
×
Page 64
Suggested Citation:"4 Sample and Questionnaire Design." National Research Council. 2008. Understanding American Agriculture: Challenges for the Agricultural Resource Management Survey. Washington, DC: The National Academies Press. doi: 10.17226/11990.
×
Page 65
Suggested Citation:"4 Sample and Questionnaire Design." National Research Council. 2008. Understanding American Agriculture: Challenges for the Agricultural Resource Management Survey. Washington, DC: The National Academies Press. doi: 10.17226/11990.
×
Page 66
Suggested Citation:"4 Sample and Questionnaire Design." National Research Council. 2008. Understanding American Agriculture: Challenges for the Agricultural Resource Management Survey. Washington, DC: The National Academies Press. doi: 10.17226/11990.
×
Page 67
Suggested Citation:"4 Sample and Questionnaire Design." National Research Council. 2008. Understanding American Agriculture: Challenges for the Agricultural Resource Management Survey. Washington, DC: The National Academies Press. doi: 10.17226/11990.
×
Page 68
Suggested Citation:"4 Sample and Questionnaire Design." National Research Council. 2008. Understanding American Agriculture: Challenges for the Agricultural Resource Management Survey. Washington, DC: The National Academies Press. doi: 10.17226/11990.
×
Page 69
Suggested Citation:"4 Sample and Questionnaire Design." National Research Council. 2008. Understanding American Agriculture: Challenges for the Agricultural Resource Management Survey. Washington, DC: The National Academies Press. doi: 10.17226/11990.
×
Page 70
Suggested Citation:"4 Sample and Questionnaire Design." National Research Council. 2008. Understanding American Agriculture: Challenges for the Agricultural Resource Management Survey. Washington, DC: The National Academies Press. doi: 10.17226/11990.
×
Page 71
Suggested Citation:"4 Sample and Questionnaire Design." National Research Council. 2008. Understanding American Agriculture: Challenges for the Agricultural Resource Management Survey. Washington, DC: The National Academies Press. doi: 10.17226/11990.
×
Page 72
Suggested Citation:"4 Sample and Questionnaire Design." National Research Council. 2008. Understanding American Agriculture: Challenges for the Agricultural Resource Management Survey. Washington, DC: The National Academies Press. doi: 10.17226/11990.
×
Page 73
Suggested Citation:"4 Sample and Questionnaire Design." National Research Council. 2008. Understanding American Agriculture: Challenges for the Agricultural Resource Management Survey. Washington, DC: The National Academies Press. doi: 10.17226/11990.
×
Page 74
Suggested Citation:"4 Sample and Questionnaire Design." National Research Council. 2008. Understanding American Agriculture: Challenges for the Agricultural Resource Management Survey. Washington, DC: The National Academies Press. doi: 10.17226/11990.
×
Page 75
Suggested Citation:"4 Sample and Questionnaire Design." National Research Council. 2008. Understanding American Agriculture: Challenges for the Agricultural Resource Management Survey. Washington, DC: The National Academies Press. doi: 10.17226/11990.
×
Page 76
Suggested Citation:"4 Sample and Questionnaire Design." National Research Council. 2008. Understanding American Agriculture: Challenges for the Agricultural Resource Management Survey. Washington, DC: The National Academies Press. doi: 10.17226/11990.
×
Page 77
Suggested Citation:"4 Sample and Questionnaire Design." National Research Council. 2008. Understanding American Agriculture: Challenges for the Agricultural Resource Management Survey. Washington, DC: The National Academies Press. doi: 10.17226/11990.
×
Page 78
Suggested Citation:"4 Sample and Questionnaire Design." National Research Council. 2008. Understanding American Agriculture: Challenges for the Agricultural Resource Management Survey. Washington, DC: The National Academies Press. doi: 10.17226/11990.
×
Page 79
Suggested Citation:"4 Sample and Questionnaire Design." National Research Council. 2008. Understanding American Agriculture: Challenges for the Agricultural Resource Management Survey. Washington, DC: The National Academies Press. doi: 10.17226/11990.
×
Page 80
Suggested Citation:"4 Sample and Questionnaire Design." National Research Council. 2008. Understanding American Agriculture: Challenges for the Agricultural Resource Management Survey. Washington, DC: The National Academies Press. doi: 10.17226/11990.
×
Page 81
Suggested Citation:"4 Sample and Questionnaire Design." National Research Council. 2008. Understanding American Agriculture: Challenges for the Agricultural Resource Management Survey. Washington, DC: The National Academies Press. doi: 10.17226/11990.
×
Page 82
Suggested Citation:"4 Sample and Questionnaire Design." National Research Council. 2008. Understanding American Agriculture: Challenges for the Agricultural Resource Management Survey. Washington, DC: The National Academies Press. doi: 10.17226/11990.
×
Page 83
Suggested Citation:"4 Sample and Questionnaire Design." National Research Council. 2008. Understanding American Agriculture: Challenges for the Agricultural Resource Management Survey. Washington, DC: The National Academies Press. doi: 10.17226/11990.
×
Page 84
Suggested Citation:"4 Sample and Questionnaire Design." National Research Council. 2008. Understanding American Agriculture: Challenges for the Agricultural Resource Management Survey. Washington, DC: The National Academies Press. doi: 10.17226/11990.
×
Page 85
Next: 5 Data Collection »
Understanding American Agriculture: Challenges for the Agricultural Resource Management Survey Get This Book
×
Buy Paperback | $68.00 Buy Ebook | $54.99
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF

The Agricultural Resource Management Survey (ARMS) is the federal government's primary source of information on the financial condition, production practices, and resource use on farms, as well as the economic well-being of America's farm households. ARMS data are important to the U.S. Department of Agriculture (USDA) and to congressional, administration, and industry decision makers when they must weigh alternative policies and programs that touch the farm sector or affect farm families.

ARMS is unique in several respects. As a multiple-purpose survey with an agricultural focus, ARMS is the only representative national source of observations of farm-level production practices, the economics of the farm businesses operating the field (or dairy herd, greenhouse, nursery, poultry house, etc.), and the characteristics of the American farm household (age, education, occupation, farm and off-farm work, types of employment, family living expenses, etc.). No other data source is able to match the range and depth of ARMS in these areas. American agriculture is changing, and the science of statistical measurement is changing as well. As with every major governmental data collection with such far-reaching and important uses, it is critical to periodically ensure that the survey is grounded in relevant concepts, applying the most up-to-date statistical methodology, and invested with the necessary design, estimation, and analytical techniques to ensure a quality product.
ARMS is a complex undertaking. From its start as a melding of data collected from the field, the farm, and the household in a multiphase, multiframe, and multiple mode survey design, it has increased in complexity over the decade of its existence as more sophisticated demands for its outputs have been made. Today, the survey faces difficult choices and challenges, including a need for a thorough review of its methods, practices, and procedures. Understanding American Agriculture : Challenges for the Agricultural Resource Management Survey summarizes the recommendations of the committee who wrote the survey.
  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    Switch between the Original Pages, where you can read the report as it appeared in print, and Text Pages for the web version, where you can highlight and search the text.

    « Back Next »
  6. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  7. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  8. ×

    View our suggested citation for this chapter.

    « Back Next »
  9. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!