National Academies Press: OpenBook
« Previous: 6 Adequacy of the Content of the 2014 SIPP
Suggested Citation:"7 Quality of Key Estimates." National Academies of Sciences, Engineering, and Medicine. 2018. The 2014 Redesign of the Survey of Income and Program Participation: An Assessment. Washington, DC: The National Academies Press. doi: 10.17226/24864.
×

7

Quality of Key Estimates

Primary objectives of the redesign of the Survey of Income and Program Participation (SIPP) were to reduce the survey’s costs and respondent burden, but there was also a belief that a reengineering of the survey could help to counter a long-run decline in the quality of the survey’s measures and could address the troublesome seam bias that has plagued the survey from its beginning. On the face of it, reducing the number of yearly interviews from three to one seemed bound to threaten two of the survey’s most unique contributions: its measurement of short-term dynamics and its detailed monthly economic data. Although the 2009 National Research Council (NRC) study panel was skeptical that the use of an event history calendar (EHC) could offset the loss of the survey’s trademark frequent interviews (National Research Council, 2009, p. 105), this innovation offered at least a prospect of success. In addition, annual interviews seemed likely to help with the measurement of sources of income for which the 4-month reference period seemed to present problems. A risk was that any improvements in the data might turn out to be focused in areas for which SIPP data are rarely used or have no comparative advantage over other Census Bureau surveys.

This chapter presents an assessment of data quality in the 2014 SIPP panel relative to the 2008 panel1 and, in some cases, independent sources.

___________________

1 The 2014 SIPP panel data and 2008 SIPP panel data, from which these comparisons were made, were generated through different survey designs, questionnaires, reference periods, and imputation methodology. In this chapter, the study panel compared the final estimates for various critical variables without attempting to ascertain which changes were the most influential.

Suggested Citation:"7 Quality of Key Estimates." National Academies of Sciences, Engineering, and Medicine. 2018. The 2014 Redesign of the Survey of Income and Program Participation: An Assessment. Washington, DC: The National Academies Press. doi: 10.17226/24864.
×

The study panel’s assessment covers income and program participation, recall bias, intrayear dynamics, and nonresponse.

After generating basic statistics and analysis tables, panel members collectively reviewed the results. The study panel developed trend information, comparisons with benchmarks, and graphical representations to help identify estimates that appeared questionable. When a questionable result was found, panel members went back to double check the programming code and to make sure they were constructing certain variables correctly. The panel corrected any errors found. When the study panel identified suspected data problems, panel members informed the Census Bureau staff and asked them to investigate. When the Census Bureau found a data problem, it was generally fixed in the next iteration of the internal data. The section in this chapter titled Treatment of Excessive Hourly Wages provides a good example of a major data problem that was found in the review.

This review process occurred repeatedly over time. The study panel began its analysis using iteration 5 of the internal data and recreated its analysis and review several times as the Census Bureau continued to correct the underlying data. This report is based on iteration 12 of the internal data, and the Census Bureau released a public use data file based on iteration 13. In working with iteration 12, there were results for a few variables that still concerned the study panel—either a data problem or the panel had not yet correctly constructed the variable in its computer code. The study panel excluded from this report any analysis of these specific variables. An example is participation in the WIC (Women, Infants, and Children) program. We understand that the Census Bureau made modifications to the WIC variables before issuing the public use data file.

To provide a suitable context for our assessment, this chapter begins with the study panel’s view of quality measurement for survey estimates of income and program participation, followed by a review of income reporting in prior SIPP panels. The chapter then describes the methods used in the study panel’s assessment of the new SIPP data.

QUALITY MEASUREMENT FOR INCOME AND PROGRAM PARTICIPATION

For estimates of receipt and amount of income, program participation, and transitions, the quality of survey data is commonly assessed by comparison to a benchmark—based, if possible, on administrative data or, barring that, exceptionally high-quality or widely used survey data.2

___________________

2 When SIPP was launched, the initial publications of statistics from the survey included comparisons of selected estimates with carefully constructed benchmarks drawn from administrative data. These benchmark comparisons underscored the importance the Census Bureau accorded to data quality.

Suggested Citation:"7 Quality of Key Estimates." National Academies of Sciences, Engineering, and Medicine. 2018. The 2014 Redesign of the Survey of Income and Program Participation: An Assessment. Washington, DC: The National Academies Press. doi: 10.17226/24864.
×

When benchmarks are not available, comparisons can be made to other surveys but with the understanding that the comparison survey may not be uniformly better than the survey being evaluated. Indeed, the objective of such comparisons may be to better understand the relative strengths and weaknesses of the two surveys. If the comparison survey has been assessed previously against a set of benchmarks, comparing the two surveys may serve as an indirect benchmark comparison for the survey being evaluated.

For any benchmark comparison, universe differences and definitional differences must be taken into account. Definitional differences tend to be more of an issue with income than with program participation. For example, receipt of unemployment compensation or enrollment in Medicaid has a very clear definition in administrative data, but interest or dividend income will differ substantially depending on whether or not the amounts earned in retirement accounts are included in the definition. Universe issues, on the other hand, tend to be more common with program participation. Medicaid, for example, serves a large institutional population, so unless the institutional beneficiaries can be removed from an administrative benchmark, an estimate of Medicaid enrollment from a survey of the non-institutional population can be expected to fall short by at least the level of institutional enrollment.

For income, comparisons can be based on means (or aggregates), medians, or distributional statistics, such as quantiles. Administrative benchmarks may be limited to means or aggregates. For some purposes, comparisons of aggregates may hold particular interest, because they provide a basis for assessing the completeness of reporting. But aggregates may be influenced unduly by observations in the upper tail of a distribution, and surveys can underestimate the upper tail because of nonresponse as well as underreporting. Another factor affecting means and aggregates in public use data is topcoding, where the highest values are replaced with lower amounts—which may be calculated in different ways—in order to protect the confidentiality of respondents. One strategy for topcoding that preserves means and aggregates is to assign the mean of the topcoded values to all of the observations that are topcoded. The Census Bureau has used this strategy for select variables in SIPP and does so for an expanded set of income sources in the 2014 panel public use data file.

Our assessment of SIPP data quality in this chapter makes use of benchmark comparisons where possible, but the study panel was also interested in how the 2014 panel compares to the 2008 panel across a wide range of measures. These measures include not only receipt and amount of income by source, receipt of noncash benefits, amount of aggregate income, and level of income relative to poverty, but also transitions into and out of a variety of statuses including employment, poverty, and benefit receipt. With each set of comparative measures presented below the report dis-

Suggested Citation:"7 Quality of Key Estimates." National Academies of Sciences, Engineering, and Medicine. 2018. The 2014 Redesign of the Survey of Income and Program Participation: An Assessment. Washington, DC: The National Academies Press. doi: 10.17226/24864.
×

cusses specifically how the panel assesses quality in that context. Finally, in addition to assessing the quality of various estimates from the 2014 SIPP panel the chapter examines nonresponse—both unit and item—because this provides an important measure of overall data quality. The standard of comparison for this assessment is nonresponse in recent SIPP panels and, for item nonresponse, in another Census Bureau survey whose income estimates are widely used.

INCOME REPORTING IN PRIOR SIPP PANELS

The 2009 NRC report documented modest declines in the quality of SIPP’s measurement of income from a variety of sources between the survey’s debut in 1983 and 1996 (National Research Council, 2009, p. 32). The quality of survey estimates of income is often assessed by comparing estimates of aggregate amounts to established benchmarks—commonly the totals prepared by the Bureau of Economic Analysis for the National Income and Product Accounts (NIPA). Table 7-1, prepared by the current study panel, reports ratios of SIPP annual aggregates to NIPA totals for 16 income sources between 1990 and 2012 (with 2 years, 2000 and 2008, excluded because there were not 12 months of data collected in those years). The trends vary by source, with some sources showing improvements followed by declines. However, for nearly every source the 2012 share of the NIPA total is lower than the 1990 share. The declines between 2009 and 2012 are especially striking because they occur entirely within the 2008 panel. Wage and salary earnings, the single largest source of income, shows small ups and downs between 1990 and 2009, but stood at 85 percent in 2009, the same as 1990, before descending to 80 percent in 2012. Self-employment earnings exceeded the NIPA total over much of the period but turned down sharply with the start of the 2004 panel (dropping 25 percentage points to 82 percent of the NIPA total) and continuing to decline to 65 percent in 2012.3

Private pensions and military retirement also exhibit steep declines between 2003 and 2004 in Table 7-1, suggesting that changes in the questions may have played a role. Yet federal pensions remain steady, and state and local pensions show a relatively modest decline. Because of the dominance of private pensions within the pensions category, total pension income declines from 92 percent of the NIPA total to 75 percent between 1990 and 2012.

___________________

3 The 2004 panel separated self-employment income into two components: the salary a business owner pays to himself or herself and the profit realized by the business. The profit amount could be a loss, and for the first time in the history of SIPP the total income from a business was allowed to be negative. This could account for the reduced share of NIPA dollars captured by the SIPP data after 2003.

Suggested Citation:"7 Quality of Key Estimates." National Academies of Sciences, Engineering, and Medicine. 2018. The 2014 Redesign of the Survey of Income and Program Participation: An Assessment. Washington, DC: The National Academies Press. doi: 10.17226/24864.
×

The most dramatic declines over the 23 years occur for property income, with the three sources (interest, dividends, and rents and royalties) showing a collective decline from 57 percent to 12 percent of the NIPA total. Each of the three sources displays a steady erosion over the entire period, rather than an abrupt downturn that could be attributed to questionnaire changes in a single SIPP panel.

Transfer income shows the least decline overall, due to steady reporting of Social Security benefits and Supplemental Security Income (SSI). As noted in Chapter 5, confusion between the two sources is common in surveys, and that is reflected here in a substantial overreporting of income from SSI, which slightly depresses reporting of income from the much larger Social Security program.

Summing all of these sources, SIPP’s reporting of aggregate income declined relative to the NIPA total, falling from 86 percent in 1990 to 73 percent in 2012. Can the new SIPP reverse this trend? This is a critical question in our assessment of the quality of reporting of income and program participation in the redesigned survey.

METHODS

The partial overlap between wave 1 of the 2014 SIPP panel and the final waves of the 2008 panel afforded the study panel an opportunity to assess how estimates from the two panels compare over the same period of time. The earlier NRC report recommended that the Census Bureau start the new SIPP panel during the final year that the 2008 panel was in the field. This timing would have provided a 2-year overlap, with the first seam between waves of the new panel falling in the middle. The Census Bureau did not adopt this recommendation because the cost of fielding both surveys at the same time for an entire year of data collection was infeasible from both an operational and cost standpoint. However, with the 1-year reference period of the new panel, it was possible to achieve an overlap of about 1 year without starting data collection for the 2014 panel until data collection for the 2008 panel was complete. By extending data collection for the 2008 panel through December 2013, the 2008 panel obtained data for the first 11 months of the initial reference period of the 2014 panel. Due to events outside the Census Bureau’s control, the data from the 2008 panel turned out to be less complete than was intended, but the data nevertheless make it possible to compare key estimates from the two surveys for a common reference period.

With the rotation group design of the earlier SIPP panels, conducting the final interviews in December meant that the final month of data, November 2013, was collected from only one rotation group. This also meant that, in principle, two rotation groups would have had October

Suggested Citation:"7 Quality of Key Estimates." National Academies of Sciences, Engineering, and Medicine. 2018. The 2014 Redesign of the Survey of Income and Program Participation: An Assessment. Washington, DC: The National Academies Press. doi: 10.17226/24864.
×

TABLE 7-1 Survey of Income and Program Participation (SIPP) Annual Income as a Proportion of National Income and Product Accounts (NIPA) Total, by Source and Year, 1990-2012

Source of Income 1990 1991 1992 1993 1994 1995 1996 1997
Earnings 0.89 0.89 0.87 0.86 0.85 0.84 0.89 0.87
Wages and salary 0.85 0.86 0.84 0.84 0.84 0.83 0.87 0.85
Self-employment 1.19 1.19 1.06 0.98 0.96 0.93 1.06 0.97
Property 0.57 0.54 0.49 0.48 0.46 0.45 0.40 0.35
Interest 0.66 0.69 0.63 0.59 0.61 0.58 0.55 0.44
Dividends 0.40 0.35 0.35 0.39 0.34 0.35 0.28 0.26
Rent and royalties 0.59 0.48 0.41 0.41 0.40 0.36 0.34 0.32
Transfers 0.93 0.92 0.89 0.88 0.87 0.88 0.87 0.87
Social security + railroad retirement 0.97 0.95 0.93 0.91 0.90 0.90 0.87 0.87
Supplemental security income 0.84 0.87 0.85 0.84 0.85 0.88 0.90 0.95
Family assistancea 0.91 0.91 0.87 0.96 0.94 0.95 0.95 0.93
Other cash welfare 0.21 0.24 0.23 0.26 0.21 0.24 0.29 0.33
Unemployment compensation 0.70 0.77 0.77 0.78 0.70 0.68 0.59 0.56
Workers’ compensation 0.65 0.60 0.58 0.54 0.56 0.51 0.72 0.66
Veterans’ payments 0.78 0.76 0.78 0.74 0.73 0.66 0.73 0.69
Pensions 0.92 0.95 0.93 0.95 0.97 0.95 0.86 0.93
Private pension 0.98 0.98 0.91 0.95 0.99 0.99 0.87 0.96
Federal pension 0.78 0.83 0.83 0.84 0.87 0.91 0.77 0.79
Military retirement 0.88 0.88 0.89 0.89 0.90 0.86 1.00 1.10
State and local pensions 0.78 0.83 0.82 0.81 0.79 0.78 0.67 0.71
Total Income 0.86 0.87 0.84 0.84 0.83 0.82 0.85 0.83

NOTES: Omitted years (2000 and 2008) did not have 12 months of data collected by SIPP.

aWith the replacement of Aid to Families with Dependent Children (AFDC) by Temporary Assistance for Needy Families (TANF) under the Personal Responsibility and Work Opportunity Reconciliation Act of 1996, the NIPA total began to include funds expended on services other than cash assistance, whereas family assistance as measured in SIPP includes only cash assistance. The subsequent decline in the proportion of the NIPA total captured in SIPP is due primarily if not entirely to this difference in the two measures.

SOURCE: Panel generated with data from the SIPP public-use data file.

Suggested Citation:"7 Quality of Key Estimates." National Academies of Sciences, Engineering, and Medicine. 2018. The 2014 Redesign of the Survey of Income and Program Participation: An Assessment. Washington, DC: The National Academies Press. doi: 10.17226/24864.
×
1998 1999 2001 2002 2003 2004 2005 2006 2007 2009 2010 2011 2012
0.85 0.84 0.87 0.88 0.89 0.86 0.85 0.83 0.83 0.85 0.82 0.80 0.78
0.84 0.84 0.83 0.85 0.86 0.87 0.86 0.85 0.84 0.85 0.83 0.81 0.80
0.88 0.86 1.10 1.07 1.07 0.82 0.77 0.75 0.80 0.84 0.78 0.74 0.65
0.33 0.37 0.35 0.32 0.29 0.30 0.24 0.21 0.18 0.21 0.17 0.14 0.12
0.36 0.42 0.41 0.56 0.39 0.58 0.30 0.27 0.22 0.26 0.17 0.13 0.10
0.29 0.33 0.25 0.14 0.17 0.18 0.19 0.16 0.14 0.16 0.16 0.13 0.11
0.31 0.34 0.39 0.37 0.39 0.39 0.27 0.29 0.22 0.22 0.19 0.19 0.18
0.86 0.85 0.85 0.83 0.84 0.91 0.91 0.90 0.89 0.87 0.87 0.88 0.88
0.87 0.87 0.88 0.87 0.88 0.94 0.94 0.93 0.92 0.91 0.92 0.93 0.93
0.94 0.95 0.99 1.01 1.04 1.06 1.22 1.23 1.25 1.13 1.19 1.25 1.24
0.73 0.52 0.40 0.39 0.37 0.44 0.44 0.42 0.40 0.42 0.37 0.37 0.35
0.19 0.13 0.76 0.39 0.16 0.27 0.19 0.14 0.14 0.10 0.09 0.08 0.06
0.54 0.57 0.61 0.54 0.58 0.70 0.69 0.64 0.63 0.67 0.67 0.63 0.64
0.62 0.64 0.50 0.54 0.53 0.46 0.47 0.46 0.43 0.52 0.47 0.52 0.54
0.70 0.69 0.74 0.77 0.75 0.74 0.77 0.69 0.62 0.81 0.67 0.62 0.61
1.00 1.03 0.87 0.94 0.96 0.81 0.83 0.79 0.76 0.79 0.78 0.76 0.75
1.07 1.08 0.92 1.04 1.07 0.79 0.81 0.74 0.72 0.74 0.71 0.68 0.67
0.79 0.82 0.80 0.80 0.83 0.85 0.80 0.79 0.72 0.76 0.74 0.73 0.74
1.25 1.39 0.95 1.09 1.00 0.69 0.68 0.64 0.59 0.64 0.62 0.58 0.56
0.73 0.72 0.64 0.67 0.70 0.68 0.70 0.70 0.66 0.72 0.71 0.70 0.69
0.82 0.82 0.83 0.85 0.86 0.83 0.82 0.79 0.78 0.81 0.78 0.75 0.73
Suggested Citation:"7 Quality of Key Estimates." National Academies of Sciences, Engineering, and Medicine. 2018. The 2014 Redesign of the Survey of Income and Program Participation: An Assessment. Washington, DC: The National Academies Press. doi: 10.17226/24864.
×

data, three would have had September data, and all four rotation groups would have 2013 data from August back through January. However, a federal government shutdown during the first 2 weeks of fiscal year 2013 resulted in the Census Bureau’s canceling the October SIPP data collection. With the loss of an entire rotation group in the final wave, there was one fewer rotation group for the months of June through September (the 4-month reference period for the October interviews). The end result was that in 2013 data for the 2008 SIPP panel were collected from all four rotation groups for just the first 5 months, January through May. Data were collected from three rotation groups for June through August, two rotation groups for September and October, and just one rotation group for November.

Fewer than four rotation groups for any month means reduced precision and greater volatility in the survey’s estimates. Estimates based on a single rotation group have standard errors twice as large as estimates based on all four rotation groups. In addition, biases may develop in the estimates from individual rotation groups due to attrition and other unit nonresponse. As a result of the patterning of 2008 panel data in 2013, the study panel focused most of its comparisons on the months of January through May. Because of the general similarity of results across these months, most of the results presented in the next section represent a single month, April, which was chosen because the data for all four rotation groups were collected in the same interview wave, wave 15, representing interviews conducted between May and August of 2013.

The panel does not present estimates of the proportion of persons who ever participated in a program over the course of a year (annual-ever participation) because it could not construct comparable estimates from the 2008 panel without addressing the loss of data due to missed interviews. The design of the new SIPP lends itself very well to calendar year estimates because these can be constructed from a single wave. Because of this design feature, there is no need to deal with intrayear attrition. While the older SIPP provided calendar year weights that included adjustments for attrition, there are no calendar year weights for 2013 because, as noted, the 2008 panel collected only 5 months of data from all four rotation groups and collected no data at all for the month of December.

The study panel generated extensive tabulations for several iterations of the internal 2014 panel data, including the unweighted data, iteration 5 (which was supposed to have had “near final weights”), iteration 11, and finally iteration 12. The public use data file is based on the next iteration of the data. The estimates presented here represent a small subset of our analyses; they were selected to address what the study panel believes are the most important questions that they could answer with respect to the 2014 SIPP panel.

Suggested Citation:"7 Quality of Key Estimates." National Academies of Sciences, Engineering, and Medicine. 2018. The 2014 Redesign of the Survey of Income and Program Participation: An Assessment. Washington, DC: The National Academies Press. doi: 10.17226/24864.
×

A potential concern in comparing the 2008 and 2014 panels with respect to their estimates for 2013 is the impact of attrition on the representativeness of the 2008 sample at that point.4 The 2008 panel estimates for 2013 are derived from waves 14 through 16, and as shown later in this chapter, the households responding to wave 15—the source of most of our 2008 panel estimates—were two-thirds of the households responding to wave 1. The monthly sample weights in both panels are post-stratified to essentially the same population estimates, so the demographic composition of the two weighted samples (on age, sex, race, and Hispanic origin) is virtually identical. The SIPP sample weights also incorporate a non-interview adjustment that uses hundreds of adjustment cells derived from about a dozen characteristics to correct for differential nonresponse in wave 1 and between successive waves (U.S. Census Bureau, n.d.-b). SIPP panels capture too few new immigrants each year, however, and neither the non-interview adjustment nor the post-stratification of the weights fully addresses this growing divergence. In 2013, the (weighted) foreign-born percentage in the 2008 panel was 1.7 points lower than in the 2014 panel. The panel also observed differences in levels of education, with the 2014 panel having somewhat more college graduates but also more persons who did not graduate from high school.

Overall, however, the share of the population with earnings was nearly the same between the two samples, and it was also nearly identical between the two samples at each level of education, marital status, and race and very similar for both native and foreign-born shares of the population. Furthermore, when the panel substituted the 2014 panel’s education distribution for the 2008 panel’s distribution, the 2008 panel’s average 2013 earnings changed (rose) by less than 1 percent. On balance, then, it did not appear likely that uncorrected differences in sample composition due to attrition would be a significant factor in the comparisons the panel conducted.

INCOME AND PROGRAM PARTICIPATION

Because of findings such as those reported in the previous section of this chapter showing that surveys underestimate benchmark estimates of aggregate income—and for some sources underestimate by substantial margins—comparative evaluations of reported income can generally rely on the principle that “more is better” to determine which survey’s estimate of a given income source is the most accurate. For the months of January through November 2013, the panel compared the 2014 and 2008 panels with respect to monthly estimates of receipt or participation and, where

___________________

4 Comparing the 2014 panel with wave 1 of the 2008 panel was not a viable option because wave 1 of the 2008 panel was conducted during the middle of the Great Recession.

Suggested Citation:"7 Quality of Key Estimates." National Academies of Sciences, Engineering, and Medicine. 2018. The 2014 Redesign of the Survey of Income and Program Participation: An Assessment. Washington, DC: The National Academies Press. doi: 10.17226/24864.
×

applicable, average monthly income per recipient and aggregate monthly income, for 27 income sources and programs.5 These encompass income from earnings and assets, government and interhousehold transfers, and pensions and individual retirement plans. The comparisons also include estimates of health insurance coverage.

One of the questions the panel explored was whether there was any evidence that the quality of reporting declined between the end of the reference period and the beginning—that is, from November 2013 (the last month with any 2008 panel data) back to January 2013. The Census Bureau examined this question in each of the test panels that preceded the launch of the 2014 panel and found mixed evidence of “reverse telescoping,” which was described as “the possibility that respondents will report less accurately about events further in the past” (U.S. Census Bureau, 2013). Using linked survey and administrative records that allowed monthly reports of program participation to be checked for accuracy, the bureau found that only one program in one year showed evidence of reverse telescoping when participation was underreported. Four programs showed a decline in overreporting of participation over the course of the reference year.

The study panel’s assessment was hampered to some degree by the reduction in 2008 panel data after May 2013, as discussed above. Therefore, in addition to basing our assessment on comparisons between the two panels, the study panel also looked at the consistency of 2014 panel reporting over the full 12 months. With two notable exceptions, unemployment compensation and the Supplemental Nutrition Assistance Program (SNAP), which are discussed in a later section, the panel found little to no evidence of weaker reporting earlier versus later in the reference year. To simplify the presentation, the monthly estimates discussed in this section are based on April 2013.

Income from Employment and Assets

This section examines the reporting of earnings from employment and self-employment and of income from interest, dividends, and rental property. Table 7-2 presents estimates of the percentage of the population (15 years and older) reporting the receipt of income from each of these sources (called the “estimate of receipt”), the mean monthly amount per recipient, and the aggregate monthly amount6—all for April 2013. Estimates of the

___________________

5 In these comparisons, the study panel used survey weights provided in the public use file for the 2008 SIPP data and the survey weights in the wave 1 internal file (iteration 12) for the 2014 data. Both sets of weights were post-stratified to (near) identical population totals. The other adjustments for nonresponse and attrition were discussed above.

6 In this context, “aggregate amount” refers to the estimate of the total amount of income received by the receiving population for that income source.

Suggested Citation:"7 Quality of Key Estimates." National Academies of Sciences, Engineering, and Medicine. 2018. The 2014 Redesign of the Survey of Income and Program Participation: An Assessment. Washington, DC: The National Academies Press. doi: 10.17226/24864.
×

TABLE 7-2 2014 and 2008 Survey of Income and Program Participation (SIPP) Panel Estimates of Earned Income and Asset Income, by Source, April 2013

Source Percentage of Populationa Mean Amountb Aggregate Amount ($ millions)c
2008 Panel 2014 Panel Ratio of 2014 to 2008 2008 Panel 2014 Panel Ratio of 2014 to 2008 2008 Panel 2014 Panel Ratio of 2014 to 2008
Wages and Salaries 50.75 51.48 1.01 $3,677 $4,281 1.16* $465,253 $549,178 1.18*
Self-Employment 5.84 6.29 1.08* $3,866 $5,118 1.32* $56,308 $80,238 1.42*
Interest 38.46 49.48 1.29* $27 $52 1.93* $2,603 $6,422 2.47*
Dividends 6.80 11.62 1.71* $241 $472 1.96* $4,080 $13,679 3.35*
Rent 2.68 4.17 1.56* $402 $460 1.14 $2,680 $4,771 1.78*

aPercentage of population (15 years and older) that received income from each source.

bThe mean amount of income from each source, by month per recipient.

cAggregate amount refers to the estimate of the total amount of income received by the receiving population for that income source.

*Difference between estimates is statistically significant at the p ≤ .05 level.

SOURCE: Panel generated with data from the 2008 (public-use data file) and 2014 SIPP (iteration 12) panels.

Suggested Citation:"7 Quality of Key Estimates." National Academies of Sciences, Engineering, and Medicine. 2018. The 2014 Redesign of the Survey of Income and Program Participation: An Assessment. Washington, DC: The National Academies Press. doi: 10.17226/24864.
×

proportion of the population with wage and salary income are very similar between the two panels, whereas the receipt of self-employment income is slightly higher in the 2014 panel. However, the estimate of receipt for each type of asset income is substantially higher in the 2014 panel, with the interest estimate increasing by 29 percent, rental income by 56 percent, and the dividends estimate by greater than 70 percent.

Turning to amounts, the estimate of mean wage and salary income in April was 16 percent higher in the 2014 panel and aggregate income was 18 percent higher. The estimate of mean monthly self-employment income was 32 percent higher, and aggregate income was 42 percent higher. Estimates of mean monthly interest and dividends were nearly twice as high in the 2014 panel, while the corresponding aggregate amounts showed an even greater rise, reflecting the higher estimates of both the means and increased recipiency. The estimates of mean rental income were similar between the two panels, but the aggregate amount—boosted by the sharp increase in receipt—is 78 percent higher in 2014.

Transfer Income

The sources of transfer income examined in this section encompass both government transfers and transfers between households. Government transfers include Social Security (for adults and for children), SSI, family assistance (Temporary Assistance to Needy Families and general assistance), workers’ compensation, and veterans’ benefits. Transfers between households include child support and alimony.

Differences between the two panels are mixed. The estimates of receipt for Social Security for adults, SSI, family assistance, and child support are smaller for the 2014 panel (with statistically significant differences), while the other sources showed no statistically significant change (see Table 7-3). The 2014 panel’s estimates of mean monthly amounts of Social Security for adults, veterans benefits, and child support are higher, while mean monthly SSI and workers’ compensation are lower. Estimates of aggregate SSI, family assistance, and workers’ compensation are all lower by about a third, while veterans benefits is higher by a quarter. Adult Social Security is more modestly higher, while the estimates for the remaining sources were not statistically different between the two panels.

Retirement Income

The study panel’s retirement income comparisons include railroad retirement, the four main types of pensions (private, federal, military, and state and local), and withdrawals from IRA, 401k, Keogh, and thrift plans. For each, the estimate of receipt in the 2014 panel was either significantly

Suggested Citation:"7 Quality of Key Estimates." National Academies of Sciences, Engineering, and Medicine. 2018. The 2014 Redesign of the Survey of Income and Program Participation: An Assessment. Washington, DC: The National Academies Press. doi: 10.17226/24864.
×

TABLE 7-3 2014 and 2008 Survey of Income and Program Participation (SIPP) Panel Estimates of Safety Net Program Participation, by Source, April 2013

Source Percentage of Populationa Mean Amountb Aggregate Amount ($ millions)c
2008 Panel 2014 Panel Ratio of 2014 to 2008 2008 Panel 2014 Panel Ratio of 2014 to 2008 2008 Panel 2014 Panel Ratio of 2014 to 2008
Social Security-Adult 20.69 18.88 0.91* $1,131 $1,363 1.21* $58,343 $64,171 1.10*
Social Security-Child 0.70 0.79 1.13 $736 $694 0.94 $1,289 $1,361 1.06
Supplemental Security Income 3.34 2.79 0.84* $634 $510 0.80* $5,273 $3,550 0.67*
Family Assistance 0.64 0.38 0.59* $337 $345 1.02 $541 $329 0.61*
Workers’ Compensation 0.24 0.28 1.17 $1,365 $764 0.56* $803 $528 0.66*
Veterans’ Benefits 1.35 1.35 1.00 $941 $1,171 1.24* $3,161 $3,952 1.25*
Child Support 2.20 1.76 0.80* $455 $568 1.25* $2,503 $2,464 0.98
Alimony 0.24 0.27 1.13 $1,230 $1,104 0.90 $731 $752 1.03

aPercentage of population (15 years and older) that received income from each source.

bThe mean amount of income from each source, by month per recipient.

cAggregate amount refers to the estimate of the total amount of income received by the receiving population for that income source.

*Difference between estimates is statistically significant at the p ≤ .05 level.

SOURCE: Panel generated with data from the 2008 (public-use data file) and 2014 SIPP (iteration 12) panels.

Suggested Citation:"7 Quality of Key Estimates." National Academies of Sciences, Engineering, and Medicine. 2018. The 2014 Redesign of the Survey of Income and Program Participation: An Assessment. Washington, DC: The National Academies Press. doi: 10.17226/24864.
×

lower or not significantly different from the estimate of receipt in the 2008 panel (see Table 7-4). Mean amounts of income were significantly higher in the 2014 panel estimates for railroad retirement and private pensions; (statistically) significantly lower for state and local pensions; and not statistically different for federal pensions, military pensions, and the various types of retirement accounts. However, estimates of aggregate amounts were (statistically) significantly lower for all four pension types in the 2014 panel. Given the pronounced secular decline documented earlier in the SIPP’s findings for aggregate income from private pensions and military retirement, this evidence of further erosion in the measurement of these important sources of income for older Americans, combined with declines in selected transfers, suggests that the 2014 panel may herald a weakening in SIPP’s measurement of the well-being of low-income households—an issue of concern that the report examines more directly below.

Health Insurance Coverage

In contrast to many of the income sources, reported health insurance coverage is virtually unchanged between the 2008 and 2014 panels. For the three major sources—employer-sponsored insurance (ESI), Medicare, and Medicaid—the estimates from the 2014 panel are nearly identical to those from the 2008 panel (see Table 7-5).7 Significant differences appear among the smaller sources, and they are large enough to suggest a fair amount of confusion among respondents in one or both panels. Estimates of private, non-ESI decreased in the 2014 panel while both military health insurance and “other” health insurance increased roughly two to three times. As a result, the estimated proportion of the population of all ages covered by health insurance was more than two percentage points higher in the 2014 panel than the 2008 panel. Questionnaire wording changes designed to improve the reporting of coverage purchased through the exchanges created by the Affordable Care Act may explain the sharp increase in “other” health insurance, which would have been reported more correctly as private, non-ESI (or nongroup) coverage. The 2014 panel (with its first reference year of 2013) is well timed to provide a key data source for examining the changing dynamics of health insurance coverage with the implementation of the Affordable Care Act. This timing, in combination with the new panel’s performance in identifying persons with some form of health insurance coverage, should be welcomed by health researchers.

___________________

7 Medicaid coverage as measured in the SIPP includes coverage provided under the Children’s Health Insurance Program.

Suggested Citation:"7 Quality of Key Estimates." National Academies of Sciences, Engineering, and Medicine. 2018. The 2014 Redesign of the Survey of Income and Program Participation: An Assessment. Washington, DC: The National Academies Press. doi: 10.17226/24864.
×

TABLE 7-4 2014 and 2008 Survey of Income and Program Participation (SIPP) Panel Estimates of Retirement Income, by Source, April 2013

Source Percentage of Populationa Mean Amountb Aggregate Amount ($ millions)c
2008 Panel 2014 Panel Ratio of 2014 to 2008 2008 Panel 2014 Panel Ratio of 2014 to 2008 2008 Panel 2014 Panel Ratio of 2014 to 2008
Railroad Retirement 0.14 0.10 0.71 $1,269 $1,890 1.49* $449 $466 1.04
Private Pensions 6.10 4.50 0.74* $1,091 $1,203 1.10* $16,600 $13,509 0.81*
Federal Pensions 0.96 0.76 0.79* $2,115 $2,017 0.95 $5,075 $3,814 0.75*
Military Pensions 0.70 0.62 0.89 $1,783 $1,804 1.01 $3,105 $2,805 0.90*
State and Local Pensions 2.76 2.27 0.82* $2,107 $1,894 0.90* $14,473 $10,736 0.74*
IRA/401k/Keogh/Thrift 1.23 1.12 0.91 $1,900 $2,528 1.33 $5,808 $7,032 1.21

aPercentage of population (15 years and older) that received income from each source.

bThe mean amount of income from each source, by month per recipient.

cAggregate amount refers to the estimate of the total amount of income received by the receiving population for that income source.

*Difference between estimates is statistically significant at the p ≤ .05 level.

SOURCE: Panel generated with data from the 2008 (public-use data file) and 2014 SIPP (iteration 12) panels.

Suggested Citation:"7 Quality of Key Estimates." National Academies of Sciences, Engineering, and Medicine. 2018. The 2014 Redesign of the Survey of Income and Program Participation: An Assessment. Washington, DC: The National Academies Press. doi: 10.17226/24864.
×

TABLE 7-5 2014 and 2008 Survey of Income and Program Participation (SIPP) Panel Estimates of Health Insurance Coverage, by Source, April 2013

Source Percentage of Population of All Ages Covered
2008 Panel 2014 Panel Ratio of 2014 to 2008
Employer-Sponsored Insurance (ESI) 53.16 53.50 1.01
Medicare 15.95 15.84 0.99
Medicaid 15.82 15.80 1.00
Private Non-ESI 9.12 7.97 0.87*
Military Health Insurance 2.20 4.20 1.91*
Other Health Insurance 1.24 3.35 2.70*
Any Health Insurance 83.13 85.46 1.03*

*Difference between estimates is statistically significant at the p ≤ .05 level.

SOURCE: Panel generated with data from the 2008 (public-use data file) and 2014 SIPP (iteration 12) panels.

Comparison with Independent Sources

Table 7-6 compares annual aggregates calculated from the 2014 SIPP panel with NIPA totals and the 2014 Current Population Survey Annual Social and Economic Supplement (CPS ASEC). The latter introduced a revised income section on a split-sample basis in that year. Among the sources explicitly targeted by the revisions, the estimate of interest income showed a very substantial increase in aggregate dollars relative to the traditional questionnaire, and retirement income (including withdrawals from retirement accounts) showed a more modest increase (Czajka and Rosso, 2015; Semega and Welniak, 2015).

Both the CPS ASEC at 97 percent and the 2014 SIPP panel at 95 percent were very close to the NIPA total for wage and salary income, which represents a marked increase for SIPP relative to the 2008 panel (compare with 80% in 2012; see Table 7-1). The new SIPP’s estimate of self-employment income (91% of the NIPA total) was substantially better than the CPS ASEC (35%). SIPP also exceeded the CPS ASEC estimate of combined earnings at 94 percent versus 89 percent of the NIPA total.

The 2014 SIPP panel estimates of unearned income (all sources other than earnings) compare much less favorably to the NIPA estimates but are comparable to the CPS ASEC for most of these sources. Whereas the results discussed above show that SIPP 2014 estimates of both interest and dividends are improved over the 2008 panel, aggregate annual interest is only 19 percent of the NIPA total, while dividends are 41 percent. The

Suggested Citation:"7 Quality of Key Estimates." National Academies of Sciences, Engineering, and Medicine. 2018. The 2014 Redesign of the Survey of Income and Program Participation: An Assessment. Washington, DC: The National Academies Press. doi: 10.17226/24864.
×

TABLE 7-6 2014 Survey of Income and Program Participation (SIPP) and Current Population Survey Annual Social and Economic Supplement (CPS ASEC) Aggregate Income as a Proportion of National Income Product Accounts (NIPA) Total, by Income Source

Source of Income 2013 Aggregate ($ millions)a Proportion of NIPA
NIPA 2014 CPS ASEC 2014 SIPP 2014 CPS ASEC 2014 SIPP
Earnings $8,116,590 $7,219,917 $7,672,997 0.890 0.945

Wages and salaries

$7,045,479 $6,842,946 $6,694,653 0.971 0.950

Self-employment

$1,071,112 $376,972 $978,344 0.352 0.913
Property $947,980 $418,199 $297,476 0.441 0.314

Interest

$399,192 $182,842 $76,632 0.458 0.192

Dividends

$404,083 $146,813 $163,958 0.363 0.406

Rentb

$144,705 $88,544 $56,887 0.612 0.393
Transfers $986,834 $843,032 $932,669 0.854 0.945

Social Security + railroad retirement

$763,494 $684,570 $804,707 0.897 1.054

Supplemental security income

$49,959 $47,104 $43,089 0.943 0.862

Family assistance

$19,289 $4,409 $4,247 0.229 0.220

Unemployment compensation

$61,844 $39,825 $26,240 0.644 0.424

Workers’ compensation

$21,052 $15,631 $6,582 0.742 0.313

Veterans’ payments

$71,197 $51,493 $47,804 0.723 0.671
Pensions $741,907 $376,672 $374,605 0.508 0.505

Private pension

$317,045 $169,099 $163,290 0.533 0.515

Federal pension

$90,428 $55,282 $45,793 0.611 0.506

Military retirement

$70,091 $29,139 $34,280 0.416 0.489

State and local employee pension

$264,343 $123,152 $131,243 0.466 0.496
Total $10,793,311 $8,857,820 $9,277,747 0.821 0.860

NOTE: Table includes all individuals ages 15 and older.

aAggregate amount refers to the estimate of the total amount of income received by the receiving population for that income source.

bCPS ASEC rent value includes royalties, but royalties in the 2014 SIPP are combined with other investment income and cannot be separated for the purpose of adding to rent. The NIPA rent amount also excludes royalties.

SOURCE: Panel generated with data from SIPP (iteration 12), NIPA, and CPS ASEC.

Suggested Citation:"7 Quality of Key Estimates." National Academies of Sciences, Engineering, and Medicine. 2018. The 2014 Redesign of the Survey of Income and Program Participation: An Assessment. Washington, DC: The National Academies Press. doi: 10.17226/24864.
×

latter is better than the CPS ASEC, but the 2014 SIPP estimate of interest income falls well below the greatly improved CPS ASEC estimate. With respect to transfer income, the 2014 SIPP estimate of Social Security plus railroad retirement exceeds the NIPA total by 5 percent, whereas the CPS ASEC is 10 percent below the NIPA total. The reverse is true of SSI, with the 2014 SIPP estimate standing at 86 percent of the NIPA total and the CPS ASEC at 94 percent. SIPP compares closely with the CPS ASEC on cash family assistance, where the NIPA total does not provide a suitable standard of comparison because it includes funds allocated for noncash benefits. The 2014 SIPP is also close to the CPS ASEC on veterans payments (67% of NIPA versus 72% for the CPS ASEC). However, the 2014 SIPP is well below the CPS ASEC estimates of unemployment compensation and workers’ compensation. For transfer income as a whole, the 2014 SIPP estimate is 94 percent of the NIPA total, compared with 85 percent for the CPS ASEC, but this result is driven largely by SIPP’s overestimate of Social Security income, which is by far the largest source of transfer income.

For pension income overall, the 2014 SIPP matches the CPS ASEC in capturing half the NIPA total. Comparing Table 7-6 with Table 7-1, the results for the 2014 SIPP panel represent a substantial fall-off from the 2008 panel, which captured 75 percent of the NIPA total pension income in 2012.

Income Relative to Poverty

Many of the uses of SIPP reviewed in Chapter 3 focus on the low-income population. How well the new SIPP captures income at the lower end of the income distribution is critical to the survey’s measurement of economic well-being and its estimates of eligibility for means-tested programs that serve the low-income population. Measures of income relative to poverty provide an important indicator of the effectiveness of SIPP in capturing income at the lower end of the distribution.

Table 7-7 compares the 2014 and 2008 panels with respect to monthly estimates of the percentage of persons in families below 50 percent, between 50 and 100 percent, and between 100 and 200 percent of the federal poverty threshold.8 Two alternative sets of estimates are provided for the 2014 panel: one with and one without an allocation of income from persons who lived in panel households for only part of the year (and were not

___________________

8 The federal poverty threshold is a measure of income issued every year by the U.S. Department of Health and Human Services (HHS) used to determine eligibility for certain programs and benefits, including savings on Marketplace health insurance, and Medicaid and CHIP coverage. In 2013, the poverty threshold for a one-person household was $11,490 and was $23,550 for a four-person household.

Suggested Citation:"7 Quality of Key Estimates." National Academies of Sciences, Engineering, and Medicine. 2018. The 2014 Redesign of the Survey of Income and Program Participation: An Assessment. Washington, DC: The National Academies Press. doi: 10.17226/24864.
×

TABLE 7-7 2008 and 2014 Survey of Income and Program Participation (SIPP) Panel Estimates of the Percentage of the Population in Selected Ranges of Poverty Thresholds, by Month, 2013

Month Percentage Below 50% of Threshold Percentage 50% to 100% of Threshold Percentage Above 100% but Less Than 200% of Threshold
2008 Panel 2014 Panel With Type 2 Income 2014 Panel Without Type 2 Income 2008 Panel 2014 Panel With Type 2 Income 2014 Panel Without Type 2 Income 2008 Panel 2014 Panel With Type 2 Income 2014 Panel Without Type 2 Income
January 7.73 10.06 10.37 8.82 7.35 7.36 20.74 17.55 17.58
February 7.83 10.33 10.62 8.98 8.20 8.24 20.95 18.68 18.75
March 7.66 9.89 10.19 9.00 7.48 7.52 21.02 17.63 17.62
April 7.53 9.75 10.03 8.81 7.60 7.64 21.13 18.03 18.05
May 7.16 9.34 9.63 8.77 7.59 7.63 20.67 17.53 17.54
June 7.47 9.23 9.51 8.99 7.83 7.87 20.99 18.18 18.20
July 7.42 9.17 9.45 8.63 7.74 7.77 20.97 17.80 17.81
August 6.90 9.04 9.26 8.67 7.68 7.74 21.12 17.68 17.70
September 6.69 8.91 9.08 8.52 7.87 7.86 21.47 18.24 18.26
October 6.60 8.58 8.71 8.37 7.62 7.64 21.08 17.93 17.87
November 7.00 8.71 8.80 8.41 7.85 7.85 20.99 18.27 18.26
December 8.64 8.74 7.78 7.77 17.90 17.84

SOURCE: Panel generated with data from the 2008 (public-use data file) and 2014 SIPP (iteration 12) panels.

Suggested Citation:"7 Quality of Key Estimates." National Academies of Sciences, Engineering, and Medicine. 2018. The 2014 Redesign of the Survey of Income and Program Participation: An Assessment. Washington, DC: The National Academies Press. doi: 10.17226/24864.
×

living there at the time of the interview).9 The income reported for these “Type 2” persons was obtained by proxy or imputed, and it consists of an estimate of annual (not monthly) income, which must be apportioned across the months that each Type 2 person lived with a panel household. The data on Type 2 persons are used in the Census Bureau’s estimates of poverty in two ways. Each Type 2 person increments the poverty threshold of his or her panel family or household (depending on the measure), and each Type 2 person’s estimated monthly income adds to that family or household’s resources.

In presenting these results, the panel also reminds the reader that the sample base of the 2008 estimates becomes thinner after May 2013, declining to less than one-quarter of the May sample base by November. Comparisons between the two panels become progressively less reliable after the first 5 months of the year.

The comparative poverty rates provide clear evidence that the 2014 panel captures less income than the 2008 panel at the very bottom of the distribution. With Type 2 persons and their income included, the estimated percentage of persons in families below 50 percent of the poverty threshold over the first 5 months of the year is about 2.25 percentage points higher, on average, for the 2014 panel compared with the 2008 panel. In April, for example, the 2014 panel estimate is 9.75 percent versus 7.53 percent in the 2008 panel. Later in the year, the gap narrows to around 2 percentage points or less, but as noted, so does the reliability of the 2008 panel estimates. Against the backdrop of an improving economy, the 2014 panel estimate declines from greater than 10 percent to around 8.6 percent—something the report discusses further in the section below on “Poverty.” Over this timespan, the 2008 panel estimate declines as well.

Between 50 and 100 percent of the poverty threshold, the relative estimates for the two panels reverse. The estimate of persons falling into this range of family income is higher in the 2008 panel by about 1.5 percentage points. The difference between the two panel estimates declines over time to as low as 0.6 percentage points. Between 100 and 200 percent of the poverty threshold, the gap between the two panels is even greater; estimates of the fraction of the population falling into that range are more than 3 percentage points lower in the 2014 panel than in the 2008 panel. This difference declines to less than 2 percentage points late in the year.

In summary, the 2014 SIPP panel appears to find less income than the 2008 panel among families at the bottom of the income distribution, resulting in a larger fraction of the population being found below 50 percent of

___________________

9 These individuals are called “Type 2,” as explained in the Chapter 4 section on “Rosters.” Further discussion of the handling of Type 2 persons is provided in the Chapter 6 section, “Handling of Part-Year Residents.”

Suggested Citation:"7 Quality of Key Estimates." National Academies of Sciences, Engineering, and Medicine. 2018. The 2014 Redesign of the Survey of Income and Program Participation: An Assessment. Washington, DC: The National Academies Press. doi: 10.17226/24864.
×

the poverty threshold. A little higher in the income distribution, though, the 2014 panel captures more income than the 2008 panel, with the result that the fraction of the population between 50 and 100 percent of the threshold is lower in the 2014 panel. Between 100 and 200 percent of the threshold, the 2014 panel finds 2 to 3 percentage points fewer members of the population.

Even with the shift between 50 and 100 percent of the poverty threshold, the 2014 panel still finds a slightly greater fraction below 100 percent of the threshold—between 0.6 and 1.0 percentage points more than the 2008 panel estimate in most months. However, the 2014 panel finds fewer persons than the 2008 panel below 200 percent of the poverty threshold. The difference exceeds 2 percentage points early in the year and falls to around 1.6 percentage points later in the year.

One can hypothesize that the lower reporting of most sources of transfer income is a factor in the new SIPPs performance at the bottom of the income distribution. Whereas the overall quality of SIPP income data has declined over the years, SIPP’s performance at the low end of the income distribution was a strength through at least the first panels of the 21st century. Compared to the 2008 panel, the 2014 panel appears to be less successful at the very bottom of the income distribution but more successful a little higher in the distribution.

In view of the limited income information collected from Type 2 persons in the redesigned SIPP, the impact of this information on estimated poverty status is of interest. The study panel’s analysis found that the answer depends on the range of family relative income within which the impact of Type 2 income is observed. Without Type 2 income (and excluding Type 2 persons from families’ poverty threshold), the 2014 panel estimate of persons in families below 50 percent of the threshold is about 0.3 percentage points higher through the first 7 months of the year, although it falls to about 0.1 percentage point in the final months of 2013. Including Type 2 income has no impact on estimates of persons in families between 50 and 100 percent of the threshold or between 100 and 200 percent.

One must ask whether the addition to resources by Type 2 persons in families above 50 percent of the poverty threshold is truly offset by their addition to the poverty threshold (level for the unit) or whether the very crude estimate of their added income obtained by the 2014 SIPP panel tends to understate their contribution. However, for estimates of the proportion of the population below 100 percent and even below 200 percent of the poverty threshold, including Type 2 income shows the same 0.3 percentage point impact early in the year that is observed for the population below 50 percent of the threshold, although the difference drops to 0.1 percentage point later in the year. Why would the effect be different below 100 percent and below 200 percent than it is between 50 and 100 and between 100 and 200 percent of the poverty threshold? Perhaps the

Suggested Citation:"7 Quality of Key Estimates." National Academies of Sciences, Engineering, and Medicine. 2018. The 2014 Redesign of the Survey of Income and Program Participation: An Assessment. Washington, DC: The National Academies Press. doi: 10.17226/24864.
×

reason is that when one examines the impact within a range that is open at both ends, the effect of Type 2 income in moving people out of the top of the range (say, above 100 percent of the threshold) is offset by its impact in moving people into the bottom of the range (say, above 50 percent of the poverty threshold). When one combines the ranges under 50 percent and between 50 and 100 percent of the threshold, the impact of Type 2 income is confined to moving people out of the top of the range—that is, above 100 percent of the poverty threshold.

One other aspect of our results is notable. The proportion of the population below 100 percent and below 200 percent of the poverty threshold shows a sharp rise between January and February followed by a comparable decline between February and March (see Table 7-8). The poverty rate rises by 1.12 percentage points between January and February and falls by 1.16 percentage points between February and March. Similarly, the percentage of the population below 200 percent of the threshold rises by 2.25 percentage points between January and February and falls by 2.20 percentage points between February and March. This difference is due to how the Census Bureau assigns earnings to months for persons with pay periods that vary with the number of days in the month (e.g., persons paid biweekly instead of semimonthly).10 Such persons receive less earnings in shorter months than longer months. February being three days shorter than the longest months shows the greatest impact. This phenomenon is much less evident in the 2008 panel, where the manner in which monthly income was collected encouraged more uniform (and quite likely less precise) reporting of earnings across months. The report explores this problem further below during the discussion of estimates of transitions, but note here that it highlights a genuine issue in the measurement of monthly poverty rates when incomes that vary with the number of days in the month are compared to poverty thresholds that do not. Allowing the poverty thresholds to vary as well would eliminate this fluctuation in monthly poverty rate for persons with incomes that vary with the days in the month but would introduce fluctuation in the reverse direction for those whose incomes are constant across months.

Treatment of Excessive Hourly Wages

The 12-month reference period allowed respondents to the 2014 SIPP to report their earnings, interest, and other sources of income as annual amounts, which could facilitate more accurate reporting for sources with

___________________

10 This issue exists only in the 2014 SIPP because in the 2008 SIPP the Census Bureau did not allocate earnings to a month based on how many days that month contained.

Suggested Citation:"7 Quality of Key Estimates." National Academies of Sciences, Engineering, and Medicine. 2018. The 2014 Redesign of the Survey of Income and Program Participation: An Assessment. Washington, DC: The National Academies Press. doi: 10.17226/24864.
×

TABLE 7-8 2008 and 2014 Survey of Income and Program Participation (SIPP) Panel Estimates of the Percentage of the Population in Selected Poverty Threshold Ranges, by Month, 2013

Month Percentage Below 100% of Poverty Threshold Percentage Below 200% of Poverty Threshold
2008 Panel 2014 Panel With Type 2 Income 2014 Panel Without Type 2 Income 2008 Panel 2014 Panel With Type 2 Income 2014 Panel Without Type 2 Income
January 16.55 17.41 17.73 37.29 34.96 35.32
February 16.81 18.53 18.86 37.76 37.21 37.61
March 16.67 17.37 17.71 37.69 35.01 35.33
April 16.34 17.35 17.67 37.47 35.38 35.72
May 15.94 16.93 17.25 36.61 34.46 34.79
June 16.46 17.06 17.38 37.44 35.24 35.57
July 16.05 16.91 17.22 37.02 34.71 35.03
August 15.57 16.72 16.99 36.69 34.40 34.69
September 15.21 16.78 16.93 36.68 35.02 35.19
October 14.97 16.20 16.35 36.06 34.13 34.23
November 15.41 16.56 16.66 36.40 34.83 34.91
December 16.42 16.50 34.33 34.34

SOURCE: Panel generated with data from the 2008 (public-use data file) and 2014 SIPP (iteration 12) panels.

annual but not monthly income statements.11 At the same time, respondents were encouraged to report their earnings in whatever other unit of time was easiest. This includes hourly rates, which were then multiplied by reported weekly hours and total weeks worked to obtain estimates of annual income. While this flexibility was expected to improve the accuracy of reporting—and perhaps even reduce item nonresponse—it could also lead to errors when respondents reported a correct amount but for the

___________________

11 For wage and salary income, the annual reporting period could also make it easier for respondents to report gross rather than net income, as gross earnings rather than take-home pay are what appear on employees’ W-2 forms, whereas take-home pay is much more prominent—and presumably more salient to recipients—on the paychecks received throughout the year. Some respondents’ reporting of net instead of gross earnings has been suggested as a possible reason for the earlier SIPP panels’ substantial underreporting of annual earnings compared to the CPS ASEC (Roemer, 2000).

Suggested Citation:"7 Quality of Key Estimates." National Academies of Sciences, Engineering, and Medicine. 2018. The 2014 Redesign of the Survey of Income and Program Participation: An Assessment. Washington, DC: The National Academies Press. doi: 10.17226/24864.
×

wrong unit of time. For example, a respondent might report a monthly or annual salary as an hourly rate, or an hourly rate as a monthly amount. The survey instrument included prompts to the interviewer if a particular reported amount seemed excessively high or low. Despite this mechanism for correcting such errors, the Census Bureau reports that some apparent errors did occur. In particular, the hourly pay rate variable includes around 20 instances of extremely high hourly wages while the annual earnings variable includes around the same number of extremely low amounts. The high amounts are of particular concern because this small number of cases accounted for nearly as much aggregate income as the rest of the sample, and in so doing raises the overall estimate of annual aggregate income far above the NIPA total.

When preparing the public use file, the Census Bureau topcoded12 the hourly wage variable, assigning each topcoded record the median of the topcoded values. Other rates of reported earnings were topcoded using the mean of the topcoded values. Separate means were calculated for selected subpopulations in order to preserve the reported aggregate amounts for these subpopulations. The topcoded hourly wages were then used to generate monthly and annual earnings for the public use file. The study panel produced most of its estimates from internal data that were not topcoded. To enable us to generate more reasonable estimates of aggregate earnings, the Census Bureau identified the high and low outliers to our programmer, and the panel removed these observations—about 40 in total—from our estimates of monthly and annual earnings. The net effect is rather astonishing. Without the 40 outlier observations, the panel obtained an annual aggregate wage and salary amount of $6.69 trillion, which is 95 percent of the NIPA total, as reported in Table 7-6. With these 40 additional observations included, this figure rose to $12.95 trillion, which exceeds the NIPA total by 83 percent.

This excessive total cannot be generated from the public use file due to the topcoding applied to hourly wages. But neither can users correct what are likely to be overstated earnings for the roughly 20 cases whose excessive amounts were handled by topcoding. The Census Bureau opted not to edit the almost surely incorrect reports of hourly wages, although these values—and other outliers—were excluded from use as donors in the Census Bureau’s imputations. The Census Bureau remains reluctant to edit reported amounts although it was willing to do so for Social Security and SSI when linked administrative records provided evidence of program confusion. The report addresses this issue in our recommendations.

___________________

12 Topcoding is a procedure in which the value of a particular variable that is above an upper limit is censored or changed. The Census Bureau’s topcoding in the SIPP is primarily used as part of confidentiality protection on the public use file.

Suggested Citation:"7 Quality of Key Estimates." National Academies of Sciences, Engineering, and Medicine. 2018. The 2014 Redesign of the Survey of Income and Program Participation: An Assessment. Washington, DC: The National Academies Press. doi: 10.17226/24864.
×

RECALL BIAS

With the lengthened reference period of the redesigned SIPP and with interviews conducted between 1 and 5 months after the end of the reference period, there is reason to be concerned that the quality of the data provided by respondents will deteriorate with increased temporal distance of the period in question from the interview. This could be reflected in lower reports of program participation when such participation ended during the reference year and in fewer transitions in general.13 Reduced quality could also be reflected in less accurate reports of dollar amounts earlier versus later in the reference period. The Census Bureau evaluated this aspect of data quality during its successive pilot tests of the redesigned survey and found mixed evidence of “reverse telescoping” or “the possibility that respondents will report less accurately about events further in the past” (U.S. Census Bureau, 2013). Basing its assessment on matched administrative records, the bureau found that only Medicaid in 2010 showed evidence of this phenomenon in the gross underreporting of program participation, while Medicaid, Medicare, Social Security, and SNAP exhibited declines over the reference year in gross overreporting of participation.

In its own evaluation, briefly described above, the panel compared the 2014 SIPP panel estimates of monthly program participation in 2013 with estimates derived from the 2008 panel over the same period. Where possible to assemble monthly data on program participation from administrative sources, the panel compared SIPP estimates to these data as well.

Two programs showed striking evidence of progressively worse reporting of participation with increased distance from the end of the reference period: unemployment compensation and SNAP.14 The report examines findings for these two programs in turn.

Table 7-9 compares 2014 and 2008 SIPP panel estimates of persons receiving unemployment compensation in each month of 2013 with beneficiary counts from unemployment insurance administrative data.15 With the improving economy, the administrative data show a pronounced downward trend, with the number of beneficiaries declining from 5.8 million to 3.9 million in October before rising slightly to 4.4 million in December.

___________________

13 This phenomenon is generally ascribed to respondents’ recall bias, although in principle a similar blurring of more distant events could result from limitations of the survey instrument, hurried data collection or other survey limitations, as well as from respondents’ memory limitations. The report makes no attempt here to distinguish among these mechanisms for bias in the data.

14 We did not examine changes in reported participation as a function of distance from the interview. That would have been possible with the internal data but is not possible with the public use data, because the interview date is not included in the public file.

15 As a note, some analysts think of unemployment insurance as a social insurance program and not as a safety net program, at least in normal economic times.

Suggested Citation:"7 Quality of Key Estimates." National Academies of Sciences, Engineering, and Medicine. 2018. The 2014 Redesign of the Survey of Income and Program Participation: An Assessment. Washington, DC: The National Academies Press. doi: 10.17226/24864.
×

TABLE 7-9 2008 and 2014 Survey of Income and Program Participation (SIPP) Panel Estimates of Unemployment Beneficiaries, by Month, 2013

Month Thousands of Persons SIPP as Percentage of Administrative Data
Unemployment Insurance Program Admin. Data 2008 SIPP Panel* 2014 SIPP Panel 2008 SIPP Panel 2014 SIPP Panel
January 5,772 3,779 2,348 65.5 40.7
February 5,592 3,602 2,076 64.4 37.1
March 5,309 3,412 2,101 64.3 39.6
April 4,939 3,193 2,167 64.6 43.9
May 4,621 3,025 2,083 65.5 45.1
June 4,539 2,931 2,191 64.6 48.3
July 4,663 2,805 2,302 60.2 49.4
August 4,295 2,660 2,335 61.9 54.4
September 3,966 2,429 2,089 61.2 52.7
October 3,904 2,624 2,121 67.2 54.3
November 4,021 2,806 2,459 69.8 61.2
December 4,411 2,778 63.0

*Estimates for months after May are inflated by the ratio of 4 to the actual number of rotation groups surveyed for that month.

SOURCE: Panel generated with data from the 2008 (public-use data file), 2014 SIPP (iteration 12) panels, and administrative data from the Federal-State Unemployment Insurance Program.

The 2008 SIPP panel captures this general trend, declining from 3.8 million in January to a minimum of 2.4 million in September and then reversing direction to rise back to 2.8 million in November. The 2008 panel estimates are 64 to 65 percent of the administrative totals from January through June but show some volatility over the remaining months as the sample gets thinner. The 2014 panel data start out at 2.3 million or just 41 percent of the administrative total. Over the next 9 months, the estimates vary between 2.0 and 2.3 million but then turn up in the final 2 months, reaching 2.8 million in December. As a share of the administrative totals, SIPP estimates rise to 63 percent by the end of the year—a 22 percentage point increase from January (or 26 percentage points if measured from the low point in February).

Why is unemployment compensation so different from nearly all of the other sources of income that were examined? The crucial difference, the

Suggested Citation:"7 Quality of Key Estimates." National Academies of Sciences, Engineering, and Medicine. 2018. The 2014 Redesign of the Survey of Income and Program Participation: An Assessment. Washington, DC: The National Academies Press. doi: 10.17226/24864.
×

panel suggests, is that participation in this program was declining, which implies that there were considerably more people leaving the program each month than entering. The panel surmises that those who left early in the reference year were less likely than those who left later in the year to report that they had participated. New entrants may have been better reporters because many of them were still participating. When they started was less important because the 2014 panel collects program participation starting with the interview month and works back to the start of the reference period.

SNAP is another major program with declining enrollment in 2013, although SNAP enrollment actually peaked in the first half of 2013 and did not begin to decline until the latter part of the year (see Table 7-10). Even so, SNAP enrollment declined by 0.7 million in the final months of the year. Estimates of SNAP participation in the 2008 panel were likewise fairly stable over the first half of the year, holding steady at between

TABLE 7-10 2008 and 2014 Survey of Income and Program Participation (SIPP) Panel Estimates of Supplemental Nutrition Assistance Program (SNAP) Beneficiaries, by Month, 2013

Month SNAP Administrative Data 2008 SIPP Panel* 2014 SIPP Panel SIPP as Percentage of Administrative Data
2008 SIPP Panel 2014 SIPP Panel
January 47,772 43,663 35,043 91.4 73.4
February 47,558 43,715 35,666 91.9 75.0
March 47,725 43,557 36,035 91.3 75.5
April 47,549 43,400 36,115 91.3 76.0
May 47,635 43,623 36,319 91.6 76.2
June 47,760 42,585 36,898 89.2 77.3
July 47,637 42,101 37,197 88.4 78.1
August 47,665 42,368 37,623 88.9 78.9
September 47,306 41,196 37,599 87.1 79.5
October 47,416 40,615 38,188 85.7 80.5
November 47,363 40,612 38,586 85.7 81.5
December 47,079 38,875 82.6

*Estimates for months after May are inflated by the ratio of 4 to the actual number of rotation groups surveyed for that month.

SOURCE: Panel generated with data from the Food and Nutrition Service, the 2008 (public-use data file), and 2014 SIPP (iteration 12) panels.

Suggested Citation:"7 Quality of Key Estimates." National Academies of Sciences, Engineering, and Medicine. 2018. The 2014 Redesign of the Survey of Income and Program Participation: An Assessment. Washington, DC: The National Academies Press. doi: 10.17226/24864.
×

91 and 92 percent of the administrative totals. Between May and November, however, the 2008 panel’s estimates of SNAP enrollment declined by 3.0 million as the reporting rate dropped to 86 percent. Why this should have occurred is not at all clear, but it could be just an artifact of growing sampling error as the size of the sample declined by a factor of four. The 2014 panel shows an entirely different trend, with estimated SNAP enrollment growing by 3.9 million over the year as the reporting rate rose from 73.4 to 82.6 percent. The comparatively low reporting of SNAP over much of the year is especially notable, as SNAP reporting has been one of the long-time strengths of SIPP.

Estimates of SNAP recipient units (termed “households” even though there can be more than one SNAP unit in a household) show much the same pattern but with lower reporting rates. Administrative counts of SNAP households declined by just 0.3 million between their peak in June and the end of the year (see Table 7-11). The 2008 SIPP panel estimates

TABLE 7-11 2008 and 2014 Survey of Income and Program Participation (SIPP) Panel Estimates of Supplemental Nutrition Assistance Program (SNAP) Households, by Month, 2013

Month SNAP Administrative Data 2008 SIPP Panel* 2014 SIPP Panel SIPP as Percentage of Administrative Data
2008 SIPP Panel 2014 SIPP Panel
January 23,089 18,919 15,438 81.9 66.9
February 23,007 19,062 15,632 82.9 67.9
March 23,115 18,958 15,767 82.0 68.2
April 23,039 19,064 15,822 82.7 68.7
May 23,071 19,161 15,875 83.1 68.8
June 23,117 18,649 16,077 80.7 69.5
July 23,074 18,566 16,159 80.5 70.0
August 23,094 18,733 16,325 81.1 70.7
September 23,000 18,406 16,290 80.0 70.8
October 23,054 18,167 16,433 78.8 71.3
November 23,015 17,970 16,582 78.1 72.1
December 22,861 16,651 72.8

*Estimates after May are inflated by the ratio of 4 to the actual number of rotation groups surveyed for that month.

SOURCE: Panel generated with data from the Food and Nutrition Service, the 2008 (public-use data file), and 2014 SIPP (iteration 12) panels.

Suggested Citation:"7 Quality of Key Estimates." National Academies of Sciences, Engineering, and Medicine. 2018. The 2014 Redesign of the Survey of Income and Program Participation: An Assessment. Washington, DC: The National Academies Press. doi: 10.17226/24864.
×

declined by 1.2 million between their May peak and November, with the reporting rate declining from 82 percent early in the year to 78 percent at the end of the year. Monthly estimates from the 2014 panel run counter to these trends with an increase of 1.2 million between January and December. Here, though, the increase in the reporting rate was more modest than for total beneficiaries, rising from 66.9 to 72.8 percent between the beginning and end of the year.16

SNAP benefits show a pattern somewhat similar to that for SNAP households, with administrative tallies dropping for the final months of the year (see Table 7-12). In this case, monthly benefit payments peak in October, then show a marked 7 percent ($429 million) 1-month drop in November, driven largely by the October 31 expiration of a temporary statutory expansion of the maximum monthly benefit amount per SNAP household.17 Estimates in the 2008 SIPP, although impaired by small sample size, dip in November by about 2 percent ($172 million) but are essentially unchanged in November in the 2014 panel. Over the year as a whole, reported payments as a share of true payments fell in the 2008 panel from close to 80 percent in the first 5 months of the year to percentages in the low 70s late in the year, while rising in the 2014 panel estimates by a modest amount, with an increase in reporting from 63.5 to 73.5 percent over the course of the year.

To summarize, estimates from the 2014 panel for two of the most important safety net programs exhibit trends that run counter to program data because of a substantial recall bias that contributes to a marked decrease in reporting accuracy from the end of the reference period back to the beginning. In addition, the 2014 panel shows markedly less complete reporting of participation in these two programs over much of the year compared to the 2008 panel. Even at the end of the year, the best reporting of participation in SNAP in the 2014 panel is well below that of the 2008 panel at the beginning of the year, when the data reflect the full sample.

We can contrast these findings with those for employment, which shows minimal evidence of recall bias, very high reporting in both panels, and improvement in the 2014 panel over the 2008 panel. Although there is no administrative source of monthly employment data (as opposed to jobs, for which a single employed person may hold more than one), the CPS does provide official estimates of monthly employment and unemployment for the United States. The CPS estimates are based on a single week during each month, whereas SIPP employment measures any labor activity over the

___________________

16 The lower reporting rate in both panels for households versus persons could reflect an underreporting of multiple units within the same household (i.e., two units being reported as one) or a greater underreporting of small units than large units.

17 See https://www.fns.usda.gov/arra/snap [August 2017].

Suggested Citation:"7 Quality of Key Estimates." National Academies of Sciences, Engineering, and Medicine. 2018. The 2014 Redesign of the Survey of Income and Program Participation: An Assessment. Washington, DC: The National Academies Press. doi: 10.17226/24864.
×

TABLE 7-12 2008 and 2014 Survey of Income and Program Participation (SIPP) Panel Estimates of Aggregate Supplemental Nutrition Assistance Program (SNAP) Benefits, by Month, 2013

Month $ Millions SIPP as Percentage of Administrative Data
SNAP Administrative Data 2008 SIPP Panel* 2014 SIPP Panel 2008 SIPP Panel 2014 SIPP Panel
January $6,326 $5,025 $4,015 79.4 63.5
February $6,303 $5,032 $4,080 79.8 64.7
March $6,342 $5,024 $4,108 79.2 64.8
April $6,298 $5,025 $4,117 79.8 65.4
May $6,321 $5,048 $4,123 79.9 65.2
June $6,346 $4,904 $4,177 77.3 65.8
July $6,288 $4,827 $4,202 76.8 66.8
August $6,340 $4,779 $4,266 75.4 67.3
September $6,300 $4,599 $4,250 73.0 67.5
October $6,355 $4,539 $4,268 71.4 67.2
November $5,926 $4,366 $4,270 73.7 72.1
December $5,818 $4,276 73.5

*Estimates after May are inflated by the ratio of 4 to the actual number of rotation groups surveyed for that month.

SOURCE: Panel generated with data from the Food and Nutrition Service, the 2008 (public-use data file), and 2014 SIPP (iteration 12) panels.

entire month. If reported with the same level of accuracy, SIPP estimates of monthly employment ought to be higher than those from the CPS.

Table 7-13 compares monthly employment (both wage and salary and self-employment) from the two SIPP panels and the CPS. All three surveys show similar upward trends, with the 2014 SIPP panel estimates running 1 to 2 percentage points higher than the CPS and the 2008 panel running 2 to 3 percentage points lower in all but one month. CPS employment increases by 3.5 million between January and July and ends the year down 0.7 million from its peak. SIPP employment from the 2008 panel rises by 4.9 million from its January level to a peak in October but drops 1.7 million in the next month, reflecting the volatility arising from a reduced sample. SIPP employment from the 2014 panel increases by 4.5 million to a peak in August and ends the year 0.6 million below that peak. The 2014 panel estimates, as a percentage of the CPS estimates, are about a percentage point higher in the second half of the year than the first

Suggested Citation:"7 Quality of Key Estimates." National Academies of Sciences, Engineering, and Medicine. 2018. The 2014 Redesign of the Survey of Income and Program Participation: An Assessment. Washington, DC: The National Academies Press. doi: 10.17226/24864.
×

TABLE 7-13 2008 and 2014 Survey of Income and Program Participation (SIPP) Panel Estimates of Employment, by Month, 2013, Compared with Current Population Survey (CPS) Estimates

Month Thousands of Persons SIPP as Percentage of the CPS
Current Population Survey 2008 SIPP Panel* 2014 SIPP Panel 2008 SIPP Panel 2014 SIPP Panel
January 141,614 138,383 142,936 97.7 100.9
February 142,228 137,861 143,660 96.9 101.0
March 142,698 138,431 144,310 97.0 101.1
April 143,724 138,828 144,857 96.6 100.8
May 144,432 140,240 145,785 97.1 100.9
June 144,841 140,064 146,957 96.7 101.5
July 145,113 140,401 146,812 96.8 101.2
August 144,509 141,678 147,549 98.0 102.1
September 144,651 142,357 146,976 98.4 101.6
October 144,144 143,258 147,353 99.4 102.2
November 144,775 141,545 147,295 97.8 101.7
December 144,423 146,963 101.8

*Estimates after May are inflated by the ratio of 4 to the actual number of rotation groups surveyed in that month.

SOURCE: Panel generated with data from the Bureau of Labor Statistics, the 2008 (public-use data file), and 2014 SIPP (iteration 12) panels.

half, but there is no trend beyond that. That the 2014 SIPP panel should perform so differently for employment than for SNAP or unemployment compensation may reflect the greater salience of employment, its greater visibility to all household members (potential respondents), and its highly positive image as an activity. Respondents are not reluctant to report employment, apparently.

For waves 2 and later of the 2014 panel, the data collected for the early part of the reference period during the prior wave should eliminate the effects of recall bias on reported program participation during those months. Whether dependent interviewing—reminding respondents of program participation that they reported in the previous interview—reduces recall bias over the remaining months of the reference period remains to be seen. The panel completed its work months before usable wave 2 data became available and therefore was not able to assess the impact of the data collection design and editing strategy on this phenomenon.

Suggested Citation:"7 Quality of Key Estimates." National Academies of Sciences, Engineering, and Medicine. 2018. The 2014 Redesign of the Survey of Income and Program Participation: An Assessment. Washington, DC: The National Academies Press. doi: 10.17226/24864.
×

The issues this report discusses in the next section, Intrayear Dynamics, shows data quality problems that are likely partially produced through recall problems.

INTRAYEAR DYNAMICS

While longitudinal studies may not dominate SIPP literature, as demonstrated in Chapter 3, the ability to support analyses of short-term dynamics remains one of SIPP’s primary purposes. The SIPP web page describes the survey in the following terms:

The main objective of SIPP has been, and continues to be, to provide accurate and comprehensive information about the income and program participation of individuals and households in the United States. The survey’s mission is to provide a nationally representative sample for evaluating: (1) annual and sub-annual income dynamics; (2) movements into and out of government transfer programs; (3) family and social context of individuals and households; and (4) interactions among these items.

(U.S. Census Bureau, n.d.-a)

Several in the series of briefs issued with the release of the first public use file of 2014 SIPP data describe SIPP as “a nationally representative panel survey . . . that collects information on the short-term dynamics of employment, income, household composition, and eligibility and participation in government assistance programs” (Moore et al., 2017).

Arguably, the most significant limitation of the older SIPP design with respect to the measurement of short-term dynamics was the tendency for a disproportionate share of reported transitions to fall at the seam between interview waves—a seam effect or seam bias. Whereas 25 percent of transitions, on average, might be expected to fall between the fourth month of the previous wave reference period and the first month of the current wave reference period, this fraction for participation in many programs exceeded 80 percent and approached 100 percent for some programs. Table 7-14 reports the percentages of transitions into and out of selected programs, employment, and poverty that occurred at the seams between waves in the final 12 months of the 2008 SIPP panel.

With the earlier SIPP’s rotation group design, the seam months were distributed uniformly across calendar months, so that the seam effect in the underlying data was not evident in the distribution of transitions across calendar months. But the seam effect hampered longitudinal analysis, producing “heaping” at multiples of four months in distributions of spell length and spikes in transitions every 4 months. Because the seam effect was present in so many of the phenomena that SIPP measured, it generated

Suggested Citation:"7 Quality of Key Estimates." National Academies of Sciences, Engineering, and Medicine. 2018. The 2014 Redesign of the Survey of Income and Program Participation: An Assessment. Washington, DC: The National Academies Press. doi: 10.17226/24864.
×

TABLE 7-14 Percentage of Transitions into and out of Selected Statuse Reported at the Seam Between Interview Waves for Final 12 Months of the 2008 Survey of Income and Program Participation (SIPP) Panel

Status Transitions into Status Transitions out of Status
Receipt of Unemployment Compensation 45.8 59.4
Receipt of SNAP Benefits 79.3 59.5
Medicaid Coverage 90.3 94.4
Employment 51.8 42.1
Poverty 47.8 59.9

NOTE: SNAP = Supplemental Nutrition Assistance Program.

SOURCE: Panel generated with data from the 2008 SIPP panel (public-use data file).

false associations between program transitions and changes in a variety of other characteristics, complicating efforts to investigate the circumstances promoting transitions into and out of programs. In response, analysts often dropped reference months 1 to 3 from their longitudinal analyses or designed their analyses at the wave level. Another concern raised by the seam effect in the older SIPP is the occurrence of false transitions—potentially as a significant fraction of all reported transitions. Whether due to a change in respondent between waves, misreporting by the same respondent in consecutive waves, or imputation using donors with seam effects embedded in their data, transitions recorded between waves were sometimes, and perhaps often, false. Compounding the problem, a misreported program status in one wave, corrected in the next wave, yielded two false transitions—one at each end of the misreported wave.

A significant challenge in evaluating estimates of transitions—including documenting excess transitions—is the general absence of independent estimates that align with SIPP concepts and are sufficiently accurate to serve as benchmarks. Program administrative data could provide the best estimates of transitions into and out of programs if they were more often used for this purpose or were more readily accessible for research. However, very few programs are administered at the national level, and the federal government obtains data suitable for estimating transitions from a limited set of state-administered programs. Medicare and Social Security—the two major national programs—gain millions of new beneficiaries each year, but most of those who join remain beneficiaries for life. Exits due to death are not of great interest to most researchers studying program dynamics and for a variety of reasons are likely to be underestimated in surveys anyway. SSI is also a federal program, although states may supplement the federal benefits,

Suggested Citation:"7 Quality of Key Estimates." National Academies of Sciences, Engineering, and Medicine. 2018. The 2014 Redesign of the Survey of Income and Program Participation: An Assessment. Washington, DC: The National Academies Press. doi: 10.17226/24864.
×

but SSI is only about 7 percent of the size of Social Security, and transitions out of the program tend to be comparatively rare as well. Medicaid, which compares in size to Medicare, is administered at the state level, although the states are required to submit microdata files to the federal government. The Centers for Medicare & Medicaid Services now publishes statistics on new enrollments (transitions into the program), but these relatively recent data do not include program exits and did not begin until October 2013.

All of the other major transfer programs are state administered. The largest of these, SNAP, does not provide microdata to the federal government other than a quality control sample. Selected states have begun to submit microdata files to the Census Bureau, but these have not been used as yet to generate statistics on transitions. The quality control sample data can be used to estimate new beneficiaries, and this report makes use of these data, but as a sample of active cases these data contain no information on program exits. States also submit microdata from their unemployment insurance (UI) programs to the Department of Labor and to the Agency for Children and Families within the Department of Health and Human Services—and, for a special project, to the Census Bureau. Statistics on new beneficiaries are published by the Department of Labor, but the statistics on exits cover only those beneficiaries who have exhausted their benefits and not those whose benefits have stopped after they obtain work or failed to comply with program requirements.

Moving beyond programs, transitions into and out of employment and into and out of poverty are also of great interest, but there are no administrative data that capture these transitions. The Department of Labor releases statistics on gross and net new jobs, but these do not translate cleanly into new employment, because job changes cannot be separated from all new jobs. Furthermore, there are no comparable data on transitions out of employment. New UI claims come close but exclude losses of employment that did not result in UI claims.

In sum, less is known than one might wish about how frequent the transitions measured in SIPP ought to be. Given that starting point, the study panel’s goals with respect to evaluating the measurement of short-term dynamics in the 2014 SIPP panel were very limited. By comparing the 2014 and 2008 panels with respect to their estimates of transitions for a small number of programs and economic statuses, then comparing both sets of estimates to administrative statistics when available (and, in one case, independent survey statistics), the panel sought to establish whether the 2014 panel appears capable of providing sufficiently credible estimates of transitions to support future research. Given the availability of just a single wave of data, the panel necessarily focused on intrayear dynamics.

Suggested Citation:"7 Quality of Key Estimates." National Academies of Sciences, Engineering, and Medicine. 2018. The 2014 Redesign of the Survey of Income and Program Participation: An Assessment. Washington, DC: The National Academies Press. doi: 10.17226/24864.
×

Unemployment Compensation

Table 7-15 compares 2008 and 2014 SIPP panel estimates of new recipients of unemployment compensation with administrative estimates of first payments.18 The administrative data show a January seasonal spike that is nearly twice the average monthly level during the balance of the year. Except for a secondary spike in July, the monthly counts of first payments are relatively flat over most of the year before rising in December toward what was likely a January 2014 spike. Neither SIPP series replicates the pattern in the administrative data. The 2008 panel estimates in January and February are a little higher than the rest of the estimates for the year, which end in August, but they do not approach the severity of the downward pitch seen in the administrative data.19 The first wave of the 2014 panel does not provide an estimate of new entrants in January, but with SIPP’s historic seam bias the estimate for January 2014 could very well match or exceed the spike in the administrative data, although not necessarily for all the right reasons. Table 7-9 showed that the 2014 panel understated the unemployment compensation caseload by a much greater margin early than late in the reference year, and this is reflected in Table 7-15 by transitions that start low in February and rise and flatten out over the next few months before rising more steeply at the end of the year. As a fraction of the administrative estimates, the 2014 SIPP estimates are roughly similar to the 2008 panel estimates from April through the end of that series in August. In the final 2 months of the year, the average transitions in the 2014 panel match or exceed the administrative numbers. These high numbers raise the overall average relative to the administrative transitions to 67.1 percent, compared to 65.1 percent for the 2008 panel for a somewhat different set of months. As previously explained, there is reason to believe that the recall bias that affects the 2014 panel estimates of caseloads and transitions will

___________________

18 The Department of Labor also publishes monthly data on final payments, but these count only persons who have exhausted their benefit entitlements. Final payments do not include termination of benefits due to re-employment or failure to meet program requirements.

19 Because of the pronounced seam bias prevalent in the older SIPP panels, estimates of transitions between a given pair of calendar months are distributed very unevenly across rotation groups. The loss of an entire wave of data for one rotation group due to the federal government shutdown in October 2013 means that there are no data from the rotation group that would provide the seam transitions between May and June 2013. For that reason the panel has not produced a 2008 panel estimate of transitions between those months (which would have been reported in the month of June). Similarly, the end of data collection for the 2008 panel in the latter part of 2013 means that no seam transitions were obtained for the months of September through November, so no transitions are reported for those months as well. Seam transitions were obtained for the months of July and August; in those months the loss of the October interviews means that data were obtained for only two of the three off-seam transitions. The estimates reported for July and August incorporate an adjustment to compensate for the missing off-seam transitions.

Suggested Citation:"7 Quality of Key Estimates." National Academies of Sciences, Engineering, and Medicine. 2018. The 2014 Redesign of the Survey of Income and Program Participation: An Assessment. Washington, DC: The National Academies Press. doi: 10.17226/24864.
×

TABLE 7-15 2014 and 2008 Survey of Income and Program Participation (SIPP) Panel Estimates of New Entrants to Unemployment Compensation Compared to Administrative Estimates of First Payments, by Month, 2013

Month Thousands of Persons SIPP as Percentage of Administrative Data
First Payments from Administrative Data 2008 SIPP Estimates of New UI Entrants 2014 SIPP Estimates of New UI Entrants 2008 SIPP Panel 2014 SIPP Panel
January 1,125 539 47.9
February 729 549 262 75.3 35.9
March 600 363 307 60.5 51.2
April 596 427 414 71.6 69.5
May 567 401 294 70.7 51.9
Junea 558 419 75.1
Julyb 740 447 415 60.4 56.1
Augustb 574 399 365 69.5 63.6
Septembera 478 263 55.0
Octobera 542 380 70.1
Novembera 546 664 121.6
Decembera 763 674 88.3
Average 652 446 405 65.1 67.1

aSeam transitions were not collected in the 2008 panel for this month, so no estimate of transitions is reported from that panel.

bThe 2008 panel estimates have been adjusted to compensate for the absence of off-seam data from one rotation group.

SOURCE: Panel generated with data from the Office of Unemployment Insurance, U.S. Department of Labor (available: https://oui.doleta.gov/unemploy/5159report.asp/), data from the 2008 (public-use data file), and 2014 SIPP (iteration 12) panels.

be diminished in future waves, due to a combination of data collected for the first part of the reference period in the prior wave and the use of dependent interviewing. With this expectation, the findings for unemployment compensation suggest that the 2014 SIPP panel may ultimately capture transitions at least as well if not better than the 2008 panel. However, as explained below, unemployment compensation is alone in this regard.

Suggested Citation:"7 Quality of Key Estimates." National Academies of Sciences, Engineering, and Medicine. 2018. The 2014 Redesign of the Survey of Income and Program Participation: An Assessment. Washington, DC: The National Academies Press. doi: 10.17226/24864.
×

SNAP

SNAP is the only other program for which the study panel was able to assemble administrative estimates of transitions that could be aligned with SIPP estimates. The administrative numbers come from SNAP quality control (QC) sample data, which are compiled on a continuous basis from monthly samples of SNAP cases that are selected for detailed review in each state. An annual microdata file is produced from the data collected during these reviews. The estimates of transitions into SNAP that the panel compared to SIPP estimates are based on new certifications of cases that began to receive benefits in the month that they were sampled (recertifications are excluded). Because the estimates are based on samples, rather than complete counts, they are subject to sampling variability.

Table 7-16 compares 2008 and 2014 SIPP monthly estimates of SNAP entrants with the estimates from QC sample data. The estimates from the QC data exhibit no trend over the first 8 months of the year, and that is true for the 2008 SIPP estimates as well. With one exception, the 2008 SIPP estimates fall between 53 and 69 percent of the QC sample estimates. New certifications in the QC data show what may be a seasonal peak in the fall months. The 2014 SIPP estimates also peak in October but show a modest upward trend over the year, which reflects gradual improvement relative to the QC sample estimates. Nevertheless, the 2014 SIPP estimates remain well below the 2008 SIPP estimates in the share of actual new cases that they capture, averaging 28 percent over 11 months compared to 60 percent for the 2008 panel over 7 months. Thus the 2014 SIPP’s capture of new SNAP cases falls well short of the survey’s capture of new unemployment compensation recipients.

Employment

While there are no administrative estimates of transitions into and out of the state of being employed, the Bureau of Labor Statistics (BLS) uses matched samples of respondents to the monthly CPS labor force questionnaire to estimate gross flows into and out of employment, unemployment, and the labor force. As the source of the official monthly unemployment rate, labor force participation rate, and other labor statistics, the CPS provides high-quality cross-sectional estimates. The fact that the underlying estimates of labor force status are based on reports for the survey week makes the estimates of gross change less susceptible to recall error than those from either SIPP panel. In addition, the labor force questionnaire itself is very brief, and respondents are being asked the same questions that nearly three-quarters of them answered the previous month. The gross change estimates, the panel suggests, provide an available benchmark for

Suggested Citation:"7 Quality of Key Estimates." National Academies of Sciences, Engineering, and Medicine. 2018. The 2014 Redesign of the Survey of Income and Program Participation: An Assessment. Washington, DC: The National Academies Press. doi: 10.17226/24864.
×

TABLE 7-16 2008 and 2014 Survey of Income and Program Participation (SIPP) Panel Estimates of New Entrants to the Supplemental Nutrition Assistance Program (SNAP) Compared to Quality Control Sample Estimates of Spells Starting the Prior Month, 2013

Month New SNAP Certifications of Persons All Ages 2008 SIPP Estimates of SNAP Entrants All Ages 2014 SIPP Estimates of SNAP Entrants All Ages SIPP as Percentage of Quality Control Sample
2008 SIPP Panel 2014 SIPP Panel
January 2,649 1,411 53.3
February 2,369 1,516 678 64.0 28.6
March 2,173 1,410 541 64.9 24.9
April 2,236 1,306 336 58.4 15.0
May 2,311 1,452 456 62.8 19.7
Junea 2,450 882 36.0
Julyb 2,661 1,226 692 46.1 26.0
Augustb 2,426 1,680 850 69.2 35.0
Septembera 2,965 718 24.2
Octobera 3,140 1,061 33.8
Novembera 2,715 881 32.4
Decembera 2,235 692 31.0
Average 2,528 1,429 708 59.8 27.9

aSeam transitions were not collected in the 2008 panel for this month, so no estimate of transitions from that panel is reported.

bThe 2008 panel estimate has been adjusted to compensate for the absence of off-seam data from one rotation group.

SOURCE: Panel generated with data from the 2008 (public-use data file), 2014 SIPP (iteration 12) panels, and the 2013 SNAP Quality Control Sample.

assessing SIPP estimates of movement into and out of employment, even though they, too, are survey-based.

Table 7-17 compares 2008 and 2014 SIPP panel estimates of transitions into and out of employment to the BLS estimates based on the CPS. The 2008 panel estimates of transitions into employment range from 50 to 68 percent of the BLS estimates, with a mean of 61 percent, while the 2014 panel estimates run from 19 to 42 percent of the BLS estimates, with a mean of 29 percent. This result, with the 2014 panel estimating fewer than half as many transitions as the 2008 panel, is similar to what the panel found for transitions into SNAP. The study panel’s analysis also finds that the 2014 panel mirrors the BLS series in showing a marked increase in new

Suggested Citation:"7 Quality of Key Estimates." National Academies of Sciences, Engineering, and Medicine. 2018. The 2014 Redesign of the Survey of Income and Program Participation: An Assessment. Washington, DC: The National Academies Press. doi: 10.17226/24864.
×

TABLE 7-17 2008 and 2014 Survey of Income and Program Participation (SIPP) Panel Estimates of Transitions into and out of Employment, Compared to Current Population Survey (CPS) Matched Estimates of Gross Flows (in thousands of persons), by Month, 2013

Month CPS Gross Flows 2008 Panel 2014 Panel SIPP as % of CPS Transitions to Employed SIPP as % of CPS Transitions to Not Employed
Transitions to Employed Transitions to Not Employed Transitions to Employed Transitions to Not Employed Transitions to Employed Transitions to Not Employed 2008 Panel 2014 Panel 2008 Panel 2014 Panel
January 5,147 6,635 3,420 3,488 66.4 52.6
February 6,246 5,665 3,093 3,412 1,209 510 49.5 19.4 60.2 9.0
March 5,755 5,302 3,376 2,683 1,325 614 58.7 23.0 50.6 11.6
April 5,976 5,018 3,334 2,776 1,449 894 55.8 24.2 55.3 17.8
May 6,177 5,516 4,174 2,698 1,969 997 67.6 31.9 48.9 18.1
Junea 6,589 6,232 2,738 1,489 41.6 23.9
Julyb 6,351 6,043 3,818 3,855 1,564 1,645 60.1 24.6 63.8 27.2
Augustb 6,546 7,166 4,405 3,764 2,287 1,478 67.3 34.9 52.5 20.6
Septembera 6,506 6,408 2,039 2,632 31.3 41.1
Octobera 5,538 6,077 1,807 1,419 32.6 23.4
Novembera 5,509 4,905 1,427 1,452 25.9 29.6
Decembera 5,061 5,453 1,247 1,573 24.6 28.8
Average 5,950 5,868 3,660 3,239 1,733 1,337 60.8 28.6 54.8 22.8

aSeam transitions were not collected in the 2008 panel for this month, so no estimates of transitions from that panel are reported.

bThe 2008 panel estimates have been adjusted to compensate for the absence of off-seam data from one rotation group.

SOURCE: Panel generated with data from the Bureau of Labor Statistics (available: https://www.bls.gov/webapps/legacy/cpsflowstab.htm), the 2008 (public-use data file), and 2014 SIPP (iteration 12) panels.

Suggested Citation:"7 Quality of Key Estimates." National Academies of Sciences, Engineering, and Medicine. 2018. The 2014 Redesign of the Survey of Income and Program Participation: An Assessment. Washington, DC: The National Academies Press. doi: 10.17226/24864.
×

employment in June and higher estimates in August and September than the surrounding months, but these patterns are more pronounced in SIPP data than in the BLS data.20 The 2014 panel also shows a more substantial decline in new employment in the final months of the year than does the BLS series.

Both SIPP panels’ estimates of transitions out of employment run lower relative to the BLS estimates than do their estimates of transitions into employment. The 2008 panel estimates range from 49 to 64 percent of the BLS estimates, with a mean of 55 percent, while the 2014 panel estimates run from 9 to 41 percent of the BLS estimates, with a mean of 23 percent. Even more so than the transitions into employment, the 2014 panel estimates of transitions out of employment run much lower in the first part of the year than later. This suggests that respondents are more likely to forget (or to telescope to an earlier period) employment that ended early in the year than employment that began early in the year—presumably because the employment that began early in the year is still continuing or ended recently. The study panel notes as well that transitions out of employment have a substantial peak in September in the 2014 SIPP panel—consistent with students leaving summer employment to return to school—whereas the peak in the BLS series occurs in August. As with transitions into employment, this 1-month shift in timing may reflect the differences in how employment is measured in the two surveys.

Medicaid

There are only limited administrative data on transitions into the Medicaid program and no survey estimates comparable to what the CPS provides for employment transitions. However, like employment, Medicaid is characterized by sizable numbers of monthly transitions in both directions. Comparing the two panels with respect to estimates of these transitions gives us additional leverage for assessing the reporting of transitions in the 2014 panel, although the study panel lacks the basis for saying which panel’s estimates are more correct if they differ.

Table 7-18 compares the two panels with respect to transitions into and out of Medicaid. The results reflect even less well on the 2014 panel than did those for SNAP and employment. Over the months of February through May, the 2014 panel finds only 6 to 9 percent as many transitions into Medicaid as the 2008 panel and only 5 to 7 percent as many transi-

___________________

20 The BLS estimates are based on labor force status during a single week during the nominal survey month, whereas the SIPP estimates treat someone as employed if that person had a job for any part of the month. This difference in measurement may contribute to differences in the observed transitions in the two surveys.

Suggested Citation:"7 Quality of Key Estimates." National Academies of Sciences, Engineering, and Medicine. 2018. The 2014 Redesign of the Survey of Income and Program Participation: An Assessment. Washington, DC: The National Academies Press. doi: 10.17226/24864.
×

TABLE 7-18 2008 and 2014 Survey of Income and Program Participation (SIPP) Panel Data Transitions into and out of Medicaid (in thousands of persons), by Month, 2013

Month 2008 Panel 2014 Panel Ratio of 2014 to 2008
Transitions into Medicaid Transitions out of Medicaid Transitions into Medicaid Transitions out of Medicaid Transitions into Medicaid Transitions out of Medicaid
January 2,221 2,440
February 2,389 2,391 224 119 0.09 0.05
March 2,196 2,333 191 104 0.09 0.04
April 2,405 2,211 137 149 0.06 0.07
May 2,308 2,385 150 146 0.06 0.06
Junea 326 271
Julyb 2,035 2,303 251 207 0.12 0.09
Augustb 2,304 2,429 417 256 0.18 0.11
Septembera 247 205
Octobera 381 240
Novembera 395 333
Decembera 318 213
Average 2,265 2,356 276 204 0.10 0.07

aSeam transitions were not collected in the 2008 panel for this month, so no estimates of transitions from that panel are reported.

bThe 2008 panel estimates have been adjusted to compensate for the absence of off-seam data from one rotation group.

SOURCE: Panel generated with data from the 2008 (public-use data file) and 2014 SIPP (iteration 12) panels.

tions out of Medicaid. The 2014 panel improves in July and August, with 12 to 18 percent as many transitions into Medicaid and 9 to 11 percent as many transitions out of Medicaid. On average, the 2014 panel estimates of transitions into Medicaid are only 10 percent of the 2008 panel estimates, while the 2014 panel estimates of transitions out of Medicaid are only 7 percent of the 2008 panel estimates.

To monitor the impact of the Affordable Care Act, the Centers for Medicare & Medicaid Services began publishing administrative counts of new eligibility determinations by month for Medicaid and for the Children’s Health Insurance Program. This series started in October 2013. For the final three months of 2013, these determinations of new eligibility averaged 2.2 million per month. These are strikingly consistent with the

Suggested Citation:"7 Quality of Key Estimates." National Academies of Sciences, Engineering, and Medicine. 2018. The 2014 Redesign of the Survey of Income and Program Participation: An Assessment. Washington, DC: The National Academies Press. doi: 10.17226/24864.
×

estimates from the 2008 panel, but they far exceed the highest estimates from the 2014 panel. Given that the two panels produced virtually identical estimates of overall Medicaid enrollment, these discrepancies are difficult to explain. One possibility is that 2014 panel respondents reported their Medicaid enrollment at the time of their interviews to the same degree of accuracy as 2008 panel respondents but rarely reported any change in that enrollment between then and the previous January. In our examination of monthly reporting of program participation, the panel found little change over the year in most programs, including Medicaid. In addition, Table 7-14 shows that the seam effect on Medicaid reporting in the 2008 panel was especially high, with 90 percent of transitions into Medicaid and 94 percent of transitions out of Medicaid being on the seam. Whatever accounts for the minimal reporting of within-wave Medicaid transitions in the 2008 panel may have been operational within the first wave of the 2014 panel as well, despite the much longer reference period.

Growth over the year in the estimated numbers of transitions in both directions in the 2014 panel is suggestive of recall bias as well, as the estimates from the 2008 panel are quite flat for the months that they are reported. As noted above, the panel did not observe much change in reported levels of Medicaid enrollment over the year, but that does not necessarily imply the same uniformity in the reporting of change in Medicaid enrollment.

Poverty

Similar to employment and Medicaid, the 2014 SIPP panel captures substantially fewer transitions into and out of poverty than the 2008 panel, although the 2014 panel does substantially better for poverty than for Medicaid. As with transitions into employment, the transitions into poverty are on average about half as numerous in the 2014 panel as in the 2008 panel (see Table 7-19). For exits from poverty the transitions estimated by the 2014 panel are, on average, 64 percent as numerous as those estimated by the 2008 panel. This is better than for exits from employment, but still not good, where the transitions estimated by the 2014 panel were well below half of those estimated by the 2008 panel.

Variation in the frequency of monthly transitions estimated by the 2014 panel is much more substantial than with the 2008 panel. One factor is the short-month phenomenon (fewer days in the month) explained earlier. Whereas in most months the 2014 panel finds 2 to 3 million people becoming poor and a comparable number leaving poverty, the estimated entrances into poverty number 5.1 million in February and the exits—representing many of the same people, the panel suspects—number 4.8 million in March. Less dramatic upticks in poverty entrances are shown in the short months

Suggested Citation:"7 Quality of Key Estimates." National Academies of Sciences, Engineering, and Medicine. 2018. The 2014 Redesign of the Survey of Income and Program Participation: An Assessment. Washington, DC: The National Academies Press. doi: 10.17226/24864.
×

TABLE 7-19 2008 and 2014 Survey of Income and Program Participation (SIPP) Panel Data Transitions into and out of Poverty (in thousands of persons), by Month, 2013

Month 2008 Panel 2014 Panel Ratio of 2014 to 2008
Entrances into Poverty Exits from Poverty Entrances into Poverty Exits from Poverty Entrances into Poverty Exits from Poverty
January 4,175 4,752
February 4,849 4,254 5,057 1,661 1.04 0.39
March 3,750 4,142 1,079 4,782 0.29 1.15
April 3,733 4,483 1,945 2,065 0.52 0.46
May 3,315 4,530 1,569 2,957 0.47 0.65
Junea 2,899 2,597
Julyb 4,350 4,680 2,184 2,736 0.50 0.58
Augustb 4,098 4,709 2,090 2,809 0.51 0.60
Septembera 3,174 3,044
Octobera 1,606 3,475
Novembera 2,976 2,024
Decembera 2,352 2,841
Average 4,039 4,507 2,448 2,817 0.56 0.64

aSeam transitions were not collected in the 2008 panel for this month, so no estimates of transitions from that panel are reported.

bThe 2008 panel estimates have been adjusted to compensate for the absence of off-seam data from one rotation group.

SOURCE: Panel generated with data from the 2008 (public-use data file) and 2014 SIPP (iteration 12) panels.

of April, June, September, and November, with less pronounced increases in poverty exits the next month in each case. The March and October estimates of entrances appear to be depressed by the high numbers of entrances in the prior months. This makes sense in that the somewhat easier entry into measured poverty in a short month may bring in people who would have entered otherwise in the next month. Excluding February and March, the entrances estimated by the 2014 panel are 47 to 52 percent of the entrances estimated by the 2008 panel. Excluding the same two months, the 2014 panel estimates of poverty exits are 46 to 65 percent of those captured in the 2008 panel.

SIPP is the only data source that can provide estimates of monthly transitions into and out of poverty. There is no other dataset that can be

Suggested Citation:"7 Quality of Key Estimates." National Academies of Sciences, Engineering, and Medicine. 2018. The 2014 Redesign of the Survey of Income and Program Participation: An Assessment. Washington, DC: The National Academies Press. doi: 10.17226/24864.
×

tapped to do so. Unlike employment (and, for transitions in one direction, unemployment compensation, SNAP, and Medicaid), there is no alternative source that can be consulted to determine which SIPP panel is more correct in its estimates of poverty transitions. The panel has noted concerns that the frequent interviews in the older SIPP could produce overstated transitions. However, the panel has seen no evidence of that in the limited comparisons to administrative records that it was able to perform. And while the results for unemployment compensation demonstrate that one must be careful in generalizing from findings for one source or status to another, the markedly fewer transitions estimated by the 2014 panel than the 2008 panel for two major programs and two critical economic indicators provide compelling evidence that in the measurement of short-term dynamics, the 2014 panel falls well short of the 2008 panel.

Using the EHC to Capture Spells

As identified in Chapter 4, the EHC offers a potentially significant advantage over the standard questionnaire approach to capture spells (critical for measuring intra-year dynamics) by allowing respondents and interviewers to move back and forth across programs and activities, reporting events that are linked in time. However, the study panel learned that this potential advantage is not maximized because interviewers appear to use the EHC primarily in a sequential manner that fails to take advantage of the flexibility afforded by the EHC. Possible explanations include the possibility that training is insufficient to give interviewers both the direction and confidence sufficient to use the EHC to move back and forth between spells. The report discusses training in Chapter 9.

Second, an interviewer is asked to identify a spell in the EHC and then to immediately move out of the EHC to ask a series of supplemental questions using the standard screen. The study panel is concerned that the identification and interconnectedness of spells within and across timelines can easily be overlooked through this linear process. A secondary adverse consequence of the structure of the questionnaire is that each identified spell becomes a filter question that is immediately followed by supplementary questions. Respondents will learn that if they report a spell, a series of supplementary questions will be asked. This may discourage the reporting of additional spells. An alternative construction of the questionnaire would be to have the interviewer use the EHC to identify all spells up front. Only after all spells (across all programs/activities) were identified would the interviewer leave the EHC and ask the supplemental questions for programs/activities for which spells had been identified.

A third issue with the way that the EHC is constructed is that, because of space limitations, it does not include all programs and activities. So there

Suggested Citation:"7 Quality of Key Estimates." National Academies of Sciences, Engineering, and Medicine. 2018. The 2014 Redesign of the Survey of Income and Program Participation: An Assessment. Washington, DC: The National Academies Press. doi: 10.17226/24864.
×

can be a real disconnect between spells of related activities/programs. For example: spells of unemployment, employment, and out of the labor force should be cross-checked within the interview. The fact that employment is captured within the EHC while unemployment compensation and time out of the labor force are captured through the standard screens make cross-checking very difficult. The report previously discussed the confusion between participation in Old-Age, Survivors, and Disability Insurance and SSI (see Recommendation 5-3).

NONRESPONSE

Response rates to sample surveys have been declining for decades, and while federal surveys (primarily those conducted by the Census Bureau) have enjoyed higher response rates than surveys conducted by other entities—even federal contractors—the federal survey organizations have not been immune to this trend. Recent studies have documented an increased rate of decline in response rates to federal surveys in the past few years (Atrostic et al., 2001; Brick and Williams, 2013; Czajka and Beyler, 2016; Williams and Brick, 2017). Households that participate in surveys have also shown an increasing reluctance to answer questions relating to income, a trend that has been documented for the CPS ASEC (Czajka and Beyler, 2016). Being launched during a period of increasing survey nonresponse only adds to the challenges that confronted the implementation of a fundamental redesign of SIPP. The report examines both unit and item nonresponse to the 2014 panel in this section.

Unit Nonresponse by Wave

The 2014 SIPP panel has continued the decline in response rates observed over the previous three SIPP panels. Moreover, in keeping with recent trends in survey response rates, the decline in the wave 1 response rate between the first wave of the 2008 panel and the first wave of the 2014 panel exceeded the decline between the initial waves of the 2001 and 2008 panels. Between the 2001 and 2008 panels the initial wave response rate fell from 86.7 percent to 80.8 percent, a decline of 5.9 percentage points (see Table 7-20). The wave 1 response rate for the 2014 panel, 70.1 percent, was 10.7 percentage points lower than that of the 2008 panel. The wave 2 response rate—among households that responded to the first wave and remained eligible—was 74.2 percent or nearly 18 percentage points lower than the wave 2 response rates for the previous three SIPP panels. The wave 3 response rate of 59.4 percent represented a wave-to-wave plunge unlike any seen in a recent SIPP panel, where the largest previous decline between consecutive waves was 4.4 percentage points, recorded between waves

Suggested Citation:"7 Quality of Key Estimates." National Academies of Sciences, Engineering, and Medicine. 2018. The 2014 Redesign of the Survey of Income and Program Participation: An Assessment. Washington, DC: The National Academies Press. doi: 10.17226/24864.
×

TABLE 7-20 Response Rates and Cumulative Sample Loss by Wave, 2001-2014 Survey of Income and Program Participation Panels

Wave Eligible Holds Interviewed Holds Response Rate Nonresponse Rates Cumulative Sample Loss
Type A Type D
2014 Panel
1 42,348 29,685 70.1 29.9 29.9
2 31,085 23,060 74.2 21.9 3.9 48.0
3 32,119 19,090 59.4 33.7 6.9 58.3
2008 Panel
1 52,031 42,032 80.8 19.2 19.2
2 42,481 39,000 91.8 6.9 1.3 25.9
3 42,779 37,651 88.0 9.7 2.3 29.0
4 43,176 36,195 83.8 13.2 2.9 32.4
5 43,422 35,873 82.6 14.0 3.3 33.3
6 43,544 34,891 80.1 15.9 4.0 35.5
7 43,619 33,827 77.6 18.2 4.2 37.5
8 43,609 33,417 76.6 19.0 4.3 38.2
9 43,621 32,567 74.7 20.4 4.7 39.7
10 43,690 31,445 72.0 22.7 5.1 41.9
11 43,720 31,007 70.9 23.5 5.3 42.7
12 43,678 30,716 70.3 24.0 5.6 43.4
13 43,654 30,213 69.2 25.2 5.6 44.4
14 43,600 29,810 68.4 26.0 5.5 44.9
15 43,653 28,885 66.2 27.5 5.8 46.5
16 32,566 20,135 61.8 31.4 6.1 53.0
2004 Panel
1 51 363 43 711 85.1 14.9 14.9
2 44,150 40,587 91.9 6.6 1.4 21.9
3 44,614 39,117 87.7 9.9 2.5 25.5
4 44,930 38,309 85.3 11.6 3.1 27.6
5 45,350 37,446 82.6 13.7 3.7 29.8
6 45,638 36,931 80.9 15.0 4.1 31.2
7 45,688 36,289 79.4 16.1 4.5 32.5
8 45,684 35,966 78.7 16.1 5.2 33.1
9 21,296 16,587 77.9 16.9 5.2 34.0
10 21,342 16,235 76.1 18.5 5.3 35.5
11 21,347 15,894 74.5 19.7 5.7 36.9
12 21,332 15,952 74.8 18.9 6.4 36.6
2001 Panel
1 40,489 35,102 86.7 13.3 13.3
2 30,514 28,086 92.0 6.4 1.7 21.9
3 30,899 27,453 88.8 8.6 2.7 24.7
4 31,111 27,179 87.4 9.5 3.2 25.9
5 31,300 26,775 85.5 10.9 3.6 27.5
Suggested Citation:"7 Quality of Key Estimates." National Academies of Sciences, Engineering, and Medicine. 2018. The 2014 Redesign of the Survey of Income and Program Participation: An Assessment. Washington, DC: The National Academies Press. doi: 10.17226/24864.
×
Wave Eligible Holds Interviewed Holds Response Rate Nonresponse Rates Cumulative Sample Loss
Type A Type D
6 31,449 26,635 84.7 11.6 3.7 28.2
7 31,540 26,471 83.9 12.3 3.8 28.9
8 31,623 26,012 82.3 13.3 4.5 30.3
9 31,684 25,481 80.4 14.7 4.8 31.9

NOTE: Response, nonresponse, and sample loss rates are unweighted.

SOURCES: Panel generated with data as follows: for 2001, 2004, and 2008—Jason M. Fields unpublished tabulation. For 2014—James Treat (Nov. 21) memorandum “Sample Loss Rates for SIPP 1985 Through SIPP 2008 Panels” to Jason M. Fields, U.S. Census Bureau.

15 and 16 of the 2008 panel. The next largest decline—of 4.2 percentage points—occurred between waves 3 and 4 of the 2008 panel and between waves 2 and 3 of the 2004 panel. The 2014 panel’s wave 3 response rate of 59.4 percent is 2.4 percentage points below the wave 16 response rate to the 2008 panel.

The cumulative impact of nonresponse is reflected in the sample loss rate, which expresses the number of households responding to a given wave as a percentage of the estimated number of households that would have responded to that wave if the response rate to every previous wave had been 100 percent. The denominator for the sample loss calculation is the number of eligible households at wave 1 multiplied by a growth factor that accounts for eligible households splitting into two or more households due to original sample members moving out. These household splits are observed for households that remain as respondents but have to be estimated for households that do not. After wave 2 the cumulative sample loss rate for the 2014 panel stood at 48.0 percent, a loss rate that was not reached by the 2008 panel until after wave 15 and was never approached by the shorter 2001 and 2004 panels. After wave 3 the cumulative sample loss rate for the 2014 panel reached 58.3 percent, or more than 5 percentage points beyond that of the final (16th) wave of the 2008 panel.

One concern expressed about the design of the 2014 panel was that the gap between interviews—anywhere from 9 to 16 months, with an average around 12 months—might make it more difficult to retain contact with SIPP households between waves. However, the concurrent growth in Internet usage has made it increasingly easier to locate individuals. The role of lost contacts in the decline in response rates between waves is reflected in the Type D nonresponse rate, which measures nonresponse by households that moved to an unknown address or moved to a known address more than 100 miles from the nearest field representative and could not

Suggested Citation:"7 Quality of Key Estimates." National Academies of Sciences, Engineering, and Medicine. 2018. The 2014 Redesign of the Survey of Income and Program Participation: An Assessment. Washington, DC: The National Academies Press. doi: 10.17226/24864.
×

be interviewed by telephone.21 Type D nonresponse has declined slightly over time and has become less and less important relative to Type A nonresponse, which includes refusals, inability to make contact, and inability to conduct an interview due to a language barrier (less important after wave 1). (The study panel thinks it is important to track these components of Type A nonresponse separately but did not have the data to do so.) Type D rates for the 2014 panel are indeed higher than among the three previous panels, even when one compares waves that reflect roughly the same gap in time, although the more frequent contact attempts with the older SIPP design mean that even similar gaps in time between waves are not strictly comparable in terms of the potential for losing contact. Wave 4 interviews in earlier panels, occurring 12 months after the wave 1 interviews, had a Type D nonresponse rate of 2.9 percent in the 2008 panel, compared to 3.9 percent for wave 2 of the 2014 panel. Wave 7 interviews, coming 12 months after wave 4 in earlier panels, had a Type D nonresponse rate of 4.2 percent for the 2008 panel, compared to 6.9 percent for wave 3 of the 2014 panel. Nevertheless, despite its growth relative to prior panels, Type D nonresponse was no more important as a factor in the overall nonresponse to waves 2 and 3 of the 2014 panel than it was for almost any wave of the prior three panels.

In comparison with the redesigned SIPP, the Panel Study of Income Dynamics and the National Longitudinal Survey of Youth are now collecting data biannually, yet have adopted techniques to keep their panel participants engaged. It may be helpful for the Census Bureau to consult with these organizations to help mitigate the large wave-to-wave losses that the redesigned SIPP is experiencing.

Unit Nonresponse to the Supplement

A supplemental survey was added to the SIPP program between wave 1 and wave 2, sponsored by the Social Security Administration (SSA). This important client needed data from survey questions, primarily from topical modules, that had been dropped with the SIPP redesign. The supplement targeted all respondents from wave 1 and was implemented as a telephone survey with a fairly limited nonresponse follow-up. A telephone survey is generally expected to have lower response rates than a face-to-face survey, and staff from the Census Bureau and the SSA discussed those expectations. Table 7-21 shows that the unweighted response rate to the SSA supplement was 52.2 percent, where the number of eligible households constituting the denominator are those that responded to wave 1. Whether this approach to collecting supplemental data provides

___________________

21 By definition, Type D nonresponse cannot occur in wave 1.

Suggested Citation:"7 Quality of Key Estimates." National Academies of Sciences, Engineering, and Medicine. 2018. The 2014 Redesign of the Survey of Income and Program Participation: An Assessment. Washington, DC: The National Academies Press. doi: 10.17226/24864.
×

TABLE 7-21 Response Rate to the Social Security Administration Supplement

Wave Eligible Households Interviewed Households Response Rate Nonresponse Rates Sample Loss
Type A Type D
1a 29,685 15,498 52.2 47.8 47.8

NOTE: Response, nonresponse, and sample loss rates are unweighted.

SOURCE: Personal correspondence from Matthew Marlay and Cindy Easton, U.S. Census Bureau, to John Czajka, chair of the Panel on the Review and Evaluation of the 2014 Survey of Income and Program Participation Content and Design.

the quality needed by the client will be answered as SSA staff and other users proceed with their analyses and evaluations of the data. A related issue is whether the response rate achieved with the supplement will discourage prospective sponsors of future supplements (with similar designs and expectations of nonresponse).

Item Nonresponse

Questions about income and assets routinely receive among the highest rates of nonresponse. This reflects a combination of the sensitivity of the questions and respondents’ uncertainty about the correct responses. The role of the latter is evident from the fact that nonresponse tends to be highest to questions that typically involve relatively small amounts that are received infrequently or only on paper—that is, not as cash payments. Thus, nonresponse to questions about interest received is much higher than nonresponse to questions about earnings, despite the seemingly greater sensitivity of the latter.

In an effort to reduce respondent burden in SIPP and speed up the interviews, the Census Bureau stopped repeating questions about monthly income from sources that tended to pay the same amount each month and resorted instead to imputing the amounts based on the responses given at longer intervals. With its longer reference period, the 2014 panel collects the most recent monthly amounts and then asks only about changes to these amounts. Because of these different approaches, imputation rates cannot be compared between the 2014 and earlier SIPP panels for many sources of income.

Table 7-22 reports the percentage of annual dollars imputed for 17 income sources in the 2014 panel and 5 sources in the 2008 panel. The table also reports the percentage of dollars imputed for the 17 sources in the 2014 CPS ASEC. For every one of the five sources with imputation

Suggested Citation:"7 Quality of Key Estimates." National Academies of Sciences, Engineering, and Medicine. 2018. The 2014 Redesign of the Survey of Income and Program Participation: An Assessment. Washington, DC: The National Academies Press. doi: 10.17226/24864.
×

TABLE 7-22 Proportion of Total Dollars of Income Imputed by Source: 2008 and 2014 Survey of Income and Program Participation (SIPP) Panels and 2014 Current Population Survey Annual Social and Economic Supplement (CPS ASEC)

Source of Income 2008 SIPP 2014 SIPP 2014 CPS ASEC
Wages and Salaries 0.171 0.201 0.375
Self-employment 0.382 0.422 0.479
Interest 0.514 0.651 0.759
Dividends 0.216 0.586 0.672
Rent a 0.463 0.412
Social Security (adult) a 0.218 0.375
Railroad Retirement a 0.415 0.585
SSI 0.070 0.269 0.296
Family Assistance a 0.094 0.341
Unemployment Compensation a 0.190 0.351
Workers’ Compensation a 0.292 0.340
Veterans’ Benefits a 0.159 0.380
SNAP a 0.151 0.274
Private Pensions a 0.179 0.396
Federal Pensions a 0.221 0.355
Military Pensions a 0.140 0.349
State and Local Pensions a 0.161 0.349

NOTE: SSI = Supplemental Security Income, SNAP = Supplemental Nutrition Assistance Program.

aImputation rate is not comparable to the other surveys because it reflects logical imputations of monthly amounts that were not asked except in selected months in order to reduce burden.

SOURCE: Panel generated with data from the 2008 (public-use data file), 2014 SIPP (iteration 12) panels, and the 2014 CPS ASEC.

rates reported in both SIPP panels, the imputation rate is lower in the 2008 panel. There are relatively small differences for wages and salaries and for self-employment but large differences for dividends and SSI. However, in comparison with the CPS ASEC, the 2014 SIPP panel has lower imputation rates for 16 of 17 sources. For many items the imputation rates in the CPS ASEC—the official source for estimates of household income and poverty statistics in the United States—are nearly twice as high as they are in the 2014 SIPP panel. For example, the imputation rate of 20.1 percent

Suggested Citation:"7 Quality of Key Estimates." National Academies of Sciences, Engineering, and Medicine. 2018. The 2014 Redesign of the Survey of Income and Program Participation: An Assessment. Washington, DC: The National Academies Press. doi: 10.17226/24864.
×

for wages and salaries in SIPP compares to 37.5 percent in the CPS ASEC. Retirement and most safety net programs exhibit similarly lower rates of imputation in SIPP. Rent is the one exception, with a higher imputation rate in SIPP than in the CPS ASEC, where the amount is combined with royalty income. Interest and dividends have the highest rates of imputation in both surveys, with rates of 65.1 and 58.6 percent, respectively, in SIPP and 75.9 and 67.2 percent in the CPS ASEC.

SUMMARY OF FINDINGS WITH RECOMMENDATIONS

Finding 7-1: Between 1990 and 2012, SIPP’s estimates of annual aggregate income for nearly every major source declined relative to NIPA totals. When sources are combined, SIPP’s share of the NIPA total fell from 86 percent to 73 percent.

Finding 7-2: Earnings and asset income (interest, dividends, and rent) are better reported in the 2014 panel than the 2008 panel.

Finding 7-3: Differences in the estimates of transfer income between the 2008 and 2014 panels are mixed.

Finding 7-4: Pension income is less well reported in the 2014 panel than in the 2008 panel, but withdrawals from retirement accounts are more fully reported.

Finding 7-5: Reported health insurance coverage is virtually unchanged between the 2008 and 2014 SIPP panels.

Finding 7-6: Compared to the 2008 panel, the 2014 SIPP panel captures less income at the very bottom of the income distribution, as reflected in higher estimates of the monthly percentage of the population in families below 50 percent of the federal poverty threshold. Above that threshold and up to 200 percent of the poverty threshold, the 2014 panel finds a smaller percentage of the population than the 2008 panel, although the poverty rate (percentage of population below the threshold) in the 2014 panel remains higher than the poverty rate in the 2008 panel.

Finding 7-7: Apparent misreports of monthly or annual earnings as hourly wages by a small number of respondents nearly doubled the estimate of annual aggregate wage and salary income in the SIPP 2014 wave 1 data. On the public use file, these excess earnings are largely eliminated by topcoding, but they remain on the internal file.

Suggested Citation:"7 Quality of Key Estimates." National Academies of Sciences, Engineering, and Medicine. 2018. The 2014 Redesign of the Survey of Income and Program Participation: An Assessment. Washington, DC: The National Academies Press. doi: 10.17226/24864.
×

Finding 7-8: While most income sources show little to no evidence of more complete reporting later than earlier in the reference period, unemployment compensation and SNAP display a marked decrease in reported participation between the end of the reference period and the beginning. This produces trends in participation that run counter to the decreasing participation from earlier to later in the same period, as documented in program administrative data.

Finding 7-9: Employment shows minimal evidence of recall bias and very high reporting in both panels, but with higher reporting in the 2014 panel.

Finding 7-10: Where comparisons of estimated transitions with administrative statistics or other suitable benchmarks are possible, the 2014 SIPP panel generally captures a smaller share of transitions than the 2008 panel. Transitions that occur early in the year are measured particularly poorly. Averaging over the year, unemployment compensation performs best among the income programs the study panel was able to evaluate. The 2014 SIPP panel’s estimates of transitions into unemployment compensation are comparable to the 2008 panel and close to administrative estimates in the final months of the 2013 reference year. For SNAP, the 2014 panel is well below the 2008 panel in the share of actual new cases that it captures, although this shows improvement over the reference year. For Medicaid, the 2014 panel’s estimates of transitions into and out of the program are just a small fraction of the transitions reported in the 2008 panel, which are consistent with administrative estimates for the final 3 months of the year. The 2014 panel’s estimates of transitions into and out of employment and poverty are about half as high as those obtained from the 2008 panel. For employment, the 2014 panel estimates are about a quarter as high as survey-based benchmarks.

Finding 7-11: As an artifact of the way that reported earnings are allocated to calendar months, the incidence of poverty tends to be higher in months with fewer than 31 days. Poverty spikes in February, with 5.1 million persons entering poverty in that month and 4.8 million exiting the next month.

Finding 7-12: Reflecting a secular trend of declining survey response rates, the wave 1 response rate for the 2014 SIPP panel was more than 10 percentage points below the wave 1 response rate of the 2008 panel. By wave 3 of the 2014 panel, the cumulative sample

Suggested Citation:"7 Quality of Key Estimates." National Academies of Sciences, Engineering, and Medicine. 2018. The 2014 Redesign of the Survey of Income and Program Participation: An Assessment. Washington, DC: The National Academies Press. doi: 10.17226/24864.
×

loss—58.3 percent—exceeded by 5 percentage points the sample loss at wave 16 of the 2008 panel.

Finding 7-13: For the small number of income sources where a comparison of imputation rates between the 2008 and 2014 SIPP panels is appropriate, the fraction of dollars imputed is higher in the 2014 panel and for two income sources markedly so. However, imputation rates for 16 of 17 sources are lower in the 2014 SIPP panel than in the 2014 CPS ASEC, with the rates in the latter being nearly twice as high in many cases.

Finding 7-14: Misreporting of the unit of measure caused outliers in the 2014 SIPP wage estimates. The study panel found approximately 20 instances of extremely high hourly wages and approximately the same number of extremely low amounts in annual earnings. Including these 40 observations doubled the annual wage and salary amount. The Census Bureau opted not to edit these reports but to instead use topcoding on the public use file to address the problem data. Thus the internal data and the public use data are inconsistent.

Finding 7-15: The study panel only received data from wave 1 of the 2014 SIPP and therefore was unable to evaluate seam effects in the redesigned survey. Limitations in the implementation of computer audio-recorded interviewing for the 2014 SIPP did not allow us to evaluate recordings from the EHC module of the questionnaire to look for potential causes of seams.

The panel was charged with comparing the previous SIPP with the redesigned SIPP and assessing the extent to which the new design improves upon, maintains, or underperforms the old design in terms of key variables. The findings listed above lead to the following conclusion:

CONCLUSION 7-1: The panel’s assessment of variables key to the Survey of Income and Program Participation’s (SIPP’s) purpose indicates that, when compared to the older design of SIPP, the redesigned SIPP sometimes outperforms, sometimes maintains performance, and sometimes underperforms the older design. Neither design is uniformly better. The new SIPP design outperforms the old design most notably with regard to reports of earnings and asset income (interest, dividends, and rent). The new SIPP design underperforms the old design in a number of areas that are particularly important to SIPP’s focus on poverty and within-year dynamics of low-income households. These

Suggested Citation:"7 Quality of Key Estimates." National Academies of Sciences, Engineering, and Medicine. 2018. The 2014 Redesign of the Survey of Income and Program Participation: An Assessment. Washington, DC: The National Academies Press. doi: 10.17226/24864.
×

areas include participation early in the calendar year, overall underreporting of income from programs such as Supplemental Security Income, family assistance (Temporary Assistance to Needy Families and General Assistance), unemployment income, and the Supplemental Nutrition Assistance Program, and underreporting of transitions in program participation.

Our assessment illuminates a number of areas in which the redesigned SIPP underperforms, and the study panel believes its assessment can help the Census Bureau develop a roadmap of research and changes as part of the continuous improvement of SIPP over the coming years. The panel also submits some specific recommendations.

RECOMMENDATION 7-1: The Census Bureau should develop a research program to examine the effects of seam bias on the 2014 Survey of Income and Program Participation and should develop ways to mitigate these effects.

RECOMMENDATION 7-2: The Census Bureau needs to establish and document procedures to identify and handle outliers in the Survey of Income and Program Participation (SIPP) data. The procedures should include early identification during the interview process in order to correct erroneous data. The procedures should identify data that significantly influence key estimates and methods to address those influences. These outlier procedures should be applied to internal SIPP data, apart from any topcoding of the public use data.

RECOMMENDATION 7-3: The Census Bureau should conduct analyses to identify and address issues with recall bias in reporting income and program participation for months during the early part of the reference year. Such bias appears to be particularly evident for lower income groups.

RECOMMENDATION 7-4: The Census Bureau should conduct research to identify why the redesigned Survey of Income and Program Participation (SIPP) has difficulty relative to the earlier SIPP panels in accurately capturing income from households at the bottom of the income distribution.

RECOMMENDATION 7-5: Accurately measuring transitions is a critical function for the Survey of Income and Program Participation (SIPP). The Census Bureau should research the areas where the redesigned SIPP performs poorly relative to the 2008 SIPP in measuring these

Suggested Citation:"7 Quality of Key Estimates." National Academies of Sciences, Engineering, and Medicine. 2018. The 2014 Redesign of the Survey of Income and Program Participation: An Assessment. Washington, DC: The National Academies Press. doi: 10.17226/24864.
×

transitions, and should develop a plan to test and improve such measurements. This research should include a careful assessment of field representatives’ use of the event history calendar (EHC), and whether the EHC, as implemented, can be effective in assisting respondents’ recall of spells. The testing of a more integrated EHC should be a part of this research. The research should evaluate an option for restructure that would record all income and program participation spells before asking any follow-on questions, with a mechanism to review cumulative spells before leaving the EHC. If the research to restructure the EHC does not lead to a marked improvement in the accurate collection of spells, the Census Bureau should explore the return to a shorter reference period.

RECOMMENDATION 7-6: The Census Bureau should address the “short month” discontinuity between income measurement and poverty measurement by handling “month size” similarly in both estimates.

RECOMMENDATION 7-7: The substantial decline in response between waves 2 and 3 of the 2014 Survey of Income and Program Participation panel raises concerns about attrition bias and the adequacy of the sample for multiyear longitudinal analysis. The Census Bureau should investigate factors contributing to the decline and whether there is evidence of attrition bias. This investigation should include the adequacy of weights in compensating for attrition.

RECOMMENDATION 7-8: The poor response rate to the Social Security Administration Supplement could discourage prospective sponsors of future supplements. The Census Bureau should conduct a nonresponse bias analysis to determine whether the low response rate affected the representativeness of the results and should investigate alternative approaches to collecting supplemental data that might provide better overall quality.

Suggested Citation:"7 Quality of Key Estimates." National Academies of Sciences, Engineering, and Medicine. 2018. The 2014 Redesign of the Survey of Income and Program Participation: An Assessment. Washington, DC: The National Academies Press. doi: 10.17226/24864.
×

This page intentionally left blank.

Suggested Citation:"7 Quality of Key Estimates." National Academies of Sciences, Engineering, and Medicine. 2018. The 2014 Redesign of the Survey of Income and Program Participation: An Assessment. Washington, DC: The National Academies Press. doi: 10.17226/24864.
×
Page 107
Suggested Citation:"7 Quality of Key Estimates." National Academies of Sciences, Engineering, and Medicine. 2018. The 2014 Redesign of the Survey of Income and Program Participation: An Assessment. Washington, DC: The National Academies Press. doi: 10.17226/24864.
×
Page 108
Suggested Citation:"7 Quality of Key Estimates." National Academies of Sciences, Engineering, and Medicine. 2018. The 2014 Redesign of the Survey of Income and Program Participation: An Assessment. Washington, DC: The National Academies Press. doi: 10.17226/24864.
×
Page 109
Suggested Citation:"7 Quality of Key Estimates." National Academies of Sciences, Engineering, and Medicine. 2018. The 2014 Redesign of the Survey of Income and Program Participation: An Assessment. Washington, DC: The National Academies Press. doi: 10.17226/24864.
×
Page 110
Suggested Citation:"7 Quality of Key Estimates." National Academies of Sciences, Engineering, and Medicine. 2018. The 2014 Redesign of the Survey of Income and Program Participation: An Assessment. Washington, DC: The National Academies Press. doi: 10.17226/24864.
×
Page 111
Suggested Citation:"7 Quality of Key Estimates." National Academies of Sciences, Engineering, and Medicine. 2018. The 2014 Redesign of the Survey of Income and Program Participation: An Assessment. Washington, DC: The National Academies Press. doi: 10.17226/24864.
×
Page 112
Suggested Citation:"7 Quality of Key Estimates." National Academies of Sciences, Engineering, and Medicine. 2018. The 2014 Redesign of the Survey of Income and Program Participation: An Assessment. Washington, DC: The National Academies Press. doi: 10.17226/24864.
×
Page 113
Suggested Citation:"7 Quality of Key Estimates." National Academies of Sciences, Engineering, and Medicine. 2018. The 2014 Redesign of the Survey of Income and Program Participation: An Assessment. Washington, DC: The National Academies Press. doi: 10.17226/24864.
×
Page 114
Suggested Citation:"7 Quality of Key Estimates." National Academies of Sciences, Engineering, and Medicine. 2018. The 2014 Redesign of the Survey of Income and Program Participation: An Assessment. Washington, DC: The National Academies Press. doi: 10.17226/24864.
×
Page 115
Suggested Citation:"7 Quality of Key Estimates." National Academies of Sciences, Engineering, and Medicine. 2018. The 2014 Redesign of the Survey of Income and Program Participation: An Assessment. Washington, DC: The National Academies Press. doi: 10.17226/24864.
×
Page 116
Suggested Citation:"7 Quality of Key Estimates." National Academies of Sciences, Engineering, and Medicine. 2018. The 2014 Redesign of the Survey of Income and Program Participation: An Assessment. Washington, DC: The National Academies Press. doi: 10.17226/24864.
×
Page 117
Suggested Citation:"7 Quality of Key Estimates." National Academies of Sciences, Engineering, and Medicine. 2018. The 2014 Redesign of the Survey of Income and Program Participation: An Assessment. Washington, DC: The National Academies Press. doi: 10.17226/24864.
×
Page 118
Suggested Citation:"7 Quality of Key Estimates." National Academies of Sciences, Engineering, and Medicine. 2018. The 2014 Redesign of the Survey of Income and Program Participation: An Assessment. Washington, DC: The National Academies Press. doi: 10.17226/24864.
×
Page 119
Suggested Citation:"7 Quality of Key Estimates." National Academies of Sciences, Engineering, and Medicine. 2018. The 2014 Redesign of the Survey of Income and Program Participation: An Assessment. Washington, DC: The National Academies Press. doi: 10.17226/24864.
×
Page 120
Suggested Citation:"7 Quality of Key Estimates." National Academies of Sciences, Engineering, and Medicine. 2018. The 2014 Redesign of the Survey of Income and Program Participation: An Assessment. Washington, DC: The National Academies Press. doi: 10.17226/24864.
×
Page 121
Suggested Citation:"7 Quality of Key Estimates." National Academies of Sciences, Engineering, and Medicine. 2018. The 2014 Redesign of the Survey of Income and Program Participation: An Assessment. Washington, DC: The National Academies Press. doi: 10.17226/24864.
×
Page 122
Suggested Citation:"7 Quality of Key Estimates." National Academies of Sciences, Engineering, and Medicine. 2018. The 2014 Redesign of the Survey of Income and Program Participation: An Assessment. Washington, DC: The National Academies Press. doi: 10.17226/24864.
×
Page 123
Suggested Citation:"7 Quality of Key Estimates." National Academies of Sciences, Engineering, and Medicine. 2018. The 2014 Redesign of the Survey of Income and Program Participation: An Assessment. Washington, DC: The National Academies Press. doi: 10.17226/24864.
×
Page 124
Suggested Citation:"7 Quality of Key Estimates." National Academies of Sciences, Engineering, and Medicine. 2018. The 2014 Redesign of the Survey of Income and Program Participation: An Assessment. Washington, DC: The National Academies Press. doi: 10.17226/24864.
×
Page 125
Suggested Citation:"7 Quality of Key Estimates." National Academies of Sciences, Engineering, and Medicine. 2018. The 2014 Redesign of the Survey of Income and Program Participation: An Assessment. Washington, DC: The National Academies Press. doi: 10.17226/24864.
×
Page 126
Suggested Citation:"7 Quality of Key Estimates." National Academies of Sciences, Engineering, and Medicine. 2018. The 2014 Redesign of the Survey of Income and Program Participation: An Assessment. Washington, DC: The National Academies Press. doi: 10.17226/24864.
×
Page 127
Suggested Citation:"7 Quality of Key Estimates." National Academies of Sciences, Engineering, and Medicine. 2018. The 2014 Redesign of the Survey of Income and Program Participation: An Assessment. Washington, DC: The National Academies Press. doi: 10.17226/24864.
×
Page 128
Suggested Citation:"7 Quality of Key Estimates." National Academies of Sciences, Engineering, and Medicine. 2018. The 2014 Redesign of the Survey of Income and Program Participation: An Assessment. Washington, DC: The National Academies Press. doi: 10.17226/24864.
×
Page 129
Suggested Citation:"7 Quality of Key Estimates." National Academies of Sciences, Engineering, and Medicine. 2018. The 2014 Redesign of the Survey of Income and Program Participation: An Assessment. Washington, DC: The National Academies Press. doi: 10.17226/24864.
×
Page 130
Suggested Citation:"7 Quality of Key Estimates." National Academies of Sciences, Engineering, and Medicine. 2018. The 2014 Redesign of the Survey of Income and Program Participation: An Assessment. Washington, DC: The National Academies Press. doi: 10.17226/24864.
×
Page 131
Suggested Citation:"7 Quality of Key Estimates." National Academies of Sciences, Engineering, and Medicine. 2018. The 2014 Redesign of the Survey of Income and Program Participation: An Assessment. Washington, DC: The National Academies Press. doi: 10.17226/24864.
×
Page 132
Suggested Citation:"7 Quality of Key Estimates." National Academies of Sciences, Engineering, and Medicine. 2018. The 2014 Redesign of the Survey of Income and Program Participation: An Assessment. Washington, DC: The National Academies Press. doi: 10.17226/24864.
×
Page 133
Suggested Citation:"7 Quality of Key Estimates." National Academies of Sciences, Engineering, and Medicine. 2018. The 2014 Redesign of the Survey of Income and Program Participation: An Assessment. Washington, DC: The National Academies Press. doi: 10.17226/24864.
×
Page 134
Suggested Citation:"7 Quality of Key Estimates." National Academies of Sciences, Engineering, and Medicine. 2018. The 2014 Redesign of the Survey of Income and Program Participation: An Assessment. Washington, DC: The National Academies Press. doi: 10.17226/24864.
×
Page 135
Suggested Citation:"7 Quality of Key Estimates." National Academies of Sciences, Engineering, and Medicine. 2018. The 2014 Redesign of the Survey of Income and Program Participation: An Assessment. Washington, DC: The National Academies Press. doi: 10.17226/24864.
×
Page 136
Suggested Citation:"7 Quality of Key Estimates." National Academies of Sciences, Engineering, and Medicine. 2018. The 2014 Redesign of the Survey of Income and Program Participation: An Assessment. Washington, DC: The National Academies Press. doi: 10.17226/24864.
×
Page 137
Suggested Citation:"7 Quality of Key Estimates." National Academies of Sciences, Engineering, and Medicine. 2018. The 2014 Redesign of the Survey of Income and Program Participation: An Assessment. Washington, DC: The National Academies Press. doi: 10.17226/24864.
×
Page 138
Suggested Citation:"7 Quality of Key Estimates." National Academies of Sciences, Engineering, and Medicine. 2018. The 2014 Redesign of the Survey of Income and Program Participation: An Assessment. Washington, DC: The National Academies Press. doi: 10.17226/24864.
×
Page 139
Suggested Citation:"7 Quality of Key Estimates." National Academies of Sciences, Engineering, and Medicine. 2018. The 2014 Redesign of the Survey of Income and Program Participation: An Assessment. Washington, DC: The National Academies Press. doi: 10.17226/24864.
×
Page 140
Suggested Citation:"7 Quality of Key Estimates." National Academies of Sciences, Engineering, and Medicine. 2018. The 2014 Redesign of the Survey of Income and Program Participation: An Assessment. Washington, DC: The National Academies Press. doi: 10.17226/24864.
×
Page 141
Suggested Citation:"7 Quality of Key Estimates." National Academies of Sciences, Engineering, and Medicine. 2018. The 2014 Redesign of the Survey of Income and Program Participation: An Assessment. Washington, DC: The National Academies Press. doi: 10.17226/24864.
×
Page 142
Suggested Citation:"7 Quality of Key Estimates." National Academies of Sciences, Engineering, and Medicine. 2018. The 2014 Redesign of the Survey of Income and Program Participation: An Assessment. Washington, DC: The National Academies Press. doi: 10.17226/24864.
×
Page 143
Suggested Citation:"7 Quality of Key Estimates." National Academies of Sciences, Engineering, and Medicine. 2018. The 2014 Redesign of the Survey of Income and Program Participation: An Assessment. Washington, DC: The National Academies Press. doi: 10.17226/24864.
×
Page 144
Suggested Citation:"7 Quality of Key Estimates." National Academies of Sciences, Engineering, and Medicine. 2018. The 2014 Redesign of the Survey of Income and Program Participation: An Assessment. Washington, DC: The National Academies Press. doi: 10.17226/24864.
×
Page 145
Suggested Citation:"7 Quality of Key Estimates." National Academies of Sciences, Engineering, and Medicine. 2018. The 2014 Redesign of the Survey of Income and Program Participation: An Assessment. Washington, DC: The National Academies Press. doi: 10.17226/24864.
×
Page 146
Suggested Citation:"7 Quality of Key Estimates." National Academies of Sciences, Engineering, and Medicine. 2018. The 2014 Redesign of the Survey of Income and Program Participation: An Assessment. Washington, DC: The National Academies Press. doi: 10.17226/24864.
×
Page 147
Suggested Citation:"7 Quality of Key Estimates." National Academies of Sciences, Engineering, and Medicine. 2018. The 2014 Redesign of the Survey of Income and Program Participation: An Assessment. Washington, DC: The National Academies Press. doi: 10.17226/24864.
×
Page 148
Suggested Citation:"7 Quality of Key Estimates." National Academies of Sciences, Engineering, and Medicine. 2018. The 2014 Redesign of the Survey of Income and Program Participation: An Assessment. Washington, DC: The National Academies Press. doi: 10.17226/24864.
×
Page 149
Suggested Citation:"7 Quality of Key Estimates." National Academies of Sciences, Engineering, and Medicine. 2018. The 2014 Redesign of the Survey of Income and Program Participation: An Assessment. Washington, DC: The National Academies Press. doi: 10.17226/24864.
×
Page 150
Suggested Citation:"7 Quality of Key Estimates." National Academies of Sciences, Engineering, and Medicine. 2018. The 2014 Redesign of the Survey of Income and Program Participation: An Assessment. Washington, DC: The National Academies Press. doi: 10.17226/24864.
×
Page 151
Suggested Citation:"7 Quality of Key Estimates." National Academies of Sciences, Engineering, and Medicine. 2018. The 2014 Redesign of the Survey of Income and Program Participation: An Assessment. Washington, DC: The National Academies Press. doi: 10.17226/24864.
×
Page 152
Suggested Citation:"7 Quality of Key Estimates." National Academies of Sciences, Engineering, and Medicine. 2018. The 2014 Redesign of the Survey of Income and Program Participation: An Assessment. Washington, DC: The National Academies Press. doi: 10.17226/24864.
×
Page 153
Suggested Citation:"7 Quality of Key Estimates." National Academies of Sciences, Engineering, and Medicine. 2018. The 2014 Redesign of the Survey of Income and Program Participation: An Assessment. Washington, DC: The National Academies Press. doi: 10.17226/24864.
×
Page 154
Suggested Citation:"7 Quality of Key Estimates." National Academies of Sciences, Engineering, and Medicine. 2018. The 2014 Redesign of the Survey of Income and Program Participation: An Assessment. Washington, DC: The National Academies Press. doi: 10.17226/24864.
×
Page 155
Suggested Citation:"7 Quality of Key Estimates." National Academies of Sciences, Engineering, and Medicine. 2018. The 2014 Redesign of the Survey of Income and Program Participation: An Assessment. Washington, DC: The National Academies Press. doi: 10.17226/24864.
×
Page 156
Suggested Citation:"7 Quality of Key Estimates." National Academies of Sciences, Engineering, and Medicine. 2018. The 2014 Redesign of the Survey of Income and Program Participation: An Assessment. Washington, DC: The National Academies Press. doi: 10.17226/24864.
×
Page 157
Suggested Citation:"7 Quality of Key Estimates." National Academies of Sciences, Engineering, and Medicine. 2018. The 2014 Redesign of the Survey of Income and Program Participation: An Assessment. Washington, DC: The National Academies Press. doi: 10.17226/24864.
×
Page 158
Suggested Citation:"7 Quality of Key Estimates." National Academies of Sciences, Engineering, and Medicine. 2018. The 2014 Redesign of the Survey of Income and Program Participation: An Assessment. Washington, DC: The National Academies Press. doi: 10.17226/24864.
×
Page 159
Suggested Citation:"7 Quality of Key Estimates." National Academies of Sciences, Engineering, and Medicine. 2018. The 2014 Redesign of the Survey of Income and Program Participation: An Assessment. Washington, DC: The National Academies Press. doi: 10.17226/24864.
×
Page 160
Suggested Citation:"7 Quality of Key Estimates." National Academies of Sciences, Engineering, and Medicine. 2018. The 2014 Redesign of the Survey of Income and Program Participation: An Assessment. Washington, DC: The National Academies Press. doi: 10.17226/24864.
×
Page 161
Suggested Citation:"7 Quality of Key Estimates." National Academies of Sciences, Engineering, and Medicine. 2018. The 2014 Redesign of the Survey of Income and Program Participation: An Assessment. Washington, DC: The National Academies Press. doi: 10.17226/24864.
×
Page 162
Next: 8 Impact on Respondent Burden »
The 2014 Redesign of the Survey of Income and Program Participation: An Assessment Get This Book
×
 The 2014 Redesign of the Survey of Income and Program Participation: An Assessment
Buy Paperback | $65.00 Buy Ebook | $54.99
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF

The Survey of Income and Program Participation (SIPP) is a national, longitudinal household survey conducted by the Census Bureau. SIPP serves as a tool to evaluate the effectiveness of government-sponsored social programs and to analyze the impacts of actual or proposed modifications to those programs. SIPP was designed to fill a need for data that would give policy makers and researchers a much better grasp of how effectively government programs were reaching their target populations, how participation in different programs overlapped, and to what extent and under what circumstances people transitioned into and out of these programs. SIPP was also designed to answer questions about the short-term dynamics of employment, living arrangements, and economic well-being.

The Census Bureau has reengineered SIPP—fielding the initial redesigned survey in 2014. This report evaluates the new design compared with the old design. It compares key estimates across the two designs, evaluates the content of the redesigned SIPP and the impact of the new design on respondent burden, and considers content changes for future improvement of SIPP.

READ FREE ONLINE

  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    Switch between the Original Pages, where you can read the report as it appeared in print, and Text Pages for the web version, where you can highlight and search the text.

    « Back Next »
  6. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  7. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  8. ×

    View our suggested citation for this chapter.

    « Back Next »
  9. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!