National Academies Press: OpenBook
« Previous: Evaluation Methodologies
Suggested Citation:"Data Issues for Tracking Welfare Leavers." National Research Council. 1999. Data and Methodological Issues for Tracking Former Welfare Recipients: A Workshop Summary. Washington, DC: The National Academies Press. doi: 10.17226/9696.
×

also emphasized that even though these grants may not be appropriate for making comparisons across the two cohorts, collecting certain data and doing some sensitivity analysis could help give these comparisons more validity.

It was suggested that researchers could collect information on the work histories and wage histories of all former recipients from administrative data. Labor market histories could help quantify who is “worse off ” in terms of past labor market experience and who is “better off.” It was also suggested that conducting sensitivity tests on how different economic conditions affect caseloads could help in making comparisons. Analyzing how the economy affects welfare exit rates and the rates at which former recipients become self-sufficient could be used to help explain differences across the two study cohorts.

It was also suggested that states could study those who stay on welfare and at the least, compare them to those who leave. This could be done within each of the two cohorts. Stayers in the first cohort could be compared to leavers in the first cohort and similarly for the second cohort. If outcome definitions and measures for these groups are similar across states, the relative well-being of leavers and stayers can be compared across states.

Finally, workshop participants suggested that the 14 grants could be used in conjunction with other research efforts at the state or county level to help evaluate the new policies. Many states are conducting research on other groups (besides welfare leavers) who may be affected by the policy changes. Linking research efforts could make further evaluations possible.

DATA ISSUES FOR TRACKING WELFARE LEAVERS

One of the primary capacity-building purposes of the 14 state and county grants is to help states develop the know-how to collect data on the populations that may be affected by the policy changes. Discussion at the workshop covered overall data collection issues, administrative data, survey data, and, finally, how survey and administrative data can be used together.

Overall Data Issues

Who Does the Sample Population Represent?

One concern of workshop participants involved what population the drawn samples would represent. All 14 of the projects are drawing a sample of the cash assistance recipients who have left the caseload in one of two time periods. As discussed above, because the caseload dynamics of each state vary over time, no state's sample population of leavers will represent all leavers in that state. Rather, the sample population will represent only leavers at a given time. In reporting the results of the studies, workshop participants emphasized that results be placed in the context of what is happening to the entire population of poor families in the state or county. For example, in reporting a result of those who leave welfare, it would be useful to know what proportion of the state's poor is on welfare and how that result compares with the outcomes of those who are poor but were not on welfare at the time the sample of leavers was drawn.

Suggested Citation:"Data Issues for Tracking Welfare Leavers." National Research Council. 1999. Data and Methodological Issues for Tracking Former Welfare Recipients: A Workshop Summary. Washington, DC: The National Academies Press. doi: 10.17226/9696.
×
Tools

Workshop participants also emphasized that state and county administrators carefully consider the computer tools they choose to store, link, access, and process data. The quantity of administrative data sets that most states are using and the number of years of data that states hope to use will place significant demands on the amount of computing space needed. Because there is a public-use requirement to these grants and because confidentiality rules will need to be applied, the questions of who has access to the data and how they get access will also require much consideration. Workshop participants urged administrators to carefully consider all the options and to start early in getting confidentiality agreements in place.

Administrative Data

Discussion at the workshop highlighted some of the advantages and disadvantages of using administrative data. One disadvantage of administrative data is the limited information available on why a person left welfare. Many states and counties plan to conduct subgroup comparisons of leavers by the reason for leaving welfare, but many recipients leave by simply not showing up for their next monthly check. Unless such a person had reached a time limit, it is difficult to determine why the person left welfare from administrative data. This lack is also a problem for studies that plan to oversample certain types of closed cases (e.g., sanctioned cases) in their surveys. Administrative records on earnings may suggest whether an individual would no longer have been eligible for welfare, but administrative records cannot capture whether an individual married or moved in with other family members and no longer needs welfare.

Another problem with administrative data is that at the state or county level, administrative data cannot track those who leave the region or a recipient who lives and works in different counties or states. Furthermore, if a recipient works for the federal government, the administrative records will not be in UI data sets. There will also be some former recipients whose employers are not covered under the UI reporting system and some whose employers do not report them as working in order to avoid the unemployment insurance taxes.5

Several possible data sources were suggested to use to fill the gaps or at least to assess the extent of the problem. The expanded Federal Parent Locator Service (FPLS) administrative data set was suggested in order to avoid the problem of geographical constraints of administrative data on earnings. The FPLS contains the new hires database, which is a national database that includes UI quarterly wage earnings data, UI compensation claims, and many federal agencies' reports of hires. The Federal Child Support Registry is also a potential national-level data set that has child support order information about all cases in the United States. In order to avoid the problem of UI underreporting, some workshop participants urged states and counties to consider other potential sources of administrative data from which earnings and income could be captured, such as tax returns. At least two states (Massachusetts and Wisconsin) have proposed to do so.

5  

One estimate of this problem, using data from the Illinois Department of Economic Security, found that firms do not report 13.6 percent of their workers and 4.2 percent of their UI taxable wages to UI administrators (Blakemore et al., 1996).

Suggested Citation:"Data Issues for Tracking Welfare Leavers." National Research Council. 1999. Data and Methodological Issues for Tracking Former Welfare Recipients: A Workshop Summary. Washington, DC: The National Academies Press. doi: 10.17226/9696.
×

Another problem of using multiple administrative data bases is in linking the databases to each other and to the survey data also being collected. A common identifier for each case is often not available. A recipient's Social Security number is the most commonly used identifier, but this is sometimes misreported or, in the case of immigrants and children of immigrants, replaced by a temporary identification that may be program specific. Two methods were suggested as possible ways to solve this problem. Some states use a single identification code for each recipient, which is the same code used for all programs in which the individual may be enrolled. This number does not change: if a recipient leaves a program but later returns, he or she is assigned the same number. Another possible solution is to use probabilistic record matching, based on several items of an individual's background (e.g., name, Social Security number, age, ethnicity, and other characteristics). A problem with this method is that it can be difficult and expensive to get the technical assistance needed to do it well.

Another problem highlighted at the workshop is that administrative data are often limited to information about one single household or family member who is the program participant. Yet often the outcome of interest is a family-level measure, such as family income, or a measure about the children in the family. To avoid this problem, tax revenue data, in conjunction with information collected on surveys, could be used.

Workshop participants also discussed the idea of linking administrative records from schools, which contain information on attendance, test scores, and delinquency of the children of recipients. Participants were optimistic about this as a potential data source; however, these data sets are often not computerized and do not include children in private schools. Another potential administrative data source are records from community resources that might be used by recipients and former recipients. Although there were questions about how much of these data are recorded, workshop participants agreed that at least listing these community resources would be of potential benefit to further research.

Some workshop participants cautioned about systematic biases in administrative data bases. One such bias experienced in at least one state is that administrative data sets tend to underreport data at the end of the year. Another bias that participants discussed focused on how reports of income in administrative data sets differ from reports of income from surveys. Surveys tend to capture more income for program participants than do administrative data. Because participants may not be eligible for programs or may have their benefits reduced if they report all their income, there is a disincentive to report income to caseworkers, that for some participants may override the legal obligation to do so.

Workshop participants urged that efforts to improve the quality of both survey and administrative data be made. They discussed several actions that can be taken to make sure the data is cleaned and of high quality. The first one is to make sure that those who are entering the data in the system from what are often paper records know that the data are important and that correctly entering the data is important. Another action that can be taken is to randomly choose a sample of administrative records from the automated system and double-check their accuracy against the paper copies of the records. It was also suggested that if sample sizes of the studies are relatively small, checking every record against the paper copy would not be unreasonable. A final action suggested is to develop computer algorithms to detect when data values do not fit into a reasonable ranges on the basis of the rules of the various programs that are the source of the data.

Despite these limitations of administrative data sets, workshop participants agreed that using them is valuable because they provide an inexpensive alternative to survey data. Administrative data

Suggested Citation:"Data Issues for Tracking Welfare Leavers." National Research Council. 1999. Data and Methodological Issues for Tracking Former Welfare Recipients: A Workshop Summary. Washington, DC: The National Academies Press. doi: 10.17226/9696.
×

sets can also serve as a cross-check for survey data. Workshop participants encouraged states and counties to collect as much information from administrative data sets as they can. Participants also urged that information collected not be limited only to cash assistance leavers, but also include current recipients and those using other benefit programs, such as Medicaid and food stamps.

Survey Data

Survey Design and Implementation

For all 14 projects, the sample population consists of all those who have left welfare. Drawing a sample from this population is a difficult task because they are no longer in administrative records.

Another potential problem that is related to identifying the sample frame for a survey involves the data collection mode. Because of cost considerations, most studies are relying on telephone surveys. A telephone survey of welfare leavers may miss many leavers because they do not have telephone service. To address this problem, some studies are supplementing their telephone surveys with in-person surveys for those who cannot be reached by telephone. Another method suggested was to conduct in-person interviews with sample members who do not have telephone service, and during this in-person interview give the respondent a cellular telephone and a toll-free number so that the respondent can later call the interviewer to complete the survey over the phone, thus avoiding a possible survey mode bias.

Response Rates

Several topics related to response rates for surveys were discussed at the workshop. How do you define a response rate? What is an appropriate response rate? How can a higher response rate be achieved? How can you tell if respondents are significantly different from nonrespondents? What can be done if the response rate is low?

How a response rate is defined was briefly discussed. One issue is whether those in the survey sample who were never located should be included as nonrespondents or should not be included in any response rate calculation. Another issue was whether or not those former recipients who had died since leaving welfare or who were institutionalized (incarcerated or in a health care facility) should be included in the calculation of nonrespondents. Participants noted that there are standards in the survey research profession for these definitions. The American Association for Public Opinion Research (1998) has recently published a guide to defining response rates which could be used for a reference.

Another issue that received a great deal of attention at the workshop was what an appropriate response rate was and how nonresponse bias can be detected. Participants were somewhat divided on what kind of response rate was a good response rate. Some argued that 60 percent response rate was adequate; others argued that the standard should be set at 80 percent. All participants were cautious about interpreting results from surveys with low response rates. Conducting sensitivity tests for detecting nonresponse bias is very important. However, the standard method of detecting

Suggested Citation:"Data Issues for Tracking Welfare Leavers." National Research Council. 1999. Data and Methodological Issues for Tracking Former Welfare Recipients: A Workshop Summary. Washington, DC: The National Academies Press. doi: 10.17226/9696.
×
Page 8
Suggested Citation:"Data Issues for Tracking Welfare Leavers." National Research Council. 1999. Data and Methodological Issues for Tracking Former Welfare Recipients: A Workshop Summary. Washington, DC: The National Academies Press. doi: 10.17226/9696.
×
Page 9
Suggested Citation:"Data Issues for Tracking Welfare Leavers." National Research Council. 1999. Data and Methodological Issues for Tracking Former Welfare Recipients: A Workshop Summary. Washington, DC: The National Academies Press. doi: 10.17226/9696.
×
Page 10
Suggested Citation:"Data Issues for Tracking Welfare Leavers." National Research Council. 1999. Data and Methodological Issues for Tracking Former Welfare Recipients: A Workshop Summary. Washington, DC: The National Academies Press. doi: 10.17226/9696.
×
Page 11
Next: Conclusion »
Data and Methodological Issues for Tracking Former Welfare Recipients: A Workshop Summary Get This Book
×
MyNAP members save 10% online.
Login or Register to save!
  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    Switch between the Original Pages, where you can read the report as it appeared in print, and Text Pages for the web version, where you can highlight and search the text.

    « Back Next »
  6. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  7. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  8. ×

    View our suggested citation for this chapter.

    « Back Next »
  9. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!