No single source can meet all demands for data needed to describe, monitor, and analyze alternative work arrangements (AWAs) in the U.S. labor market. Chapter 3 cited selected evidence, based on a wide range of data sources, to support recommendations for designing future iterations of the Contingent Worker Supplement (CWS). In this closing chapter, we give a brief overview of how other data sources have added to our understanding of the prevalence and characteristics of AWAs and the workers who engage in them—sometimes in ways that would not be possible or practical for a Current Population Survey (CPS) supplement.
The range of other data sources from which insights can be drawn includes household and business surveys, government administrative records, and commercial data. In some cases, these data sources allow researchers to measure dimensions of AWAs that are beyond the scope of the CWS, such as the relationship between work arrangements and health outcomes. In other cases, data sources such as administrative and commercial data may allow more accurate or detailed measurement of some aspects of AWAs, such as the prevalence of some forms of on-demand platforms.
Over the long term, there may be potential for linking the CWS with other data sources to enhance our understanding of AWAs. Continued exploration of this broader ecosystem of data sources will be an important strategy if we are to generate the most comprehensive statistical information about AWAs possible given budgetary constraints.
Chapter 3 already identified specific questions from non-CWS surveys that could be adapted for the CWS or at least provide insights into how to improve the CWS. Comparing results based on data from different household surveys can generate insights into the ways definitions, question wording, and respondent interpretations may affect estimates. In this section, we provide some examples of household surveys that can inform research on AWAs as well as others that, while focused on topics largely beyond the scope of the CWS, can complement our understanding of the nature of work. The goal here is to illustrate the importance to research and policy of being able to draw from multiple data sources; it is not to comprehensively catalog the data that could be used for this purpose.
For example, although worker safety is of significant policy interest because of its direct link to job quality and worker well-being, the CWS is not positioned to collect data on this outcome. One instrument that provides rich information on workplace safety is the Occupational Health Supplement to the National Health Interview Survey (NHIS), a major data collection program of the National Center for Health Statistics of the Centers for Disease Control and Prevention (CDC). This supplement, which so far has been fielded in 1988, 2010, and 2015, generates evidence of the impact of work schedules on health, among other things. The most obvious effects are on sleep quality and quantity, which in turn are linked to a wide range of outcomes. In addition, impacts on exercise, diet, smoking, substance use, and work-life balance and conflict are also important considerations. An example of a finding from this survey is the distinctly higher injury rates found among temporary help agency workers relative to direct-hire employees; after adjusting for occupational differences, the injury rates of temp agency workers have been estimated to be about twice as high. Temp agency workers actually tend to have less frequent exposure to workplace health and safety hazards, but on average they also have less safety training and less experience for the jobs to which they are assigned (Fabiano et al., 2008).1
Another reason data sources beyond the CWS are needed is that some research questions require longitudinal estimates. Longitudinal datasets are needed, for example, to accurately measure people’s transitions into and out of different kinds of work, which is important for understanding the implications of these work arrangements and the underlying reasons people engage in them. Despite the strengths of longitudinal analysis, the opportunity to employ it is limited in the CPS. Household members are
observed for only 8 months over a 16-month period. Although the CWS has asked retrospective questions about workers’ entry into and exit from AWAs, as discussed in Chapter 3, there are concerns about the ability of respondents to answer such questions accurately. For data that span longer time periods, researchers typically rely on longitudinal surveys, such as the National Longitudinal Surveys (NLS), which are sponsored by the BLS.
The NLS have gathered information at regular time intervals on the labor market experiences and other significant life events of several nationally representative cohorts of men and women. For the past four decades, the NLS79 has collected labor force information for a cohort of individuals who were ages 14 to 22 when they were first surveyed in 1979. These individuals, currently ages 54 to 62, were interviewed annually through 1994 and have been interviewed biennially since then.2 The NLS97 has collected labor force information since 1997 for a cohort currently ages 34 to 39; this group, containing individuals who were ages 14 to 22 when they were first surveyed, was interviewed annually through 2011 and has been interviewed biennially since then.3
From the beginning, the NLS interviews have attempted to identify whether workers are self-employed. The precise question sequence has varied somewhat over time, but since 2006 in the NLS97, if workers do not say they are self-employed, they are asked if they are independent contractors, freelancers, or independent consultants. Wage and salary workers who are not independent contractors, freelancers, or independent consultants are asked if they are temporary help workers, on-call workers, or contract company workers (with questions akin to those in the CWS). Similar changes were made beginning in 2002 in the NLS79 to better identify workers who are not in traditional wage and salary jobs.4 The longitudinal structure of the survey makes it possible for researchers to analyze transitions into and out of different employment arrangements—for example, when studying topics such as determinants of women’s entry into self employment (Taniguchi, 2002) or the “relative importance of family financial and human capital in the transition into self-employment” (Dunn and Holtz-Eakin, 2000).
4 Changes included a clarification of what defines self-employment: “On the basis of answers to the job classification questions, the respondent is classified as self-employed if he or she owned at least 50 percent of the business, was the chief executive officer or principal managing partner of the business, or was supposed to file a form SE for federal income taxes. Respondents also are classified as self-employed if they identify themselves as independent contractors, independent consultants, or freelancers. A job is classified as nontraditional employment if the respondent is paid by a temp employment agency.” Available: https://www.nlsinfo.org/content/cohorts/nlsy79/topical-guide/employment/class-worker.
The Survey of Income and Program Participation (SIPP), the statistical system’s premier source of information on individuals participating in government assistance programs, is another example of a dataset that allows career histories to be followed—something that, again, is largely outside the scope of the CWS. Beginning in 2014, the interview structure of the SIPP was changed so that households were now to be interviewed annually for 4 consecutive years.
The SIPP asks several questions of those who identify as employed and, in particular, collects information about what might be considered “informal” work. The 2014 SIPP asks people who reported performing work for pay the following: “Was that for an employer, self-employed or did you have some other arrangement? Other arrangements include odd jobs, on-call work, day labor, one-time jobs and informal arrangements like babysitting, lawn mowing or leaf raking for neighbors.” Answers are recorded as self-employed, employee, or other arrangement. Drawbacks to this question include its 1-year reference period, which again is likely to result in recall error, as well as the lack of detail regarding both self-employment and “other” arrangements.
A final example of a complementary household survey data source is the Quality of Worklife module, included in the 2002, 2006, 2010, and 2014 editions of the General Social Survey (GSS). The GSS is a biennial, nationally representative, personal interview survey of U.S. households conducted by the National Opinion Research Center and funded by the National Science Foundation. In addition to questions about earnings and benefits, the GSS module asks respondents how often they are allowed to change schedules and how often they are allowed to change their starting and quitting times on a daily basis.5 As discussed in Henley and Lambert (2014),6 the GSS also includes questions about unpredictable and unstable scheduling. Instead of asking about usual hours, the GSS asks how many hours respondents worked during the week prior to the survey. These questions have allowed researchers to study the impacts of these job characteristics on such outcomes as work-family conflict and work stress (Golden, 2015).
Data supplied by businesses can provide important information for policy makers and researchers that is largely complementary to that captured in household surveys such as the CPS and CWS. Key areas where business data may be called on to fill gaps include capturing contract
5 Available: http://gss.norc.org/Pages/quality-of-worklife.aspx.
company work, including use of temporary help services. These work categories were identified in Chapter 3 as areas where household surveys are deficient because respondents have difficulty reporting reliably on the contract arrangements of their employers.
Experience from various household surveys suggests that additional probing may be needed so that independent contractors can correctly identify their status. Reporting about subcontracting relationships may be especially difficult for household survey respondents. Bernhardt, Spiller, and Theodore (2013) attempted to identify subcontracted jobs in their in-depth, long-form survey of low-wage workers, but they abandoned the effort because of workers’ inability to accurately identify whether their employer was a contractor or not. As discussed in Chapter 3, the CWS currently attempts to measure only a subset of subcontracted work (on-site, for one client), but the potential universe of subcontracted work arrangements is much broader and more varied. Subcontracted workers may work off-site, at multiple sites, for multiple clients, or with names on their paychecks that they do not recognize or that do not match who they think their employer is.
Establishment and firm-level surveys can provide an important source for measuring complex AWAs, such as subcontracting, that workers understandably have a difficult time identifying correctly. Firm-to-firm contracting arrangements for services within the United States—variously called subcontracting, fissuring, or domestic outsourcing—appear to be increasingly common in a wide range of industries. These arrangements likely affect many more workers than has been recognized, ranging from low-wage service workers such as janitors, security guards, warehouse workers, and hotel housekeepers to professional and technical workers such as programmers, health care technicians, and accountants (Bernhardt et al., 2016; Weil, 2014).
In response, researchers have harnessed a variety of datasets that include establishment- and firm-level data to begin to measure the prevalence of subcontracting (Dorn, Schmieder, and Spletzer, 2018; Goldschmidt and Schmieder, 2017). For example, the share of payroll employment in professional and business services, a sector composed primarily of contractor companies, nearly doubled from 1980 to 2016, rising from 7.3 to 13.9 percent (Bernhardt et al., 2016). Similarly, Yuskavage, Strassner, and Medeiros (2008) report that the share of gross domestic product (GDP) accounted for by domestic providers of outsourcing services—which they defined as purchased services excluding telecommunications and financial services—rose from 7 to 12 percent between 1982 and 2006. And Goldschmidt and Schmieder (2017) show that the outsourcing of cleaning, food, security, and logistics services accounts for a sizable share of the growth in wage inequality in Germany since the 1980s. Because subcontracting inherently involves two (or more) firms, the long-term promise
of this emerging research is the ability to identify and link the industry and firm characteristics of both user firms and contractor firms in order to study the impact of contracting on job quality outcomes such as wages, benefits, and other working conditions.
Bernhardt and Houseman (2017), Dey, Houseman, and Polivka (2010), and Foster and others (2019) provide detailed overviews and analysis of firm- and establishment-level surveys and potential measures of subcontracting. A comprehensive menu of surveys and recommendations for each lies beyond the scope of this chapter; here we briefly highlight several key opportunities afforded by existing business surveys.
Many of the efforts within the federal statistical system to collect data from U.S. businesses are spearheaded by the Census Bureau, which conducts a wide range of business surveys, including the quinquennial economic censuses, annual economic surveys, and quarterly and monthly indicator surveys. The full suite of annual surveys is currently being reengineered, which may create opportunities to implement changes that better capture trends and motivations regarding AWAs from the employer’s perspective.7
For example, as suggested by a recent CNSTAT panel, the annual surveys could be expanded to more fully capture firms’ expenditures on, and use of, contracted services. Currently only aggregate categories are used, such as transportation and warehousing services, or professional and technical services. But these are the areas where growth in contracting out is concentrated (Yuskavage, Strassner, and Medeiros, 2008). Berlingieri (2014) finds that sectoral reallocation toward the outsourcing of professional and business services accounts for a very high percentage of increases in service sector employment and a corresponding decrease in manufacturing employment. Greater detail in expenditure data would allow better estimation and tracking of the scale, scope, and growth of contracting out as a firm-level practice, especially since Economic Census data are the primary source of BEA’s input-output tables, which could be leveraged to measure inter-firm contracting (see Bernhardt and Houseman, 2017).
7 See National Academies of Sciences, Engineering, and Medicine (NASEM) (2018), which covers the scope, operation, and major uses of the following economic surveys conducted by the Census Bureau, listed here by topic area. Manufacturing: Annual Survey of Manufactures (ASM); Manufacturers’ Unfilled Orders Survey (M3UFO); Management and Organizational Practices Survey (MOPS). Trade: Annual Retail Trade Survey (ARTS); Annual Wholesale Trade Survey (AWTS). Services: Service Annual Survey (SAS). Multisector: Annual Capital Expenditures Survey (ACES); Information and Communication Technology Survey (ICTS). Demographic: Annual Survey of Entrepreneurs (ASE). Other surveys related to the Business Register (Sampling Frame): Business and Professional Classification Survey (SQ-CLASS); Company Organization Survey (COS). The NASEM (2018) report also provides a framework for redesign of the annual surveys.
The Annual Survey of Entrepreneurs (ASE) and the Annual Business Survey (ABS) are examples of Census firm-level surveys that shed light on businesses’ use of various work arrangements. The ASE, which was administered in 2014 through 2016, and the ABS, which has been administered annually since 2017, ask firms if they use each of six categories of workers: full-time employees; part-time employees; temporary agency workers; day laborers; workers from professional employer organizations (PEOs); and contract, subcontracted, independent contractors, or outside consultants. A 2015 module to the ASE also asked firms to report the percentage of their workforce in each of these arrangements and the functions performed by each type of worker. Brown, Earle, and Lee (2019) use evidence from the employer-provided data to ask, “Who hires nonstandard labor?” with nonstandard defined as workers who are not on the firm’s payroll. The resulting firm-based data allowed these researchers to determine, for example, that contract work is the most common type of nonstandard work; that the fraction of firms using contract workers is around 30 percent; that young firms are more likely to use nonstandard workers; and that larger firms tend to hire more temporary workers and contractors than do smaller firms. These types of insights into the use of AWAs require employer-based data. The types of questions on the 2015 ASE module could be collected in future business surveys.
The BLS also produces relevant statistics collected through employer surveys. In addition to the payroll employment statistics already mentioned, the BLS conducts the Survey of Occupational Injuries and Illnesses (SOII), which produces workforce health and safety measures. Although the injuries and illnesses suffered by subcontracted workers should in principle be counted in the survey, they unfortunately are not separately identified in the data.8 Since 2011, however, the Census of Fatal Occupational Injuries (CFOI) has collected information on contractor status, including the industry of the firm for which a job was performed and the industry of the contractor firm. Although these data are limited to the most serious safety incidents faced by subcontracted workers, they could be used to investigate long-standing concerns that outsourcing, on average, may lead to higher rates of workplace death.
8 The information reported on the SOII is drawn from OSHA Form 300. The OSHA form states: “The employer must also record injuries and illnesses that occur to workers who are not on the employer’s payroll if the employer supervises these workers on a day-to-day basis (including employees of temporary help services, employee leasing services, personnel supply services and contractors).” There is evidence that employers are often confused by these instructions and do not report correctly. See, for example, the September 2016 issue of Monthly Labor Review at https://www.bls.gov/opub/mlr/2016/article/an-update-on-soiiundercount-research-activities.htm.
An example of a nongovernment survey that illuminates an aspect of contract company work is the American Staffing Association (ASA) Staffing Employment and Sales Survey (SESS). This survey,9 conducted on a quarterly basis since 1992, collects information from staffing firms to estimate temporary and contract staffing industry employment, sales, and payroll. In a presentation to the panel (described in Appendix B to this report), Steve Berchem defined staffing companies as those that are employers of temporary and contract workers. A key indicator derived from the survey is the ASA Staffing Index, a weekly measure of changes in employment by staffing firms. Berchem reports that the index closely tracks and serves as a leading indicator of GDP. For example, in July 2009, at the end of the recession, the staffing index began to tick up, revealing the beginning of the subsequent expansion ahead of many other indicators.
In sum, establishment- and firm-level surveys are an underexplored source of data on AWAs. Especially in the case of subcontracted work, these surveys have the potential to yield vital information about the prevalence and nature of firms’ contracting-out activities. Similarly, as discussed next, tax data are an important complementary data source to the CWS.
While household and business surveys will continue to provide critical information on the changing nature of employment arrangements in the U.S. economy, surveys are limited to questions to which respondents know the answer and that they can answer relatively easily. Moreover, survey response rates, including in government surveys, have been falling, raising concerns about costs and the ability of the statistical system to rely solely on survey data in the future. In this context, nonsurvey data sources are being increasingly called on to fill the void. As has been documented in numerous reports—most recently and prominently by the Commission on Evidence-Based Policymaking (2017)—the use of administrative data can improve the overall efficiency of data programs by reducing agency expenditures, lowering respondent burden, encouraging the sharing of information across agencies, and potentially increasing the accuracy of the information col-
9 The SESS is a stratified sample of about 100 companies of various sizes (encompassing about 10,000 establishments) that uses the Economic Census as a benchmark. The survey is now a web-based instrument of no more than seven questions; employment, sales, and payroll are measured from quarter to quarter within each company size strata. The length of the SESS used to be about 20-25 minutes, but the number of questions has been reduced so that in now takes about 10 to 12 minutes to complete. The SESS is based on the methods of the BLS’s establishment surveys—for example, even collecting data for the week containing the 12th day of the month.
Government administrative tax data are a key example of sources of information on AWAs that can complement the CWS. Researchers have recently focused on using tax data for measuring and tracking the scale and scope of independent contracting, including the use of on-demand labor platforms (Collins et al., 2019; Jackson, Looney, and Ramnath, 2017; Lim et al., 2019).
Tax data feature several advantages over survey data. They offer a relatively clear delineation between employees (measured by W-2 income) and independent contractors (measured by sole proprietor income and/or 1099 income). The 1099-K form can be used to identify work for on-demand labor platforms, which is of particular public policy interest. And tax data also can capture sources of income that some workers might not report through surveys such as the CWS.11 As a result, tax data are valuable for estimating the prevalence of all sources of paid income and, importantly, how these income sources (including from on-demand labor platforms) are combined by workers.
The above-described features of tax data have allowed researchers to generate novel findings that would be difficult to generate using household survey data alone. For example, Lim and colleagues (2019) use tax data to study trends in independent contracting. They define independent contractors as tax filers who earned income reported on a form 1099-MISC or 1099-K and had less than $10,000 in non-car, non-travel-expense deductions (if filing as an individual) or less than $10,000 in total deductions (if filing as a business). Although the largest share of independent contractors are people whose Form 1099 earnings supplement their wage and salary incomes, until quite recently the rate of growth in independent contracting has been most rapid among those for whom those earnings are the primary source of labor income.
10 The Evidence-Based Policymaking Commission Act of 2016 defines administrative data as data that are “(1) held by an agency or contractor or grantee of an agency (including a State or unit of local government); and (2) collected for other than statistical purposes” (Commission on Evidence-Based Policymaking, 2017, p. 9). Unlike survey data collected specifically for statistical purposes, administrative data are typically collected in support of an agency’s or other organization’s routine program operations.
11 Presentations to the panel by Dmitri Koustas and by Mike Udell and Diane Lim, summarized in Appendix B, include a detailed description of the use of tax data in research—for example, to capture payments by firms to unincorporated individuals for nonemployee services.
Collins and colleagues (2019) examine the universe of tax returns “to reconcile seemingly contradictory facts about the rise of alternative work arrangements in the United States.” They find that, among the tax-paying workforce—defined as workers who get a W-2 or a form 1099 and file a form 1040, or get a W-2 and do not file—expansion of 1099 work since 2013 has been driven almost exclusively by online platform work.12 They also find that, for most people engaged in online platform work, that work is secondary, generating income that is supplementary to primary W-2 jobs.13 Other findings show that, although most people who engage in this work do it as a secondary job, most of the work for platform companies is accounted for by people who do it full-time and as their primary source of income. While the research of Lim and colleagues emphasizes results for a longer post-2000 period, the focus of Collins and colleagues is on the past few years, during which patterns of independent contracting work have changed significantly.
Accurate information of this kind is difficult to attain using a household survey, where W-2 workers often neglect to report supplementary independent contractor income (Abraham et al., 2020; Abraham et al., forthcoming). Thus, even these early tax-based studies are contributing significantly to our understanding of trends in independent contracting, and online platform work in particular. Going forward, there is significant potential to learn more about how workers use independent contracting—for example, whether over the life course or to manage career disruptions. There is also potential to learn about how firms use independent contracting, for example, whether it is to supplement or substitute for their W-2 workforce, and how firms’ use varies over the business cycle.
That said, tax data have their own shortcomings. While tax data may better measure the prevalence of independent contracting, they still do not capture the full universe of such activity. For example, they do not measure the presence of off-the-books and other informal work. Nor is measurement of sole proprietor income using tax data free from error, due to underreporting of income and/or overreporting of expenses (IRS, 2019; Slemrod, 2018). Additionally, the 1099-K form will likely be an unstable source for measuring on-demand platform work because of the current high reporting
12 They find that “the share of the workforce with income from alternative, non-employee work arrangements has grown by 1.9 percentage points of the workforce from 2000 to 2016. More than half of this increase occurred over 2013 to 2016 and can be attributed almost entirely to dramatic growth in work mediated through online labor platforms.”
13Collins et al. (2019) find that approximately 44 percent of the overall growth in the 1099 economy comes from people who do not file self-employment taxes. Examining the relationship between 1099s and self-employment tax records more generally, we find that the previously documented increases in self-employment tax filings since 2007 are largely driven by workers without 1099s.
threshold of $20,000 in annual gross income and 200 transactions.14 In addition, tax data are available only on an annual basis, and when multiple sources of income are reported in tax filings, one cannot tell if work relationships were held at the same time (and so reflect multiple job holding) or were held sequentially.
In conclusion, recent research has demonstrated the utility of access to confidential tax data for measuring the prevalence of independent contracting activity, and in particular how that activity is combined with W-2 work. As such, tax data should be considered an important complementary data source to the CWS for AWA measurement purposes; it can also serve as an external source of information to help in the refining of CWS questions in the future. However, the quality and therefore value of tax data is determined by tax rules and by the extent of taxpayer compliance. An important example is the 1099-K form, which could serve as an accurate source of data on the prevalence of on-demand platform work over time were its filing threshold significantly lowered, but at the existing filing threshold it provides incomplete information and thus an imperfect picture.
In addition to tax data, a number of researchers have begun using financial and other commercial data to reveal different aspects of AWA that are not often well understood. Koustas (2019) analyzes the income, spending, and liquid assets of rideshare drivers using personal financial service data;15Hall and Krueger (2018) study pricing in the ride-share industry using Uber data; and Parrott and Reich (2018) use Uber data to determine, among other things, that about two-thirds of ride-share drivers in New York City do this work full time.
Recent work from the JPMorgan Chase Institute (JPMCI) illustrates how administrative (commercial) datasets can contribute value in measuring labor market trends. Using JPMCI financial accounts data, Farrell, Greig, and Hamoudi (2018) estimate platform participation among all families regardless of labor force status, and expose the fact that people are particularly likely to turn to platforms for income when they are between jobs or when their income from other sources dips. The researchers find that platform work is a supplementary source of income for most people, suggesting that many workers engaged in this form of AWA may also hold more traditional jobs. Indeed, it is the flexible nature of contingent work
that may make it possible for individuals to fit it in around their traditional work schedules. Some student-age (18 to 24 years) and older adults (65+), who might self-identify as not in the labor force, nevertheless generate income on platforms. Put differently, certain forms of work, such as platform work, might not fit into traditional concepts of labor force participation. Thus, conditioning our understanding on labor force participation might underestimate the share of the population engaged in contingent and alternative work arrangements.
As discussed in Chapter 3, defining the reference period as a week may miss many who sporadically engage in work activities and result in undercounting of contingent and alternative work arrangements. The JPMCI data reveal that most individuals who participate in platforms do so for no more than 3 months of the year. As a result, estimated platform participation rates are much lower if they are examined during a particular month (e.g., 1.6% in March of 2018) than when they are measured to include any point in the prior year (e.g., 4.5% for the 12 months leading up to March 2018). Another benefit of a quarterly or annual time frame, which some commercial data make possible, is the ability to make comparisons with government administrative datasets, such as unemployment insurance wage records or tax data, which are also collected on a quarterly or annual basis. Commercial data can provide alternative reference periods and more granular data with which to analyze work activity.
Sources such as the JPMCI Online Platform Economy dataset, which passively captures information from daily administrative operations used to manage customer accounts, offer some advantages as compared to existing survey datasets. First, these datasets are very large. The example from Farrell, Greig, and Hamoudi (2018) leverages a sample size of 2.3 million families who received income from 128 on-line platforms, allowing the construction of narrow confidence intervals around estimates over time. These data also permit analysis of differences across demographic and geographic subgroups, as well as subcategories of work, such as transportation platforms versus nontransport work. Second, they offer a continuous high-frequency lens over the observed period, such that trends can be ascertained on a weekly or monthly basis rather than only for a specific reference period. This is critical in the case of contingent work, since their research revealed just how sporadic work is in the online platform economy. In addition, the continuous lens paired with the large sample size is particularly valuable in picking up new trends and forms of work in their infancy when the prevalence of the activity may be very low.
Third, because commercial data are based on real transactions or operations, they offer an unfiltered perspective unaffected by low (and falling) survey response rates, by the respondents’ interpretations, by recall bias, or by proxy reporting when answering survey questions. As described
above, respondents’ varying interpretations of a question pose a particularly vexing measurement challenge when the question relates to new and rapidly changing concepts, such as electronically mediated work. Most recent attempts to measure electronically mediated work have yielded an unrealistically high level of participation due to affirmative responses that the BLS subsequently deemed to be false positives. Finally, administrative data often offer a view into a range of other outcomes and attributes. In the case of the banking data, Farrell, Greig, and Hamoudi (2018) are able to observe all income deposited to an account and thus measure how reliant families are on platforms.
For all of the promise of nonsurvey data sources, caution must be taken when interpreting them. The potential sources of measurement error and approaches to addressing error are very different in administrative (commercial or government) data compared to survey data. For example, rarely will commercial data sources be representative of the full target population. Reweighting can make them more closely approximate the general population, but it cannot assist in representing subgroups that are missing entirely. In the case of the JPMCI data, for example, the unbanked and those who chose not to bank with Chase will always be absent. Additionally, while the unit of observation often becomes a design choice of how to aggregate the data rather than how to ask the survey question, there are practical challenges in delineating concepts such as an individual, family, or household within the context of administrative data. In the case of banking data, Farrell, Greig, and Hamoudi (2018) aggregated linked accounts to approximate platform participation for an entire family, but we know that accounts held by family members are not always linked nor do individuals always funnel all their income through a single financial institution.
Similarly, while users of administrative data do not have to wrestle with ensuring that a question will be (or has been) interpreted correctly by a respondent, another critical design choice is which operations within an account to include as contingent or nonstandard work. The set of platforms included within the JPMCI Online Platform Economy data has expanded from 30 to 128 over time as new platforms have emerged, allowing for a window into not only labor platforms but also capital platforms. While these design choices can be applied uniformly across all accounts (families, if you will), there are judgment calls and practical challenges in defining and effecting such inclusion criteria. For instance, researchers will need to ask: Which platforms should we include? Can we identify transactions associated with such platforms?
Finally, while surveys are created for the purposes of measurement and designed to make observations comparable, commercial administrative data are a byproduct of operations that can change in ways that distort the perspective over time. For example, if online platforms change or expand the
ways in which they pay participants, measurements of participation solely based on direct deposits into checking accounts may miss an increasingly larger share of the activity.
These shortcomings underscore the value of commercial administrative data as a complementary source of information even while, for the foreseeable future, government surveys will remain an important part of the data collection infrastructure of statistical agencies. That said, there may be some domains where private sector data are intrinsically superior or where government data collection would be impractically expensive. In those cases, presumably, private sector data could even serve as a substitute for government surveys.
The contribution of commercial and administrative data is in providing not just additional estimates with which to triangulate measures of contingent work and AWAs, but also an additional perspective that can reveal the ways in which each lens, from surveys to administrative data, is filtering the information. The exercise in reconciling not just estimates but also lenses can help inform efforts to improve the design of government surveys. For example, Farrell, Greig, and Hamoudi (2018) note that, all told, the JPMCI and BLS estimates for participation in labor platforms were remarkably similar, both estimating that 1.0 percent of respondents were earning income from labor platforms through electronically mediated work in May 2017. The fact that the two estimates were so close gives us confidence that they are in the right ballpark, even while their similarity is striking in light of the very different approaches taken by JPMCI and BLS to measuring electronically mediated work. These differences in approaches reveal opportunities for improvement in measuring contingent and alternative work arrangements.
Declining response rates—even for federal surveys with historically high response rates—and the multitude of data sources now available are leading to broad-based efforts to expand the statistical system beyond the survey-centric approach developed during the 20th century. The Commission for Evidence-Based Policymaking16 advocated for expanded use of administrative data and improved data linkage across federal statistical and regula-
16 The Commission was a 15-member group of experts charged by the U.S. Congress and the President with examining how government could better use its existing data sources to provide high-quality evidence for policy and government decision-making. The Commission was created in March 2016 by the Evidence-Based Policymaking Commission Act (P.L. 114-140), legislation jointly filed by Speaker of the House Paul Ryan (R-WI) and Senator Patty Murray (D-WA). Available: https://www.congress.gov/bill/114th-congress/house-bill/1831.
tory agency sources to help guide decision making. That work created a climate in which movement to this new paradigm may have the opportunity to flourish, including through legislative changes. Of course, policies and procedures are needed to ensure that access to restricted data is limited to qualified researchers and policy makers while protecting the privacy of people’s records.17 The capacity to improve measurement of social and economic phenomena, including trends in employment and work arrangements, will be largely driven by how effectively multiple data sources—survey and nonsurvey, national and local, public and private—can be drawn from and combined (NASEM, 2019). Redesigns of surveys increasingly will presume that these instruments will need to be linked to other data sources.
The emergence of a multiple-data-source paradigm will no doubt influence the way employment and other labor market statistics are generated going forward. Research is already being conducted in cooperation with the U.S. Census Bureau that successfully links survey and administrative microdata (e.g., Abraham et al., 2020; Abraham et al., forthcoming). One finding, among many, is that CPS-based and administrative data-based employment estimates, such as those for wage and salary employment, multiple job holding, and self-employment, differ, and some of these differences have grown over time.18
Even though this panel strongly supports ongoing statistical agency work of this kind, at this time we do not recommend linking CWS data to tax (or other administrative) data as a priority for the BLS. One reason for this is that the single-week reference period for most of the CWS questions cannot easily be compared to the annual reference period for tax data. In addition, there would be technical and legal barriers to carrying out such linkages.19 We do, however, endorse the long-term goal of leveraging multiple data sources to better measure and understand the evolving nature of alternative work arrangements in the United States.
17 One model for accomplishing this is provided by the Federal Statistical Research Data Centers (FSRDCs)—a partnership between federal statistical agencies and leading research institutions in which secure facilities provide authorized access to restricted-use microdata for statistical purposes only. FSRDCs (coupled with legislation) have allowed agencies and outside researchers to combine IRS data with existing statistical agency surveys, available: https://www.census.gov/fsrdc.
19 As described by Spletzer, CPS supplement data have been linked with tax data for various projects, but the personal identifier code needed to link the data are collected only in the March Annual Social and Economic Supplement and so would not be available for all CWS respondents since the CWS has been fielded in February and May.
This page intentionally left blank.