In this chapter, we explore options for advancing collection of data on dimensions of social capital. Our starting point is the federal statistical system, particularly the Civic Engagement Supplement of the Current Population Survey (CPS). Later, we consider complementary and substitute data options—public and private, survey and nonsurvey, along with experimental strategies, some of which involve administrative or “big data” sources.
The recommendations in this and the next chapter are intended to improve information about civic engagement, social cohesion, and other elements of social capital for research and policy purposes. They fall into two categories: (1) those directed toward improving data collection in the near term, taking advantage primarily of existing survey vehicles; and (2) those that are more forward looking in a way that anticipates the role of government surveys alongside emerging data sources, including unstructured digital data produced as the by-product of day-to-day business, communication, and social and civic activities. Underscoring our guidance is the recognition that the viability of large national surveys is at a crossroads; a real possibility exists that major surveys conducted by the federal statistical system will take a starkly different form in the not too distant future.
RECOMMENDATION 1: For data collection in areas of social capital, a multipronged strategy should be pursued in which large population surveys conducted by the federal statistical
system play a role, but one that is increasingly complemented and supplemented by new, innovative, experimental alternatives. The greatest promise lies in specific-purpose surveys such as those focused on health, housing, and employment issues (especially those that have a longitudinal structure) and in the exploitation of nonsurvey sources ranging from administrative data (e.g., local-level incident-based crime reports) to digital communications and networking data that are amenable to community-level analyses. Many of the surveys will continue to be conducted or funded by the federal government, while many of the nonsurvey sources will originate elsewhere.
Some of the data from nongovernment sources are traditional survey based (e.g., from Pew, Gallup, and similar organizations), and some originate from private-sector activities organically generating information as a byproduct of day-to-day processes. The quality of the nation’s information and its research capacity will in large part be determined by the effectiveness with which these disparate data sources can be exploited and coordinated to work in a complementary fashion.
Some elements of social capital are best measured through surveys of individuals or households while, for others, it is possible to gather information using nonsurvey methods. Among the data elements for which surveys are required, some can be effectively collected using instruments administered to national samples while others are better approached using specialized, more focused ones. As discussed in Section 5.1., measurement of some behaviors, actions, and attitudes may also be enhanced by linking survey data to nonsurvey sources and through modeling or other methods.
In this section, we discuss the prospects for data collection on civic engagement and volunteering using existing federal surveys. We describe attributes of the federal statistical system that enhance data collection and those which create constraints. The role of the CPS, specifically the September and November Supplements, is considered, as are other federal survey options.
Data collection performed by the federal statistical system has the advantage of methodological transparency and, in turn, credibility with users. The objective of federal statistics is to produce information that is publicly available—with adequate privacy and confidentiality protection—and that meets the quality and accuracy standards required by
decision makers. As articulated in Capps and Wright (2013), “…official statistics in the United States are grounded in the scientific method and constantly subject to scientific review; they are understood, they are authoritative, and they are credible.”
Government surveys and statistics have also historically offered regular, replicated content that provides continuity over time. This long history has yielded a wide range of methodological advances in probability sampling methods that allow population estimates to be generated, assessment of nonsampling errors, dissemination of data and access to data by users, and protection of privacy and confidentiality of respondents. Perhaps most importantly, the distributional properties of the government’s survey samples are known, and decades of research have honed the statistical agencies’ ability to collect reliable data and interpret their meaning. As a result, when key information on covariates has been included in carefully designed surveys, research that can support inferences has been possible.
That the government collects data about civic engagement—specifically, volunteering and voting—also sends a signal that these activities are important to society. And the historically high response rates of government surveys (e.g., 92-94 percent for the CPS in 2003-2005 and 86-88 percent for the volunteer supplements) give them a comparative advantage over nongovernment surveys. This advantage is particularly important for measuring activities such as volunteerism for which participation in the survey correlates with the propensity to volunteer.1 Moreover, if other sources of national data on voting and volunteering (such as the American National Election Studies funded by the National Science Foundation) are discontinued at some time, the CPS Voting and Registration Supplements would become all the more vital.
Government data collection also has limitations. As noted in Chapter 3, some questions may be viewed by the public as inappropriate for inclusion in government surveys on grounds of privacy or sensitivity (e.g., political or religious affiliation or sexual orientation). Though the government does ask about sensitive matters such as drug abuse, alcoholism, and people’s habits, some questions—such as those about trusting
1The high variability in survey estimates of volunteering is due to the “greater propensity of those who do volunteer work to respond to surveys” (Abraham et al., 2009, p. 1129). The authors analyzed data from the American Time Use Survey (ATUS)—based on a sample drawn from the CPS—and the CPS Volunteer Supplement to show that “CPS respondents who become ATUS respondents report much more volunteering in the CPS than those who become ATUS nonrespondents” (ibid). And this bias, replicated within subgroups, cannot be corrected for using conventional adjustment methods. Although nonresponse “leads to estimates of volunteer activity that are too high, it generally does not affect inferences about the characteristics of volunteers” (p. 1129).
particular political parties or about some personal social behaviors—are generally considered beyond the scope of what government should be asking about. Indeed, the CPS was initially rejected as a home for a civic engagement module on the grounds that, to maintain high response rates, questions judged to be politically, morally, or otherwise sensitive should not be included. This concern could be interpreted to apply to questions about “religious activities and interactions with individuals of specific racial or ethnic groups—key components of social capital within the U.S.” (Hudson and Chapman, 2002, p. 5).
While government has traditionally not ventured very far into the realm of asking citizens about attitudes, the movement to measure subjective (self-reported) well-being may be changing this view. This change is clear in some countries—including Brazil, Canada, Chile, and the United Kingdom—where questions about life satisfaction and day-to-day emotions are being fielded in flagship surveys. In the U.S. federal statistical system, the stance has been more wait and see. For the purpose of assessing people’s social connectedness, group cohesion, attitudes toward others in the community and the like, establishing convincing links to outcomes in specific policy realms (health, crime, resilience to disaster) would support the case for survey coverage in these areas. That is, if it is established that when characteristics x, y, and z are present, communities are shown to be better off and, where they are absent, communities are worse off, there would be a strong argument for collecting relevant data. In some cases, other organizations such as Pew and Gallup have a comparative advantage in doing this kind of attitudinal work. Gauging the public’s consumer confidence (as done by a survey conducted by the University of Michigan’s Survey Research Center for the Conference Board, a nonprofit research group) is an example where the nongovernment sector has shown a comparative advantage in data collection.
It is not possible or desirable to make the CPS the source for all data related to social capital needed for policy, research, and general information purposes. The primary purpose of the core, monthly CPS is as an employment survey, and adding a major new component could increase respondent burden and jeopardize its high response rates.
The purpose of the CPS Civic Engagement Supplement—which has now been fielded in 2008, 2009, 2010, 2011 and, with a half sample, in 2013—was stated in the justification document to the U.S. Office of Management and Budget (OMB) (2011, p. 3):
…collect data for the Civic Health Assessment, an annual report mandated by the Serve America Act that is produced in partnership with
the National Conference on Citizenship (NCoC). The Civic Engagement Supplement provides information on the extent to which American communities are places where individuals are civically active. It also provides information on the number of Americans who are active in their communities, communicating with one another on issues of public concern, and interacting with public institutions and private enterprises.
At national and state levels, the CPS Civic Engagement Supplement fulfills several elements of this mandate for descriptive information.2 As we argue above, some elements of social capital data collection are well served by broad population surveys fielded by the federal statistical system, while others are not—not because they are unimportant, but because they either require a different measurement approach can be collected using less costly vehicles.3
CONCLUSION 5: Current Population Survey (CPS) supplements, which offer only a limited amount of survey space (about 10 minutes is allotted for a given monthly supplement), are most appropriate for collecting data on variables that (1) can be estimated from a small set of questions, (2) deal with people’s behaviors, (3) would be difficult to ascertain through nonsurvey methods, and (4) need to be correlated with personal attributes that are also captured on the survey in order to study how they interrelate for groups such as the elderly, minorities, or immigrants. Also critical is that the CPS data are useful when the research and policy questions of interest require information aggregated at the federal-, state-, or (in some cases) metropolitan-area level.
By these criteria, the Civic Engagement and Volunteer Supplements to the CPS are well suited for generating statistics on a subset of narrowly defined dimensions of civic engagement (see the top two rows in Table 2-1 in Chapter 2).4 The series produced from these data have, historically, proven to be useful, particularly for describing national-level trends. Volunteering is a particularly important form of engagement because, unlike
2The data have been used to describe characteristics at more local levels (though not for generating statistically valid estimates) and, in combination with other data sources, to motivate community action. See, for example, the Greater New Haven Community Index Project, compiled by the nonprofit organization, DataHaven: available: http://www.ctdatahaven.org/communityindex [February 2014].
3An example is Hersch (2013) who replaced traditional survey approaches with voter lists and digital obituaries data to reveal patterns of political behavior among post-9/11 victims.
4As described in Chapter 1, the Civic Engagement Supplement has been fielded most years since 2008. The Volunteer Supplement has been fielded each September since 2002. See Appendix B for a complete schedule of CPS supplements.
“memberships,” it requires a time commitment. Working for a campaign, for example, is a stronger indication of civic engagement than simply belonging to a political party or even voting.
CONCLUSION 6: Information about the population’s political participation and voting activities can be adequately captured with a small number of questions. Likewise, the Current Population Survey (CPS) has proven useful for understanding volunteering rates and patterns—especially when linked with data from the survey’s time use (American Time Use Survey) module. Thus, the CPS Volunteer (September) and Civic Engagement (November) Supplements are best focused on political and civic participation.
These supplements are less optimal for generating data on dimensions of social cohesion, connectedness, trust, and characteristics of the broader social environment (e.g., the bottom three rows in Table 2-1). Relative to voting and volunteering behavior, these attitudes and interactions are quite complex.5 Measuring social cohesion and related constructs requires a larger number of questions and perhaps the use of nonsurvey methods that are beyond the scope and acceptable burden levels of the CPS.
CONCLUSION 7: Although even a short module can generate useful information, the Current Population Survey does not offer a comparative advantage for data collection on complex behaviors and attitudes indicative of social cohesion, individual and group connectedness, and civic health generally. These phenomena cannot be satisfactorily characterized by data collected from a small set of questions.
Even for a comparatively well-defined element of social capital, such as individuals’ connectedness, it can be misleading to rely on one or very few proxy measures. For example, if a survey asks about family ties, it may miss a trend whereby friendship networks are increasingly substituting for those centered around family. And asking only about in-person contacts will miss increasing use of remote personal communication and
5Forrest and Kearns (2001) summarized this complexity well, stating that studies of social cohesion may emphasize “the need for a shared sense of morality and common purpose; aspects of social control and social order; the threat to social solidarity of income and wealth inequalities between people, groups and places; the level of social interaction within communities or families; and a sense of belonging to place” (p. 2129).
social networking options that may substitute for or complement conventional interpersonal interactions (a person may be almost as happy to hear from a distant grandchild or friend by email, Skype, or Facebook as in person). An exclusive focus on family or on in-person relationships may miss possible counterbalancing trends. Ultimately, the number of measures needed is an empirical question to be tested; there are examples where researchers have been able to successfully reduce lengthy scales into even a single item that is valid and reliable, a process that has typically involved robust psychometric assessment of the underlying concepts early on.
In the current budgetary environment, cost reduction has become an increasingly prominent objective. Strategies relevant for the CPS include (1) combining the Civic Engagement and Volunteer Supplements, with a reduced number of questions on each topic, in order to field both each year; (2) moving to a rotating schedule in which the full content of each is fielded, but only in alternating years; or (3) cutting sample sizes in order to field both supplements with the full complement of questions annually. Indeed, this was essentially the set of alternatives faced by Corporation for National and Community Service (CNCS) during planning for the 2013 supplements. CNCS ultimately chose to implement option (3)—using smaller samples—so that both the volunteer and civic engagement question sets could be fielded; it was also an option that did not require a questionnaire redesign or cutting content, processes that would have involved a redesign study (such as that described below).
The major cost of selecting the reduced sample option, fully acknowledged by CNCS, is that it increases the standard errors of estimates,6 thereby compromising the ability of data users to conduct subgroup analyses or to produce statistically valid findings at the metropolitan statistical area level. While this was a practical short-term decision, it would not be the best approach long term.
RECOMMENDATION 2: Due to the importance of substate and subgroup analyses, under a cost-reduction scenario the panel favors a combined civic engagement and volunteer supplement to the Current Population Survey (CPS) even though it would require reducing the number of questions in each category.
6See Appendix C for standard error estimates for state-level samples for the September 2011 CPS Volunteer Supplement for full- and half-sample scenarios.
Question streamlining would be accomplished by (1) narrowing the subject matter now covered in the Civic Engagement Supplement based on assessment of what information can and cannot be collected effectively in a short survey module; (2) identifying and eliminating redundancies across the CPS Civic Engagement and Volunteer Supplements; and (3) identifying and eliminating questions for which comparable data can be found in other government surveys or elsewhere, while recognizing there is analytic value in having both volunteering and civic engagement data, along with covariate information, for the same respondents.
Moreover, it is not necessary to have identical content each year since some behaviors change slowly over time. Therefore, CNCS and the Census Bureau should experiment with the periodicity of various questions. For items where change and granularity are needed, sample size and frequency tradeoffs can be exploited such that a core set of questions is asked each year; other questions could be asked less frequently. We cover the first two parts of this streamlining plan in the rest of this section; the third part is covered in Section 4.4.
Setting Appropriate Scope
If one accepts the position articulated above—that, while it is possible to measure some dimensions of social capital in a short survey module, others are too complex to address meaningfully—a logical first step in streamlining to a combined supplement is to limit it to volunteering and civic (particularly, political) engagement topics. In order to stay within CPS time and length requirements, priority questions must be identified.7 Precise question wording, ordering, and other aspects of survey design require development testing.8 The details of such testing are beyond the panel’s charge, but we offer for consideration some general ideas (illustrated with a few examples) for recasting the supplements to take advantage of its strengths and rethink its limitations.
Intragroup (bonding) and intergroup (bridging) cohesion, for example, are phenomena with the potential to affect the dynamics of political and social movements and should be measured and studied. But questions falling into these categories (listed in the lower rows of Table 2-1)
7Although the volunteering supplement contains 19 questions, plus some follow-up questions, many respondents do not answer all of them. Those who reply “no” to the first two questions establish whether or not the person volunteered to take a very short survey.
8In 2011, CNCS contracted with Abt Associates for such testing.
cannot be adequately covered in a 10-minute supplement that is also covering volunteerism and civic engagement activities. One content area of the Civic Engagement Supplement to consider scaling back is about interactions with friends, family, and neighbors (see questions S12-S16 in Appendix E). Questions about activities such as, “How often did you eat dinner with other members of your household?” (S12) and “How often did you see or hear from friends or family…?” (S13), are examples for which data may be collected more comprehensively and in a better connected way elsewhere. The connectedness topic is important, but these questions need empirical backing, and more research is needed to understand what they are measuring. For example, the phrasing “How often did you hear from or see…” does not identify the intensity of contact—does “hi” on the street equal a long visit?
For questions on social connectedness, Pew and Gallup have developed survey models that are conducive for measuring weak and strong ties as well as diversity and cohesion.9 Alternatives to the current types of attitudinal questions on connectedness and polarization might be phrased along the following lines
- “In your personal life—for example, in choosing friendships—how important are each of the following: religion, race, ethnicity, language, politics?” This formulation of a social networking stem questions is similar to those used in some general social surveys.
- “Do you want your child to marry x, live next door to x, be friends with x?” where x is a person of a different race, political view, religion, etc. Similarly, “Do you have strong preferences in the x, y, and z of people you associate with?”
Data from these kinds of questions provide insights into general attitudes about tolerance and diversity.
Internet use (Civic Engagement Supplement, question S3) may also be peripheral to the core volunteer and engagement constructs coverable by the CPS, and it is likely that better nonsurvey sources of this information exist. The assessment by Abt Associates (2011) found that respondents had trouble interpreting the questions about Internet activities; similarly, respondents were uncertain about what kinds of organizations “counted” in questions about participation (S5) and also about what level of participation qualified as a “yes” to the question. Question (S2) asks people whether they have expressed views to public officials—without specify-
9For the Pew questions on social isolation and new technology, see http://www.pewinternet.org/~/media//Files/Reports/2009/PIP_Tech_and_Social_Isolation.pdf [February 2014].
ing whether in person or not (e.g., by phone or email), and question (S3) specifically asks how often a respondent has expressed political or community views using the Internet. In order to characterize the interaction, it is important to differentiate mode more precisely—a person who spends all day at home posting opinions on social media may not have the same level of engagement as a person writing op-eds that are published, but the two kinds of activities may appear similar with the current questions.
Survey questions also require periodic updating to account for changes taking place in a society’s norms, habits, and activities. For example, membership in civic or service organizations such as the American Legion, Rotary, and Lions Club (asked about in question S5) is no longer as commonplace as it once was in the United States.10 One way to accommodate such changes is to use more generic categories; for example, for most purposes, it will not matter whether a respondent is a member of the Lions Club, the Rotary Club, or some other club, so new response options may be needed. This would also apply to questions in other areas, such as those in Box 4-1. For example, the question about communication technology use (Facebook, Twitter, Instagram, or Snapchat) might have the most meaningful value as part of long-term longitudinal data collection efforts if the question ended generically at “…network site.”
Another option is to structure questions, such as S5, in an open-ended “yes/no” fashion parallel to question S1 of the volunteer supplement (“Since September 1st of last year, [have you/has NAME] done any volunteer activities through or for an organization?) A “no” response ends the line of questioning. A “yes” prompts (unconstrained) identification of the organization. Among the advantages of this more open-ended structure is that it: (1) captures the changing nature of organizations, modes of engagement and communication, etc., and eliminates preconceived notions about what kinds of organizations, volunteer activities, and personal interactions should “count”; (2) streamlines a survey since “no” answers allow respondents to move quickly to the next item, thus reducing burden; and (3) allows analysts to interpret results in greater detail (e.g., motivations for volunteering at church, at a homeless shelter, or for a political candidate may be quite different). Modern computing offers tools to take advantage of such data which are richer, and which reflect the direction surveys appear headed in the future.
10Meeting attendance for such clubs has, by Putnam’s (2000) estimates, declined by 50-60 percent at the end of the 20th century. Of course, changing norms affect many aspects of society and the economy in ways that leave statistics out of date. For example, standard industrial classification systems required updating as manufacturing sectors become relatively less dominant while service and high-tech industries grew to account for a larger share of economic activity.
Sample Open-Ended Engagement Questions
Please tell me if you have done any of the following in the last year (Yes/No) (can be face to face, Internet, in writing):
- Donated money or goods to a charitable or political cause.
- Attended a meeting to discuss a public issue.
- Contacted the media or a public official to express your opinion about a public issue.
- Talked to family, friends, or coworkers about a public issue.
Are you currently registered to vote?
(if YES to above): To your knowledge were there any local elections in the last year for which you were eligible to vote?
(if YES to above): Did you vote?
Do you currently have (and/or have access to?) Internet access in your home?
(if YES): Do you currently participate in a social network site (such as Facebook, Twitter, Instagram, or Snapchat)?
In the past year have you volunteered your time for any social, political, or charitable cause?
(IF YES: some follow-ups on type of activity, amount of time)
Are you currently affiliated with or a member of any volunteer organizations or associations (give some examples of types)?
IF YES: ask a few follow-ups on types, number.
Eliminating Overlap Among the Supplements
Minimizing overlap within the CPS supplements is another source for streamlining and is no doubt something that will be studied during the design of the a combined instrument. As just one illustrative example, the question on participation in groups or organizations in the Civic Engagement Supplement (S5, S6) could be merged with the questions on the
Volunteer Supplement (S3, S4) about “organizations volunteered for.”11 Both versions of these participation questions are likely not needed in a single supplement.
The panel did not examine the idea of integrating the CPS Voting and Registration Supplement into the combined instrument. The main reason is that the Civic Engagement (November) and Volunteer (September) Supplements have been fielded every year, except for 2012 when the Civic Engagement Supplement was skipped due to budget reasons. The Voting and Registration Supplement is only fielded in even-numbered (election) years. Thinking about the amount of space available on a 2-year cycle basis, and assuming it is important to have at least a core of civic engagement and volunteer questions fielded every year, it may not be efficient to try to combine all three supplements. However, it may be worthwhile to consider moving specific questions from one supplement to the other with the needed frequency (every year versus every other year) being a key criterion. A combined Civic Engagement/Volunteer Supplement could be fielded in either the September or November slot. The current supplement schedule (see Appendix B) suggests that there is less competition for the September slot since it is already occupied by the Volunteer Supplement. The Voting and Registration Supplement obviously needs to remain in November.
Furthermore, the content of the Voting and Registration Supplement—which asks about participation and registration in national elections and about reasons for not voting—might be changed in one respect. Data from this supplement allows states to ascertain demographics and voting registration information; and, historically, CPS data have proven very useful for quantifying and understanding voting and registration behavior by population age, education, sex, race, and Hispanic origin and for analyzing such policies as the effect of absentee voting and same day registration on voter turnout. Voting data are important in the calculation of statistics used to assess the strength of democracy (see Dalton, 2008). If the local election question (S1) in the Civic Engagement Supplement were transferred to the voting supplement, it could possibly be dropped from the civic engagement supplement.
The nonprofit sector relies heavily on surveys of volunteer activities. Data from the survey performed by Independent Sector—published in Giving and Volunteering in the U.S.—is comprehensive and frequently cited; however, this survey is conducted irregularly. Although there are difficult questions of compatible definitions, standards, editing, and other
11Question S5 of the 2011 Civic Engagement Supplement reads: “Next, I will give you a list of types of groups or organizations in which people sometimes participate. (Have you/Has NAME) participated in any of these groups during the last 12 months, that is since November 2010.” Five preset categories follow (see Appendix E for the exact question wording).
elements, it may be possible to coordinate these efforts. Also, it is important to note that the time spent in various volunteer activities can be estimated using the American Time Use Survey, which could be used in place of question S6 of the Volunteer Supplement if the two sources can be shown to produce comparable estimates (or if the supplement version is shown to be less accurate). Space could thereby be freed up and refocused on other important research questions such as why respondents choose to volunteer and what type of volunteering is being done.
Tradeoffs in Sample Size and Question Frequency
At the end of Chapter 3, we identified basic survey characteristics that at least indirectly guide what kinds of information can be effectively collected and used for measurement purposes. Here, we apply some of these considerations as they relate to the CPS Civic Engagement and Volunteer Supplements.
The CPS maintains a sample size of about 60,000 households per month which is, by design, sufficient to generate national and state employment and unemployment statistics. Additionally, substate data are published for 54 large metropolitan areas, 22 metropolitan divisions, and 41 cities, although this requires pooling data over the course of the year to create annual averages. The Civic Engagement and Volunteer Supplements are both conducted annually, budget permitting (the Civic Engagement Supplement was not fielded in 2012). This schedule allows for year-to-year tracking of responses, though the monthly sample size constrains research to the national- and, in some cases, state-level data analyses. Unlike, say, the American Community Survey (ACS), CPS sample sizes data are not large enough for county, much less neighborhood, research. Thus, activities, actions, attitudes that are inherently interesting and important to track at only those levels are not strong candidates for the CPS.
Additionally, the frequency of data collection—whether annual or at longer intervals—has an impact on the precision of estimates. One approach for estimating smaller areas is to accumulate data over time, creating moving averages or “period prevalence estimates,” as is routinely done with the ACS (e.g., many statistics are derived using 5-year moving totals that allow users to drill down to construct local area estimates). Less frequent data collection reduces an analyst’s ability to estimate change measures and to pool data across time to increase precision for smaller geographic areas and shorter time periods.12
Reducing the frequency of questions to a monthly survey fielded only every other year—as would also be the case if the Volunteer and Civic
12CNCS has pooled data across years of the Civic Engagement Supplement to publish more precise estimates for smaller geographic areas.
Engagement Supplements were rotated each November (or September)—reduces the effective sample size on an average annual basis by half. To maintain confidence interval widths, data pooling would have to encompass a time period twice as long. And it obviously would not be possible to estimate year-to-year changes, even nationally or at the state level.
The characteristics of interest, and the way they change temporally or vary spatially, create opportunities for sample design tradeoffs and experimentation with the periodicity with which questions appear on a module. Particularly with a combined November supplement, as described above, it may not be optimal to have identical content each year, and it is important to assess which information would suffer least from less frequent collection. For phenomena that do not change rapidly, less frequent sampling is not a bad tradeoff to exploit. If, for example, patterns of volunteering abroad (question S15) do not change quickly, that question could be a candidate to be fielded every other year, which would open up survey space for other questions. If there is not much demand to do research on short time interval trends in participation, voting, and other phenomena, there is less need for annual data collection. On the other hand, if one wanted to track erosion in a population’s confidence in a rapidly changing political climate for purpose of anticipating social unrest, an infrequent survey is an ineffective option (indeed, even a more frequent survey might not be the best way to tap into such feelings). If measuring trends is a priority (as is the case, for example, for survey data on which monthly unemployment rates are based), adequate sample size becomes important for establishing statistical significance.
The 2-year cycle framework suggests a core set of questions to be asked each year and another set that might be asked every 2 years, or even less frequently, thereby clearing space for additional biannual questions. Core questions would be reserved for items where change and granularity, to the extent it exists among the current topics, are needed (or one or the other); where neither is needed, questions become candidates for less frequent inclusion, on a rotating basis.
Another issue that supports our recommendation for a combined supplement—as opposed to separate, biannual Volunteer and Civic Engagement Supplements—involves the way the overall CPS survey sample is rotated. Currently, analysts can take advantage of the fact that the sample overlaps from year to year because respondents are in the sample for 4 months, out for 8 months, then back in for 4 additional months. This sample rotation format means that half the sample respondents in any given month were also present in the sample 1 year earlier; therefore, more precise estimates of annual change can be obtained, a feature that would be lost if only 2-year change estimates were possible.
The above discussion leads to the following conclusions
- The content of an annual combined Volunteer and Civic Engagement Supplement need not be identical each year. While some questions should be asked annually, others could appear less frequently. Such a strategy should be considered for items where research suggests that the measurement objective pertains to phenomena that do not change rapidly. Of course, this strategy cannot be exploited without negative consequence if a level of geographic granularity is desired that requires pooling data across years. And there is also the chance of a big event occurring (e.g., 9/11, Katrina) that creates a need and value for greater temporal information.
- A rotating question schedule would allow for collection of data on a greater range of variables—for example, many researchers of immigration and social mobility have called for a question on parents’ occupation, earnings, or country of birth (which might fit in well to the CPS, though the ACS is really the goal due to its capability to produce more granular geographic estimates).
- Respondent burden can also be reduced by rotating questions or using split sample questionnaires. The latter involves asking different sets of questions to random subsamples of respondents. The downside of this approach is that increased costs (for the same total sample size) and reduced item precision due to lowered sample sizes for a given question.
- Given the nature of subject matter falling under the social capital rubric, the frequency and geographic specificity of data from the CPS is inadequate for measuring many of its dimensions. Since the panel recommends refocusing the CPS Civic Engagement Supplement to volunteering and voting primarily, question rotation—while still potentially useful—becomes less crucial because the scope of the survey content will have been narrowed.
Developing a comprehensive data collection strategy in the areas of social capital requires consideration of other survey vehicles with potentially greater relevance and direct applicability to research on specific domains; the CPS supplements should not be evaluated in isolation. In weighing what to prioritize for the CPS, it is also necessary to identify
overlapping content of the Civic Engagement and Volunteer Supplements with other federal government surveys.
While, as indicated in Table 4-1 (and the accompanying Appendix D, which gives greater detail of questionnaire content), there are few ongoing surveys specializing in social capital, there are many that ask questions touching on relevant topics.13 Several of these surveys provide the covariate context required for deeper analysis of the relationship between social capital variables and outcomes in specific domains. The primary focus of the CPS is the labor force, and it asks about union membership and contacts (the supplements then delve more deeply into voting, volunteering, time use, and nonmarket activities). The ATUS also captures volunteering and is important for studying labor expended in the production of nonmarket goods and services; time-use measurement makes sense within the CPS because of its relationship with market labor hours. The American Housing Survey, under the auspices of the U.S. Department of Housing and Urban Development (HUD), asks about trust and neighborhoods in the context of housing. The Health and Retirement Study asks about support contacts in the context of health. The Panel Study of Income Dynamics asks about organizational memberships and contacts in the context of caregiving and well-being. The National Longitudinal Survey of Youth asks about volunteerism, religious affiliation, and political attitude in the context of education and work. The Health and Retirement Study and the English Longitudinal Study of Ageing each include a series of questions about connectedness with one’s children or grandchildren, which are useful for supporting research examining the effects of interpersonal relationships on the health, longevity, and happiness of older people. When justifying the addition of questions to surveys, it is highly persuasive when a specific purpose such as those noted above can be identified.
It was beyond the charge to this panel to go through the entire battery of government surveys for which development and placement of social capital questions may be appropriate or useful. However, we can generalize to say that specific research and policy questions (and the covariate information demanded by these questions) dictate the content of many of these surveys. While the panel recognizes that surveys often have different design standards, and transparency is not uniform across them,
13Some of these surveys—the National Crime Victimization Survey, the American Time Use Survey, and the Neighborhood Social Capital Module of the American Housing Survey—are fully sponsored and administered by the federal statistical system. Others—such as the Health and Retirement Study, the national longitudinal surveys, the General Social Survey, the American National Election Survey, the Social Capital Community Benchmark Survey, and Giving and Volunteering in the United States—are supported in part by the federal government and administered by nongovernment institutions.
the working group proposed in Recommendation 4 below would review and investigate the ability of existing data collection instruments to serve multiple purposes and to be streamlined.
One of the most compelling and timely examples of a promising survey vehicle for data on social capital is an addition to HUD’s 2013 American Housing Survey (AHS) (conducted by the Census Bureau): the neighborhood social capital module was designed to help researchers study neighborhood effects. The module was created as a “rotating topical module that collects data on shared expectations for social control, social cohesion, and trust within neighborhoods, and neighborhood organizational involvement.”14
The content of this new AHS module included 21 questions—about trust, values of neighbors, how well people get along, etc.—each drawn from existing neighborhood-level surveys that have been field tested and revised over the past 18 years. The design of this module drew heavily from research by Sampson (e.g., 2006, 2012, 2013) described above and were intended to measure the “extent of social cohesion among residents and their willingness to intervene on behalf of the common good [collective efficacy]” (Sampson, 2013). It does so by asking questions about respondents’ attitudes—such as how likely they would be to intervene if a fight were to break out among neighbors—and about levels of trust and willingness to help out in the community. The presence of such a survey module (if it were to become permanent), and the coverage it creates, should allow the CPS Civic Engagement Supplement to focus more narrowly on traditional political and civic participation questions.
The AHS module seems like an ideal fit for studying neighborhood effects—and the survey is large enough to allow for analysis of these small areas.15 Documentation in the data collection request for the AHS (OMB supporting statement 2528-0017) reveals that:16
While the content is nearly identical to previous surveys, the previous surveys have only been administered in a small number of metropolitan areas, including Chicago. Therefore, the AHS will provide a much larger and geographically diverse sample, thereby permitting detailed neighborhood social capital assessments in 25 metropolitan areas.…HUD PD&R consulted with Robert Sampson (Harvard University) and Cathy Haggerty and Michele Zimowski (NORC at the University of Chicago) to identify a group of questions that it expects will provide the best
15In statistical terms, a small area was defined as “a domain of interest for which the sample size is insufficient to make direct sample-based estimates of adequate precision” (National Research Council, 2013a).
16See footnote 14.
TABLE 4-1 Social Capital, Civic Engagement, and Social Cohesion Content of Major U.S. Surveys
|Current Population Survey||Census & BLS||Labor force statistics||Monthly|
|Civic Engagement Supplement||Census & BLS||Civic engagement||Resource and policy driven|
|Volunteer Supplement||Census & BLS||Volunteering||Annually|
|Voting and Registration Supplement||Census & BLS||Voting and registration||Biannually|
|ASEC Supplement||Census & BLS||Income, poverty, geographic mobility/migration, and work experience||Annually|
|NCVS||BJS||Characteristics of criminal victimization||Biannual|
|NHES Civic Involvement||NCES||Adult and youth civic involvement||Resource and policy driven|
|ATUS (CPS)||BLS & Census||Time use, employment||Annual|
|AHS - Neighborhood Observation/Social Capital||HUD||Housing||Biannual|
|HRS||NIA & SSA (U. of Michigan)||Health and aging||Biennial|
|Most Recent Year||Population Sampled and Sampling Mode||Capacity for Small-Area Estimates|
|2013 (ongoing)||Probability selected sample of about 60,000 occupied households/CATI & CAPI||State and 12 select MSAs|
|2011 (full sample); 2013 (half sample)||“||State and 12 select MSAs|
|2011 (full sample); 2013 (half sample)||“||“|
|2013 (ongoing)||Nationally representative sample of about 90,000 households/CAPI & CATI||National|
|1999||Nationally representative random-digit-dialing sample|
|2012 (ongoing)||Nationally representative sample of about 25,000 people/CATI||National|
|2013 (ongoing)||190,000 housing units, address based, longitudinal; computer-assisted personal interview||National and 29 large metropolitan areas|
|2012 (ongoing)||Varies by wave, but generally over age 50||National|
|NLSY97||BLS||Educational and labor market experiences, relationships with parents, contact with absent parents, marital and fertility histories, dating, sexual activity, onset of puberty, training, participation in government assistance programs, expectations, time use, criminal behavior, and alcohol and drug use||Annually 1997–2013, now biannual|
|NLSY79||BLS||Labor force behavior, educational attainment, training investments, income and assets, health conditions, workplace injuries, insurance coverage, alcohol and substance abuse, sexual activity, and marital and fertility histories||Annually 1979-2010, now biennial|
|NLSY79 Child & Young Adult||BLS||Schooling, training, work experiences and expectations, health, dating, fertility and marital histories, and household composition||Began in 1986 for ages 0-14. Since 1994 ages 15+. Biennial|
|NHIS||CDC/NCHS & Census||Health of adults and children||Annual|
|Sample Adult Core||CDC/NCHS & Census||Health conditions, limitations, behaviors, access and utilization of insurance||Annual|
|Most Recent Year||Population Sampled and Sampling Mode||Capacity for Small-Area Estimates|
|2013 (ongoing)||8,984 respondents born between 1980 and 1984||National|
|2012 (ongoing)||American youth born 1957-1964; 9,964 respondents remain in the eligible samples||National|
|2010||Children of NLSY79 females||National|
|2013 (ongoing)||Varies, around 40,000 households and 100,000 individuals||National|
|2013 (ongoing)||Varies, around 40,000 households and 100,000 individuals||National|
|PSID||U. of Michigan with funding from multiple government agencies, foundations, and other organizations||Employment, income, wealth, expenditures, health, marriage, childbearing, child development, philanthropy, education of families over multiple generations||Biennial|
|Disability and Use of Time Supplement of the PSID||U. of Michigan with funding from multiple government agencies, foundations, and other organizations||Detailed well-being, caregiving, time diary (24 hrs.) from previous day||Every 4 years|
|Transition into Adulthood Supplement of the PSID||U. of Michigan with funding from multiple government agencies, foundations, and other organizations||Health and emotional well-being, time use, community involvement, self-identity and perception, expectations for the future, family, peer, and romantic relationships, work, schooling||Biennial|
|GSS 2012||NSF (conducted by NORC)||Societal trends in behavior, attitudes, and opinions||Biennial|
|ANES||NSF (conducted by Stanford and U. of Michigan)||Voting, public opinion, and political participation||Biennial|
|Most Recent Year||Population Sampled and Sampling Mode||Capacity for Small-Area Estimates|
|2013||Began with nationally representative sample of over 18,000 individuals living in 5,000 families (sample additions/drops depend on demographics and funding throughout 45-year history)||National|
|2013||394 married couples over age 50 from PSID main||National|
|2011||Over 1,500 aged 18 years and older; no longer attending high school; participated in the CDS baseline interview (1997, 2002/2003, or 2007); and participated in main PSID 2009 interview||National|
|2012 (ongoing)||National probability sample; two waves with sample target of 1,500 adults for each wave. Face-to-face CAPI (online option added in 2012), some CATI||Census region|
|2012||Cross-section, equal probability, sample. 5,916 face-to-face, CAPI, and Internet||National|
|SCBS 2000||41 local community groups||Social capital and civic engagement||One time|
|SCCS 2006||Consortium of charitable foundations and local community groups||One time|
|Giving & Volunteering in U.S.||Consortium of charitable foundations and Independent Sector (conducted by Westat)||Volunteering and giving patterns and the motivations that correlate with such behavior||Biennial, 1988-2001|
NOTES: ANES, American National Election Studies; ASEC, Annual Social and Economic Supplement; BLS, Bureau of Labor Statistics; CAPI, computer-assisted personal interviewing; CATI, computer-assisted telephone interviewing; CDC, Centers for Disease Control and Prevention; GSS, General Social Survey; NCHS, National Center for Health Statistics; NHIS, National Health Interview Survey; HUD, Department of Housing and Urban Development;
insights into the scalability of results from neighborhood-level surveys of social capital to larger areas.…Ten of these questions were cleared by OMB as part of the Choice Neighborhoods Demonstration—baseline research project.
Further work will be needed on the new module to determine the precision of the small-area estimates and statistical properties. The survey should approach a sample size of 179,000 (though this includes both a national sample and a metropolitan area sample), which is considerably larger than the CPS—and it is longitudinal. Since a complete sample and questionnaire redesign is scheduled for the AHS in 2015, this is a crucial time for studying options for the permanent core questions and topical supplements.
|Most Recent Year||Population Sampled and Sampling Mode||Capacity for Small-Area Estimates|
|2000 (inactive)||National sample of 3,000 respondents and community respondents in 42 communities nationwide (across 29 states) covering an additional 26,700 respondents||National and 41 communities|
|2006 (inactive)||National adult sample of 2,741 respondents and 22 communities sample (11 of which were from the 2000 SCBS) totaling 9,359 community respondents||National and 22 communities|
|2001 (inactive)||Nationally representative sample of 4,216 adults aged 21 and older, random digit dialing||National|
NCVS, National Crime Victimization Survey; NIA, National Institute of Aging; NLSY79, National Longitudinal Surveys 1979 wave; NSF, National Science Foundation; PSID, Panel Study of Income Dynamics; SCBS, Social Capital Benchmark Survey; SCCS, Social Capital Community Survey; SSA, Social Security Administration.
RECOMMENDATION 3: The Corporation for National and Community Service should establish a technical (research and evaluation) working group tasked with systematically investigating the content of, and redundancies or overlap in, federal surveys in areas related to social capital measurement. A good place to start is with the Current Population Survey (CPS) Civic Engagement Supplement and the Neighborhood Social Capital Module of the American Housing Survey. Other candidates are the CPS Volunteer Supplement and the American Time Use Survey and the CPS Voting and Registration Supplement and other national election administration and voting surveys. The technical working group should be charged with finding effective ways to coordinate the content of these options.
The reference list of surveys in Table 4-1 provides a further roadmap for this assessment.
Ideally, both parts of the data collection strategy identified in Recommendation 1—national population surveys conducted by the statistical agencies and detailed special studies—should be pursued. A viable approach to optimizing the value of public resources would be to give priority to supporting sustained, locally intensive research models (e.g., the Chicago neighborhoods and NYC immigration studies).
RECOMMENDATION 4: For measuring relationships between such phenomena as social cohesion and neighborhood environment on one hand, and health, social, and economic outcomes on the other, statistical and funding agencies should take an experimental approach, sponsoring studies at the subnational-level and in-depth and longitudinal pilot data collections. This suggests that additional research and testing will be needed before committing to the content and structure of specific survey instruments. The statistical agencies’ advisory groups may be especially helpful in thinking creatively about what kinds of research and survey projects offer the most promise.
New, innovative work might involve conducting experiments (i.e., randomized treatment and control), but it might also include observational analysis, focus groups, cognitive interviews, and the like. Conducting experiments to identify causal effects is not the comparative advantage of the federal statistical agencies—they are best suited for collecting large-scale, high-quality, representative measures of political, economic, and societal indicators with the goal of tracing trends in society over time. Such data collections (whether from surveys or administrative records) then enable scholars to leverage exogenous shocks (or randomized treatments) to test causal claims. And now is the right time to move on the measurement and design issues implied in the above recommendation because federal statistics in this subject matter area have not yet become deeply rooted.
Additionally, numerous national polling organizations regularly conduct surveys intended to gauge various aspects of civic engagement and social cohesion. These data collections, such as the Gallup World Survey and various surveys conducted by the Pew Research Center, have high value and are often more nimble in reacting to changing conditions and the emergence of new issues and questions. The Pew 2012 survey project, Civic Engagement in the Digital Age, is one example.