Below are the first 10 and last 10 pages of uncorrected machine-read text (when available) of this chapter, followed by the top 30 algorithmically extracted key phrases from the chapter as a whole.
Intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text on the opening pages of each chapter. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.
Do not use for reproduction, copying, pasting, or reading; exclusively for search engines.
OCR for page 81
Using the American Community Survey for the National Science Foundation’s Science and Engineering Workforce Statistics Programs Appendix A Workshop Summary and Agenda The Panel on Assessing the Benefits of the American Community Survey for the NSF Division of Science Resources Statistics held a public workshop on October 5, 2007, in Washington, D.C., to discuss NSF, Census Bureau, and data user needs for the Scientists and Engineers Statistical Data System (SESTAT), particularly the National Survey of College Graduates (NSCG). The workshop goals were to clarify issues concerning alternative approaches to using the ACS as a sampling frame for the NSCG; identify issues related to the use of field-of-degree information on the ACS with regard to statistical methodology, data quality, and data products; consider the use of field-of-degree and other information from the ACS as a screening element for subsequent surveys such as the NSCG, which until now has used level of degree information from the decennial census long form; and consider relevance and adequacy of ACS products for meeting current and emerging data needs for NSF. The workshop agenda is at the end of this appendix. Among the highlights of the workshop, NSF provided a background discussion on the range of workforce surveys in SESTAT, including the NSCG, National Survey of Recent College Graduates (NSRCG), Survey of
OCR for page 82
Using the American Community Survey for the National Science Foundation’s Science and Engineering Workforce Statistics Programs Doctorate Recipients (SDR), and the Survey of Earned Doctorates (SED). These surveys offer a comprehensive and integrated system of information on employment, education, and demographic characteristics of scientists and engineers in the United States. The three sample surveys consist of more than 100,000 respondents combined, representing a population of over 21 million who have science and engineering (S&E) or S&E-related degrees or occupations. The NSCG, which is the survey of interest to the panel, captures data on people with at least a bachelor’s degree, who account for 85-90 percent of the SESTAT population and is the only source of information on people with non-U.S. degrees. NSF staff also reviewed the several mandates under which the agency operates. Under the 1950 act that created the agency, it is mandated to be a clearinghouse of information on the S&E enterprise. The amended act calls for NSF to collect and analyze demographic and education information on individuals with degrees in science and engineering and to design, establish, and maintain a data collection and analysis capability for the purpose of identifying and assessing the number and characteristics of scientists and engineers in the United States. Additional congressional mandates require NSF to produce the Women, Minorities, and Persons with Disabilities in Science and Engineering and Science and Engineering Indicators biennial reports. The panel heard from five SESTAT data users, including three of the panel’s own members, on the various uses of and needs for the data. The uses range from reconstructing answers from the census long form to evaluate the quality of the Census Bureau’s imputation of education, assessing gender and racial earning gaps, evaluating the relationship between work activity and earnings, and determining the contribution to U.S. science from foreign-born versus native-born workers. The NSCG is especially useful to researchers interested in determining how the labor force is changing and the effects of immigration. The users stressed that within confidentiality and privacy limits, particularly under Title 13, data linkages between the ACS and the NSCG and links from the SDR and the NSCG to the U.S. Patent and Trademark Office database would be helpful in answering additional research questions. The latter of these linkages would facilitate research into the role of entrepreneurial activities in the fields of science and engineering. The Census Bureau provided a comprehensive overview of the content testing planned for the field of bachelor’s degree question. There are two versions of the question: categorical or forced choice, and open-ended. Each version of the question was mailed to 15,000 housing units in July 2007, and nonresponse follow-up was conducted by telephone and personal visit in August and September. A content follow-up reinterview was conducted by telephone to assess the reliability of the responses. This
OCR for page 83
Using the American Community Survey for the National Science Foundation’s Science and Engineering Workforce Statistics Programs reinterview attempted to speak to the original respondent and asked both versions of the field-of-degree question. There are several decision criteria that will be used by the Census Bureau as evaluation measures for the ACS content test. They include comparing the content test results (distributions and percentages) to the NSCG, evaluating item missing data rates and the estimates’ reliability (gross difference rates and the L-fold index of inconsistency), assessing the response correspondence and rate of inconsistencies between the open-ended and categorical questions, determining the general impact to NSCG’s sampling frame, and comparing item nonresponse rates for the educational attainment question that precedes the field-of-degree question. The preliminary results from this evaluation process are expected in early 2008 with the goal to have approval by the U.S. Office of Management and Budget (OMB) by July 2008. Census Bureau personnel also addressed issues associated with using the ACS as a sample frame for other surveys. Based on the current ACS OMB terms of clearance, surveys that previously used decennial long-form data, such as the NSCG, may use the ACS as a sampling frame. To clarify the rules of access to ACS data, the Census Bureau has developed a policy that describes the criteria for determining appropriate uses of the ACS as a frame for reimbursable follow-on surveys. Thus far, NSF is one of only two survey sponsors that have officially requested the use of the ACS for this purpose. There are several technical issues associated with using the ACS as a sample frame for the NSCG. First, the ACS sample over a 12-month period does not capture enough of the rare populations needed in the NSCG. Most populations require two annual cycles of the ACS, while the rarest populations may need up to five rounds. NSF personnel offered their thinking regarding four options for drawing the NSCG sample from the ACS: Draw the sample once a decade (the current approach). Conduct selective updates, oversampling to capture certain populations with a regularly scheduled major redesign. Design rotating panels by dividing the survey into multiple panels in a 2- to 3-year cycle. Create rotating panels. The panel also solicited comments from Graham Kalton at the workshop regarding the options provided by NSF. An additional option stemming from that conversation was to design rotating panels for the rare populations and cross-sections for the rest of the population.
OCR for page 84
Using the American Community Survey for the National Science Foundation’s Science and Engineering Workforce Statistics Programs Panel on Assessing the Benefits of the American Community Survey for the NSF Division of Science Resources Statistics October 5, 2007 The Keck Center of the National Academies 500 Fifth Street, N.W. Washington, D.C. 20001 Room 101 WORKSHOP AGENDA Friday, October 5 8:00-8:30 am Call to Order and Introductions Hal Stern, Chair 8:30-10:30 am Session 1: Future Design of the NSF Workforce Surveys Moderator: Dan Black Mary Frase, Deputy Director, Division of Science Resources Statistics, NSF 10:45 am-12:15 pm Session 2: Requirements for the National Survey of College Graduates Moderator: Cathy Weinberger Roundtable Discussion of Uses by Panel Members Using NSCG Data in Wage Inequality Research Donna Ginther, University of Kansas (by phone) Using NSCG Data in Assessing the Quality and Composition of the Scientific Workforce Sharon Levin, University of Missouri, Kansas City (by phone) 12:15-1:00 pm Working Lunch: Roundtable Discussions of User Needs
OCR for page 85
Using the American Community Survey for the National Science Foundation’s Science and Engineering Workforce Statistics Programs 1:00-2:30 pm Session 3: Use of the American Community Survey as a Sample Frame and for Analytical Purposes Moderator: Chet Bowie Plans for Methods Panel and Testing FOD Question Jennifer Tancreto, Chief, ACS Data Collection Methods Staff Access to ACS Data for Sample Design and Analysis Cheryl Landman, Chief, Demographic Surveys Division 2:45-4:00 pm Session 4: Sample Design Options and Criteria Moderator: Robert Santos Discussion of Options Stephen H. Cohen, Chief Statistician, Science Resources Statistics Division, NSF 4:00-5:30 pm Open Discussion and Summary Hal Stern, Chair Guest Panelist: Graham Kalton, Chair, NRC Panel on the Functionality and Usability of Data from the American Community Survey