Uses in Research
This chapter opens with a discussion of the importance of research on changes in the skill requirements of jobs. It then presents a brief overview of recent applications of the O*NET to timely, policy-relevant topics in labor market research and human resource management research. The next sections consider the major limitations of the O*NET as a research tool. The final section presents the panel’s conclusions and recommendations related to the use of O*NET in research.
Researchers, educators, public officials, and private employers are keenly interested to understand how the skill requirements of jobs in the United States are changing. Such information is critical for developing education and workforce training policies; for assessing the impact of potentially disruptive economic forces, such as the rise of international offshoring; for evaluating how recent technological changes, such as the computer revolution, are reshaping job skill requirements; and for understanding the degree to which U.S. natives and foreign immigrants compete or, alternatively, occupy distinct and potentially complementary niches, in the labor market.
Answering such questions requires, as a starting point, data that accurately characterize the attributes of jobs performed in the United States. Such data have not always been readily available. Standard labor force survey data sets, such as the Current Population Survey, the Decennial Census, and the American Community Survey, provide two types of information that are commonly used to measure U.S. job skill requirements and their changes over time: (1) measures of the human capital of the workforce, in particular, the distribution of educational attainment and experience of
employed workers, job-seekers, and labor force nonparticipants, and (2) measures of the share of overall employment consisting of various broad (or detailed) occupational categories, such as professional and technical occupations; clerical, administrative, and sales occupations; precision production, craft, and repair occupations; operators, fabricators, and laborers; service occupations; and farm occupations.
Both types of measures have strengths and limitations. Human capital variables, such as schooling or experience, measure the credentials that workers bring to the job. These measures are useful for roughly comparing education and experience requirements among various occupations, but they do not tell us why these occupations employ workers with these credentials—that is, what job tasks the workers in these occupations perform that demand the levels of educational attainment or experience that they possess.
Broad occupational categories, by contrast, provide a more precise sense of what tasks workers do on the job—for example, accountants perform bookkeeping and other quantitative and analytical reasoning tasks—but these occupational categories do not facilitate comparisons of job skill requirements across jobs. For example, how do the skill requirements of operators, fabricators, and laborers compare with those of workers in farm occupations? To answer this question rigorously requires a common metric or taxonomy that classifies occupations into their constituent task requirements. Such a taxonomy should be based on sound social science and grounded empirically in direct measurements of the job tasks, aptitudes, and duties of incumbents in each occupation.
Since its inception in 1999, O*NET has become the primary database used by labor market researchers to assess how the skill requirements of jobs in the United States have changed over the recent past and how these requirements are likely to evolve. Relative to human capital measures and occupational categories, O*NET has three key strengths for this kind of research:
It offers the only contemporaneous U.S. data source that comprehensively measures what workers in America do at their jobs. That is, to the panel’s knowledge, O*NET does not have any close substitutes or close competitors as a source of information on the content of jobs performed by the U.S. workforce.1
O*NET provides a tool for comparing job attributes and skill requirements across occupations at a point in time—for example, operators, fabricators, and laborers relative to farm occupations—and for evaluating changes in these job attributes over time.
O*NET provides an exceptionally rich set of scales for assessing job content along numerous dimensions.
RESEARCH USES OF O*NET
Human Resources and Organizational Behavior Research
The O*NET database has been employed in research on human resource and organizational behavior topics with increasing frequency over time, as researchers have become aware of its potential and accessibility. O*NET is involved in basic research in these areas in three primary ways.
The most frequent use of O*NET in human resource research has been to provide data on job characteristics in studies on a wide range of human resource and organizational behavior topics. These topics include job autonomy levels (e.g., Andresassi and Thompson, 2007), job control (Lie, Spector, and Jex, 2005), work context (Dierdorff and Ellington, 2008; Dierdorff and Morgeson, 2007), knowledge and skill training retraining time (AIR research), occupational literacy requirements (AIR research), skill level estimations (Wiita and Palmer, 2009), and job level (Tracey, Sturman, and Tews, 2007). Studies focus on a wide range of topics, such as work-family conflict, personality testing, stress, emotional labor, and others, indicating that researchers in different domains see O*NET as a potentially valuable resource for providing job or occupational characteristics information that may play a role in understanding a wide range of organizational phenomena. Most of this research is relatively recent (i.e., the past 3 years), so this use of O*NET may be growing, as more researchers become aware of the database and how it might benefit their research.
Second, O*NET questionnaires have been used by human resources researchers in exploring issues in job analysis and other topics (Dierdorff and Rubin, 2007; Morgeson, Reider, and Campion, 2005). In these cases, the Skills, Abilities, Generalized Work Activities (GWAs), Tasks, Work Styles, and Work Context measures have all been employed by researchers collecting their own data but wishing to examine a research question
involving one or more aspects of the content model. The accessibility of all O*NET measurement tools allows researchers the opportunity to use these questionnaires to address specific, emerging research questions. It also provides an opportunity for feedback to the O*NET Center, in that with independent researchers using the tools, suggestions for improvement may be gathered. Also, data collected by researchers using the same tools as those underlying O*NET may serve in a comparative capacity for some occupations.
Finally, the O*NET content model and database have been examined by human resources researchers as an object of research in themselves, as detailed in Chapter 2. Research on O*NET itself is a continued focus of human resources researchers interested in understanding the nature of work, the effectiveness of various job analytic methods, and the similarities and differences among jobs. That is, O*NET has become a useful tool in enhancing understanding and advancing theory and practice in the area of job analysis.
In sum, the O*NET database as well as the O*NET questionnaires have been used by researchers in human resources and organizational behavior to address a wide range of questions regarding job characteristics, such as how job characteristics relate to worker satisfaction and health and how they inform selection and training of workers.
Economic and Labor Market Research
The O*NET database is used with increasing frequency and prominence by economists and sociologists studying the evolution of the labor market. Three areas of particular focus have been the effects of computerization on labor demand, the susceptibility of U.S. jobs to international offshoring, and the impact that low-skilled immigrants have on the employment and earnings of U.S. natives.
Howell and Wolff (1991) were the first researchers to study the impact of computerization on the labor market using job task measures. Their work predates O*NET and thus relies on the Dictionary of Occupational Titles (DOT). Autor, Levy, and Murnane (2003) extended the work of Howell and Wolff by offering a formal hypothesis for how the spread of computerization shapes the demand for workplace skills and tested this hypothesis using DOT data. Stated simply, their core hypothesis is that computerization leads to the automation (with a concomitant reduction in their share of total national employment) of a large set of “middle education” routine cognitive and manual tasks, such as bookkeeping, clerical work and repetitive production tasks. Although the initial work of Autor,
Levy, and Murnane relied on the DOT, recent work exploring the same hypothesis has extended this analysis using O*NET. Notably, Goos, Manning, and Salomons (2009) have applied O*NET job content measures to analysis of data from numerous Organisation for Economic Co-operation and Development countries.
A second research topic of substantial recent interest has been the potential impact of international offshoring on U.S. employment. Blinder (2007) argues that the major constraint on the outsourcing of U.S. jobs is the degree to which these jobs must be performed in person; jobs that do not suffer substantial quality degradation when performed at a distance are likely to be increasingly sourced offshore, where employers can take advantage of lower labor costs. To gauge the susceptibility of U.S. jobs to offshoring, Blinder (2007) used O*NET to classify occupations according to their need for in-person interactions. The major conclusion of this work is that between 22 and 29 percent of all U.S. jobs are or will be potentially offshorable within one to two decades. A number of recent papers follow up on this work, including Smith and Rivkin (2008) and Blinder and Krueger (2009).
A third prominent topic in which the O*NET database has found application is the analysis of the impact of low-skilled immigrants on the employment and wages of U.S. natives. A voluminous and contentious literature, commencing with Card (1990), studies the economic consequences for U.S. workers of rising immigration flows from Central and South America. The bulk of this literature concerned itself primarily with employment rates and wages of natives. Recent contributions by Cortes (2008) and Peri and Sparber (in press) have advanced the debate by using O*NET to document substantial differences in the patterns of occupational specialization of immigrants and natives with similar levels of education and experience. An intuitive but nonetheless important finding of both studies is that, for given levels of education and experience, U.S. natives are much more likely than immigrants to be employed in language- and communication-intensive occupations, whereas immigrants are more likely to be employed in occupations demanding physical labor. This finding in part helps to explain why similarly educated immigrants and natives do not appear to compete more directly in the labor market.
In sum, these research examples highlight the broad applicability of the O*NET to core and emerging topics in labor market research. Without O*NET (and its predecessor, the DOT), empirical analysis of these topics would be substantially impoverished. By providing a tool for looking within occupations, O*NET affords researchers the opportunity to better assess how computerization, offshoring, and immigration differentially affect distinct job categories according to their core task requirements.
SHORTCOMINGS OF O*NET AS A RESEARCH TOOL
Although O*NET has unique value as a research tool, it has not reached its full potential. These shortcomings fall under three headings: (1) the O*NET survey instrument, (2) the data collection effort, and (3) the dissemination of O*NET data. Some of the issues identified would require substantial rewriting of the survey instrument to amend. Others are readily addressable, requiring only better dissemination of data already collected.
The O*NET Survey Instrument
This section briefly highlights a subset of the weaknesses in the survey instruments that are most salient to labor market researchers (see also Chapter 4).
Redundancies and Ambiguities
As noted in Chapter 2, the O*NET content model reflects the earlier content model of the Advisory Panel for the Dictionary of Occupational Titles (1993) incorporating “everything about jobs that had been studied, in the name of explaining occupational choices, occupational performance, and work/occupational satisfaction.” The result is an array of survey questions, which, to researchers not deeply versed in the history of O*NET, appear redundant or, at best, not obviously distinct. Chapter 2 points out that “problem solving” appears as an item on four separate O*NET questionnaires: Abilities, Skills, Work Styles, and GWAs. Confronted with this ambiguity, researchers who wish to analyze problem-solving requirements of occupations must either choose among the four alternative scales or develop some aggregation of two or more scales. Neither approach is preferable to having a single, unified problem-solving scale in O*NET or spurring O*NET developers to provide clearer conceptual distinctions among these problem-solving measures.
Another weakness of the survey lies in its inclusion of both the Level and Importance scales in four of the seven O*NET descriptor domains (Abilities, Knowledge, Skills, and GWAs). As discussed in Chapter 4, the responses to these scales are highly correlated and largely redundant. To cut through these redundancies, researchers have little option but to make arbitrary choices about which scales to employ and which to discard.
Complexity of Survey Questions
Two of seven questionnaires (Abilities and Skills) for evaluating occupations are completed by occupational analysts, rather than job incumbents. These questionnaires include many technical terms that research has shown to be unfamiliar to most employees (see Chapter 4). For example, although the Abilities questionnaire was originally intended to be completed by job incumbents (Peterson et al., 1999), it contains many technical terms like “fluency of ideas,” “category flexibility,” “speed of closure,” and “rate control” that are unlikely to be familiar to laypersons.
The question of whether job incumbents’ or analysts’ ratings most accurately reflect the actual requirements of the occupation remains open and warrants further study (see Chapter 4).
Vague Job Content Measures
Some O*NET job content measures are so complex and vague as to leave doubt as to whether they measure a single, well-defined construct. For example, Item 30 of the Skills questionnaire asks respondents to rate the importance of Systems Evaluation, defined as “Identifying measures or indicators of system performance and the actions needed to improve or correct performance, relative to the goals of the system” (National Center for O*NET Development, no date). Setting aside the question of whether lay respondents understand this definition, the panel is unsure whether “systems evaluation” is a specific job skill or a loosely defined admixture of a number of other, more generic skills. In fact, the Systems Evaluation skill appears to combine a handful of comparatively well-defined, specific building-block job tasks from the GWAs questionnaire. These GWAs include information-gathering, monitoring, evaluating information to determine compliance with standards, and making decisions and solving problems.
Problematic Survey Anchors
As detailed in Chapter 4, many of the behavioral anchors offered to guide respondents in rating aspects of their jobs are problematic. Hubbard et al. (2000), observe that numerous anchors offer examples from specialized occupations that may not be well known to a substantial share of job incumbents (e.g., computer programmers, loan evaluators, managers of road repair crews). Even anchors drawn from commonplace occupations may be sufficiently specific to their occupational domain to impede ready comparison to the incumbent’s own occupation. Returning, for example, to the Systems Evaluation question, the O*NET survey anchors include
Level 2, determining why a coworker was unable to complete a task on time; Level 4, understanding why a client is unhappy with a product; and Level 6, evaluating the performance of a computer system (National Center for O*NET Development, no date).
It is difficult for the panel to conceive of a well-defined metric that would unambiguously place the level of systems evaluation required for “evaluating the performance of a computer system” either above or below the level of systems evaluation required for “understanding why a client is unhappy with a product.” These tasks appear to us noncomparable on any ordinal scale of the “level” of occupational performance. In addition, it seems unlikely that the typical worker performing customer service would be sufficiently well versed in computer engineering to determine whether the level of systems evaluation required in her job is above or below that required for a computer systems engineer (and vice versa for a typical systems engineer).
The O*NET Data Collection Effort
The following section focuses on data collection issues that are most relevant to research uses of the O*NET. See Chapter 4 for a general discussion of data collection issues.
The questionnaires used to populate the previous database, O*NET 13.0, contained 277 survey questions, many of which invite respondents to rate both the Level and Importance of various descriptors. This amounts to a burdensome data collection effort, both for the U.S. Department of Labor (DOL) and O*NET respondents themselves.
While respondent burden is a necessary cost of surveying, excessive length and complexity erode the quality of survey results in two ways. First, all else equal, greater respondent burden reduces survey responses rates, thus shrinking the sample size and potentially skewing its representativeness. Second, because the O*NET job incumbent questionnaire must be completed by three different sets of respondents, the reliability of comparisons of responses across domains within an occupation is reduced. For example, if the GWA responses for an occupation do not tightly correlate with the Knowledge responses for that occupation, is that because the two questionnaires have truly distinct content, or because different occupational incumbents answered each survey?
Survey Detail, Sample Size, and Refresh Frequency
In addition to respondent burden, the O*NET survey data collection effort faces trade-offs along three dimensions that affect data quality: the size of the sample collected for each occupation, the number of detailed occupations individually surveyed (rather than subsumed within broader occupation categories), and the time interval between successive waves (or refreshes) of the data. Improving O*NET along any one of these dimensions increases the total cost of data collection; holding constant data collection costs, improvements on one dimension necessitate cutbacks along either or both of the remaining dimensions.
While it is not possible for us to stipulate the optimal trade-offs without further study, the panel found little evidence that these trade-offs were carefully considered in the design of O*NET or that they are currently weighed on an ongoing basis. Most importantly, it appears that disproportionate precedence has been given to respecting the integrity and completeness of the O*NET content model over other dimensions of survey quality—that is, depth, precision, frequency. The prototype content model attempted to follow the very wide-ranging content model of the Advisory Panel for the Dictionary of Occupational Titles (1993). To address the low response rates in the field test of the prototype, DOL charged a study group with “making changes that would reduce the respondent burden (thereby increasing response rates) while keeping intact the Content Model” (Hubbard et al., 2000, p. 5). As a result, the current content model is largely unchanged from the prototype content model. This history suggests that DOL may not have given due consideration to the costs that a lengthy and complex survey instrument would impose on other dimensions of data quality. The fact that DOL currently must, for each occupation, survey three different sets of job incumbents and also occupational experts and occupational analysts, to obtain the requisite data on the seven primary domains substantially elevates the cost of administering O*NET for any given sample size or refresh interval. Similarly, it makes any increase in sample size or survey frequency more costly.
The panel’s conclusion in Chapter 2 that the O*NET survey includes a substantial number of redundant questions makes it less costly to address the problem of an unduly burdensome survey instrument. Eliminating redundant material through the research recommended in that chapter would allow for improvements along other quality dimensions with no significant loss of quality on any relevant dimension.
Treatment of New and Emerging Occupations
It is imperative that O*NET collect data on new and emerging occupations. These occupations are often of substantial interest to policy makers. A leading example is the recent policy interest in green jobs.
While identifying new and emerging occupations for inclusion in the occupational classification system is necessary, it is important to avoid a costly proliferation of niche occupations. The DOT featured more than 12,000 occupations, a number far too large to scientifically sample and regularly refresh at current expenditure levels. The question of how new and emerging occupations are identified and selected for inclusion in O*NET deserves careful study and policy development.
One of the key shortcomings of O*NET as a research tool could be readily addressed by simply improving data dissemination. Although researchers, especially those in labor market studies and human resource research, increasingly rely on O*NET, they are hindered in their analyses by not having ready access to demographic and other data on respondents—such as education, age, gender, and race—at both the aggregate occupation level and the individual respondent level. In addition, researchers would like to easily access data on when each occupation was sampled or refreshed as well as survey data from prior sampling waves.
These limitations can be readily addressed. Some of the desired data elements are already available. For example, the O*NET Center makes older versions of the database available on request, but many researchers are not aware of this. Other data elements are already available from the O*NET website but are not easily located. Still other data elements probably cannot be released to researchers without first establishing safeguards to protect respondent confidentiality.
The panel thinks that DOL and the O*NET Center should strive to make O*NET source data accessible to researchers and to end users, provided that this does not compromise confidentiality or entail substantial costs. Such data sharing increases the value of the O*NET resource and generates knowledge that is potentially invaluable to researchers, policy makers, and ultimately the O*NET Center itself.
This data sharing must be governed by transparent and consistently applied policies that stipulate what data are available and how they are accessed. Nowhere is this issue more salient than in the provision of individual-level O*NET survey data (also referred to as microdata) for research. O*NET microdata offer a potentially rich research resource; using them, researchers can explore detailed questions that cannot be adequately
addressed using aggregate, occupation-level data available on the O*NET website. Authorizing qualified researchers to access O*NET microdata for purposes of analysis and publication would encourage the growth of a body of research to address these types of important research questions. Such a body of research would complement studies using individual-level O*NET data published by research groups that include one or more members of the O*NET development team (e.g., Dierdorff and Morgeson, 2007, 2009; Dierdorff, Rubin, and Morgeson, 2009; Peterson et al., 1999).
Sharing individual-level O*NET data with outside researchers also entails risks; if researchers breach the survey’s confidentiality, this will compromise cooperation between respondents and government surveyors. Thus, data sharing must be thoughtfully managed and carefully overseen. Federal statistical agencies have developed several approaches to managing access to data sets, such as removing all direct and indirect identifiers, making confidentiality edits, restricting access to qualified researchers who agree to confidentiality protections, and establishing disclosure review boards to oversee all data sharing activities (National Research Council, 2007, 2009).
CONCLUSIONS AND RECOMMENDATIONS
Many of the panel’s recommendations related to the content model and data collection would address the shortcomings discussed in this chapter. In addition, we conclude that, although researchers, especially those in labor market studies and human resource research, increasingly make use of O*NET, they are limited in their analyses by the lack of demographic and other data on the respondents and the lack of access to individual-level data. They are also limited by lack of access to previous versions of the database, in order to conduct longitudinal studies of changes in the skills and other requirements of jobs.
Recommendation: To increase researchers’ access to O*NET data, the Department of Labor, with advice and guidance from the technical advisory board recommended in Chapter 2 and the user advisory board recommended in Chapter 6, should actively inform the research community about the detailed information on samples obtained in each occupation that is currently available and provide additional detailed information including: the survey response rate, the date ranges of survey responses collected, etc.2
Recommendation: The Department of Labor, with advice and guidance from the technical advisory board and the user advisory board, should consider the feasibility of making available additional statistics (in addition to means) for survey responses in each occupation, such as median, standard deviation, 25th percentile, and 75th percentile. In addition, DOL should consider the feasibility of making available data on the demographic characteristics of respondents in each occupation, including income,3 education, gender, race, and age.
Recommendation: The Department of Labor, with advice and guidance from the technical advisory board and the user advisory board, should explore the possibility of making the availability of successive waves of O*NET survey responses for each occupation (with dates) widely known to the research community, for use in longitudinal analysis.4
Recommendation: The Department of Labor and the O*NET Center, with advice and guidance from the technical advisory board and the user advisory board, should draw up policies allowing researchers to access individual-level data and communicate these policies clearly to DOL and O*NET Center staff, contractors, and outside researchers. These policies should include appropriate techniques to protect individual privacy, such as restricting access to qualified researchers who agree to confidentiality protections.
Advisory Panel for the Dictionary of Occupational Titles. (1993). The new DOT: A database of occupational titles for the twenty-first century (final report). Washington, DC: U.S. Department of Labor, Employment and Training Administration.
Andreassi, J.K., and Thompson, C.A. (2007). Dispositional and situational sources of control: Relative impact on work-family conflict and positive spillover. Journal of Managerial Psychology, 22, 722-740.
Autor, D.H., and Handel, M. (2009). Putting tasks to the test: Human capital, job tasks and wages. Working paper, Massachusetts Institute of Technology, May.
Autor, D.H., Levy, F., and Murnane, R.J. (2003). The skill content of recent technological change: An empirical investigation. Quarterly Journal of Economics, 118(3), 1279-1333.
Blinder, A. (2007). How many U.S. jobs might be offshorable? Princeton University Center for Economic Policy Studies Working Paper No. 142, March. Available: http://www.princeton.edu/ceps/workingpapers/142blinder.pdf [accessed December 2009].
Blinder, A.S., and Krueger, A.B. (2009). Alternative measures of offshorability: A survey approach. Princeton University Center for Economic Policy Studies Working Paper No. 190, August. Available: http://www.princeton.edu/ceps/workingpapers/190blinder.pdf [accessed November 2009].
Card, D. (1990). The impact of the Mariel Boatlift on the Miami labor market. Industrial and Labor Relations Review, 43, 245–257.
Cortes, P. (2008). The effect of low-skilled immigration on U.S. prices: Evidence from CPI data. Journal of Political Economy, 116(3), 381-422.
Dierdorff, E.C., and Ellington, J.K. (2008). It’s the nature of the work: Examining behavior-based sources of work-family conflict across occupations. Journal of Applied Psychology, 93, 883-892.
Dierdorff, E.C., and Morgeson, F.P. (2007). Consensus in work role requirements: The influence of discrete occupational context on role expectations. Journal of Applied Psychology, 92, 1228-1241.
Dierdorff, E.C., and Morgeson, F.P. (2009). Effects of descriptor specificity and observability on incumbent work analyst ratings. Personnel Psychology, 62, 601-628.
Dierdorff, E.C., and Rubin, R.S. (2007). Carelessness and discriminability in work role requirement judgments: Influences of role ambiguity and cognitive complexity. Personnel Psychology, 60, 597-625.
Dierdorff, E.C., Rubin, R.S., and Morgeson, F.P. (2009). The milieu of managerial work: An integrative framework linking work context to role requirements. Journal of Applied Psychology, 94, 972-988.
Dustmann, C., Ludsteck, J., and Schönberg, U. (in press). Revisiting the German wage structure. Quarterly Journal of Economics.
Felstead, A., Gallie, D., Green, F., and Zhou, Y. (2007). Skills at work, 1986 to 2006. ESRC Centre on Skills, Knowledge and Organisational Performance, University of Oxford.
Goos, M., Manning, A., and Salomons, A. (2009). Job polarization in Europe. American Economic Review Papers and Proceedings, 99(2), 58-63.
Handel, M.J. (2007). A new survey of workplace skills, technology, and management practices (STAMP): Background and descriptive statistics. Paper presented at Workshop on Research Evidence Related to Future Skill Demands, National Academies, Washington, DC. Available: http://www7.nationalacademies.org/cfe/Future_Skill_Demands_Michael_Handel_Paper.pdf [accessed July 2009].
Handel, M.J. (2008a). Measuring job content: Skills, technology, and management practices. Institute for Research on Poverty Discussion Paper No. 1357-08. Madison: University of Wisconsin.
Handel, M.J. (2008b). What do people do at work? A profile of U.S. Jobs from the Survey of Workplace Skills, Technology, and Management Practices (STAMP). Paper presented at the Labor Seminar, Wharton School of Management, University of Pennsylvania.
Handel, M.J. (2009). The O*NET content model: Strengths and limitations. Paper prepared for the Panel to Review the Occupational Information Network (O*NET). Available: http://nrc51/xpedio/groups/dbasse/documents/websupport/051537.doc [accessed July 2009].
Howell, D.R., and Wolff, E.N. (1991). Trends in the growth and distribution of skills in the U.S. workplace, 1960-1985. Industrial and Labor Relations Review, 41, 486-502.
Hubbard, M., McCloy, R., Campbell, J., Nottingham, J., Lewis, P., Rivkin, D., and Levine, J. (2000). Revision of O*NET data collection instruments. Raleigh, NC: National Center for O*NET Development. Available: http://www.onetcenter.org/reports/Data_appnd.html [accessed September 2009].
Liu, C., Spector, P.E., and Jex, S.M. (2005). The relation of job control with job strains: A comparison of multiple data sources. Journal of Occupational and Organizational Psychology, 78, 325-337.
Morgeson, F.P., Reider, M.H., and Campion, M.A. (2005). Selecting individuals in team settings: The importance of social skills, personality characteristics, and teamwork knowledge. Personnel Psychology, 58, 583-611.
National Center for O*NET Development. (no date). Questionnaires. Available: http://www. onetcenter.org/questionnaires.html [accessed July 2009].
National Research Council. (2007). Putting people on the map: Protecting confidentiality with linked social-spatial data. Panel on Confidentiality and Data Access Arising from the Integration of Remotely Sensed and Self-Identifying Data. M.P. Gutmann and P.C. Stern (Eds.). Committee on the Human Dimensions of Global. Division of Behavioral and Social Sciences and Education. Washington, DC: The National Academies Press.
National Research Council. (2008). Research on future skill demands: A workshop summary. M. Hilton, Rapporteur. Center for Education. Division of Behavioral and Social Sciences and Education. Washington, DC: The National Academies Press.
National Research Council. (2009). Protecting student records and facilitating education research. M. Hilton, Rapporteur. Committee on National Statistics and Center for Education. Division of Behavioral and Social Sciences and Education. Washington, DC: The National Academies Press.
Peri, G., and Sparber, C. (in press). Task specialization, immigration and wages. American Economic Journal: Applied Economics.
Peterson, N.G., Mumford, M.D., Borman, W.C., Jeanneret, P.R, and Fleishman, E.A. (1999). An occupational information system for the 21st century: The development of O*NET. Washington, DC: American Psychological Association.
Peterson, N.G., Mumford, M.D., Borman, W.C., Jeanneret, P.R., Fleishman, E.A., Levin, K.Y., Campion, M.A., Mayfield, M.S., Morgeson, F.P., Pearlman, K., Gowing, M.K., Lancaster, A.R., Silver, M.B., and Dye, D.M. (2001). Understanding work using the occupational information network (O*NET): Implications for practice and research. Personnel Psychology, 54, 451-492.
Smith, T., and Rivkin, J.W. (2008). A replication study of Alan Blinder’s “How many U.S. jobs might be offshorable?” Harvard Business School Working Paper No. 08-104, June. Available: http://hbswk.hbs.edu/item/5975.html [accessed November 2009].
Spitz-Oener, A. (2006). Technical change, job tasks and rising educational demands: Looking outside the wage structure. Journal of Labor Economics, 24(2), 235-270.
Tracey, J.B., Sturman, M.C., and Tews, M.J. (2007). Ability versus personality: Factors that predict employee job performance. Cornell Hotel and Restaurant Administration Quarterly, 48(3), 313-322.
U.S. Department of Labor. (1972). Handbook for analyzing jobs. Washington, DC: Author, Manpower Administration.
U.S. Department of Labor. (1991). Dictionary of occupational titles: Revised fourth edition. Washington, DC: Author, Employment and Training Administration.
Wiita, N.E., and Palmer, H.T. (2009, April). O*NET in-demand occupations: Estimating skill levels for success. Paper presented at the 23rd Annual Conference of the Society for Industrial and Organizational Psychology.