Below is the uncorrected machine-read text of this chapter, intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text of each book. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.
Minority Report ERNEST J. MCCORMICK1 The report of the Committee on Occupational Classification and Analysis deals generally with the issue of alleged discrimination in pay in the form of inequitable treatment based on sex or race. In connection with the report there are two issues with which ~ am in disagreement and that make it impossible for me to concur with portions of the report. Because of m' disagreement on these issues ~ am writing this minority report. Before setting forth the bases of these disagreements, ~ would like to refer to the composition of the committee, with particular reference to ' Professional background especially relevant to committee activities. Work cxpericncc: chief, planning unit, occupational research program, U.S. Employment Seneca, 193~1939; chief occupational analyst. population census, Bureau of the Census, 193~1941; chief, occupational statistics, Selective Service System, 1941-1943; personnel classification of- ricer, U.S. Navy, 194~1945; professor of industrial psychology, Purdue University (in- cluding primary research activities for 20 years in job analogs and teaching courses in job analysis), 194~1977. Relevant publications: Job Analysis: Methods and Applications, Amcncan Management Associations, 1979; chapter on "Job ant Task Analysis" in Hand- book of Industrial and Organizational Psychology, M. D. Du~cnc, cd., Rand McNally, 1976; chapter on "Job Information: Its Development and Applications" in ASPA Hand look of Personnel Relations, D. Yoder and H. H. Hcacman, Jr., ads., Bumau of National Affairs, 1974; chapters on "Job and Task Analysis" and "Job Evaluation" us Indus~l Engu~cenng Handbook, G. Salvendy, cd., John Wiley ~ Sons (io press); 36 technical reports dealing with job analysis and job cvaJuation; S chapters in tactical reports dealing tenth job analysis; several papers in professional journals. Other: oomultant on job analysis to the U.S. Employment Scrvicc; Icader in job analysis worl~sbop~. CU~eDt~: professor cmcntus, Purdue University; 1315 Sunset Irene, West I^fayettc, Indiana 47906. ~5
116 WOMEN, WORK, AND WAGES the topical domain with which it dealt, namely, the matter of procedures for establishing equitable pay rates for jobs, which can involve the proc- esses of job evaluation. The report of any committee that deals with a controversial subject typically renects the composition of the committee. In the case of this committee there was no member who was a full-time practitioner in the field of job evaluation, and only a very few members had had any specific experience with, or involvement with, practical job evaluation procedures or with the job analysis processes that are basic to job evaluation and wage determination. The original committee ap- pointments did include an industnal engineer who was deeply involved in job evaluation processes as a consultant, but he resigned shortly after the committee was formed because of possible conflict of interests. ~ recommended that a person with his background and experience be appointed to replace him, but this was not done. [Biographical sketches of committee members and staff appear at the end of the volume.] Neither was there anyone on the staff who, to my knowledge, had had practical experience in the field of job evaluation. Furthermore, most of the members of the staff had had no experience in the occu- pational field as broadly conceived, including exposure to job evaluation and its underlying job analysis processes. ~ believe the activities of the committee were influenced by (and in my opinion seriously impaired because of) the very limited representation on the committee and staff of persons with familianty with, or practical experiences in, job eval- uation processes and the underlying field of job analysis. Criterion for Determining Job Values One of the critical issues of this minority statement relates to the standard or the criterion that should be used in judging the "worth" of jobs. Before discussing such standards, however, it would be appropriate to differentiate between two frames of reference in which alleged dis- cnm~nation is discussed. One frame of reference concerns the matter of "equal pay for equal work," which deals with pay for jobs that are the same or very similar in content. Me other reference point concerns the concept of equal pay for 'comparable" work or work of "comparable worth" or "comparable" or "equal value." blue committee report deals largely with the '~comparable worth" frame of reference.) The Equal Pay Act of 1963 and Title VI! of the Civil Rights Act of 1964 both provide for equal pay for equal work and state that it is discriminatory for men and women performing equal work to be paid differently. On legal and rational grounds there is no justification for such discrimination. He report of the committee strongly supports the
Minority Report 117 objectives of ensuring equal pay for equal work as construed in the frame of reference of equal pay for jobs that are similar in content, and fully concur in the portions of the report that deal with this concept. My concern deals pnmanly with the question of the standards or criteria that might be relevant for evaluating the 'comparability" of jobs. Me committee report is sprinkled with direct references or ~mplica- tions relating to alleged discrimination in the case of certain jobs In which women tend to dominate (these are sometimes called "women's jobs") as contrasted with those in which men tend to dominate (these are sometimes called 6'men's jobs',). It is alleged that the pay scales for some women's jobs are lower than they `'should be," and that such rates are as low as they are because employment in them is dominated by women. Such alleged discrimination is sometimes referred to as ~nsti- tutional discrimination, the theory being that cultural and other factors have resulted in the "tracking', of women into such jobs, with accom- panying pay scales below what they ''should be." The argument that differences in the pay of women's jobs and men's jobs reflect a form of discnmination has given rise to the concept of equal pay for comparable work (or for work of comparable value or equal value), the implication being that there is some concept of comparability of jobs that would make it possible to justify the establishment of equal pay for jobs that are different in content but comparable in terms of the concept of worth or value. This argument immediately raises the question as to the basis on which jobs might be considered comparable in worth (or noncom- parable). In the report of the committee there are numerous statements that either directly, or by implication or inference, take issue with the pnn- ciple that the prevailing rates of pay in the labor market should serve as the primary basis for the establishment of pay scales for jobs in specific situations. Furthermore, the committee report implies that the deter- mination of the comparability in worth between jobs should be ~nde- pendent of current wages and salaries found in the labor market. It is with these portions of the report that ~ am in disagreement, since it is my firm conviction that current wages and salanes are indeed one in- dication of the underlying relationship between jobs. This relationship between worth and pay, albeit imperfect, is a product of real, impartial forces (as well as of the venous possible biases that trouble the com- m~ttee) and thus cannot rationally be ignored. It is my contention that there is no conceptually appropriate, eco comically Table, or practical basis for determining the comparability of jobs without considering the value system that underlies the wages "d salanes paid to all jobs throughout the entire oc~pa?donal structure
118 WOMEN, WORK, AND WAGES of our economy. Stated differently, I am convinced that the comparable worth or value as reflected in the going rates of pay assigned to jobs will over time closely correlate with the underlying hierarchy of values that has evolved in our world of work. This hierarchy of values generally reflects the fact that job Values are influenced by a vanety of factors such as skill, effort, responsibility, type of work activity, and working conditions. Furthermore, this value system is essentially a function of the supply of, and the demand for, individuals who possess the relevant job skills, who have the ability to apply the relevant effort, who are capable of assuming the relevant responsibilities, who can perform the work activities in question, and who are able and whiling to work under the working conditions in question. To ignore the value system because it does not produce results that fit certain preconceptions of job worth (whether for or against any classy reflects. in mv opinion a hiac~1 frame of reference. - ~ ~~~ ~~~~ ~r ~ - ~-~ a.l~, The committee report views the labor market as one that tends to undervalue "women's jobs" relative to '6men's jobs" and concludes that the market is discriminatory and therefore should be disregarded in establishing rates of pay. Such a view of the labor market seems to me to be naive and unrealistic. The labor market is the generic term for a value system rooted in the hierarchy of skills, effort, responsibility, and work activities (and to some extent working conditions) that compose jobs, and the supply and demand forces that operate as organizations and workers compete in our economy. As a matter of interest, state- ments of female or minority "undervaluation" seem to be based upon the concept that there is a value system but that some types of individuals in certain jobs are not paid according to the underlying system. If there is no available hierarchy of worth, there is no objective basis upon which to make claims of bias. Accordingly, ~ am convinced that the labor market must be the arbiter of basic rates of pay and that there is no other logical, economic, or practical basis for determining the values of jobs, be it in teens of "equal" or "comparable" worth. There are two general approaches that an organization can follow in relating pay scales for its jobs to those of corresponding jobs in the labor market. In the first place, if the jobs in question have identical coun- terparts in the labor market, the prevailing pay rates (or pay ranges) can be used directly for setting pay scales within the organization In thin Anna ^1~^ _~ -__ ~ -~"l~l~ll ~ use some rype of job evaluation system for setting the compensation rates forits jobs.2 The most effective 2 Age commInBt's UltCnm Sport (Trcim-, 1979) provi-~ CStC~VC discussion of pb cveluabon, "d I - 1 not discuss the process of job CV~6OD ~ ~ mono repon.
Minon0, Report 119 job evaluation system usually is one that accurately examines the content of jobs (skills, effort, responsibilities, activities, working conditions, etc.) and yields relative job values (usually point values) that correspond closely with (i.e., are correlated with) prevailing rates ~ the labor mar- ket. In effect this means that job evaluation systems (or the procedures for denying relative job values) should be based on, or related to, those 30b characteristics that gave rise to prevailing market values. To develop a job evaluation system that did not first examine (and compare) the content of jobs and, second, relate the job content to a value system that underlies our entire economy is not realistic, practical, or ecosom- ~caDy or socially desirable. The Use of Structured lob Analysis Procedures Lee other point that underlies the preparation of this niinonty report Is clearly related to the matter of determining the "comparability" of jobs either to determine their eaualitv (or inequality) or to determine their comparability in the framework of "comparable worth." Such comparisons are basic to the processes of resolving questions about the equity of pay at venous levels at which such questions might be raised, such as within an organization, when a complaint Is brought to the . ~ attention of a regulatory agency or in the courts of law. In this regard the committee lamented the probIcms of making such comparisons but chose to virtually ignore the very substantial amount of research and experience over more than two decades relating to the systematic, quantitative analysis of human work that has been demon-~ strafed to be of substantial value in making oompansons for many types Of jobs. The most directly relevant research and e~cpenence deals ninth what usually are called structured job analysis procedures. ActuaBy, the committee did include a passing, very cursory reference fin Chapter 4) to such procedures in saying: "Moreover, methods of systematic job analysis, such as structured job analysis and task analysis, ought to be explored for their applicability to job evaluation particular, the job component method of job evaluation (McCormick aDd Dgen, 1980:Ch. 18), which uses structured job analysis." Not only has the committee chosen to ignore structured job analysis procedures as they have direct relevance to the issue of comparability, but they also have failed to recognized the importance of such procedures to the fundamentals of job analysis per se, which is the foundation for aD job evaluation systems. Me committee report has alluded to the roic that job analysis serves in the job evaluation process. ActuaDy, job analysis can be the Achilles' heel of any job evaluation technique. Clearly, the impact of unrentable,
120 WOMEN, WORK, AND WAGES invalid, and biased job analysis information on the job evaluation proc- ess could lead to an "unfair" pay plan. Again, the committee failed to acknowledge the value of structured job analysis procedures in the realm of job analysis in general and specifically in the application of such procedures to job evaluation. Else bibliography to this minority report includes a limited sample of some of the research literature regarding the systematic analysis of human work, particularly that relating to struc- tured job analysis procedures. Many of these listed works clearly dem- onstrate the practical utility of structured job analysis procedures and support the contention that, for certain purposes such as comparing jobs with each other in quantitative terms such procedures are superior to conventional verbal descriptions. Several researchers have clearly shown how data obtained with structured job analysis questionnaires can be used for such key personnel administrative functions as job eval- uation. Structured job analysis procedures have two possible applications that are directly relevant to the interests of the committee, these two appli- cations being closely related. One application deals with the actual com- panson of jobs In teens of similarities and differences, and the other deals with establishing pay rates for jobs that would minin~ze possible differentials based on sex or race. Basically, structured job analysis procedures provide for the documentation of the content of jobs in terms of a set of job elements. These elements typically are descnptive of work tasks ("job~nented" elements) or of basic human job behaviors ("workermnented', elements) and are listed together in a job analysis questionnaire. In tlte analysis of jobs with such a questionnaire, the person making the analysis rates each element in terms of its relevance to the job, or, in certain instances, simply indicates whether the element does or does not apply to the job. (The reader interested in a further explanation of the nature of structured job analysis questionnaires is referred to Appendix A.) As indicated previously, one relevant application of structured job analysis procedures is that of comparing jobs in terms of their similarities and differences. At a simple jo~to-job keel, two or more jobs can be compared even by a visual review of the ratings given to the jobs on the venous job elements being used, as illustrated by the following hypo- thetical example:
M`nori~ Report lob Element Job A Job B Job C lob D a c d 3 4 4 3 O O 0 2 ~3 0 n 2 2 2 4 wow Ratings: 0 = low; ~ = high. 121 Jobs A and B are identical' job C is almost the same as jobs A and B. but job D differs markedly from the other three. Me fact that data for the job elements are quantified makes it possible to compare jobs in quantitative tens. Most typically, some statistical index of similarity is denved for each pair of jobs. In turn, such indexes frequently are used for grouping jobs into groups that have reasonably similar profiles of job element values. In the use of task inventories in the U.S. Air Force, for example, such procedures are used to identify "job types," that is, groups of positions that are reasonably similar in the combina- tions of tasks that are performed (Chnstal, 1974~. As another example, Taylor and Colbert (1978) obtained data with a "workermnented,' type of structured job analysis questionnaire used to study jobs in an insur- ance company, and they found 13 job families, each family being char- actenzed by a group of jobs with very similar behavioral components. The possibility of being able to use statistical procedures for comparing positions or jobs in terms of their similarities, and of grouping them into job types or job families, would seem to be substantially relevant in instances of possible discrimination. In this regard the use of job-on- ented questionnaires (task inventories) would be most appropriate in connection with the equal work concept if '`equal work" is viewed in the framework of the specific tasks of the positions or jobs in question. If the equal work concept were interpreted as embracing similarities in the basic human behaviors involved in jobs, however, the workermn- coted type of structured job analysis questionnaire would be appropriate. The second possible application of data from structured job analysis questionnaires that would be relevant to the charge of the committee th regard to their use in job evaluation. Their use for this purpose is distinctly different from conventional job evaluation methods in that the judgmental evaluation process is eluntnated, the job values berg denved statistically. Such a procedure is caned the job component job evaluation method (Jeanneret, 1980; McCormick, 1979:317-21; Mc- Cormick and Ilgen, 1980:37~78~. The procedure as typically corned out involves the following steps:
122 WOMEN, WORK, AND WAGES 1. The analysis of a sample of jobs in terms of an appropriate struc- tured job analysis questionnaire with job elements consisting of tasks or basic human behaviors, and usually working conditions. The ind~- vidual job elements, or statistically related groups thereof, can be con- sidered as job components. 2. Me derivation, for this sample of jobs, of money values for the individual components, in particular indexes of the extent to which the individual components contribute to the going rates of pay for the jobs. (This is a statistical procedure.) 3. The analysis with the structured job analysis questionnaire of sne- c'fic jobs for which evaluations are to be made. - - -r 4. The denvation of an index of the total monetary value for each such job. This is done by "building up" the total value for each job from the indexes of the relevance of the individual components to the job, In combination with the money values of the components as previously derived from the original sample of jobs as described in steps 1 and 2. (A more specific description of the job component method of job eval- uation can be found in Appendix B.) In line with the comments made earlier, if the concept of equal work is interpreted in terms of specific work activities such as tasks, the job- onented type of questionnaire (a task inventory) would be required in the job component method of job evaluation. Certain applications of this approach serve as illustrations' such as the study by Miles (1952) in the case of office jobs and the study by Tornow and Pinto (1976) in the case of managerial jobs. A variation of this general approach is suggested by Chnstal (1974~. If the concept of equal work were interpreted as applying to the similarity of basic human behaviors in jobs (as contrasted to work tasks), the worker-oriented type of structured questionnaire would be relevant. In this regard, for example, Jeanneret (1972) used such 8 questionnaire to place venous utility company jobs in several pay grades and then compared the average actual pay of men and women in each of the Mew" pay grades. In this instance he found a salary difference of SLOB a month in favor of men. In studies completed for another utility com- pany and for a savings and loan organization, similar comparisons also revealed appreciable salary differences (Ieanneret, 1978). (In these com- panies the salaries of women were subsequently adjusted.) The general undication from such studies is that a worlcermnented type of structured job analysis questionnaire can, as Jeanneret (1978) expresses it, "docu- ment the content of jobs without regard to sex of the Incumbents . . . and fairly evaluate jobs without regard to the sex of the Incumbents."
Minority Report 123 If the objective of a job evaluation plan is to derive estimates of equal pay for comparable work (as opposed to equal pay for equal work) the worker-onented type of structured job analysis questionnaire definitely would be Me more appropriate. Thus, it is believed that such structured job analysis questionnaires could serve as the basis for determining the "comparability of jobs if ultimately the law or the courts would provide the basis for "equal pay for comparable work" as contrasted with '6equal pay for equal work." In summary, ~ would like to emphasize the point that there have been significant developments in the past couple of decades in the develop- ment and use of systematic methods of analysis of human work and in the use of such methods for venous practical purposes such as the quan- titative comparison of jobs with each other, the identification of job types or job families, and job evaluation. There seems to be no question but that the nature and scope of these developments have~substantial potential relevance to the objectives of the committee. In Mew of this ~ fee] that the committee report is seriously deficient since it refers to such work with only a casual one-sentence comment. In my opinion the failure of the committee to include more adequate discussion of struc- tured job analysis procedures reflects the fact that the staff and most members of the committee were not sufficiently familiar with the de- velopments in this area over the past couple of decades and therefore failed to appreciate the possible relevance of such procedures to the objectives of the committee. Me two issues raised in this minority report would seem to be com- patible with each other. The job component method of job evaluation is of course based on the use of going rates of pay as the standard or criterion for Staining the money values of venous types of work behaviors, but at the same time the use of structured job analysis pro- cedures in this process seems to make it possible to "document the content of jobs without regard to sex of the incumbent . . . and fairly evaluate jobs without regard to sex of the incumbents" (Ieanneret, 1978~. APPENDIX A: STRUCTURED JOB ANALYSIS QUESTI ONNAI RES A structured job analysis questionnaire consists of a specific list of job elements that can be used in the analysis of jobs. There are various types of job elements that can be used In structured job analysis pro- cedurcs, although there are two types that are particularly relevant. In the first place, some structured job analysis questionnaires, commonly
124 WOMEN, WORK, AND WAGES called task inventories, provide for the analysis of jobs in teens of each of a number of tasks that might be performed by individuals within a given occupational area. Examples of such occupational areas are health services, office operations, automobile mechanics, and engineenng. Examples of tasks that might be included in task inventories are: types straight copy from rough draft; removes and replaces spark plugs; takes orders for meals Mom customers; and estimates costs of building ma- tenals from building plans and specifications. Task inventories have been referred to as "job-onented" questionnaires in that they provide for describing jobs In teens of the output or end-result of tasks. In the usual task inventory it is typically the practice for the job incumbent to indicate, for each task, whether he or she performs the task or not, and in addition, to indicate something about the degree of involvement with each task, such as the frequency of performance or the time spent on the task. Task mventones frequently are used as the basis for identifying `'job types" that consist of jobs or positions with relatively similar combi- nations of tasks. This sometimes is camed out with a hierarchical group- ing technique that involves the denvation of a statistical index of the degree of similarity of the tasks performed for every possible pair of jobs or positions in the sample being used. Such statistical indexes con- ceivably could be relevant in comparing the similanty of jobs about which some discrimination issue has been raised. Furthermore, the pos- sibility of identifying job types by statistical procedures might also have some relevance in connection with charges relating to discnmination. It should be pointed out that the use of any given task inventory would be restricted to the specific occupational area for which it was prepared. It is expected that there are certain types of occupational areas for which task inventories might not be feasible. The second type of structured job analysis questionnaire provides for the analysis of jobs in terms of more generalized, basic human job behaviors that transcend or cut across occupational areas. Such ques- tionnaires are referred to as "worker-onented" questionnaires in that they provide for analyzing jobs in teens that descnbe, or imply, the basic human behaviors involved In work activities. One such question- naire, reported by McCormick (1979, pp. 147~9) and McCormick, Jeanneret, and Mecham (1972) proxies for analyzing jobs in terms of each of 187 job elements. Some examples (in paraphrased form) are: uses visual displays (as a source of job information); uses measuring devices; arranges or positions objects in a specific position or arrange- ment; operates keyboard deuces; oonduas interviews with others; works under high-temperature conditions. In the analysis of a job with
M`nonty Report 125 this questionnaire the analyst uses an appropriate rating scale to indicate the involvement of the job incumbent with each job element. Vanous types of rating scales are used, such as the degree of importance, the amount of time involved, the "extent of use" of venous kinds of ma- tensis, and so forth. This particular questionnaire has been subjected to a form of factor analysis (specifically, principal components analysis) that identifies the job elements that tend to "go together" in jobs and that form what are caned job dimensions. Each such job dimension can be thought of as being based pumanly on the group of job elements that tend to occur common across jobs in general. Some examples of job din~cosions are: interpreting what is sensed; processing ~fo`'llation; performing handling and related activities; exchanging jo~related ~formation; being alert to changing conditions; and potentially hazardous situations. Collectively, these job dimensions can be viewed as reflecting the "struc- ture" of human work in terms of the basic types of bunnan behavior that are involved in work activities. With the analysis of a job using the structured job analysis questionnaire In question it is possibIc to denve a score for the job on each dimension. Such scores represent a "profile" for the job, and can be used as quantitative mdexes of the dimensions. APPENDIX B: THE JOB COMPONENT METHOD OF JOB EVALUATION Me job component method of job evaluation is based on the use of a structured job analysis questionnaire in the analysis of jobs. There are~ venous ways in which data from such questionnaires can be used in the denvation of money values of jobs, but the basic procedure referred to carlier is the one dealt with In this appendix. In actual use ~ denying indexes of job values the method involves two steps. In the first place the jobs are analyzed with the structured job analysis questionnaire being used. In the second place a statistical equation is used to denve an index of the total value for each job. The equation incorporates a weight for each job "component.'" The job components may be m~i- vidual job elements, or combinations of elements based on factor analy- ~s, the factors usually being caned job dimensions. ~ denying a total value for a job the "weight" for each component is multiplied by the value of that component for the jobs resulting ~ an anthme~cic '~prod- uct," for each component. These products for ad components are then added together to denve a total value for the job. Me central basis for the job component method of job equation Is in the derivation of the weights for the individual components. For this J
126 WOMEN, WORK, AND WAGES purpose data for a dangle of jobs are used. Each job is analyzed with the structured job analysis questionnaire, producing a value for each component for each job. In addition, information on the rate of pay for each job is obtained. Regression analysis is then used for the total sample to determine the statistical contribution of each component to the rates of pay for the jobs in the sample. From this analysis it is possible to denve the appropriate weight for each component. These weights can be thought of as reflecting the money "values" of the individual com- ponents In the labor market; that is, how much the individual compo- nents are `'worth" in the labor market. If, collectively, data on the "`ralues" of the job components of a structured job analysis question- na~re predict going rates of pay tenth acceptable accuracy for a sample of jobs, these money values can then be used as the basis for the esti- mation of total rates of pay for other jobs. In a sense, then, the central objective of the job component method of job evaluation is to develop regression equations that, by and large, reflect the approximate contributions of different job components to the market values of jobs. In the operational use of this method a regression equation based on a broad, vaned sample of jobs from venous ~ndustnes and geographical areas has been found to be reasonably ap- plicable in venous situations. However, it would be expected that, in the long run, the denvation of job values would be more accurate if the money values of the venous job components were denved from data on samples of jobs of different major types and within particular labor market areas. Although the job Component method of job evaluation has not as yet been used extensively, research and experience with it offer substantive evidence that could be used in many situations to provide the basis for the establishment of equitable rates of pay for jobs. BIBLIOGRAPHY Archer, W. B. Computation of group job descriptions Tom occupational survey data. USAF, Personnel Research L~bantory, PRL-TR~12, 1966. Archer, W. B. and Pruchtcr, D. A. The constn~ion, renew, and administration of Air Force job inventoncs. USAP, Personnel Research Laboratory, Technical l~oa~mcutsry Report No. 63 21, 1963. Icy, R. D. "d Mossbolder, K. V. ~ Axed methodology for detenn~ng similarities "d dificrcDoes among jobs. Pcrsormcl Psychology, 1977, 30, 363~73. Icy, R. D., Passino, E. M. aDd Lounsbury, I. W. Job analysis results as mfluenecd by ae% of is0mbcat "d hex of aD - t. Journal of Applied Psychology, 1977, 62, 411-16. Bachr, M. E. ~ factorial fnmewo~ for job "aiptions for higher~kvel personnel. Indus~l Relations Center, lbc Umveruty of Chicago, 1967.
Minority Report 127 Boast, R. R. and Cunningham, J. W. Systematically derived dimensions of human work. Center for Occupational Education, North Carolina State Um~crsity, 1975. Brumback, G. B., Romashko, T., Hahn, C. P. and Flcishman, E. A. Model Procedures for Job Analysis, Test Dcvelopmcat and Validation, City of New Yorlc, Dcpartmcut of PcrsonocI, July 1974. Brumback. G. B. and Vincent, J. W. Factor analysis of work-performed data for a sample of administrati~c, professional, and scientific positions. Personnel Psychology, 1970, 23, 101~7. Cam, M. J. The Samoa method of determining technical, orgen~tional, and commu mcational dimensions of task clusters. USN, Naval PcrsonDel Research Activity, Tech- nical Bulletin STB 68-5, 1967. Canter for Vocational Education, The Ohio State University. Directory of taslc in~catoncs Volumes 1, 2, and 3~. Cbristal, R. E. Stability of consolidated job dc~ptions based OR task m~coto~ icy information. USAF, Pcrsonocl Research Division, Air Force Systems Command, AFHRL~ 71~48, 1971. Christal, R. E. New directions in the Air Force occupational recomb program. USAF, Pcrsonocl Research Division, AFHRL, 1972. a~nstal, R. E. The United States Air Fores occupational arch project. USAF, Air Force Systems Command, Brooks Air Force Base, Texas, January 1974. Cragun, ]. R. and McCormick, E. J. Job inventory information: taslc and scale reliabilities and scale interrelationships. USAP, Personnel Research Laboratory, PRO 67-15, 1967. Cunningham, J. W. An S-O-R approach to the study of job commomlides relevant to occupational education. Center for Oct:upatiional Education, Now Carolina State Un~vcrsity. 1968. C~ngham, J. W. The job-cluster concept and its curncWar implications. Center for Occupational Education, Center Monograph No. 4, North Carolina State University, 1969. Cunningham, J. W., Tuttle, T. C.. Floyd, J. R. and Bates, J. A. Arc development of the Occupation Analysis Invocatory: an "ergometric" approach to as educational problem. Center for Occupational Education, Center Research Monograph No. 6, North Carolina State University, 1971. Fanna, A. J., Jr. Development of a taxonomy of human performance: a rc~cw of de- scnpt~ve schemes for human task behavior. American Institutes for Research, Technical Report No. 2,1969. Fenna, A. J. and Wheaton, G. R. Development of a taxonomy of bumaD performance: Zinc task charactenstics approach to performance prediction. Amp Institutes for Monarch, Technical Report No. 7,1971. Farrell, W. T. Hierarchical clustering: A bibliography. E~luadon of die Divide Colps task analysis program. California State University, Los Angeles, July 197S. Enroll, W. T., Stone, C. H. and Yoder, D. Guidelines for rc~rch ~ and design task analysis, Technical Report No. 4, Evaluation of the Mume Amps l-lo His Program. California State University, LLos A~geles, September HIS. Fine, S. A. Functional Job Analysis Scales: A Desk Aid. W. E. Upjohn Institute for Employment Research, Kalamazoo, Mich., April 1973. Dcishman, E. A. Development of a behavior taxonomy for describing human tams: a corrclational~c~perimcntal approach. Amcnc" Institutes for R~h, 1966. Fugill, 1. W. K. Task difficulty and task aptitude b@D~m=i ~ I"
128 WOMEN, WORK, AND WAGES and electronics career ficids. USAP, Air Force Systems Command, Brooks Air Force Base, Texas, April 1972. Gilbert, A. C. F. Dimensions of certain anny officer positions dented by factor analysis. U.S. Army Rcscarch Institute for the Behavioral and Social Sciences, Dceembcr 1975. Gregg, D. B. Identification of logistics officer job type groups. USAP, AFHRL-7~39, lg70. Gregg, D. B. An occupational survey of an airman career ladder: supply warchousing- mspection. USAF, Pcrsonncl Rcscarch Laboratory, Technical Report No. 62-19, 1962. Hcmphill, J. K. Describing managenal work. l~c Confercoce on the Executive Study. Educational Testing Service, 1961. Hcmphill, ]. K. Job descriptions for c%crutives. Harvard Busuless Review, 1959, 37, pp S3 67. ~canncrct, P. R. Personal conununication, 1978. Jcanocrct, P. R. Equitabic job evaluation and classification with the Position Analysis Questionnaire. Cornpcns~on Reavow, First Quarter, 1980, pp. 3202. AMACOM, American Managemcat Aviations. l-=nar, W. B. Ares methods for estimating difficulty of job tasks. USAP, AFHRL- TR-71-30, July 1971. Lends, I. M., Romashl~o, T. and Freshman, E. A. Dc~clopment of a Economy of human performance: evaluation of an abilities classification system for integrating and cacralizing research findings. American Institutes for Research, Technical Report No. 12, 1971. McCormick, E. I. Job Analysis: Methods and Applications. New York: AMACOM, American Management Associations, 1979. McCormick, E. J. and Ilgen, D. R. ludh~strial Psychology (7th cd.~. Englewo~od Miffs, New Jersey, 1980. McCormick, E. I., ~canncret, P. R. and ldecham, R. C. A study of job charactenstics and job dimensions as based on the Position Analysis Questionnaire (PAQ). Journal of Applied Psychology, 1972, 56, 3470. Mayo, C. C. A method for determining job types for low aptitude airmen. USAF, AFHRL-TR 69-35, 1969. Mead, D. F. sod Distal, R. E. Development of a constant standard weight eanat'an for evaluating job difficulty. USAF, AFHRL-~-70~4, 1970. Mead, D. F. Dc~elopmcat of an equation for e~raluadog job difficulty. USAP, AFHRL- TR-70~42, 1970. Mccham, R. C. and McConnick, E. ]. Tic use in job evaluation of job clemcuts sad job dimensions based on the Position Analysis Qucstionnairc. Psychology Department, Purdue University, Report No. 3, 1969. Melching, H. W. and Boucher, S. D. Procedures for constructing and using Task h~cn- toncs. Crater for Vocational and Tc~cal Education, We Ohio State Ll~vcrsity, Rcscarch and Dc~clopment Series No. 91, 1973. Mikes, M. C. Studies in job c~ralustion: a. Validity of a cheelc List for evaluating office jobs.Joun~lof Applied Psychology,l9S2, 36, 97-101. Morsh9 J. E. Analog WON Chariot. ~ paper presented at the A=cncan Psychological Association, 1963. Morsh, J. E. Evolution of a job mvcDtory "d tryout of teslc rating factors. USA, Per~onocl Research I^boratory, Tropical Repon No. 65~22, 1965. Morsh, J. E. Identification of job types ~ the personnel career field. I1SAF, Pcrsomcl Researeb Iaboratory, Technical Report No.6S-9, 1965. a -, ~
Minority Report 129 Morsh, J. E. Job Apes identified with an inventory constructed by Tclectromcs c - nears. USAF, Personnel Research Laboratory, Technical Rcpon No. 66~6, 1966. Morsh, 1. E. and Archer, W. B. Procedural guide for conducting occupational surveys in the United States Air Force. USAP, PCESODDCI Research I~boratory, Technical Report 67 11, 1967. Morsh, J. E. and Chnstal, R. E. Impact of the computer on job analog ~ the United States Air Force. USAF, Pcrsonacl Research I~bc~ratory, TO Report 6619, lo. Pass, J. J. and Cunningham, J. W. A systematic p~durc for estimating the attribute requirements of occupations. Report No. 11 of the Ergometric Re~:h and Dc~cl- opmcnt Scncs. Center for Occupational Education, North Carolina State Umvcrsity, a. Riccobono, J. A. and Cunningham, J. W. World dimensions denved through systematic job analysis: a study of the Occupation Analysis Invcotory. Center for Occupational Education, Center Research Monograph No. 8, North Carolina State University, 1971. Riccobono, J. A. and Cunningham, J. W. Work dimc~ons Denver through systematic job analysis: a replicated study of the Occupation Analogs Invcotory. Center for Oc- cupational Education, Center Rcscarch MoDograph No. 9, North Carolina State Um- vcrsity, 1971. Riccobono, J. A., Cunningham, J. W. and Boesc, R. R. austere of oocupations based on systematically derived work dimensions: an c%ploratory' study. Report No. 10 of the Ergometric Research and Development Scnes. Center for Occupational Education, North Carolina State VnjYCrS.itY, 1974. Silverman. J. A computer technique for clustering tasks USN, Naval Pcrsonacl Rcscarch Activity. Technical Bulletin STB 66-23, 1966. Silverman. ]. Nc~ techniques in task analysis. USN, Naval Personnel Rcscarch Activity, Research Memorandum No. SRM 68-12, 1967. Sprecher. T. B. Dimensions of engineers' job performance. Educational Testing Scnocc, Research Bulletin, 1965. Stone, C. H. Evaluation of the Manne Corps Teslc Analysis Program, Technical Report No. 16. California State UniYcrs.ity, Las Angeles, dune 1976. Taylor. L. R. and Colbert, G. A. Empincally denoted job families as a foundladon for the study of validity generalization: Study I1. Mine construction of job families based on oompany-specific PAQ job dimensions. Personnel Psychology, 1978, 31, 341-S5. Tcichncr, W. H. and Whitehead, J. Development of a Economy of human performance: Evaluation of a task classification system for gener~li~ng research findings Mom a data base. American Institutes for Research, Technical Rcpon 8, April 1971. Torno~, W. W. and Pinto. P. R. Tic development of a ~agenal job taxonomy: A System for dewing. classifying, and evaluating c~ccutive Motions. Fount of Applied Psychology, 1976, 61, 41~18. Treiman, Donald. Job Evaluanor': An Analytic Reavow. Istenm Report to the Equal Employment Opportunity Commission. Co~nec on Occupational Cl~ificat~ and Analysis, National Research Council. Washington, D.C.: National Academy of Sci. caces, 1979. Tuttle, T. C. and Cunningham, J. W. Affe~vc corvettes of - ~madcally dented world dimensions: Validation of the Occupation Analysis Invocatory. Ergometric r~h and development sends, Report No. 7. Ccutcr research moDo~aph No. 10. Cctltcr for Ocx:upational Education. North Carolina State University, 1972. U.S. Dcpa. unent of Health, Education, aDd Welfare. Major Oslo aDd }~kdgc chimers Involved in performance of electronic Cans' world, 1966. 1
130 WOMEN, WORK, AND WAGES U.S. Air Force, PcrsoQ~cl Rcscarch Division. AFHRL. Pro~ngs of Division 19 of the American Psychological Association. Division of Military Psychology Symposium: Collecting, yang, and reporting information Ascribing jobs and occupations, 1969. U.S. Department of Labor, Manpower Administration. Task analysis inventories: A method for collecting job information, 1973. Whcaton, G. R. Dc~elopmcut of a taxonomy of human performance: 8 review of clas- sificatory systems relating to testis and performance. American Institute for Research Technical Report 1,1968. Wiley, L. N., Jenkins, S. and Cagwin. L. P. Job types of communications officers, USAF, Pcrsonscl Research I^boratory, Technical Report No. 6617, 1966.