National Academies Press: OpenBook

Human Factors in Automated and Robotic Space Systems: Proceedings of a Symposium (1987)

Chapter: Sharing Cognitive Tasks Between People and Computers in Space Systems

« Previous: The Roles of Humans and Machines in Space
Suggested Citation:"Sharing Cognitive Tasks Between People and Computers in Space Systems." National Research Council. 1987. Human Factors in Automated and Robotic Space Systems: Proceedings of a Symposium. Washington, DC: The National Academies Press. doi: 10.17226/792.
×
Page 418
Suggested Citation:"Sharing Cognitive Tasks Between People and Computers in Space Systems." National Research Council. 1987. Human Factors in Automated and Robotic Space Systems: Proceedings of a Symposium. Washington, DC: The National Academies Press. doi: 10.17226/792.
×
Page 419
Suggested Citation:"Sharing Cognitive Tasks Between People and Computers in Space Systems." National Research Council. 1987. Human Factors in Automated and Robotic Space Systems: Proceedings of a Symposium. Washington, DC: The National Academies Press. doi: 10.17226/792.
×
Page 420
Suggested Citation:"Sharing Cognitive Tasks Between People and Computers in Space Systems." National Research Council. 1987. Human Factors in Automated and Robotic Space Systems: Proceedings of a Symposium. Washington, DC: The National Academies Press. doi: 10.17226/792.
×
Page 421
Suggested Citation:"Sharing Cognitive Tasks Between People and Computers in Space Systems." National Research Council. 1987. Human Factors in Automated and Robotic Space Systems: Proceedings of a Symposium. Washington, DC: The National Academies Press. doi: 10.17226/792.
×
Page 422
Suggested Citation:"Sharing Cognitive Tasks Between People and Computers in Space Systems." National Research Council. 1987. Human Factors in Automated and Robotic Space Systems: Proceedings of a Symposium. Washington, DC: The National Academies Press. doi: 10.17226/792.
×
Page 423
Suggested Citation:"Sharing Cognitive Tasks Between People and Computers in Space Systems." National Research Council. 1987. Human Factors in Automated and Robotic Space Systems: Proceedings of a Symposium. Washington, DC: The National Academies Press. doi: 10.17226/792.
×
Page 424
Suggested Citation:"Sharing Cognitive Tasks Between People and Computers in Space Systems." National Research Council. 1987. Human Factors in Automated and Robotic Space Systems: Proceedings of a Symposium. Washington, DC: The National Academies Press. doi: 10.17226/792.
×
Page 425
Suggested Citation:"Sharing Cognitive Tasks Between People and Computers in Space Systems." National Research Council. 1987. Human Factors in Automated and Robotic Space Systems: Proceedings of a Symposium. Washington, DC: The National Academies Press. doi: 10.17226/792.
×
Page 426
Suggested Citation:"Sharing Cognitive Tasks Between People and Computers in Space Systems." National Research Council. 1987. Human Factors in Automated and Robotic Space Systems: Proceedings of a Symposium. Washington, DC: The National Academies Press. doi: 10.17226/792.
×
Page 427
Suggested Citation:"Sharing Cognitive Tasks Between People and Computers in Space Systems." National Research Council. 1987. Human Factors in Automated and Robotic Space Systems: Proceedings of a Symposium. Washington, DC: The National Academies Press. doi: 10.17226/792.
×
Page 428
Suggested Citation:"Sharing Cognitive Tasks Between People and Computers in Space Systems." National Research Council. 1987. Human Factors in Automated and Robotic Space Systems: Proceedings of a Symposium. Washington, DC: The National Academies Press. doi: 10.17226/792.
×
Page 429
Suggested Citation:"Sharing Cognitive Tasks Between People and Computers in Space Systems." National Research Council. 1987. Human Factors in Automated and Robotic Space Systems: Proceedings of a Symposium. Washington, DC: The National Academies Press. doi: 10.17226/792.
×
Page 430
Suggested Citation:"Sharing Cognitive Tasks Between People and Computers in Space Systems." National Research Council. 1987. Human Factors in Automated and Robotic Space Systems: Proceedings of a Symposium. Washington, DC: The National Academies Press. doi: 10.17226/792.
×
Page 431
Suggested Citation:"Sharing Cognitive Tasks Between People and Computers in Space Systems." National Research Council. 1987. Human Factors in Automated and Robotic Space Systems: Proceedings of a Symposium. Washington, DC: The National Academies Press. doi: 10.17226/792.
×
Page 432
Suggested Citation:"Sharing Cognitive Tasks Between People and Computers in Space Systems." National Research Council. 1987. Human Factors in Automated and Robotic Space Systems: Proceedings of a Symposium. Washington, DC: The National Academies Press. doi: 10.17226/792.
×
Page 433
Suggested Citation:"Sharing Cognitive Tasks Between People and Computers in Space Systems." National Research Council. 1987. Human Factors in Automated and Robotic Space Systems: Proceedings of a Symposium. Washington, DC: The National Academies Press. doi: 10.17226/792.
×
Page 434
Suggested Citation:"Sharing Cognitive Tasks Between People and Computers in Space Systems." National Research Council. 1987. Human Factors in Automated and Robotic Space Systems: Proceedings of a Symposium. Washington, DC: The National Academies Press. doi: 10.17226/792.
×
Page 435
Suggested Citation:"Sharing Cognitive Tasks Between People and Computers in Space Systems." National Research Council. 1987. Human Factors in Automated and Robotic Space Systems: Proceedings of a Symposium. Washington, DC: The National Academies Press. doi: 10.17226/792.
×
Page 436
Suggested Citation:"Sharing Cognitive Tasks Between People and Computers in Space Systems." National Research Council. 1987. Human Factors in Automated and Robotic Space Systems: Proceedings of a Symposium. Washington, DC: The National Academies Press. doi: 10.17226/792.
×
Page 437
Suggested Citation:"Sharing Cognitive Tasks Between People and Computers in Space Systems." National Research Council. 1987. Human Factors in Automated and Robotic Space Systems: Proceedings of a Symposium. Washington, DC: The National Academies Press. doi: 10.17226/792.
×
Page 438
Suggested Citation:"Sharing Cognitive Tasks Between People and Computers in Space Systems." National Research Council. 1987. Human Factors in Automated and Robotic Space Systems: Proceedings of a Symposium. Washington, DC: The National Academies Press. doi: 10.17226/792.
×
Page 439
Suggested Citation:"Sharing Cognitive Tasks Between People and Computers in Space Systems." National Research Council. 1987. Human Factors in Automated and Robotic Space Systems: Proceedings of a Symposium. Washington, DC: The National Academies Press. doi: 10.17226/792.
×
Page 440
Suggested Citation:"Sharing Cognitive Tasks Between People and Computers in Space Systems." National Research Council. 1987. Human Factors in Automated and Robotic Space Systems: Proceedings of a Symposium. Washington, DC: The National Academies Press. doi: 10.17226/792.
×
Page 441
Suggested Citation:"Sharing Cognitive Tasks Between People and Computers in Space Systems." National Research Council. 1987. Human Factors in Automated and Robotic Space Systems: Proceedings of a Symposium. Washington, DC: The National Academies Press. doi: 10.17226/792.
×
Page 442
Suggested Citation:"Sharing Cognitive Tasks Between People and Computers in Space Systems." National Research Council. 1987. Human Factors in Automated and Robotic Space Systems: Proceedings of a Symposium. Washington, DC: The National Academies Press. doi: 10.17226/792.
×
Page 443

Below is the uncorrected machine-read text of this chapter, intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text of each book. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

SHARING DIVE q~SXS }BETWEEN PEO=E AND ~ ~ SPACE SYSTEMS Willian H. Starbuc3: DOT ARE 1~ CAPTIVE ADAGES OF PEOPLE AD A? M~r~i~'s capabilities Charge ~ y slowly, Whereas Juicers' capabilities here An fast~r~g~. The cost of a Dry eminent has dry forty percent per argon for cover thirty yeas;, arm Dry sizes have ~ even Are rapidly than that (Al~us, 1981; Toong and Greta, 1982). Mutation is have been accelerating nearly 25 percent yearly, the cost of logic haggle has been draping Dally rapidly, and the Elation work dorm Lath each unit of ~v has An rising thirty percent per am. Ash Are reliable and very ~ Weller. - ~ — ,. her a has b Use' interfaces and pricing larynges have Repaved considerably, es—:ially aver Me last decade. If hogan Beirut had evolved as rapidly as Darters sin e the Ad 1950s, the best r = Acid now finish a 26-mile marathon In 2.3 secorx~s, a bright student Ward Deplete all s~hoolir~ fag kinden3artQ to a AD. ~ a bit aver two days, r~rmal eater'; would consume ore calorie per month, and half of African families Ward be ~ Are than $141~000, 000 anally. The i~prover~nts In ~tiy Am;, sizes, and sums have generally ~ the At ~cimistic forests of yesteryear, as has the prolife~tic~n of is. ~f;1 l", however, have been the forecasts predictir~ that is would Shortly be able to imitate human beings. For example, In 1960 Since cptimisti~1ly ~a+ - that "Duplicating the problem solving and informati~-har~ling capabilities of the brain ~ not far off; it Scald be surprising if it were not Vised within the nest decade" (Sloan, 1960:32). is have not, in fact, develcged an ability to reason very much like people, and ~ ter s; ~ cation of h w ~ thought has had little success (Albus, 19813. When computers look most effective solving problems, the computers ,,-~^ quite different techniques than people apply (Weizenbaum, 1965; Wine grad and Flares, 1986~. For example, Newell et al. (1957) studied students' efforts to pace theorems in mathematical logic, and inferred that the students search for proofs, using heuristics that generally lead toward proofs kut do not Guarantee them. Challenged by such Park, Wang (1963) devised a computer program that efficiently proved all 200 theorems in the first five chapters of Principia M~thematica. Job-shqp scheduling affords 418

419 anther exile: &ientifi~manag~nt studies of human prc~uction schedulers led to the development of Gar~tt charts to portray graphically the activities of various machines, arm thus to help human s~h~ulers visualize the cascading implications of alternative assignments. Muters generate j~dh~p sc~h~ules by solving ~nteger-progra~r~ problems that no human card solve correctly without machine assistance. me differ; be~reer~ people and caters have an illusory quality, insofar as people ted to take prevalerlt human abilities for granted ark to notice rare or inhuman abilities. If computers did operate exactly like people do-~working at the same speeds, making the same mistakes, showing the same fatigue, complaining about unpleasant tasks, and so on--people would regard computers merely as inhuman labor. Computers most impress people when they augment human abilities significantly - by working silently and tirelessly, by calculating with dazzling spear, or by displaying toted consistency. But the quite real difference= between people and computers are persistent and profound. Rather than regard computers as potential Imitators of human beings, it makes better sense to look upon them as a distinct species a species that prefers different languages, reasons with somewhat different logic, finds comfort in different habitats, and consumes different foods. Computers are much better symbol manipulators an] much stricter logicians than pecgle; and computers are much more decisive, literal, precise, obedient, reliable, consistent, and transparent. Computers can act both much more quickly and much more slowly than people. If so instructed, computers will carry out utterly absurd instructions or they will remain completely calm in the face of impending disaster. Computers easily simulate what-if conditions; and they can extrapolate even the most farfetched implications of theories or conjec Ares. People, on the other' hand, possess brains that are so much mare complex than the largest computers that comparisons make no sense. These brains carry on numerous simultanecus and interacting processes' some of which operate entirely automatically. Without even trying, people prom=== visual and auditory data of great complexity. People can shift levels of abstraction freon detail to generality and back, they separate foreground images form background images, they distinguish patterns while remail aware of conks, ark they attend to important or unusual stay i while ignoring un~rtant or routine stimuli. Peccable halve quite extensive pries that possess ~anir~ful structures; ark if they have relevant information In their Brie=, r~1 = Bestial 1~, Urn ;+ wry hem ma" bestial lair Iffy ; I ~_~ ~~` ~~— a_ A_ ~~ ~. ---a. _` _~_ __. People can Berate with imprecise ark somewhat in~lete plans, and they can extrapolate Bed pat experiences to navel situations while Agonizing that they are indeed Operating outside ache limits of their direct experience (A11en, 1982; Dreyfus and l~reyfus, 1986; Mbray, 1986; Reason, 1986; Winograd and Flores, 1986~. Perhaps most importantly, people are more playful than Muters and better at making m~sta~s. Whereas computers obey instructions literally, people often ignore or forget instructions, or interpret them loosely. Not only do people tend to deviate from plans ark to

420 test the limits of assumptions, but many human perooph~al skills and response yes Repel on Ring deviation fray e~ations or goals that may be evolvir~. Sometimes, people begin to doubt even their most basic beliefs. main, people generally expect to male vistas; and to learn f, =~ them, arm Dative people may be very good at learning freon mistakes. If they have sufficient time, people can learn to correct then stakes arm they can reprogram themselves to take advantage of hi situations. Although computer; also conserve arm react to deviations, ~ have not yet exhibit huh capability to reprise goals for themselves, to reprogram themselves, or to question their As tic premises Valiant, 1984). Raters must be toil to leant from they experiences, and efforts to enable aches to Hearst have, so far, been restrict ~ very narrwr domains of activity. Also, Muters are good at not making mistakes in the fist place, so they have less need to learn fee., mistakes. People are, hammer, pretty diverse and flexible. Sane people can learn skills and perform thy; that other people find impossible; arm since MA can chase freon a large pool of applicants, the extreme capabilities of e~ccepti~al Maple Me more important In space systems than the average capabilities of Opine people. me people Do Berate space trysts first receive thorough training, so their deficits of ir~rierx~ Chid be small; but this training itself may i ~ e serious liabilities, such as a tendency to rely on well-practiced habits in Rachel situations. Ire people are flexible and complex, they often surprise scientists and systems designers: People may change their behaviors significantly in response to 06bensibly small environmental changes, or people may change their behaviors hardly at all in response to apparently large environment=] changes. How people react to a situation may depend quite strongly on the sequence of events leading up to that situation, including the degree to which the people Ace themselves as having helped to create the situation. Accurate statements about ~crosoopic details of human behavior rarely prone accurate as statements about general, ~acrosccpic by rival patterns, or vice versa. For exa=le, experimental studies of people who Ale being paid law haIrly wages for making repel ted choice between two cleanly defined, attract synods that have no implications for later events probably say lithe abed he an behavior In Pro-life settings where actions may have per~;istent arm Dismally significant I= arm Me actors may no even perceive ~chenselves as having choices. Tersely, broad generalizations about the by riors of May people ~ diver~;e si - lotion probably say little about the behaviors of carefully Ale people who are performing Unusual tasks in Dish they have great experience. The research issues that are important for designing human-ccmputer systems son to be <ones =ng Be proper balances amar~g Using advantages and disadvantages, rather than ones demanding new concepts; arm the best resolutions of these issues are certain to shift as computers acquire greater capabilities. Consequently, I will not attend to state any generalizations ~ the proper dividing lines between Inman arx] cc—ices responsibilities in E pace systems, and ~ am

421 root advocating any research aimed at describing human capabilities general The designers of space systems should not Kept on general theories, but Should test fairly realistic ~c-pps of interface=, her ~ re, art sof ~ are, with people who are as well train ~ arxt as able = real astronauts and controllers. The designers should also investigate the sensitivity of performance measures to small variations in theft designs (Gruenenfelde' and Whitten, 1985): Do small design changes produce large changes ~ performance? Both to improve the quality of designs and to improve users' acceptance of designs, experienced astronauts and controllers should participate ~ the designing of interfaces an] systems; and because early decisions often constrain later mcdifications, astronauts anS controllers should participate from the beginning of any new project (Grudin, 1986~. PEOPLE INTERACTING WITH CCMEUTERS Tbday's computers cannot imitate people very closely, but the differences between people and ccmput~'s imply that combinations of the two can achieve results beyond the capabilities of each alone. For that reason, NASA should devotee research effort to improving the interactions and synergies between pecgle and computers. Five research topics seem especially interesting and important because (a) ~ can see how to pursue them and (b) ~ can foresee some research findings that would translate directly into improved performances by space systems. 1. Fostering Trust Between People and Expert Systems 2. Creating Useful Workloads 3. Anticipating Human Errors 4. Developing Effective Interface Languages 5. Using fitful Interface }$e=Eihors Fostering Trust Between People and E ~ Systems Decision-support systems are co Touter programs and data bases that are ~ tended to help people solve problems. Some decision-support systems merely afford then r users easy arouse to data; other decision-support systems actually pro pose solutions, possibly basing these proposals on data supplied by their users (Woods, 1986b). Expert systems are decision-support systems that attempt to embody the specialized knowledge of human experts. m ear proponents argue that expert systems can, in principle, make specialists' knowledge available to nonspecial lets: every CPA might be able to draw upon the combined expertise of several tax specialists; every genera] practitioner might be able to make subtle diagnoses that reflect advanced training in many specialties. Expert systems might perform even better than human experts: Cc~put~s may be able to Obtain data ~t would be unavailable to people (Burke and Not, 19871. Muters' huge marries and high Speeds might enable then to

422 investigate Ire alternatives or to take a~mt of more contingencies than people consider. Computers may also adroit sew of the apical errors to with people typically fall prey, arm thus may draw same infer that pec$'le wed miss (Bcibrocr et al., 1986~. Advocates of statistical Elision theory value is' ability to adhere Elite Tricky to such foveae. Some perusals wick have stern; fonalllating Cations arxt people then sc~cq these rations and deciding whether to adept =en (alike ark Normals' 1987; Dreyfus and ~yf=, 1986; Woods, 1986a, 198Eib). Not everyone holds an optimistic view of exert systems' Gentian. Star~f;11 ark Waltz (1986:1216) remarks: '~e-bas~ expert systems ... tow to fail - Ply for problems even slightly outside their area of expertise ark in unforeseen situations." ~yf~= arxt Dreyf~ (1986:108) have argue that human experts do not follow decision rules but instead they ~ "ache actual ~ of tens of chooses of s;i~tions", ark that "If one asks the experts for Naples one will, effect, force the escort to regress ~ the level of a beginner ark state the nines he still r~ but no Ever uses. " Consequently, Snafus and ~yfus (1986:109) predicted "that ~ any do n Rich people e~i):'it holistic understar~ir~, rho systems based upon heuristics will consistently do as w~1 as experienced eats, ever if those experts were the informants who provided He heuristic rules. " Dreyfus and ~us' critique may be valid. Dutton and I (1971) Spent six years studying an expert production Ruler nab Charlie, includli ~ cue full y-:.r investigating his pr ~ are for estimating how much p m auction time any schedule represented. Charlie estimated time by using the relation: Production Time = Schedule Length / Speed 'We gradually were disabuse] of the idea that Charlie has a ccmpuLation prcac5ure for sF-C~ and were ~~nvinoed that he obtains his speed estimate= by a table lock-up. A hat is, Charlie has memorized the associations between speed and schedule characteristics, and he looks up sews in his merry in somewhat the way one looks up t-]eghane n~s ~ a diary. In Air interviews;, Charlie ta~ as if the existence of a Incitation procedure was a Noel idea, intriguing to conflate but diffi—fit to conceive of. He thinks of He s~= his table as discrete Embers distal leaf Hen a law series of he experiences. Mt~ he can interpolate ark extrapolate th-~ Cars implyir~ that He stored speeds Bust be Specific exiles from a systematic family of rompers—he dints the interpolated values and speaks of them as hypotheses to be testy in application. the story vales - are so One Are reliable Hat they It be a different 1.,: ~ ~¢ : am ~ ; ~~ =1 - ~~ - he_ V. ~~V=~V~1 me' ala ~ fact/ Belie ~ at, for a large proportion of his table entries, specific r~ situations in which the costar was Bunter ark the sky ~nred. the Only is that he does not so Cement, apparently, are those appropriate to situations arising almost Lily" (Dayton err! Startup, 1971: 230) .

423 We ~~1cula~ bat Margie he Prized approx~t=1y 5000 prc~uction Is courting ~ various si—lotions. BE we also discover" bat C~arlie's pr~uction-~ estinm-P~ cauld be predicts quite accurately by a simple Ian-=' Nation that had a ~ni~ful arm generalizable interpretation fir. tense of the physics of the production press. Rather than thousands of machine Is, this linear elation Firm only a few hundred parameters. mu=, we could state a procure that was simpler than the one Margie used; and because this artificial pa ~ ure had a Eibysi~1 interpretation, a user could mare confidently extrapolate it to novel production situations. Cane of the.best-~n e~pert-system projects not only precut - a heuristic program, DENDRAL, but also led to the development of an efficient algorithm for generating molecular structures (Bennett et al., 39831 . Evidently, the heuristic program has received little practiced use whereas the algorithm has had much (Dreyfus arm rlreyfus, 1986) . One Ivies question is: ply must expert systems closely resale human ems? The proponents of eat systems typically equate expertise with human beings, so they see imitating hen expertise as essential to creating expert System;; and Fife critics focus on Me differences between emoters arrt people. Yet, computers possess different abilities than people. Outer pr~ra~i~ efforts that have begun by imitating human behavior have often ended up unwire techniques that made no pretense of imitatir~ human behaviors; arx! erasers and scientists have Reprised, without imitating human expertise, many techniques that enable computers to exceed ~ best of human capable ities. Other questions arise con ~ people's willingness to depend upon computer-based expertise. Collins (1986) mtervie wed actual and potential users of several widely known expert systems for accounting, chemical analysis, mathematics, medico diagnosis, and c~ter~nents ordering. She fat only one of these expert system that has active users: the one for ordering ~ cx~or~nts PRO. It he straight-forward logical presses and it draws rx' sibtle inferences; it vainly helps sales persormel forget no details when they fill ~ orders, and the -Cares personnel said they appreciated not having to waste Ached time worrying about details or waiting for an ~ a human exam. It may be relevant that the . users of this system sold computing equipment. Concerning the other expert systems, potential users expressed considerable distrust, of other human e ~ as well as computers; and the potential users may view these systems as threaten m g their own expertise. However, the people who actually participated in creating these systems said they do trust them and Lucid, but do not, use them. Collins inferred that trust in an expert system comes either from participating in the design purr=== or from being able to change the system to reflect one's own expertise. This inference meshes with the general pattern of psychological research, but neither of these options was available to the cc~utingff3pipment sales personnel, JO were the users voicer Me grit trust ~ an expert system.

424 duplex issues surrour~ ache idea Mat a user Should screen an Bert sys~n's rations and decide wrencher to adept then. If an expert syst~n draws the sad inferences that its user would draw and if it r~s the sad actions that the user wed choose, that user will racily learn ~ trust ache ~m. Such Moms to be ache ~ with the en Dyson for ~r~ents ordering. Such a sy~i~cem may relieve people freon having to perform boring or easy work, kilt it alas very little to a user's mt~ll~ual capabilities, whereas In principle, computers' precise logic are extensive computation capabilities and the incorporation of exceptionally high-quality expertise sight enable expert systems to draw substantially better inferences than their users and to choose distinctly better actions. Yet a user is quite likely to distrust an expert system that draws significantly different inferences and that chooses significantly different actions than the user wend do. If the expert system also us-= a ca~putatic~nal pr~e that diverges quite dramatically from human reasoning, the system may be unable to explain, in a way that satisfies users;, shy it draws certain conclusions and not others. Dish users; may never discover whether an expert system Is making good r ~ ations or bad ones. this calls to mind the experience of a manufacturing firm that installed one of the first computer-based systems for job-shcp scheduling. The system's creators promised that computer-generated schedules would produce considerable savings ~ comparison to human-generated schedules. The factory's managers, however) were not entirely sure of the goodness of ccrputer-generated schedules, and they wanted to minimize the poplin insult to then hen production schemers, so the managers told the schemers to follow the c3~uter's Sedations as long as they agreed with them, but to substitute their clown judgement Ben they thought the cc~uter had made bad r~ations. An evaluation conducted after ore year Eyed -that the combated system had yields no imprc~vemen~ whatever. But r~ may be able to suggest same answers to then issues, at least in part; and god design may be able to resolve Alien: Exit systems, even the canes that Cannot r~i2~1ly explain the ring that 1~ them to mace Main rations, Child be able to explain By they believe their rations to be good. Pepple So cannot formulate a good = rustic may be able to recognize a good Vatican or a bad one, and people do satires ~ze their awn l;,nitations. At least sane of the people To manage factories have lean ~ to trust ~ ter pr~rmns for pa ~ action scheduling or inventory control even though the=" people could nck themselves generate the computers' solutions. the foregoing observations highlight the practical significance of research about the factors that influence people's trust in Hers' expertise. In what way should a decision-support sysbem's know' edge and logical rules fit each user individually? Given cpportNnities to tailor int~fa~= to their p~;onal preferences, inexperier~ users may design illterfa~= poorly (I=nais arx] [Knauer, 1982): Do users trust system; more or less when tailoring is Ed until the users gain considerable experience? How do task characteristics affect a

425 userls wills to trust a decision~ort system? In mat cir~nstances disc a user decide to Angst a spur system that captures me Repledge of ems Woo the user does not knew E~;ona~y? bat kiss of experiences lead a user to trust a decisiq~rt system bat the user regards, at l=~t partly, as a blamed? That kids of friends encourage a user ~ see a decision~port systems I;~n;~cations and to ove~icle bad r~=n~ati~s? Cnea~cing Useful Workloads A=oma~cic~n Is to malce chum; responsible for Rhine, -say Dams arm ~ leave the Tonne, difficult tasks for people. Cone ran for this may be me Option that r~onr~tine harks are interesting ark ballerina, arm Is worthy of hogan attention, w~P~as routme tacks appear -my awl ~int~ting, arxi so ante to people. Bit a Ore important reason may be the practical id bat designers can figure cut how to animate r~u~cinized activities whereas they carom effectively automate activities that vary. mis division of labor pros the Constance that, as automation pry, people's work Ames more arm more diver~;e and unpredictable arm it ta~s on rave am Ore of an ~~ fire-fighting character. At the say Tim, cutting people out of rout he tasks isolates them freon ongoing information a ~ ut ~ at Is happening and forces them to acquire this information while they are trying to perform nonroutine, difficult tasks. The human controllers in a system may not even be warned of gradually developing problems until the system exceeds critical limits and alarms go off OWeiner, 1985). Thus, people's w ark grows less do-able and more stressful (Senders, . 1980) ; and eXcrene stress and extreme time pressure may cause people to do poorer work and 1~== of it. In many tasks, automation also inner= the dhort-term stability of me variables used to monitor perforce; as Weiner (1985:83) put it, llautomation tunes At mull errors and Realms opportunities for large canes." De Geyser (1986) has suggest that this dhort-term stabilizatic~n causes the human operators to Shift face an anticipation logic to a recovery logic: instead of keeping track of events ark trying TO manage there, the Orators wait for significant urxiesirable events to ~~ a. Furthermore, "At the highest automation stage, the production Astor he only very Sketchy cooperating images of process arm installatic~n.... He will not make a huge investment ~ c~vaticm, i, judging, est~hli~hing relationships, gathering of data without being certain of its usefuines=. The operator does not invest psychologically in a role With escapes him" (De Geyser, 1986:234-235~. Hence, De geyser et al., (1986:135) have advocated that "the person still play an active part in the or~oirg activity, not because this presence is rehire, but because it autanatical ly keeps he person up to date on the current shale; of the system, the better to rid if an evens y situation develops." This seems a plausible hypothesis, but an equally plausible hypothesis would be that operators

426 r tend to work Isis ly when they are performing ~e kits of activities Mat card be automated. De Geyser also t however, potted cut that serious eminencies call for as not automation as possible because they produce extreme time pries, ~r~ly complex problems, and extreme dangers all of which greatly Engram the capabilities of human Repairs. Of carte, people are Utterly unable to r~ as quickly as scam eminencies I. -this poses a Ca~22. As long as the desigr~rs of a system have sufficing urxi=5~i~ to be able to prescribe has the system dhauld respond to a serials emergency, they ~ha~lcl irx~orporate this underset ~ the sys~n's automatic responses. But such complete und~star~i~ Uphold imply that the aromatic system works so well that a plar~-for serials emergency rover occurs. Consequently, when a Trials eminency does arise, is not design error one prominent hypothesis abaft its cause, and dens that hyp ~ heals not render suspect the diagnostic information being produced by the system? Any system-design process establishes a frame of reference that identifies some events as relevant and important, and other events as irrelevant or unimportant; and a cost-effective system monitors the relevant and important events and ignores the irrelevant and unimportant ones. But this is likely to mean that the system lacks information about sane of the events that produce a serials emergency, and Me incomplete informaticm that the System does have available may well lead human diagnosticians astray. Treater, hen operators who participate contirma~sly in a system might grow so famniliar with the system and its current sta== Tat Hey overlook an~rnalies AL lack the cibjeccivi~ to Newport effectively to a serious er~enc~y. I~yir~ to diagnose the causes of an we emergency AL to "relcp remedies, human Orators must ur~erstar~ Is are ather ~nadhines Icily why, which implies that they are quite comfortable Off Inters are with Me cat newels Hey in orporate; but on He Other hart, human Orators must distrust they computers; arxt IBM Hers sufficiency to be able to sift c~q#generated information with skeptics eyes. sim; 1 arly, canfider~e in Fir tsainir~ can help people remain cello in an emergency, but canfia in Heir training also blinds people to its ~hort~ai~. It Anus seems likely Hat the people who do He rest good in ~nPrgencies have an ability to discard there preconceptions arm to loo]: at sits from resew points of vi~r (urchins arm Aphids, 1959; Wat21awi~k et al., 1974). NASA Should investigate the degrees to which such an ability varies among people and can be predicted or taught. WorKL=~-c vary in duration as well as irr~nsi~r. People can cope with very it workloads for Art periods, yet Hey experience stress frma federate workloads that persist for leg periods (burner and Mask, 1984~. Sane pysiologic~1 reactions to stress, such as ulcer; arid ~3n~bility to infection, tale time to develop. thus, the short - duration shoe flights do not afford a good basis for for~sti~ the workloads ~ be experienced con lord—urination tars an a space station. NA5A Child contirn~e to investigate He workload

427 experiences Id fun long stays in confined spans such as Arlta~ca, Sealab, and nuclear shrines (Blush, 1984). Anticipating Than Errors Overloading causes people to mad errors, but so do boredom, inatter~cion, and indifference. Than errors are botch prevalent and inevitable (Senders, 1980), and Mary human firs are desirable despite . . . _ . . _ .. . . . . . . :nelr Is. People experiment, and she of their expert - nts turn salt badly. People deviate fern their ~~cructions, and some of these deviations have bad consequences. . ~ . . ~ . ~ Norman (1983, 1986) and Reason (1979, 1986) have initiated research into Off causes of errors and ways to prevent or correct them. Norman, for inset, distinguished errors in intention, Rich he called misword, frcm errors ~ carrying out intentions, With he railed slips. He classified slips according ~ their saucers, and then sought to prescribe reties for variants slips. Norman's categories and prescriptions. Table ~ lists same of P~nizir~ errors' importance, NA~'s Than Factors R - earn Division is currently conducting some well-thought-cut research on error-~etection and on error-tolerant systems. Error-detection systems could warn people when they appear to have omitted actions, to have acted cut-of-order, or to have taken harmful actions. Error-tolerant systems would first detect human errors through unobtrusive monitoring and then try to Icy them. This research has much to ~ it. But scone errors are very costly to tolerate, and scene errors are very costly or impossible to correct. So human~tcr systems Should also try to predict human errors ~ order to make serious errors unlikely in advance (Schneider et al., 1980; 5hneid~man, 19861. and more errecclve Bean cure, and rcs~rch on error pr~entlon But usefully cc~ler~nt the current projects. . ~ ~ _ _ _ That is, prevention -may be cheaper Of ~r:;e, all hour systems egress scale assertions about ~e'' hen participants. ~ . . ~ These assumptions have nearly always been 1 - loch; arm cney nave nearly always been static, insofar as the assertions have not Charged ~ response to pepple's actual by riors (Rouse, 1981; Turner ark bused, 1984~. For many -racks, it would be feasible to explicate fairly accurate Gels of people. In fact, newels red red be very accurate in order to make useful predictions or to suggest There adaptability to people's actual behaviors ~ght pay off. is might, for example, predict that people who respond to stimuli q~iblcly are more alert than people who respond slowly; or they might predict that experience people would resporx] more quickly than ~ienc~ Ones; or they might predict that people would be Ire likely to behave in habitual ways than in un~1 ways; or they might predict that people would be 1-cc concerned about small discrepancies den huh activity is Burring. n~ on a review of human-factors reseat, Simes arrt SirsJ;y (1985) hypothesized that:

428 ABLE ~ Scam Error Categories arrt Prescriptions Forming the Wrong Intentions Made errors: misclassifications of systems' Awes Ascription errors: a~bigua~s statements of intentions Misdiagncees: Eliminate ~es. Give better indications of medic. Use different Unarms In different rides. Arrange controls ~ly- Give contr~ls distinctive Bihar. Mike it diffirtllt or infusible to take actions that have serials, irreversible hem;. Suggest alternative explanations. Point At discrepancies that might be c~rer100k". Activating the Wrong Behaviors or Iriggering Behaviors at the throng Times Onissi~s . Capture errors: very familiar behaviors replace 1-C~ farnil;~r behaviors Metier actual behaviors where similar be rior so diverge. Redid pale of un~leted actions. Minimize a~rerlaE~pir~ Warrior;. Same:: Nomlan (1983, 1986) experience or frequent ,,-~ of a Aster System ~#S pooled need for insatiate f~dk (closure) , experience or frequent BEEP d~#s the importance of hen limitations in information ~xssir~, experience or frequent ~~=- Eases the in pact of sensory overstimulation,

429 · bask cC~nplexi~r =~ases Eerie - people's Ned for India f~6k, · task complexity increases the importance of human limitations in the information pressing by inexperience people, and · task cc1mplexit~y ==eases tile impact of sensory o~r~;timulation. As N~iA's h~nan-facto~ scientists well underseas, is that predict, den, and r~r human errors raise issues away who is actually in control. then should people have He right to experiment or to deviate front r ins~cructions? Develc,?ing Effective Interface Images ~Nnicat~on between people arm Muters may rile Fornication between people who came from very different Marcus, say a tribesman from the K~ahari desert: and a whiz-kid maff~tician Frau Brooklyn. Mouse ~t~; do differ fray people, the people Ho interact with ~u~; need to remain aware of these differences, and the irterfam~s for human~uter interaction should empire users of these differences. Ihis need became clear during the 1960s, En Weizer~amn created a pram, PLAZA, that Mote in English. =.TZA had apse no urxterstar~ir~ of the topics dBase With it Fever. ~stead, it imitated blindly the vagaries of He people Off when it comers; in effect, ~T.~ZA merely repeats people's words back to achy. Yet Weiz~ibaum (1976:6) observed: "I was startled to see how quickly arm hat very deeply people conversing with tF~TZA-1 be Optionally involved With the Muter arm how unequivocally they anthr~r~ihized it." Weizenbamn's Ore colorful examples concerned people who did not have close acq~In~cance with Muters. Nearly all of Of resorb on h~nan~ter interaction has face on people who laid thorough training and who had little experience winch is. Although such research fit can benefit the Feign of training pr~rans, design characteristics Hat have s;trar~ effects on retrim== may have negligible effects on expert users, so ~st of th=~ firings may net: extrapolate to the well-trairx~ arm experience operators of Apace systems. There is need for sties of w~-trained arxt e~peri~ users. Sheppard, Bailey, arx} their colleagues (Sheppard et al., 1980, 1984) have run experiments winch professional priers having several years of ex ~ rience. The first chore ~ riments involved pa ~ Cans or program specifications that were stated either in flowchart symbols, or in a constrained program-design language, or in ~ refully phrased, normal English. m -~= experiments asked experienced programmers to answer questions about program specifications, to write and debug programs, or to correct faulty programs. The fourth experiment omitted flowchart symbols and substituted an abbreviated English In which variables' nares replaced their English descriptions; arm ache pars were asked to am instructions to pr~rarns. Table 2 summarizes the results: Normal English turned cut to be consistently inferior, and the pr~ram~esign Language proved consistently superior.

430 T ~ 2 How Experienced Programmers' Performances Vary with Different Languages First experiment: answer questions about program specifications Normal F1 ~ t Program-design English Symbols language Time needed to answer: Forwar5-tracing questions 45.9 37.6 35.1 Backward-tracing questions 46.8 37.6 35.8 Input-output questions 42.9 39.4 41.0 Percent of programmers preferring 14 33 53 Second experiment: write and debug pro grams Normal Flowchart Program-design English Symbols Language T.ime needed ~ write and debug programs 29.7 23.9 20.5 Factor transactions before solution 37 39 32 Attempts before solution 3.0 2.7 2.2 Semantic errors 2.4 1.4 .8 % of programmers preferring 6 35 59 Third experiment: correct faulty programs T;~- needed to Normal Flowchart- Prrgra=-design English Symbols Language correct faulty prim 18.7 14.2 14.5 Attempts before solution I.9 2.2 1.9 Percent of programmers p ~ fearing 33 34 33 Fours experiment: edify and debug pr~rmns Normal Abbrexriated Program~esign E~glidlh English Iar~uage Tine no to n~ifir arxt dog 28.1 26.6 25.0 Semantic errors .9 1. 3 1. 0 Percent of programmers preferrer 18 32 SO SCENE: Shard et al. (1980, 1984)

431 C>ne liability of a natural language such as E~gl;~h is its general)=: Because ~roc~laries are large arm linguistic structures are flexible, math amd3iguit~, Is each words phrase/ at sentence. Sedans can mat statements that nean aft anti - , or nathi~. Even a z,3stricted natural language, probably because it Isles unrestricted natural lar~uagel may Bake users pertain fat Ire Eve legitimate arm fitful to the catheter system (Jarke et al., 1985: shneiderman. 19861 . Anionic art unused Alexis create noise. Both peccable art Inters Gibson information faster art mare accurately when their interactions make good use of th ~ s, Churning, and sequences (Badre, 1982; Sines and Sirsky, 1985~. Overall themes can help people or Dumpsters to predict what information to expect and what information is important. Effective chinking aggregates information into batches that have meaning within the context of specific tasks. Effective sequencing presents information in a familiar, predictable order. Themes, clunking, and sequence= can improve communication in any language, but they may become move important when a language Hal more generality. A second liability is that natural language evokes the habits of thinking and problem solving that people use in everyday life. Green et al. (1980:900-901) remarked, for example: "The fundamental strategies of parsing used by people seem, ~ fact, to be aimed first and foremost at avoiding parsing altogether , _ (i) if the end of the sentence can be guessed, stop listening; (ii) if semantic cues or perceptual cues (boldface, indenting, pitch and stress in speech) are enough to show what the sentence means, stop parsing; (iii) if syntactic signals (and, as, -ly, et<:.) are available, use then to make a guess at the sentence structure; (iv) if there ~ no help for it, make a first At at parsing by cementing together the closest adaptable pairings~noun to the nearest vend, if to the next Lichen. etc.: (v) only if that first Shot fails, try to figure cut the structure by matching up constibaerlts properly. Not unt;1 Step (v) d~= Die human start to passe ~ a marker alythi~ like the Outer scientists' idea of passing, and the phrase 'figure At' has been used advisedly, for by the time that step is reached people are doing something more like problem solving than routine reading or listening." _ . · . Information displays can improve comprehension by offering Symbolic and/ especially/ perceptual cues that help people to interpret messages. However/ designing good displays is made complicated by the potentially large effects of Overtly small cues. En a study of a command language' for instance/ Payne et al. (1984) found that users' errors dropped 77 percent when the operator words were displayed in upper case and the operands were displayed in lower case, thus providing visual distinction between the two categories. Further, changes that improve performance in one context often degrade

432 performance in another context, and ~ ges that improve one dimension of performance often degrade another dimension of performance. A flowchart, for example, may help users to trace forward to the consequences of some initial conditions but it may impede their backward inferences about the agents of some terminal conditions (Green, 1982). A third liability may be that natural lanquaqes lead users to assume that computers' reasoning resembles human reasoning, whereas artificial programming or query languages remind users that cc~puters' reasoning differs from human reasoning. m is suggests that language= resembling nature] ones ~ ght be more effective mania for communication between people and cc mputers in contexts where the computers closely simulate human reasoning and understanding, even though artificial languages might be mare effective ccmmunication media in applications where computers deviate from human reason mg. Unstudied so far are the interactions between social contexts and interface languages; virtually all studies of interface languages have involved people working on backs that they could perform alone. Yet space systems create strong social contexts. The operators talk with each other while they are interacting with computers: Series between people instigate queries to computers, and mes ages from computers become oral statements to other pecgle. De Bacht~n (1985) found that sales personnel who were interacting with a computer and customers · . · . . . . . . si~canecuslv areatlAr Dreferrm an Interface that allayed ones to Ease - ~ , ,,_ _, ^, _ ~ queries On rather free sequence and phrasing. Thus, interface languages that approximate natural languages ~ ght turn out to be more vane An space systems than in the situations that have been Died. Us mg Pe#ningfu1 interface Metaphors One very significant contribution to human-cc mputer interaction was Xerox's star interface, thin derived from many years of rest tsy many regear&hers. The Star interface embodies a number of design principles that evolved from experiments with prototypes. According to Canfield Smith et al. (1982:248-252), "Some types of concepts are inherently diffident for people to grasp. Without being too formal about it, our experience before and during the Star design led us to the following classification: EaE;5, concrete choosing recogmzmg Panting interactive I,.Ta~ Act creating filling in generating PI batch

433 The characteristics on the left were incorporate into the Star user's concepts] Gel. The characteristics on He right we attend to avoid. "me following main goals were ~r';ued In designing the Star user interface: firm; 1 far user ~ s conceptual Yodel arming are pointing versus ruing ark typing Hat you see is what you get universal cuds consistent y simplicity model- interaction user tai loran 1 its "...We decided ~ create electronic count~rts ~ the physical Objects in an office: paper, folders, file cabinets, mail boxes, ark so on an electronic metaphor for the office. We hoped this Acid make the electronic 'world' seen more familiar, less alien, and ~ less crainir~.... We further decided ~ make He ele~z~n~c analogies be concrete objects. Its wed be more than file naps on a disk; they weld be represented by pictures on the display semen. Hey wed be select by Pointing to then.... To file a dent, YOU _ , , , _ _ _ · · a · a _ _ e _ _ ~ a a he would mere it to a picture of a file drawer, just as ye take a EShysi~1 pine-. of paper to a physical file cabinet." N~A's Virtual E3nriromnen~c Workstation illustrates= a ~~ Are avant-garde metaphor (Fisher et al., 1986~. This project wed give a rdbot's operator the sensations and E~r~bive of the robot: Screens ~ the c~pera~r's helmet wed Than vie; taken by cameras on He rat; sensors would pick ~ the ~erator's arm ark finger cements ark translate then into retirements of He Let's arms; ark He ~rator's gloves Hula let the cooperator feel pressures Cat the r~ot's fingers foul. (she Operator wend have the sensation of being inside the rat, and the robot wed become an extension of ~ c~erator's arm ark hard Cements, ever though He robot might be maTly miles from ~ ppera~r. Although metaphors consti~ a fairly new frame of reference for the designers of ~n~cerfaces, a designer or user can look upon every interface as a metaphor of scmet}ling, ark thus the design issue is not wrencher ~ at a metaphor but Cat metaphor to ad—. Each metaphor has bud advantages ark disadvantages. As She's designers not, an effective metaphor can botch r~c:e the Hunt of learning Cat inexperience users ~ do ark accelerate Cat learning. Art effective metaphor can also tap ink users' weal—evened habits and Sherry reduce errors and ~ rinses; and experienced use as wok as inexperienced users Than such i:~pmv~nts. For instar~, 7—gard et al. (1980) slightly reedified a text - Star so that its Is riled Short English sentences: me original, notation RS:/~/,/OK/;* became CHANGE: ALL "JO" 10 "Ok', ark the notatior3al carmnand FUlD:/l=IlI/ became FO~PRI:) TO "INCH". As Table 3 straws, such

434 changes improved the performances of fa Ply experienced users as well as Inexperience - users. TABLE 3 Text Editing With Different Command Languages English-like Cams Users with less than 6 hours of experience: Percentage of tanks completed correctly Percentage of erroneous commands Users with more than 100 hcNrs of experience: 42 11 Percentage of balks completed correctly 84 Percentage of erroneous commands 5.6 Notational stands 28 19 74 9.9 . souRcE: I-dgard et al. (1980) But every interface metaphor breaks down at some point, both became a metaphor differs from the situation it s;~1atPc and becalm=- an inter face differs from the computer it represents. People in real office= can take actions that users cannot simulate ~ Star's electronic office, and Starts electronic office allows actions that would be impossible ~ a real office. . Similarly, a robot might be unable to reproduce some of its operator's instinctive finger movements, and an operator in a shuttle or space station would lack the mobility of an unconfined robot. . '. . ~ ~ . Yet, users are likely to draw strong inferences about a computers capabilities from the human-cc=puter interface. Ledgard et al. (1980:561) noticed that "the users made no distinction between syntax and semantics.... TO them, the actual commands embodied the Cantor to such an extent that many were surprised when told after the experiment that the two editors were functionally identify . " Cone implication is bat an interface metaphor, Fib an interface language, Id magentas ntentional artificiality in order ~ warn us ~ of itch li~nitatim s. Are sure of the ~nt~;ti~re exp ~ stir s that users bring to metaphors especially important to fulfill? For example, in design m g the Virtual Environment Workstation, might it be essential to use cameras that closely approximate the spacing and movements of human eyes in order to avoid having to retrain the erator's s~ic vision? Under stress, people tend to rearers fit - it specific, learned, complex models bac3` to generic, cc~mnon sense, simple models: Which of the equations that users have unleaded through tray d~= stress reawaken? Does stress, for instance,

435 Cream use' responsiveness to Crete, visible stimuli and dream their r~nsiveness to abstract, ir~risible stimuli? A so implication is that designers Chid carefully explore the limitations of an interface metaphor before they adopt it, and Hey . . . . . . — snare 1003; upon a metaphor as ~~ choice flea a set of alternatives, each of Rich has advantages arm disadvantages. However, the Citing i~erfa~ metaphors have been developed separately, with considerable emphasis being given to them union; arm the proses that Merely ffmm have been poorly docents. So, interface ~-signer'; need to be able to Grate alternative metaphors, they row ~eptual forks that highlight the significant properties of different metaphors, and Fey row systematic research to dormant these Plies. * * * All of the foregoing topics imply that a ~ should adapt both its amoral and the Naples ~ pr~rmns to its use take amount, for example, of its user's topical mortise, experience, fr~y of Am, or mortal dexter)=. His ados for de~rel~t of ~ticated interface software (a Focally Use' Interface Management System) that will Agonize He nerds of different users, allow different users to express their personal prefers s, and print use' individ~,a1it~r. Whys, He cater row to be able to identify a user quirkily and Reciprocally, and if Foible, withal infusing an identification posture that would irritate People or delay their arts in an arty. BOOKIE ADD CHIN END POEM Efforts to justify apace systems in erratic tangs will keep pressing for higher and high levels of Sable productivity, and so plan; will tend to program the Orators' activities ~ detail. But rarely heavy workloads Rise the probabilities of human error, and Cuter; will always be better than people at working tiny and obediently adhering to plans. PI pie contribute to ~ ace - Its their able ity to deal ~ th the unexpected, and in fact, to create the ure,4x~sbed by experimenting and innovating. -they can make these contributions better if they are allowed some slack. Space systems' tasks are not all located in space. Spa~. systems inevitably make educational contributions that transcend any of they immediate chelations g - 1s. C>ne of the major contributions of the apace pr~3rmn to date has been a pat~ra~h - a photograph of a clald-bede~ed h=1 ~ of water and dint isolated in a black void. Before they saw that ~hotograp, people's undcrsta~i~ that Unkind shares a con fate had to be attract and i~lec~l; the ~ih~rap ho made this under~ar~i~ Ore partible and Iris. People play central roles in Locational activities because they serve as identifiable portals of referee In setting; that wood otherwise seem mechanistic, rate, and alien. Another of the apace

436 p~rmn's major contributions, ~11~ it put Space exploration into words schist caught the hen imagination, was Neil A. Art's unforgettable Gestation: "mat's one small step for a man, ace giant leap for mankind" (July 20, 1969~. SPRY OF ENTICES AND LIONS EAR RESELL Fostering ]~ Between People are Expert Systems ~ Pat ways dhalld a decision~support ~tem's kr~ledge arm logical nines fit each user individually? Do users trust systems more or less tailoring ~ ~=tpon~ u~il the users gain oonsid~ble ex~ri~e? How do tack characteristics affect a user's I ~ trust a clecision~support system? In what cir~nstan~s doe= a user decide to trust a Later system that captures He knowledge of exams ^~ the user does not kr~w personally? Pat kinds of ex~rier~s lead a user to trust a deceit Stem that the user r~s, at let partly, as a black-box? Pat kits of experi~s Frye a user to ~ a dec~sion~port pys~n's l;,n;~cations and to Override bad r = Nations? Creating Useful Workloads Does performing activities that calld be alienated actually keep human ppera~rs up to date on the stales of a system, or do creators tend deco work ~anistir~lly when they are performing routine activities? Do human Aerators who perform activities. Hat card be automated remend Are effectively to a Trials end beadle theta participation update= then on the current status of the system, or does continues participation ~ carats so familiar with He in and its current status that. Fey mrericok ar~nalies and lack the Objectivity to porx] effectively to a Curious emergency? MESA should investigate the degrees to which an ability to discard Ereconceptians varies among people and can be predicted or taught. What have been the workload of experiences curing long stays ~ confined spare= such as Sealab, Antarctica, and nuclea' submarines? Anticipating Human Errors Research on error prevention might tunefully complement the current projects on error detection and error tolerance. For many tasks, it would be feasible to explicate fairly accurate models Of people that would enable human-cecputer systems to predict and adapt to human errors. An fact, models need nck be very accurate in order to make

437 useful predictions or to suggest Me adaptability to pepple's actual Saviors it pay off. Devel~pi~ Effective Interface Indulges visually all s - flies of ~terfacela~ges have involved indivi~ people working on tasks Cat they could perform alone. Because Space systems create strong social contexts, interface larynges that aE~prox~te nag larynges may turn out to be such Are value In apace systems. Using Manful Interface MbtaEShors Are scan of the ~ ;tive expectations that users bring ~ men hors especially Important to fulfill? Urger stress, people teal to revert from specific, learned, complex models back to generic, ~n-sense, simple models: With of the exhumations that users have unlearned through tray does stress reawaken? Interface c9P=igner'; need ~ be able to generate alternative metaphors, they need cone ~ dual fee ~ rks that highlight the significant properties of different metaphors, and they need systematic research to document these properties. General NASA should develop a sophisticated User interface Management System that will recognize the needs of different users, allow different users to express the ~ personal preferences, and protect users' individuality. Is there a way for a computer to identify its user quickly and unequivocally, without imposing an identification procure that id irritate people or delay they ages In an emergency? Since NA5A can choose fawn. a large pool of applicants, the extreme capabilities of exceptional people are more important Man ache average capabilities of typical people. The people ^o Ate Space systems first receive thoralgh training, so gear deficits of ~e~ri~ce sand be small. Nearly all of the reseat ~ cat human ~ uter interaction has foals ~ on people who lacked thorough training and who had little experience with computers, so most of these findings may not extrapolate to the well-trained and experienced operators of space systems. A here is need for studies of well-trained and experienced users. Avoid research armed at describing human capable ities in general. Instead, test fairly realistic mock-ups of interfaces and systems, with people go are as well trained and as able as real astronauts and controllers.

438 Investigate Me sensitivity of perforate ~ to small variations ~ designs: Do srnal1 design charges produce large Charges in performance? Both to improve Me quality of designs and to improve use' acceptance of designs, experienced as~crona~s and controllers Chard participate in the designing of interfaces and system;. Becalms= early decision off constrz`m later m~ifica~cions, astronauts and controllers should participate freon the Baird of any rear project. AC~=I~1~ mis report has beer improved by constructive suggestions Fran Michael Burlm, Janet l~kerich, Kenneth Inudon, Henry Incas, yes Milliken, Jon Larry, Carte Webster, faith Weigelt, and Hazily W~lbers. Or; At, J. S. 1981 Pp. 97-99 In Brains. Behavior, and R~ics. New York: M=raw~Hil1. ?. Allen, R. B. 1982 Cognitive factors In human ~n~ction way cc~utert;. Pp. 1-26 In A. N. Badre and B. Shneid~man, ~~.,,~Directions In H=nar`,/~ter Interaction. Norwood, N. J.: xylem. Badre, A. N. 1982 feigns Shades for sequentially displayed information. Pp. 9-193 tuna. N. Badre and B. ~neiderman, eds., Directions In ~nan/~tter Interaction. Norwood, N. Ablex. · — Bennett' J.J ~' B., ~,-P., I, F., ~ ~ 1981 ]hel=11=Lpr~ra~E;. Pp. 106-123 tuna. Backhand E. A. Feigeriba~, ens., me Handbook of Artificial Ir~ellig~x:e, Volume II. T - : Ante, Q=1 if.: Wi ~ liar }~u~. South, B. J. 1984 me pathology arm safety of weightlessness. Pp. 63-79 in G. W. }math, ea., Space Safety and Porsche, 1982-1983. San Diego, Waif.: African Astranautical Society. B - raw, D. G., Mittal, S., arm Stefik, M. J. 1986 Eat i: perils arm praise. ~nicati~s of the Association for Sting Machinery 29~9) :880-894.

439 Slyly, M. 1987 J., arx] Norman, J. Sized pa hologi=~1 testing: critique. 18(1) . an ave~iew ~ P~h and Practice Canfield Smith, D., Any, C., Shell, R., V=plar~c, B., arm Harslem, B. 1982 Minnie the star user interface. I: 7(4) :242-282. Collins, J. S. 1986 me Usefulness ofEs~tSyste=: A Mirage? ~us~ipt, School of nosiness Administration, University, of Wis~nsan-Milwaukee. De Bacht~n, O. 1985 It is Sat it's used for~jcib perception and Stem evaluation. Pp. 689-692 LAB. Sashay, ea., H~an~r~uter Interuction~teract '84. I: Elsmrier. De X~, V. 1986 Ironical assistant e to the orator in ~- of incident: scan lines of thought. Pp. 229-253 in E. Holinagel, G. . . _ _ _ . . . . Martini, and D. D. Woods, eds., Intelligent Decision Support in Process Ernrtro~ts. Berlin: Springer. Bayous, H. L., and Bayous, S. E. 1986 Pp. 110-118 in Mire Over Mythic. New York: Free Sass. Lanais, S. T., arm I~u~r, T. K. 1982 P';y~logi~1 iTnrestigatic~ns of natural terminology for Snare & query lar~uages. Pp. 95-109 ~ A. N. Badre arm B. Shneiderman, Is., Directions in ~ranJ/O~ter Thteraction. Norway, N. J.: Ablex. Dutton, J. M., end Starbu~k, W. H. 1971 Finding C~arlie's n~n-time estimator. lip. 218-242 in J. M. Rattan am W. H. Stazbuc~k, ens., Inter Si~rn~ation of Ian BE rior. New York: Wiley. Fisher, S. S., Ivy, M., Henries, J., arm Rdbinett, W. 1986 Virtual E~ironff~nt Display System. M~s~ipt, Aerospace Human Faders Reseat Di~risic~n, NP5A Is R - eared Center, Mbffett Field, Waif. Green, T. R. G. 1982 Picture; of patrons am other processes, or he to do Dims with lir=;. B~haviour and Information Technology 1:3-36.

440 Green, T. R. G., Sime, M. E., and Fitter, M. J. 1980 me pebble; the pry faces. Er~onc~mics 23(9) :893-907. Gamin, J 1986 Designing In the dank: logics that compete with the user. Pp. 281-284 In M. Man~i arm P. Or~ton, ads., Roman Factors In Grating Systems CHI '86 Conference P~:r~. New York: Association for Outing Mushily. Gru~felder, T. M., armful whiten, W. B., II 1985 Augmenting generic reseat with prototype evaluation: experience in applying generic research to Specific pros. Pp. 913-917 In B. S6ha~, ad., ~nan~uter Inte~ction~Interact ' 84 . ~sterd~n: Elsmrier. Jarke, M., Turner, J. A., Stahl, E. A., Vassi ~ Ian, Y., mite, N. A., and Mi~hiels~, K. 198S A field Aquatic of natural language for data retrieval. ~:~-~; ~ansacti~s an Software Engineering SE-11(1):97-114. ~d, H. F., Whiteside, J. A.J Singer, A., and Spur, W. 1980 me nay language of interactive ~ems. ~nications of the Association for ~T~utin~ Machinery 23~10) :556-563. Iubbins, A. S., arm urchins, E. H. 1959 Rigidity of Behavior. Eugene: Un~versi~ of In Books. Mbray, N. 1986 yelling cognitive activities: human limitations in relation to ~ter aids. Pp. 273-291 in E. lIollnagel, G. M - Mini, and D. D. Woods, ems., Intelligent Decision Support in ~ss Enviro~ts. Berlin: Springer. Newel, A., Gbaw, J. C., arxt Simon, X. A. 1957 Empiric explorations of the logic theory machine: a case sly In heuristic. Joint ~ter Conference Western inks 11:218-230. N = ~ D. A. 1983 Design rules bask ~ analyses of human error. C~nications of the Association for Grouting M~c~hin~y 26(4) :254-258. 1986 New views of information pressing: implications for intelligent deposit support systems. Pp. 123-136 In E. Hol~gel, G. M~ncini, arxID. D. Ways, ~c., Intelligent Decision Suborn In Pr~ess E~vir~nts. Berlin: Springer.

441 Payne, S. J., Sine, M. E., and Green, T. R. G. 1984 Perceptual structure cueing In a simple caner language. International Journal of Mdn-M~dhine Studies 21:19-29. Reason, J. 1979 Actions ret as pearled: the price of autc~natizatioa,. Pp. 67-89 In G. Underwood are R. Stevens, - a. ~ Aspeccs of Consciousness, Volume I. Unwon: Academic E=ss. 1986 Recurrent errors in process environments: some implications for the design of ~nt=1ligent decision support systems. Pp. 255-270 in E. Hollnagel, G. Mancini, and D. D. Woods, edge., Intelligent Decision Support in Process Environments. Marlin: Springer. Rouse, W. B. 1981 Human-computer interaction in the control of Dynamic systems. Computing Surveys 13(1):71-99. Schneider, M. L., Wexelblat, R. L., and Jende, M. S. 1980 Desi ~ control languages from the user's perspective. Pp. 181-198 in D. Beech, Ed., Command Language Directions. New York: North-HollanS. SenSe m, J. W. 1980 Is there a cure for human error? Psychology Today 14~4~:52-62. Sheppard, S. B., Bailey, J. W., and Bat ey, [Kruesi3, E. 1984 An e`~rrical evaluation of software documentation formats. Pp. 135-164 in J. C. mamas and M. L. Schneider, ads., Human Factors in Cb=puber Systems. Norwood, N. J.: Ablex. Sheppard, S. B., Kruesi [Bailey], E., and Curt is, B. 1980 The Effects of Symboloqy and Spatial Arrangement on the Comprehension of Software Specifications. Arlington, Va General Electric CcE pan i, IR-80-388200-2. ,. . Shneiderman, Ben 3986 Seven plus or minus two: central ~ sues in human-ccmputer interaction. Ep. 343-349 in M. M~ntei and P. Orbeton, ed=., Human Factors in Computing Systems. CHI '86 Conference Proceedings. New York: Association for Cc mputing Madhinery. Simon, D. K., ark Sit, P. A. 1985 Human factors: an exploration of the psychology of human~ter dialogues. Pp. 49-103 in H. R. Hartson, ea., Advances In Human~r~uter Interaction, Volume I. Norm, N. J.: Ablex.

442 Simon, H. A. 1960 the corporation: will it be managed by machines? Pp. 17-55 in M. Anshen arm G. L. Bach, "3s., Management and Colorations 1985. Evanston, Ill. : Harper & P=w. Stanfill, C., arm Waltz, D. 1986 Tears ~ry-bas" reasoning. C~nications of the Association for Garroting machinery 29(12) :1213-1228. [bang, H. D., arm Gupta, A. 1982 E=s~al Inters. Scientific American 247(6) :86-107. I=ner, J. A., arm Harass, R. A. 1984 Software ergonomics: effects of Inter application design parameters on Orator task perforate arm heals. E~onami~ 27 (6): 663-690. Valiant, L. G. 1984 A theory of the lean - le. ~nications of me Association for Outing finery 27(11) :1134-1142. Warm, H. 1963 TO mechanical mathematics. Pp. 91-120 in K. M. Sayre arm F. J. Crimson, Ells., me Modeling of Mire. Natre 1~3, I.: University of NOtre name mess. Watzlawi=, P., Wealdar~, J., arm Fisch, R. 1974 Carte: Principles of Prciblen Formation arm Emblem Pesol~ion. New York: Nortan. l Weiner, E. Le 1985 Band the sterile Spite Collar l~;tacto~ 27(1) :75~90. Weizenbaum, J. 1965 F~.TZA—a Or pr~rara for the stem of natural lar~uage ~ nication between man arx! marine. ~nications of the Association for Outing Mythic 9(1) :36-45. 1976 ~ Peer arm Iran Reason. San Fears isco: F= . Winograd, T., arm ~ores, F. 1986 U~star~i~ is arm Ignition. Norway, N. J.: Ab1ex. Woods, D. D. 1986a cognitive technologies: the Sign of joint humn-madhine Lenitive Me AI Magazine 6 (4) : 86-92.

443 1986b Paradigms for intelligent decision sort. Pp. 153-173 in E. Holl$,agel, G. Mdncini, and D. D. Woods, "c., Intelligent Decision Support In less Er~viroTments. Poplin Springer.

Next: Discussion: Comments on the Human Role in Space Systems »
Human Factors in Automated and Robotic Space Systems: Proceedings of a Symposium Get This Book
×
Buy Paperback | $125.00
MyNAP members save 10% online.
Login or Register to save!
  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  6. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  7. ×

    View our suggested citation for this chapter.

    « Back Next »
  8. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!