Index

A

Abstraction hierarchies, 198

Accidents

Bhopal, 328

Chernobyl, 1

in large-scale systems, 1

at power plants, 198

Three Mile Island, 1, 256, 328

by Vincennes, 328

ACR. See Architecture commitment review

ACT-R model, 167, 244, 321

Active Risk Management, 311

Activity view, concurrent levels of, 4144

ACWA. See Applied cognitive work analysis method

Adaptability, 6, 26

Adjustable methods, 24

Advanced spectroscopic portals (ASPs), 105

Advocacy.

See also Nonadvocate technical experts

for consideration of HSI, 15

Affinity analysis, 175

Afghanistan, current needs in, 93

AFQT. See Armed Forces Qualification Test

Aggregation, of features, 26

Agile methods, 35, 37

Air Force Falconer Air Operations Center, 16

Air Force Research Laboratory, 2

Air traffic control systems, 13, 21

AIRPRINT, 296

Alarms, with melodies, 111112

ALARP. See As low as reasonably practicable

Alertness level, 227228

Ambiguity, 63

American Anthropology Association, 154 n.1

Anchor point milestone reviews, 23, 25, 37, 44

development commitment, 4446

Anthropometric models, 244

Applied cognitive work analysis (ACWA) method, 62

Applied Physics Lab, 144

Applied Psychology Research Unit, 10

Aptima, Inc., 162

Archetypes, composite user, 65

Architecting phase, 3

and design, 247

point-solution, 33

Architectural prototypes, 236

Architecture commitment review (ACR), 44

procedures, 46

Architectures, back-end, 238

Armed Forces Aptitude Test Battery (ASVAB), 1920



The National Academies | 500 Fifth St. N.W. | Washington, D.C. 20001
Copyright © National Academy of Sciences. All rights reserved.
Terms of Use and Privacy Statement



Below are the first 10 and last 10 pages of uncorrected machine-read text (when available) of this chapter, followed by the top 30 algorithmically extracted key phrases from the chapter as a whole.
Intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text on the opening pages of each chapter. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

Do not use for reproduction, copying, pasting, or reading; exclusively for search engines.

OCR for page 365
Human-System Integration in the System Development Process: A New Look Index A Abstraction hierarchies, 198 Accidents Bhopal, 328 Chernobyl, 1 in large-scale systems, 1 at power plants, 198 Three Mile Island, 1, 256, 328 by Vincennes, 328 ACR. See Architecture commitment review ACT-R model, 167, 244, 321 Active Risk Management, 311 Activity view, concurrent levels of, 41–44 ACWA. See Applied cognitive work analysis method Adaptability, 6, 26 Adjustable methods, 24 Advanced spectroscopic portals (ASPs), 105 Advocacy. See also Nonadvocate technical experts for consideration of HSI, 15 Affinity analysis, 175 Afghanistan, current needs in, 93 AFQT. See Armed Forces Qualification Test Aggregation, of features, 26 Agile methods, 35, 37 Air Force Falconer Air Operations Center, 16 Air Force Research Laboratory, 2 Air traffic control systems, 13, 21 AIRPRINT, 296 Alarms, with melodies, 111–112 ALARP. See As low as reasonably practicable Alertness level, 227–228 Ambiguity, 63 American Anthropology Association, 154 n.1 Anchor point milestone reviews, 23, 25, 37, 44 development commitment, 44–46 Anthropometric models, 244 Applied cognitive work analysis (ACWA) method, 62 Applied Physics Lab, 144 Applied Psychology Research Unit, 10 Aptima, Inc., 162 Archetypes, composite user, 65 Architecting phase, 3 and design, 247 point-solution, 33 Architectural prototypes, 236 Architecture commitment review (ACR), 44 procedures, 46 Architectures, back-end, 238 Armed Forces Aptitude Test Battery (ASVAB), 19–20

OCR for page 365
Human-System Integration in the System Development Process: A New Look Armed Forces Qualification Test (AFQT), 19 Army Comanche Helicopter program, 13 Army Human Engineering Directorate, 19 Army Research Laboratory, 2 Artifact models, 50, 176, 280 As low as reasonably practicable (ALARP), 259 ASPs. See Advanced spectroscopic portals Assessment of HSI of contributors to system adaptability and resilience, 330 of risks, 327–329 ASVAB. See Armed Forces Aptitude Test Battery Asynchronous communication patterns, 147 ATM machine withdrawals, 197 hierarchical task analysis of, 158–159, 161 AT&T Architecture Review Board procedures, 46 Attention management, 207, 318 Automated methods, based on rules and guidelines, 272 Automatic external defibrillator, example FTA for a hypothetical, 263 Automation, 11 B Back-end architectures, 238 Baselines, generating, 278–279 BDUF. See Big design up front activities BEAST. See Boeing Engineering Aerospace Simulation Tool program Behavioral patterns, 186 Bell Laboratories, 10 Best-effort definitions, 49 Best practices for HSI, 57 for risk mitigation, 67–74 tables of, 44 Bhopal accident, 328 “Bifocal tools,” 212 Big design up front (BDUF) activities, 39 “Blogs,” 22, 289–290, 292 “Bobby” tool, 245, 272 Boeing Engineering Aerospace Simulation Tool (BEAST) program, 146 Bootstrap process, 165–166 Borg Ratings of Perceived Exertion, 220 Budget constraints, 23 tailoring methods to, 24 Business case, 77 viability of, 4 Business Week, 1 C CAIV. See Cost as an independent variable Cameras, reconceptualizing, 65 “Cardboard computers,” 172, 212–213 Case studies, 91–125 “next-generation” intravenous infusion pump, 105–125 port security, 97–105 unmanned aerial systems, 92–97 Cassette loading, semiautomatic, 112–113 Cause-effect relationships, 50 CBP. See Customs and border protection CDM. See Critical decision method Centralization, 145 Change. See also Rapid change in conditions and requirements in the workplace, designing to accommodate, 26, 300–301 CHAOS report, 191 Chernobyl accident, 1 Child safety, concerns about, 10 Choke points, identifying, 103, 202 Circadian rhythm, 227 Cmap Tools software suite, 315 COGNET/iGEN model, 244 Cognitive task analysis (CTA), 161–169 a bootstrap process, 165–166 contributions to system design phases, 167 overview, 161–166 relationship to task analysis, 165 representative methods, 162–165 shared representations, 166–167 strengths, limitations, and gaps, 168–169 in the unmanned aerial systems case study, 166 uses of methods, 167 Cognitive walkthroughs, 272 Cognitive work analysis methodology, 199 analytic tools involved in, 202

OCR for page 365
Human-System Integration in the System Development Process: A New Look Collaboration-at-a-distance, 289 Collaboration failures, 5 Collaboration-intensive systems, 4, 25 Color touch screens, large, 111 Colors, stoplight, 83 Command and control (C2), 286–287, 300 Command and control vehicles, 12 Committee on Human Factors, 2 Committee on Human-System Design Support for Changing Technology, 2 charge and scope of, 16–17 report organization, 27–28 Common ground, 61 Common Industry Specification for Usability Requirements, 194–195, 267 Communication. See also Shared representations for communication among members of the development team, 195 creating shared representations for, 25 between customers and suppliers, 195 of risk, improving, 329–330 Compatibility, evidence of, 45 Complexity of systems, 1, 144–145, 308 Composite stories, 231 Composite user archetypes, 65 Computational tools, paucity of, 206 Computer simulations, 25, 106, 240, 267 Concept mapping, 163 of the role of cold fronts in the Gulf Coast, 164 Concurrent engineering process models, 34, 37, 48, 51 Concurrent systems, definition and development, 32, 105 Conditions, accommodation to changing, 101–102 Consensus building, 2n. 1 Consequence levels, assessing, 82–83 Consumer Product Safety Commission, 10 Context of use analysis methods, 136–138 cognitive task analysis, 161–169 contextual inquiry, 175–177 defining opportunities and, 55, 129, 135–188, 279–280 event data analysis, 177–188 field observations and ethnography, 150–157 organizational and environmental context, 139–150 participatory analysis, 169–175 task analysis, 157–161 Contexts, 18–22 military sector, 18–20 private sector, 20–22 of use, 138 Contextual design, 216–217 contributions to the system design process, 217 shared representations, 217 strengths, limitations, and gaps, 217 Contextual inquiries, 114–115, 149, 175–177 affinity analysis, 175 contributions to the system design process, 139, 177 interpretation, 175 overview, 175–176 shared representations, 176 strengths, limitations, and gaps, 177 Control, of information, 22 Control rooms, 157 for power plants, 139 Cornell Musculoskeletal Discomfort Survey, 219 Cost as an independent variable (CAIV), 76 Cost-competitive contracts, 33 Cost-effective systems, 23 Costs, providing a basis for controlling, 195 Cougaar, 311 Crisis response systems, 15 Critical decision method (CDM), 162 Critical success factor (CSF) aspects of top five software projects, 52 CTA. See Cognitive task analysis Cultural analysis, 147–148 Cultural models, 144, 176, 251–252 Current-point-in-time shapshot requirements, 33 Customer observations, negative, 13–14, 112–113 Customs and border protection (CBP), 98, 104 D D-OMAR model, 245 DART, 311

OCR for page 365
Human-System Integration in the System Development Process: A New Look Data analysis, 183–184 collection, 6, 183, 187, 319 mining, 178, 318–319 privacy of, 319 rate of change of, 22 representation, 184 Data-reduction methods, 178 DATech, 271 DCR. See Development commitment review Decentralization, of information, 22 Decompositions goal/task, 279–280 hierarchical, 283 Defense Advanced Research Projects Agency, 311 Defense systems, 24 Defibrillators, automatic external, 263 Defining requirements and design, 56, 129, 189–252 contextual design, 216–217 methods for mitigating fatigue, 226–229 models and simulations, 240–252 participatory design, 210–216 personas, 233–235 physical ergonomics, 217–223 prototyping, 235–239 scenarios, 230–233 situation awareness, 223–226 usability requirements, 191–197 work domain analysis, 197–207 workload assessment, 207–210 Delphi group decision-making technique, 262 Descriptive methods, in ethnography, 151 Design as an innovative process, 189 as a socially constructed process, 61 Design cycle time, pressure to reduce, 12 Design issues decisions, 109 meaning, 63 and methods used, 124–125 opportunities and constraints, 5 solutions, 56, 280–281 Design team members, 11 Development commitment review (DCR), 44, 47, 49 procedures, 46 Development phase, 3 risk of destabilizing, 50 DHS. See U.S. Department of Homeland Security Diagrams, 25 Differentiation, 144–145 Digital human physical simulations, 243–244 Disuse, operational stage risk of, 59 Diversity, managing, 152 Documentaries, multimedia, 173 DoD. See U.S. Department of Defense Domain knowledge, repository of, 206 Drawing workshops, 171–172 Dutch Musculoskeletal Survey, 219 E E-commerce web sites, 255 Ease of use, 249–250 Eclipse Process Framework OpenUP, 302 ECR. See Exploration commitment review EDA. See Event data analysis Education of HSI specialists. See also Human-system integration, developing as a discipline opportunities for, 14 Electronic models, 25 Electronic Systems Center, 16 Emergency Care Research Institute, 110 Emergency medical missions, 15 Emergent behavior, 15, 53 Emergent requirements, 33 Emotional models, 251–252 Encyclopedias. See also Wikipedia online, user-constructed, 22 End-state operational system risks, 57 Energy systems, 255 Engineering development risk, management of, 59 Enterprise resource planning (ERP) package, 39 Environmental context, 5, 17, 139–150 contributions to system design phases, 149 methods and respective sources of data, 141 overview, 139–141 shared representations, 141–144 strengths, limitations, and gaps, 149 uses of methods, 144–149

OCR for page 365
Human-System Integration in the System Development Process: A New Look “Envisioned world” problem, 163–164 EPIC model, 245, 321 “Epistemic status,” 231 ERP. See Enterprise resource planning package Errors operational stage risk of, 59 taxonomies of, 328 ESDA. See Exploratory sequential data analysis Ethical considerations, 6, 316–320 Ethnographic inquiry, 231 contributions to the system design process, 154–156 interviews, 153 methods, 213 observations, 153–154 practices, 152–154 principles, 150–152 shared representations, 155 strengths, limitations, and gaps, 155–157 European Union, 319 Evaluation, 56, 247–248, 281–282 heuristic, 271–272 of remaining plan activities, 90 of success accomplishments, 90 system-level, 14 Event data analysis (EDA), 5, 95, 177–188 contributions to system design phases, 185–186 ethical implications, 316–320 examples of uses of, 180–181 overview, 177–178 shared representations, 178–179 uses of methods, 180–185 Event trees, 260 Evolutionary system growth, 51 Evolvability, designing for, 26 Excel spreadsheets, 321 Expert COCOMO/COCOTS, 311 Exploration commitment review (ECR), 46–47 Exploration phase, 3, 246–247 Exploratory sequential data analysis (ESDA), 184 F Failure modes, effects, and criticality analysis (FMECA), 259n.1 Failure modes and effects analyses (FMEA), 115, 124, 159, 253, 259n. 1, 260 advantages and disadvantages of, 264 of Symbiq™ IV Pump, 119–121 Failures. See Collaboration failures; Human-system failures; Product failures; System failures Fallback plans, identification of, 89 Fault tree analysis (FTA), 253 advantages and disadvantages of, 265 and other technique variations, 260–262 steps in performing, 264 FDA. See Food and Drug Administration Feasibility evidence of, 45 rationale for, 50 Feature needs large color touch screens, 111 medication libraries with hard and soft dosage limits, 110–111 semiautomatic cassette loading, 112–113 special alarms with melodies, 111–112 special pole mounting hardware, 113 stacking requirements, 113–114 and their rationales, 110–114 tubing management, 114 “Feedreaders,” 289 “Field modification,” 28 Field observations, 5, 123, 150–157 ethnographic practices, 152–154 ethnographic principles, 150–152 Fitts’s law, 321 Flash™ animations, 119, 119n. 1 Flow models, 176 FMEA. See Failure modes and effects analyses steps in performing, 261 FMECA. See Failure modes, effects, and criticality analysis Focus groups, 122 “Folksonomies,” 22, 290 Food and Drug Administration (FDA), 110, 113 Formalization, 145 Formative evaluation, uses of methods in, 267–268 FTA. See Fault tree analysis Full-scale warfare, 15 Function allocation, 131

OCR for page 365
Human-System Integration in the System Development Process: A New Look Funders of research, lack of commitment to HSI by, 2, 5 The future, 7, 275–330 conclusions and recommendations, 296–330 scenarios for, 277–295 Future-vision stories, 232 Future workshops, 171 Futures table, 144 G Gantt charts, 209, 279–281 Global context, 16 Goal/task decompositions, 279–280 GOMS (goals, operators, methods, and selection rules) method, 167, 242–243, 249–250, 281, 321 Google, 178 Government organizations, 4–5 Graphic user interface (GUI) simulations, 119 interactive builder, 32–33 Grounded theory, 152–153 Group narratives, 183 “thinking with,” 63 Groupware support systems, 38 Guidelines, 271 H Habitability, in the military sector, 1 Hand-off functions, 94 Handbook of Systems Engineering and Management, 34 Hardware models, with integrated usability tests, 119–120 Harms, defining, 257 Health hazards. See also Safety and health considerations in the military sector, 18 Heuristic evaluation, 271–272 Hierarchical task analysis (HTA), 157–158 formalism of, 166 graphic representation of, 159 Hierarchies abstraction, 198 decomposing, 283 deep supplier, 50 High assurance, incremental development for accommodating, 49–51 High-level languages, 251 Holistic methods, 16 in ethnography, 151 of measuring risk, 83 Home media systems, 13 Hospira customer service organization, 110 Hospital systems, 12 HSI. See Human-system integration HTA. See Hierarchical task analysis Human capabilities and needs, considering early, 1 Human cognitive characteristics, 2 Human-computer interaction, 10 Human digital modeling, 221, 223 Human error analysis, 255–265 contributions to system design phases, 262 general model of, for security screening, 100 identification of hazards, 257–259 shared representations, 259–262 strengths, limitations, and gaps, 262–265 Human factors analysis of, 295 events in the growth of, 10 introducing early enough, 14 professionals in, 59 Human Factors and Ergonomics Society, 10, 125, 313 Human factors engineering, 1, 11 in the military sector, 18 Human-in-the-loop evaluations/simulations, 87, 209, 232, 240–241, 265 Human-intensive systems, future of, 3 Human-system domain experts, 2 Human-system engineering, 2 Human-system failures, 13 Human-system integration (HSI), 2, 4, 9, 31 accommodating the emergence of requirements, 303 activities, participants, methods, and shared representation, 130 beginning early and continuing throughout the development life cycle, 297

OCR for page 365
Human-System Integration in the System Development Process: A New Look developing as a discipline, 7, 284–286, 312–313 modeling, 99 operational requirements in contracts and acquisition documents, 303–304 risks, 57 sizing the effort, 309–310 system development led by, 282–284 for UASS in the context of the risk-driven spiral, 96–97 Human-system integration (HSI) in the context of risk-driven incremental commitments, 98–103 accommodation to changing conditions and workplace requirements, 101–102 HSI methods tailored to time and budget constraints, 99–100 scalable methods, 102–103 shared representations used to communicate, 100–101 Human-system integration (HSI) in the incremental commitment model, 57–60, 94–96 Human-system integration (HSI) in the system development process, 55–74, 127–274 best practices for risk mitigation, 67–74 case studies, 91–125 conclusion, 66 defining opportunities and context of use, 55, 129, 135–188 defining requirements and design, 56, 129, 189–252 evaluating, 56 function allocation, 131 managing risks, 75–90 methods for evaluation, 129, 253–274 performance measurement, 131–133 shared representations, 61–66 the system development process, 31–54 Human-system integration methods tailored to time and budget constraints, 99–100 Human-system model development, 320–322 I IBM/Rational Unified Process, 302 ICM. See Incremental commitment model Identification of fallback plans, 89 of hazards and when risk management is conducted, 257–259 of HSI contributors to system adaptability and resilience, 330 of risks, 78–84, 327–329 IEC. See International Electrotechnical Commission IMPRINT (Improved Performance Research and Integration Tool), 19, 241, 249–250, 321 Incremental commitment model (ICM), 9, 23, 31, 36–51 for accommodating rapid change and high assurance, 49–51 activity categories and level of effort, 41 anchor point milestone feasibility rationale, 46 concurrent levels of activity view, 41–44 development commitment anchor point milestone review, 44–46 different risks creating different processes, 40 life-cycle process elaboration, 36, 45 milestone reviews, 37, 46–47 phases of, 44 principles, 33 process model generator view, 39–40 project experience with, 50–53 spiral view of, 47–49 top-5 projects explicitly using, 51, 51n.1 Incremental growth, of system definition and stakeholder commitment, 32, 103–104 Individual stories, 231 Information, sharing across domains, 7 Input/output system diagrams, 142 Institute for Human and Machine Cognition, 315 Institute for Safe Medical Practices, 110 Institutionalization, of a system development process based on the success factors, 302 Insurgency suppression missions, 15 Integrated product team (IPT), 80–81, 283, 286–287, 298 structuring HSI-led system development, 284

OCR for page 365
Human-System Integration in the System Development Process: A New Look Integrated usability tests, integrated hardware and software models with, 119–120 Integration of human systems and systems engineering, 27, 145, 250–251, 278–286, 298, 301–314 accommodating the emergence of HSI requirements, 303 defining opportunities and requirements and defining the context of use, 279–280 design solutions, 280–281 developing HSI as a discipline, 284–286, 312–313 evaluation, 281–282 fostering more synergy between research and practice, 314 generating a baseline, 278–279 HSI-led system development, 282–284 humans in the design process, 1 institutionalizing a system development process based on the success factors, 302 knowledge-based planning aids for HSI, 310–312 managing system development, 6, 305 meaning of, 282 operational requirements in contracts and acquisition documents, 303–304 shared representations, 307–308 sizing the HSI effort, 309–310 systems of systems, 308–309 traceability and requirements, 305–307 Interconnectedness and interdependency, 26 International Council on Systems Engineering, 313 International Electrotechnical Commission (IEC), 112 International Ergonomics Association, 217n. 2, 313 International Journal of Human-Computer Studies, 131 International Journal of Human System Integration, 313 International Organization for Standardization. See ISO standards Internet, the. See also Web 2.0 influence on culture, 154 Interpretation, 175, 231 Interviews, 153 IPT. See Integrated product team Iraq, current needs in, 93 ISO standards, 4, 115, 302 ISO 9241-11, 192, 268, 271 ISO/IEC 15288, 196, 310 ISO/IEC 9126-1, 191 ISO/PAS 18152, 44, 57, 311 ISO/TR 18529, 310 Iteration, 48, 60 system definition and development, 32 system growth, 51 usability tests, 122 IV pumps tube management features, 107 two channel, 107 J JSAF model, 244 K Keystroke-level analysis, 13, 186, 249 Knowledge acquisition techniques, 161 Knowledge-based planning for HSI, 286–287, 310–312 tools for, 7 L Labor savings, 14 Laboratory studies, 153 Lag sequential analysis, 184 Large-scale systems. See Systems of systems LCA. See Life-cycle architecture package Lead systems integrator (LSI), 15 Lean development process, 51 Lean methods, 37 Libraries. See Medication libraries LibraryThing, 290 Life-cycle architecture (LCA) development phases, 3 of the ICM and EDA, 185 operational stage risk of high costs, 59 package, 44, 50 planning, 124 LIFT tool, 272 Likelihood levels, assessing, 82–83 Limitations, 249–252 of cultural, team, and emotional models, 251–252 ease of use, 249–250

OCR for page 365
Human-System Integration in the System Development Process: A New Look of high-level languages, 251 of integration, 250–251 Limited warfare, 15 Link Trainer, 240 Linkage of system engineering principles to HSI activities that reduce risks, 58 LMM. See Lumbar motion monitor Logistics planning tools, 311 Lose-lose situation, 38 Low-technology representations, 172–173, 212, 212n. 1, 213 LSI. See Lead systems integrator Lumbar motion monitor (LMM), 221 M Macromedia Flash Player, 119n. 1 Manpower, personnel, and training (MPT) domains, 18 Manpower considerations, 1, 5, 11 in the military sector, 18–19 MANPRINT (Manpower Personnel Integration) program, 10, 17, 24, 296, 298, 307–308 Manufacturing sector, 17, 21 Maps, territory, 64 Market capture goals, 4 Marketing Requirement Document, 110 Markov modeling, 184 “Mash-up” technologies, 26, 289, 291 Matrix organization, 146 Maximizing the cost-effectiveness of usability evaluation, 326–327 Medical equipment possible harms and hazards from the use of, 258 standards for, 114 use of an automatic external defibrillator, 258 use of an automatic needle injection device, 258 Medical Research Council Laboratory, 10 Medication libraries, with hard and soft dosage limits, 110–111 Mental workload, 207–208 Meta-design approaches, 293 Method acting, 234 Methods. See also Types of methods; Uses of methods application instrumentation, 270 for assessing discomfort, 219 for assessing injury risk, 221 for assessing posture, 220 based on models and simulation, 270 collecting data from usage of an existing system, 270 issues and research needs, 116–117 satisfaction surveys, 270 tailoring to time and budget constraints, 24, 299 types of, 6, 268–272 uses of, 266–272 web metrics, 270 Methods and shared representations, 211–214 ethnographic methods, 213 low-technology representations, 212–213 scenarios, 211 theatrical approaches, 213 workshop methods, 213–214 Methods based on expert assessment of the characteristics of a system, 271–272 cognitive walkthrough, 272 guidelines and style guides, 271 heuristic evaluation, 271–272 usability walkthrough, 272 Methods based on observing users of a real or simulated system, 268–270 formative methods, 268 summative methods, 268–269 Methods for defining opportunities and context of use, 314–320 tools to support capture and dissemination of results of context of use analyses, 315–316 user participation in systems engineering and event data analysis and their ethical implications, 316–320 Methods for defining requirements and design, 320–324 human-system model development, 320–322 prototyping training and organizational design, 322–324 Methods for evaluation, 129, 253–274, 324–330 analysis of human error, 256–265 identifying and assessing HSI contributors to system adaptability and resilience, 330

OCR for page 365
Human-System Integration in the System Development Process: A New Look identifying and assessing HSI risks, 327–329 improving the communication of risk, 329–330 improving the use of usability objectives, 324–326 maximizing the cost-effectiveness of usability evaluation, 326–327 risk analysis, 253–256 usability methods, 265–274 Methods for mitigating fatigue, 226–229 assessment, 220–221 contributions to system design phases, 229 overview, 226–227 shared representations, 228 strengths, limitations, and gaps, 229 uses of, 227–228 Micro-Saint-based models, 321 Microergonomics interventions, 140 Microsoft Office, 270 Milestone B commitment, 39 Military sector context, 10, 12, 18–20. See also Command and control habitability and survivability in, 1 manpower, 19 personnel, 19–20 training, 20 Mission-critical subsystems, 34 Mitigation efforts, “off the books,” 89 MITRE Corp., 162, 311 Models, 3, 5, 7, 25, 240–252. See also Artifact models; Cultural models; Emotional models; Flow models; Hardware models; Human digital modeling; Human-system model development; Incremental commitment model; Network models; Physical models; Sequence models; Software models; Team models contributions to system design phases, 246–248 derived from human cognitive operations, 242–243 overview, 240 strengths, limitations, and gaps, 248–252 that mimic human cognitive and perceptual-motor behavior, 244–246 types and uses of, 240–246 ModSAF model, 244 Motivation behind the design, 106 MPT. See Manpower, personnel, and training domains Multidimensional scaling, 184 Multimedia documentaries, 173 Multiple systems. See Systems of systems Multitasking, 207 Muscle Fatigue Assessment method, 220 N Napping, strategic, 227 NASA. See National Aeronautics and Space Administration National Academies Committee on Human Factors, 2 study on organizational models, 252 National Aeronautics and Space Administration (NASA), 60n. 1 Near Earth Asteroid Rendezvous project, 144 TLX scales, 208 National Aerospace System, 241 National Institutes for Occupational Safety and Health (NIOSH), 219–220 National Science Foundation, 313 National Transportation Safety Administration, 10 Naval Postgraduate School, 285, 312 Navy Tactical Decision Support systems, 13 Near Earth Asteroid Rendezvous (NEAR) project, 144 Negative business outcomes. See also Customer observations resulting from HSI faults, 259 Negotiation facilitating, 64 terms oriented to, 34 Nested techniques, 144 Network management, 21 Network models, of human-system performance, 241–242 New technologies, 26 feasibility of inserting, 4 governmental and commercial uses of, 22 “Next-generation” intravenous infusion pump, 105–125 motivation behind the design, 106

OCR for page 365
Human-System Integration in the System Development Process: A New Look summary of design issues and methods used, 124–125 user-centered design process in the ICM context, 106–124 NIOSH. See National Institutes for Occupational Safety and Health Nonadvocate technical experts, 79 Nordic Musculoskeletal Questionnaire, 219 Norman, Don, 62–63 North Atlantic Treaty Organization, 313 Nuclear power plants, 21 work domain representation for a pressurized water reactor, 200 Nuclear Regulatory Commission, 262 NYNEX Science and Technology organization, 13 O Observations, 153–154. See also Customer observations Observer-participant approach, 153 Occupational Safety and Health Administration (OSHA), 10 Occupational repetitive action (OCRA) methods, 221 OCR. See Operations commitment review OCRA. See Occupational repetitive action methods Operation of the Defense Acquisition System, 2, 4–5, 14 Operational requirements, of HSI, 4 Operational return on investment, 31 Operational stage, 3, 248 Operational stage risks, 59 use-error-induced, 92 Operations commitment review (OCR), 47 Operator fatigue, 226 Opportunity-driven approach, to determining needs for HSI activity, adopting, 298 “Opt-in” and “opt-out” approaches, 320, 320n. 1 Optimization schemes, 140 Options assessment assuming the risk, 87 avoiding the risk, 85–86 handling, 85 mitigating the risks, 87–88 transferring the risk, 86 Ordinal values, 82 Organization charts, 141–142 Organizational context, 5, 139–150 contributions to system design phases, 149 methods and respective sources of data, 141 overview, 139–141 shared representations, 141–144 strengths, limitations, and gaps, 149 uses of methods, 144–149 Organizational design example of, 146 modeling approaches, 308 Organizational system scan, 144–147 Organizational variances, table of, 142 OSHA. See Occupational Safety and Health Administration Ovako working posture analysis, 220 “Over-confidence” bias, 12 P PageRank algorithms, 178n. 2 Paper prototypes, 119 Parameter estimation, 13 Part-task simulations, 249 Participatory analysis, 5, 95, 169–175, 210–216, 230 contributions to the system design process, 173, 214 fitting into the system development process, 174 methods, 211–214 overview, 169–173, 210–211 scenarios in, 172 shared representations, 173–174, 211–215 strengths, limitations, and gaps, 174–175, 215–216 workshops in, 170–172 Participatory workshops, 170–172 drawing and other visual workshops, 171–172 future workshops, 171 low-technology representations, 172–173 multimedia documentaries, 173 strategic design workshops, 171 Pass/fail reviews, 14

OCR for page 365
Human-System Integration in the System Development Process: A New Look Pathfinder network scaling, 179, 179n. 3, 184 Pattern recognition, 318 PDR. See Product requirements document Performance measurement, 131–133 Personas, 233–235, 279 contributions to the system design process, 234 shared representations, 233–234 strengths, limitations, and gaps, 234–235 Personnel considerations, 1, 5, 11 back-up, 12 in the military sector, 18–20 “Personnel subsystems,” 10 PERT charts, 209–210 Photo documentaries, 172 Physical ergonomics, 5, 217, 217n. 2, 218–223 assessing, 207 contributions to system design phases, 222 overview, 217–218 shared representations, 218–219 strengths, limitations, and gaps, 222–223 uses of methods, 219–221 Physical models, 25, 176 Physical performance characteristics, 2 Physical prototypes, foam model of a blood analyzer prototype, 237 Physical simulations, digital human, 243–244 PLIBEL, 219 Point-solution architecture, 33 Pole mounting hardware, 113 Policy recommendations, 4, 301–330 methods for defining opportunities and context of use, 314–320 methods for defining requirements and design, 320–324 methods for evaluation, 324–330 realizing the full integration of human systems and systems engineering, 301–314 Polyvinyl toluene sensors, 105 Port security, 97–105. See also Radiation portal monitoring (RPM) systems HSI in the context of risk-driven incremental commitments, 98–103 principles of system development in, 103–105 use of work domain analysis in, 202 Power plants. See also Nuclear power plants accidents at, 198 control rooms, 139, 204 “Practicum” environment, 286 Preventive action, 124 Price systems, 311 Principles-based comparison, of alternative process models, 34–36 Prioritized capabilities, specifying, 34 Prioritized risks, 84 Privacy of data, 319 options in, 320 Private sector context, 4–5, 12, 20–22 Process control, 17, 21 Process model generator view, 39–40 Process tracing, 183 Product design methodologies, 2 Product failures, reducing risk of, 195 Product introduction, 124 Product requirements document (PDR), 119, 121 Product usability characteristics evaluation methods, 271–272 automated methods based on rules and guidelines, 272 methods based on expert assessment of the characteristics of a system, 271–272 Product variation, 145 Program award fee criteria, 4 Program impacts, assessing, 83–84 Program management risks, 57 Program managers, lack of commitment to HSI by, 2 Program schedules, 89–90 Progress monitoring, 14 Project Ernestine, 243 Protocols analysis of, 182 RSS, 289 think-aloud, 225 Prototypes, 3, 5–6, 25, 119, 235–239, 267, 324 architectural, 236 contributions to system design phases, 238

OCR for page 365
Human-System Integration in the System Development Process: A New Look overview, 235–236 paper, 119 rapid, 22 shared representations, 236–237 strengths, limitations, and gaps, 238–239 “throwaway,” 212 training and organizational design, 322–324 uses of methods, 236 Q “Qualitative and quantitative personnel requirements inventory,” 10 “Quality in use,” evaluation of, 265 Quick Exposure Checklist, 220 Quick look reports, 60, 60n. 1 R Radiation portal monitoring (RPM) systems, 98–99, 202 large-scale, 97 Rapid change, 33 incremental development for accommodating, 49–51 Ratio values, 82 Rational unified process (RUP), 37, 41, 51 R&D. See Research and development Real options theory, 38 Reason’s error classification, 257 Rebaselining, 32 Recommendations, 2, 4, 296–330 adopting a risk- and opportunity-driven approach to determining needs for HSI activity, 298 beginning HSI contributions to development early and continuing them throughout the development life cycle, 297 designing to accommodate changing conditions and requirements in the workplace, 300–301 ensuring communication among stakeholders of HSI outputs, 299 integrating across human-system domains as well as across the system life cycle, 298 tailoring methods to time and budget constraints, 299 Recording language, standard, 64 Recording technologies, 153 Reductions assessing achievement of, 90 in the development effort, 195 Relationships, cause-effect, 50 Reliability, 12 Remotely piloted vehicles (RPVs), 92 Reports, 25 Representations. See Diagrams; Low-technology representations; Models; Prototypes; Reports; Shared representations; Simulations; Spreadsheets; Stories; Storyboards; Time lines Representative methods, 162–165 for defining opportunities and context of use, 137 for defining requirements and design, 190 for evaluation, 254 Requirements analysis of, 4, 304 classification of, 192 “creep” of, 294 specification of, 195 specification of inappropriate, 3 Research agenda, 3, 5–7 full integration of human systems and systems engineering, 6–7 methods and tools, 5–6 preliminary, 108–109 shared representations, 5 Research and development (R&D), 86 support for, 13 Research recommendations, 301–330 methods for defining opportunities and context of use, 314–320 methods for defining requirements and design, 320–324 methods for evaluation, 324–330 realizing the full integration of human systems and systems engineering, 301–314 Residual risk, 259 Resilience, 6, 14, 309, 328, 330 Resources failure to assign, 14, 24 suboptimal, 84, 145 Reusable components, 7, 33 Risk assuming, 87

OCR for page 365
Human-System Integration in the System Development Process: A New Look avoiding, 85–86 identification of, 78–81 prioritizing, 84 residual, 259 transferring, 86 @Risk, 311 Risk analysis, 5, 78–84, 253–256 assess likelihood and consequence levels, 82–83 assessing program impacts, 83–84 defining use error, 255–256 determining level of, 83 determining method of, 82 overview, 253–256 revised, 124 steps in, 82 Risk-driven ICM approach, 51 for accommodating rapid change and high assurance, 49 adopting, 23–24 to determine needs for HSI activity, adopting, 298 Risk-handling options, decision flow of, 85 Risk management, 48, 75–90, 104–105 e-commerce web sites, 255 early, 115–119 energy systems, 255 executing risk mitigation, 88–90 handling options assessment, 85 identification of hazards when conducting, 257–259 risk-driven activity levels and anchor point milestones, 32–33 techniques for, 255 transportation systems, 255 weapons systems, 255 Risk mitigation, 87–88 best practices for, 67–74 developing a plan, 88–89 evaluating plan activities, 90 evaluating success accomplishments, 90 executing, 88–90 identifying fallback plans, 89 incorporating into program schedules, 89–90 progressive, 23 steps in, 88 Risk of product failure, reducing, 195 Risk priority number (RPN) values, 115, 119 Robust systems, 198 Role networks, 142–144 for NASA’s Near Earth Asteroid Rendezvous project, 144 Role variances, examples of, 150 Root concept, 231 RPM. See Radiation portal monitoring systems RPN. See Risk priority number values RPVs. See Remotely piloted vehicles RSS protocol, 289 Rules and guidelines, automated methods based on, 272 RUP. See Rational unified process S Safety and health considerations, 1, 11 Safety-case submittals, 168 Safety-critical systems, 24, 34, 252 Sample size formula, 327 Satisfaction surveys, 270 Satisficing, 283. See also Stakeholders defining, 2n. 1 “Say-do-make” approach, 214 Scalable methods, 24, 102–103 multidimensional, 184 Scenarios, 7, 211, 230–233, 280 contributions to system design phases, 232 overview, 230 in participatory analysis, 172 shared representations, 231–232 strengths, limitations, and gaps, 233 uses of methods, 230–231 Scenarios for the future, 277–295 integrated methodology, 278–286 knowledge-based planning for HSI, 286–287 user participation, 288–295 Schematic representations, for a compact power plant control room, 204 Screening. See Security screening Seaports. See Radiation portal monitoring (RPM) systems SEAPRINT (Systems Engineering, Acquisition, and Personnel Integration), 18, 296 Search and rescue missions, 15 Second round prototypes, for interface to MRI device, 237

OCR for page 365
Human-System Integration in the System Development Process: A New Look Security screening in complex labor situations, 104 general model of human error analysis for, 100 likely tightening of, 22 SEER/SEM, 311 Self-report instruments, 218–219 Sensors, polyvinyl toluene, 105 Sequence models, 157–159, 176 Service industries, 17 Service-oriented architectures (SOAs), 22, 289–290 Shared language, 63 Shared representations, 141–144, 155, 159–160, 166–167, 173–179, 194–195, 201, 209, 215–219, 228, 231–237, 259–262, 273, 307–308 artifact model, 176 attributes of good, 63–64 composite stories, 231 cultural model, 176 cultural profile, 144 for defining requirements and design, 190 in the design process, 64–66 for evaluation, 254 flow model, 176 and FMEA, 259–260 and FTA, 260–262 future-vision stories, 232 futures table, 144 individual stories, 231 input/output system diagram, 142 organization charts, 141–142 physical model, 176 providing a basis for controlling costs, 195 reducing risk of product failure, 195 reducing the development effort, 195 and role networks, 142–144 sequence model, 176 for specification of requirements, 195 table of organizational variances, 142 tracking evolving requirements by providing a format to document usability requirements, 195 usefulness of, 62–63 Shared representations for communication, 5, 100–101 among members of the development team, 195 of concepts to engineering staff, 100 creating, 25 between customers and suppliers, 195 of HSI issues and opportunities, 61–66 Signal detection theory, 242, 321 Simulations, 3, 5, 7, 25, 240–252 contributions to system design phases, 246–248 overview, 240 part-task, 249 strengths, limitations, and gaps, 248–252 types and uses of, 240–246 Single-user systems, 21 Situation awareness, 11, 139, 223–226 contributions to system design phases, 225 measuring, 224–225 overview, 223–224 strengths, limitations, and gaps, 225–226 Situation Awareness Global Assessment Technique, 224 Situation Awareness Rating Technique, 225 SOAs. See Service-oriented architectures Social network analysis, 185 “Social software” services, 22, 289 “Social tagging,” 22 Socially constructed processes, design as, 61, 63 Sociotechnical systems approach, 141, 148–149 Software models, with integrated usability tests, 119–120 Software Technology Risk Advisor, 311 “Sourcing,” of information, 22 Space program, 248. See also National Aeronautics and Space Administration Special causes, 141 Spimes, 294 Spiral models, 34, 37, 39, 47–49 development of, 35 simplified view of the ICM, 48 win-win, 51 Spreadsheets, 5, 25 Stacking requirements, 113–114 “Staged world” techniques, 163–164 Stakeholders, 2, 5, 11 analyzing, 148–149 concurrence of, 40 conflicting requirements of, 15

OCR for page 365
Human-System Integration in the System Development Process: A New Look of HSI outputs, ensuring communication among, 299 satisficing, 31–32, 48, 283 success-critical, 38, 103 user-centered activities for, 196 Standard recording language, 64 Standardized interface, 22 Standish Group, 191 Stories, 5, 25, 183 Storyboards, 7, 280 Straddle carriers, 102 Strategic design workshops, 171 Style guides, 271 Subjectivity issues, 219 Suboptimal resources. See Resources Success-critical stakeholder satisficing, 103 Successful system development concurrent system definition and development, 32 incremental growth of system definition and stakeholder commitment, 32 iterative system definition and development, 32 principles for, 2–3, 32–33 risk-driven activity levels and anchor point milestones, 32–33 stakeholder satisficing, 32 Summative methods, 267–270 Supplier hierarchies, deep, 50 Survivability, 11 in the military sector, 1, 18 “Sweeps,” 162 Symbiq™ IV Pump, 105–107, 114–115, 125, 159 excerpts from failure modes and effects analyses (FMEA), 120–121 Synergy between research and practice fostering more, 7, 314 lack of, 14 System design phases, 11 architecting and design, 247 contributions to, 149, 160, 167, 185–186, 196, 205–206, 209, 222, 225, 229, 232, 238, 246–248, 262, 273 evaluation, 247–248 exploration and valuation, 246–247 operation, 248 System design process, contributions to, 154–155, 173, 177, 214, 217, 234 System developers, 14 System development principles, 103–105 concurrent system definition and development, 105 incremental growth of system definition and stakeholder commitment, 103–104 risk management, 104–105 success-critical stakeholder satisficing, 103 System development process, 31–54 conclusion, 53–54 evolving nature of system requirements, 33–34 incremental commitment model, 36–39 institutionalizing based on success factors, 302 participatory methods fitting into, 174 principles-based comparison of alternative process models, 34–36 principles for successful system development, 32–33 project experience with ICM principles, 51–53 views of the incremental commitment model, 39–51 System diagrams, inputs and outputs, 142 System engineers, 2 System failures, catastrophic, 9 System-level evaluation, 14 System life-cycle processes, 196 activity level of HSI methods across phases of, 56 issues involved in, 2 System performance, compromises in, 24 System requirements emergent, 33 evolving nature of, 33–34 rapid change, 33 reusable components, 33 System resilience. See Resilience System safety, in the military sector, 18 System scoping, 3 System simulations. See Simulations Systems engineering for user participation, 291–295 Systems of systems, 6, 14, 36, 300, 308–309 complexity of, 1, 4, 308 defining, 15 very large, 50

OCR for page 365
Human-System Integration in the System Development Process: A New Look T TADMUS (Tactical Decision Making Under Stress) program, 13 Task analysis, 5, 157–161 contributions to system design phases, 160 overview, 157–159 relationship to, 165 shared representations, 159–160 strengths, limitations, and gaps, 160–161 traditional, 201 uses of methods, 160 Task flow diagrams, 115, 118 Taxonomies, of error, 328 Team models, 12, 251–252 Technique for human error rate prediction (THERP), 256 Technologies. See also New technologies “mash-up,” 26, 289, 291 potential insertion opportunities for, 105 recording, 153 wearable, 292 Territory maps, 64 Testing of alarm criticality and alerting, 120–122 of display readability, 122 rapid, 22 of usability requirements, 194 Theater Response Package, 16 Theatrical approaches, 213, 215 Themes, 23–27 adopting a risk-driven approach, 23–24 creating shared representations for communication, 25 designing to accommodate changing conditions and requirements in the workplace, 26 integrating HSI contributions across life-cycle phases and human-system domains, 27 tailoring methods to time and budget constraints, 24 Theory-based analysis, 99 Theory W approach, 38 THERP. See Technique for human error rate prediction Think-aloud protocols, 225 Threat-based RPM display, graphical representation of work flow with, 101 Threat detection, 99 Three Mile Island accident, 1, 256, 328 “Throwaway” prototypes, 212 Time constraints, 23 tailoring methods to, 24 Time of day, and alertness level, 228 Time lines, 7, 279 TIPS cards, 123 TLX scales, 208 Tools. See also individual tools for product design, 2 to support capture and dissemination of results of context of use analyses, 315–316 Top-5 projects, explicitly using ICM principles, 51, 51n. 1 Touch screens, large color, 111 Traceability, 6 and requirements, 305–307 Tracking evolving requirements, by providing a format to document usability requirements, 195 Trade-offs, 3, 19, 34, 140 Training considerations, 1, 5, 11 deficiencies in, 20 in the military sector, 18, 20 Transportation systems, 255 Trustworthiness, 12 Tubing management, 114 Types of methods, 268–272. See also Methods; Uses of methods expert-based evaluation, 272 product usability characteristics evaluation, 271–272 user behavior evaluation, 268–270 Types of models and simulations, 240–246 digital human physical simulations, 243–244 human-in-the-loop simulation, 240–241 models derived from human cognitive operations, 242–243 models that mimic human cognitive and perceptual-motor behavior, 244–246 network models of human-system performance, 241–242 signal detection theory, 242

OCR for page 365
Human-System Integration in the System Development Process: A New Look U UASs. See Unmanned aerial systems Unintended relations and features, detection of, 62 Unmanned aerial systems (UASs), 92–97 conclusion and lessons learned, 96–97 hypothetical case, 93–94 in the ICM context, 94–96 U.S. Army, 10, 18–19, 241 U.S. Department of Defense (DoD), 2, 4–5, 10, 14, 18–19, 241, 250, 297, 301–304, 313 development milestone reviews, 23, 37 DoD Instruction 5000.2, 2, 4–5, 14, 302 Milestone B commitment, 39 U.S. Department of Health and Human Services, 271 U.S. Department of Homeland Security (DHS), 97 U.S. Navy, 18–19, 250 U.S. Rehabilitation Act, 245 US WEST, 13 Usability approaches to ensuring, 266 contributions to system design phases, 196, 273 evaluation methods, 5, 232, 265–274 of an existing system, measuring, 193 improving the use of objectives, 324–326 overview, 191–192, 265–266 practitioners of, 274 quantifying, 325 setting objectives, 115 shared representations, 194–195, 273 strengths, limitations, and gaps, 197, 273–274 tools to support capture and dissemination of results, 315–316 uses and types of methods, 193–194, 266–272 walkthrough, 272 Usability requirements, 191–197 specifying for new systems, 193–194 USC COCOMO/COSYSMO, 311 Use-error faults, 254 defining, 255–256 risk analysis, 159, 255 Use-error-induced operational risks, 92 Use of methods, 193–194, 208. See also Methods; Types of methods instructions for development and testing, 123 measuring usability of an existing system, 193 shared representations, 62–63 specifying usability requirements for the new system, 193–194 testing whether usability requirements have been achieved, 194 User-based evaluation methods, types of, 269 User behavior evaluation methods, 268–270 methods based on models and simulations, 270 methods based on observing users of a real or simulated system, 268–270 methods that collect data from usage of an existing system, 270 User-centered design process in the ICM context, 106–124 activities for stakeholder requirements, 196 contextual inquiry, 114–115 design decisions, 109 early risk management, 115–119 feature needs and their rationales, 110–114 field studies, 123 focus groups, 122 instructions for use development and testing, 123 integrated hardware and software models, 119–120 iterative usability tests, 122 life-cycle planning, 124 preliminary research, 108–109 product introduction, 124 prototypes, 119 revised risk analysis, 124 setting usability objectives, 115 tests of alarm criticality and alerting, 120–122 tests of display readability, 122 validation usability tests, 123–124 User-created dynamic pages, 22

OCR for page 365
Human-System Integration in the System Development Process: A New Look User participation in systems engineering, 288–295 approaches to capturing user input, 288–291 ethical implications, 316–320 Uses of methods, 144–149, 160, 167, 180–185, 201–205, 219–221, 227–231, 236 assignment and diagnosis, 185 cultural analysis, 147–148 data analysis, 183–184 data collection, 183 data representation, 184 ethnographic inquiry, 231 in formative and summative evaluation, 267 human digital modeling, 221 interpretation, 231 methods for assessing discomfort, 219 methods for assessing fatigue, 220–221 methods for assessing injury risk, 221 methods for assessing posture, 220 organizational system scan, 144–147 other example applications, 203–205 problem scenarios and claims, 231 root concept, 231 stakeholder analysis, 148–149 strengths, limitations, and gaps, 187–188 use of work domain analysis in the port security case study, 202 Uses of models and simulations, 240–246 digital human physical simulations, 243–244 human-in-the-loop simulation, 240–241 models derived from human cognitive operations, 242–243 models that mimic human cognitive and perceptual-motor behavior, 244–246 network models of human-system performance, 241–242 signal detection theory, 242 USS Vincennes, Iranian Air Bus downed by, 13, 328 UTOPIA project, 65, 239 V V-model, 37, 39 updates, 34 Validation usability tests, 123–124 Valuation commitment review (VCR), 46–47 Valuation phase, 3, 246–247 Value-based systems and software engineering, 38 Variability, maximization of, 153 VCR. See Valuation commitment review Vincennes. See USS Vincennes Visual workshops, 171–172 Visualizations, novel, 203 Voice recognition applications, 13 W Walkthroughs. See Cognitive walkthroughs Warfare, limited or full-scale, 15 Waterfall models, 34 sequential, 34, 39 “Weak links,” 330 Weapons systems, 255 Wearable technologies, 292 Web 2.0, 22, 26, 288, 290–291, 294, 305, 316, 318 Web metrics, 270 Web sites, designing, 157 “Weblogs,” 22, 289. See also “Blogs” WebSAT, 272 Whole-systems approach, 139 Wikipedia, 290 Win-lose situations, 38 Win-win spiral process, 51 Wireframes, 119 Work-arounds, 26 Work-centered design approaches, 139 Work domain analysis, 197–207 contributions to system design phases, 205–206 overview, 197–200 representation for a pressurized water reactor nuclear power plant, 200 shared representations, 201 strengths, limitations, and gaps, 206–207 use in the port security case study, 202 uses of methods, 201–205 Work flow graphical representation of, 101 problems with, 187 Workload, managing, 19

OCR for page 365
Human-System Integration in the System Development Process: A New Look Workload assessment, 207–210 contributions to system design phases, 209 overview, 207–208 shared representations, 209 strengths, limitations, and gaps, 209–210 use of method, 208 Workplace investigations, 175 Workplace requirements, accommodation to, 101–102 Workshop methods, 213–214, 280. See also Drawing workshops; Future workshops; Participatory workshops; Strategic design workshops; Visual workshops Workstations, 12 World War II, 10 X XML interface, 22, 289 Y Yahoo!, 291