Below are the first 10 and last 10 pages of uncorrected machine-read text (when available) of this chapter, followed by the top 30 algorithmically extracted key phrases from the chapter as a whole.
Intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text on the opening pages of each chapter. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.
Do not use for reproduction, copying, pasting, or reading; exclusively for search engines.
OCR for page 272
--> APPENDIX G Microcomputer Desktop Simulation This appendix discusses the background and uses of microcomputer desktop simulators. In this appendix, a basic microcomputer desktop simulator consists of a microcomputer with hard drive, a single cathode ray tube color monitor, keyboard, an auxiliary control device such as a mouse or trackball, data input-output capabilities, and simulation software. The data input-output capabilities would consist of one drive or a combination of floppy or other types of drives, possibly a network connection (e.g., for microcomputers linked into a classroom training system), and "read-only" devices such as a CD-ROM. This envisioned basic configuration would be analogous to personal computer systems found in many homes, businesses, and aboard many ships. The microcomputer workstation would be configured for individual use, although individuals might be linked to an instructor console. More-elaborate desktop systems might include such features as specially designed data-entry devices (or consoles) or multiple monitors. INTEREST IN DESKTOP SIMULATIONS The rapid increase in the computational power and videographics capabilities of microcomputers has stimulated interest within the marine community in using these capabilities to bring simulators onto ships' bridges and into the classroom and the home. The U.S. Coast Guard (USCG) is interested in understanding whether and how microcomputer simulators could be used as a supplement or an alternative to ship-bridge simulators. The agency views microcomputer-based simulators as a possible reduced-cost option for applications to mariner
OCR for page 273
--> training and performance evaluations. The agency currently requires radar observer certification which uses radar simulators as the training medium. Movement toward using simulators, including desktop simulators, in marine licensing was recommended by an internal USCG focus group to provide a competency-based, rather than knowledge-based, license process (Anderson et al., 1993). There is a growing belief that it might possible to more accurately and completely assess a license applicant's ability to apply knowledge and skills for some tasks using desktop simulators rather than multiple-choice written examinations. If desktop simulators prove to be feasible, practical, and suitable with respect to training or licensing objectives for broad application in the professional development process, the reduced cost could potentially lead to wider availability of simulator-based training, performance evaluations, and licensing. Such a development would constitute a substantial change relative to the current practice of multiple-choice written examinations (ECO, 1987). ASSESSMENT FACTORS The following topics should be considered in determining the suitability of desktop simulations: the technical and instructional state of practice in microcomputer desktop simulators; the research and development basis for using microcomputer simulations in marine training and licensing; the possible use and applications of instructional design process; the training potential of microcomputer desktop simulators, including the ability to produce user behavior that would occur during actual operations and the potential for developing and retaining knowledge and skills; the changes to marine licensing recommended by the internal USCG study group relative to microcomputer simulations (Anderson et al., 1993); the potential of microcomputer simulators for reinforcing skills between scheduled, structured training courses; the quality of learning in controlled and self-instruction training environments; simulator evaluation methods and their applicability; diagnostic capabilities of simulators; characterization of trainee populations, tasks, and functions for which microcomputer simulator applications may be suitable; suitability for use for direct or indirect support of actual operations; need for simulator and simulation validation; ability of the simulator to be user-friendly; and cost effectiveness.
OCR for page 274
--> GENERAL STATE OF PRACTICE OF MICROCOMPUTER DESKTOP SIMULATORS Computer-assisted learning has been used for some time by a number of organizations that offer license-preparation courses. Available courseware includes tutorials to aid in the acquisition of knowledge and to practice responding to questions in the multiple-choice format. There is a growing library of simulator software designed for marine applications. Desktop training simulations are commercially available for general navigation, radar navigation, piloting, shiphandling, maneuvering, automatic radar plotting aids, rules-of-the-road training, port entry, and the global maritime distress safety system. Some of these software packages are already being used to some extent for training and simulation of port entries in the classroom and aboard some ships; other packages are undergoing field evaluations. Because of computational requirements and presentation fidelity needs, the available software requires or works best with higher-level microprocessors and videographics array or super videographics array color monitors. The technological capability also exists to emulate electronic navigation equipment, such as radars and automatic radar plotting aids, at modest cost using microcomputer hardware and software. Because such emulations are driven by software, there is flexibility for upgrades without changing hardware. In concept, the visual presentation on the monitor emulates that available from real equipment. Unless a functional mockup were to be used, however, the control configuration would not physically resemble the actual equipment or its controls. The absence of actual equipment and bridge configurations distinguishes desktop simulators from ship-bridge simulators. Considerable advances have been made with respect to courseware (specially designed instructional software). Courseware design is either traditional show-and-tell for instructor-centered use, or interactive, with the student having a direct link to the software. Either form can include still graphics and the incorporation of embedded videos and simulations. Although application of these capabilities has been limited in marine transportation, there is a recent, rapid proliferation of microcomputer systems configured to support multimedia applications. The principal multimedia feature of such systems is the CD-ROM. Hardware is quickly becoming a technological "nonissue." Interactive courseware capabilities in instructional systems can include branching subroutines that are keyed to student responses. Diagnostics can be embedded into the program to provide additional instruction, matched to the student's level of knowledge acquisition, to facilitate the learning process. These systems can be set up to accommodate the student's rate of learning. Interactive courseware has been developed for various applications within the U.S. Department of Defense (DOD) and commercially for use in mariner training. Interactive classrooms can be used to improve student retention through instruction using interactive courseware. Interactive classrooms may be in any of
OCR for page 275
--> several configurations. One form, which is instructor led, uses a button box or other simplified data-entry devices for student instruction and response. Another form includes a microcomputer workstation with a keyboard for each student. The instructor can electronically monitor student responses in either format. Diagnostics can be included in the software to assist in determining whether the student has responded correctly or appropriately to lesson material or questions and how long it takes a student to get to the correct responses (in the case of tutorial-based lessons). The U.S. Naval Academy has established an interactive classroom environment (Bush, 1993). Similar instructional systems have not yet appeared for mariner training in general. MICROCOMPUTER TRAINING ENVIRONMENT In contrast to the operation of most ships, which is normally conducted while standing, a basic microcomputer desktop simulator requires that an individual sit at a workstation. The person participating in a desktop simulation is in a simplified training environment when compared to that aboard ship and in ship-bridge simulators. Placement of the trainee outside the normal bridge environment does not mean that desktop simulators are less effective for teaching and practice. One limitation of simulators, however, is their inability to successfully establish whether an individual can concurrently perform multiple tasks under the actual conditions found on a ship's bridge. Interest is growing among marine research and manufacturing companies in the development of high-fidelity maneuvering simulations for automatic linkage into passage planning and execution. Several maneuvering simulations using bird's-eye views or simulated bridge window views, or a combination, are available. Computer software is also used to some extent as expert systems aboard a small number of commercial ships with integrated bridge systems to assist in decision making and to control special maneuvers, such as constant radius turns (NCR, 1994). STANDARDS FOR SOFTWARE DEVELOPMENT The development of microcomputer software for application in marine training and licensing is not guided by any industrywide technical performance or operational standards. Nevertheless, because of concerns over the possible misapplication of software instructional programs and the concerns over liability, developers generally appear to be taking a very careful approach. Issues to be considered in the use of microcomputer simulators include: the possibility that desktop simulations might be treated as arcade games; the possibility that computer-assisted courseware might be used to "program" individuals to pass license examinations;
OCR for page 276
--> the possibility that differences in the cue domain from actual conditions might create incorrect expectations about vessel maneuvering behavior or professional knowledge, skills, and proficiency; and the effectiveness of desktop simulators for evaluating human performance. Development and use of standards might be adapted from other sectors. For example, DOD has established rigorous requirements to guide the development of interactive courseware to avoid redundancy and to conserve training development resources. Interactive courseware developed under contract to DOD must be recorded in a nationally accessible database. A search of this database is required as part of the development process for new interactive courseware. RESEARCH, TECHNICAL, AND OPERATIONAL CONSIDERATIONS General Research Basis There are several research bases for using microcomputer simulators for marine training and licensing. These bases include general marine simulation research, U.S. Navy applied research in the development and use of interactive courseware and embedded simulations, and a small body of microcomputer-specific basic and applied research in the commercial marine transportation sector. Insights can also be obtained from human performance research in other sectors and potentially adapted to the marine setting (NCR, 1985; Hays and Singer, 1989). The results of marine simulation research need to be carefully applied for several reasons: The research covers a period of great change and technological advances in both shipping and marine simulation. There is considerable variability in the simulators and methodologies used for research. The research basis consists of many experiments, a portion of which may not been systematically confirmed by subsequent research, across a representative range of simulators, research methodologies, or through comparative analysis with actual operations. Broad generalizations based on the existing research require considerable subjective interpretation. The veracity of such generalizations must be treated cautiously. Another important factor is that most of the published literature focuses on the use of ship-bridge rather than microcomputer desktop simulators. There is currently a very limited basis for comparing the relative merits across this range in simulator capabilities with respect to stimulation of mariner behavior in the same manner that mariners would perform operational functions and
OCR for page 277
--> tasks in real life. Nevertheless, there is a substantial literature base on which to seek lessons that might be useful to the application of microcomputer simulations in professional development and marine licensing (Douwsma, 1993). Computer-Aided Operations Research Facility Studies Extensive mariner performance research was conducted by the U.S. Maritime Administration (MarAd) and the USCG from the mid-1970s through the mid-1980s using the ship-bridge simulator at the Computer Aided Operations Research Facility (CAORF), Kings Point, New York. This research initially focused on developing a clearer understanding of factors that affect human performance. The research methodologies that were employed were affected, to some extent, by the fact that researchers were learning how to use ship-bridge simulators in research. The simulator itself was sophisticated for its time. As a result, before the experiments could proceed, mariners who participated in the research had to learn how to accomplish certain critical tasks, such as measuring distances, in the simulator environment. Although the results of these experiments are a useful starting point, they have not been systematically updated with the changes in operating conditions in the merchant fleets. Important issues that emerge from this research that need to be considered in the context of microcomputer simulators include: the adequacy of the visual scene and cues, the adequacy of the instrumentation cues, the relationship of cues to cognitive and motor skills, the accuracy and fidelity requirements, and user indoctrination to the simulator-based training environment. Mariner Licensing Device The USCG has been searching for ways to improve rules-of-the-road testing for a decade. The current multiple-choice, written examination format does not provide the applicant the opportunity to demonstrate the ability to interpret dynamic maneuvering information within the context of other navigation and bridge operational activities associated with the level of the license being sought. The rules-of-the-road test currently uses static graphic representations of maneuvering situations, lights, and shapes. The examinee is expected to analyze the situation and select and apply the appropriate rule from a short list of multiple-choice answers. The examinee is not required to detect and identify vessels or to determine changes in range and relative bearing while maintaining a complete perspective on the navigational and maneuvering situation. The provision to the examinee of range and bearing information and correct static interpretation of the data suggests, but does not verify, that the individual can effectively collect,
OCR for page 278
--> interpret, and apply this information and integrate multiple sources of information during actual operations (ECO, 1987). The USCG sponsored research and development of a prototype computer-based marine license testing device with embedded simulations. The goal was a device capable of authoring and administering examinations for: rules of the road; recognition of lights, shapes, and signals; and visual and radar navigation (ECO, 1987). The testing objectives specified for the mariner licensing device are shown in Box G-1. Technical development specifications for the prototype are shown in Box G-2. The prototype system consisted of a modification of existing microcomputer ship-bridge simulators that had been previously developed for the U.S. Navy. The ship-bridge simulator featured a 90-degree video projection; ship BOX G-1 Testing Objectives for Mariner License Testing Devices Applying appropriate rules of the road when in meeting, crossing, and overtaking situations under a variety of operational conditions. Applying the appropriate rules of the road when in special circumstances under a variety of operational conditions. Determining safe vessel speed under a variety of operational conditions. Shiphandling under various conditions of wind and current to: — hold course and speed to maintain a dead reckoning track, — avoid collision and pass at a safe distance with other traffic, — maneuver safely in various left and right turns within confined channels, — maneuver after a loss or degradation of propulsion power or steering in confined channels, and — to stop or slow. Shiphandling while executing emergency procedures. Using the whistle for maneuvering and warning signals under various operational situations. Using proper visual position-fixing techniques under various operational situations. Using proper radar navigation techniques under various operational situations. Recognizing and interpreting lights and shapes. Recognizing and interpreting sounds. Applying the appropriate rules of the road in restricted visibility. SOURCE: ECO (1987).
OCR for page 279
--> BOX G-2 Development Criteria for U.S. Coast Guard License Testing Devices Provision of hardware and software capable of authoring and administering mariner license examinations. Visual, radar, and instrument display. Input controls other than keyboards for executing rudder, engine, radar, and whistle signal actions. Generation of visual and radar databases. A computer-based instructional tutorial system for examinee use of the device. "A flexible, inexpensive, and user-friendly examination authoring capability for continuous, dynamic simulation." "A fairly 'non-controversial' automated scoring system." A validation process involving scientific experiments with active mariners. System technical and application documentation. A comparative evaluation of the marine licensing testing device and the current licensing process. SOURCE: ECO (1987). hydrodynamic mathematical models to drive the simulation; a console with a helm, throttles, and instruments; and a radar simulation. Hardware modifications to meet specifications for the prototype testing device included the addition of a visual bearing and "binocular-view" capabilities, whistle signal input controls, and a means for responding to multiple-choice questions (ECO, 1987). The prototype was tested by senior licensed officers and operators of inland and coastal commercial vessels. Results were compared to written examinations. The test found that pass or fail results were affected by experience with computers and that prior experience on simulators did not prove significant. The research found no statistically significant differences in answers to questions with respect to any of the following: years of actual experience, level of education or current license, recency of experience, familiarity with computers, or experience with ship simulators. The mean test score for written examinations was about 90 percent, while the mean score for examination on the simulator was 20-25 percent lower, depending on the experience of the groups tested. About 70 percent of the mariners that participated in the testing program expressed a view that the simulation testing was superior to the current multiple-choice examinations in assessing actual capabilities (ECO, 1987). An important result of the testing program was the indication that the ability of mariners to apply knowledge effectively, to maintain relatively complete situational awareness, to perform normal bridge functions, and to interpret all pertinent
OCR for page 280
--> information in this larger context appears to be somewhat less than suggested by tests of knowledge alone. The gap between knowledge and the application of knowledge indicated that more realistic means of measuring ability to apply knowledge could improve the assessment of individual capabilities. Although the testing by itself did not improve the mariner's ability to close this gap, the success of the experiment suggests that the testing platform could be adapted to a training platform for this purpose. Test results also suggested that a more realistic appraisal of an applicant's competence with respect to applying rules-of-the-road knowledge was possible using a relatively compact simulation capability. Interactive Video for Pilotage Training The feasibility of using interactive video for pilotage training was jointly researched by the Cleveland Cliffs Iron Company and MarAd. The objective of the research was to determine whether such training could be substituted for some of the round trips over the Detroit, St. Clair, and St. Mary's rivers pilotage routes normally required for a licensed master or mate to receive a pilot's endorsement on a USCG-issued license. A major motivation for the research was that an individual on board to observe the route had to remain on board for its entire voyage from the upper to lower lakes, a minimum of four days, to observe one trip over the pilotage routes. Extensive planning was conducted to support the production of videotapes of the pilotage routes and scripts. The videotaping covered various environmental conditions. A series of training tapes were made, each tape was about 60 minutes long and covered about 10 miles of pilotage route. Tapes were produced at the novice, intermediate, and mastery levels for each segment of the route. The novice level includes narrations, inserts on and highlights of key landmarks and aids to navigation, and descriptions of alternatives for making turns. Video overlay and taped questions requiring student response appeared at about 40-second intervals. Students were required to respond to a minimum number of questions correctly. At the intermediate level, the inserts and highlights were omitted and the audio narration was more limited. The question format was more difficult. The number of allowable incorrect answers to questions was reduced. At the mastery level, there was no narration. All questions were in the form of video overlays. An incorrect answer terminated the session, and the participant was advised to repeat the novice session for that segment. Test subjects were cadets from the Great Lakes Maritime Academy (Townley et al., 1985). The research was considered successful, although the small sample size and other factors limited the results to general observations. The results were considered a valuable starting point for proving the concept. It was determined (Townley et al., 1985) through the research that interactive video "appears to be an extremely poor substitute for other training aids in the development of chart
OCR for page 281
--> sketching skills." It was determined, however, that "interactive video appears to be a cost-effective method of training pilotage candidates in preparation for ship-board observer time and as a means for reducing the trips necessary to master the waterway insofar as navigational (as versus shiphandling) skills are concerned." The interactive video format also appeared to be capable of imparting a fair degree of knowledge that was retained. The report recommended that further research be conducted to evaluate the potential of interactive video as a testing device for pilotage knowledge requirements. The report also identified and recommended a number of technical improvements for the production of interactive videos. Maneuvering Simulations Computer-generated imagery for microcomputer maneuvering simulations can take the form of simulated bridge window views or bird's-eye views on a maneuvering graphic or electronic chart, or a combination of both. Although development and use of a small-scale console is feasible, user instructions or reactions are normally entered via keyboard, mouse, or trackball pointing device, rather than through an auxiliary console. Multiple monitors are feasible; however, single computer monitors are normally used to convey all information. The user usually must switch between views for visuals (up to four scenes ahead, port, starboard, and astern), radar emulations, electronic charts, and instrument displays and controls (often displayed concurrently with other graphics). Switching between views places the visual scene in front of the trainee rather than requiring the trainee to physically move to obtain this view, as would occur during actual operations. Vessel behavior is driven by a mathematical model that incorporates hydrodynamic reactions between the vessel and its operating environment. Environmental data are usually entered manually, although for shipboard applications automated input of some data may be feasible. The accuracy of microcomputer maneuvering simulations is directly dependent on the accuracy of trajectory predictions generated by the mathematical model and the bathymetric and environmental data used. The general configuration and format of basic microcomputer desktop simulators is such that the mariner does not receive visual or physical cues in the same manner as when aboard ship. The degree to which these differences may affect performance—including interpretation, decision making, and leadership—has not been systematically researched. There is some indication, however, that a mariner's performance in a ship-bridge simulator may vary to some extent from performance during actual conditions. If this is so for a ship-bridge simulator that more closely replicates real-world conditions and cues, then it would be reasonable to suspect that similar effects would be associated with microcomputer maneuvering simulations.
OCR for page 282
--> Comparative analysis conducted by the Danish Maritime Institute (DMI) for real life and simulated entries of a large passengers ferry to a new berth revealed that the swept path plots of simulated runs had closer tolerance than the real harbor entries. The report attributed this result to variable starting point for the real life tests because of environmental conditions. The report also found that visual references were more frequently relied on in actual operations than in the simulator where there was more frequent use of the simulator's instrumentation. Based on responses to inquiries by the test's participants, this result was attributed to "problems with the estimation of speed and distance in the simulator". The heavier reliance on instruments in the simulation may have been responsible to some extent for the tighter tolerances, although examination of this possibility was not reported as a research objective. Depth perception is an important consideration, because mariners tend to rely more heavily on electronic navigation equipment under conditions where distance cannot be reliably estimated from visual cues. More attention to electronic positioning systems may or may not be appropriate to improving marine safety. Simulation-induced reliance on electronic navigation equipment would, under the conditions of normal daylight operations in unrestricted visibility, mean that mariners were performing somewhat differently than they might when aboard a vessel in identical conditions. The significance, if any, between the actions that are evoked by the cue domain in a simulation and performance during actual operations is uncertain. It is uncertain how human perception and cognitive responses might be affected by the microcomputer simulator format. The normal use of a single, small screen for representation of visual scenes and instrumentation and the limitations of graphic imagery in the single-monitor format make it reasonable to presume that users of microcomputer maneuvering simulations with simulated bridge window views would experience depth perception problems similar to or perhaps greater than those observed in the DMI experiment. Such problems could possibly be mitigated with radar like bird's-eye views, especially where integrated with electronic charting systems. The cue domain would be altered from that associated with actual operations under unrestricted daylight operating conditions. The normal input devices, keyboards and pointing devices, are also different than those used aboard most ships. For training purposes, bridge team and bridge-to-bridge interactions are artificial because they are usually conveyed by text messages on the screen rather than by actual interactions among individuals. Because the cues are different from those experienced during actual operations, different cognitive skills or different levels of cognitive skills would be exercised. Users of microcomputer simulators may find it necessary to rely more heavily on instrument emulations for position keeping and maneuvering. The resulting trajectories and swept paths would represent what can be achieved using a
OCR for page 283
--> microcomputer simulation. How well the trajectories could be achieved during actual operations of all but the most sophisticated vessels on well-known routes is an open question. (It has been established through actual operations and extensive field testing involving passenger ferries serving ports in the Baltic Sea region that computers can be used to automatically maneuver ships on precise trajectories along well-known pilot routes [NCR, 1994]). Because of the apparent necessity to rely heavily on instruments for maneuvering decisions and the tighter swept paths obtained using simulation during the DMI research, it is possible that better accuracy of maneuvering may be possible during desktop simulations than during actual operations. There are no data to determine whether this is the case; there are no data or research to determine whether the results of ship-bridge and desktop simulators are comparable; and there are no data to determine whether different cognitive skills are used to achieve the results. In the absence of full-bridge instrumentation, accurate replication of essential visual information, and well-defined job-task criteria to guide assessments, there is a very limited scientific basis for ascertaining which tasks or individual skills might be evaluated in a desktop simulator or whether or to what degree they could be correlated with actual operations. The uncertainty over the results of desktop maneuvering simulations has implications for the application of this technology in passage planning. Although desktop simulators can potentially deliver accurate representations of maneuvering scenarios, there are uncertainties with respect to the degree to which the results represent vessel and mariner behavior in real life. The onboard maneuvering simulations would also be affected by the traffic conditions that exist at the time of passage, a factor that cannot be predicted for each individual transit. REFERENCES Anderson, D.B., T.L. Rice, R.G. Ross, J.D. Pendergraft, C.D. Kakuska, D.F. Meyers, S.J. Szczepaniak, and P.A. Stutman. 1993. Licensing 2000 and Beyond. Washington, D.C. : Office of Marine Safety, Security, and Environmental Protection, U.S. Coast Guard. Bush, B. 1993. U.S. Naval Academy, personal communication, November 8. Douwsma, D.G. 1993. Background Paper: Shiphandling Simulation Training. Unpublished literature review prepared for the Committee on Ship-Bridge Simulation Training, National Research Council, Washington, D.C. ECO (Engineering Computer Optecnomics). 1987. Mariner Licensing Device. Final report. Contract No. DTCG23-86-C-30029. Washington, D.C.: U.S. Coast Guard. Hays, R.T., and M.J. Singer. 1989. Simulation Fidelity in Training System Design: Bridging the Gap Between Reality and Training. New York : Springer-Verlag. NRC (National Research Council). 1985. Human Factors Aspects of Simulation. E.R. Jones, R.T. Hennessy, and S. Deutsch, eds. Working Group on Simulation, Committee on Human Factors, Commission on Behavioral and Social Sciences and Education. Washington, D.C.: National Academy Press.
OCR for page 284
--> NRC (National Research Council). 1994. Minding the Helm: Marine Navigation and Piloting. Committee on Advances in Navigation and Piloting, Marine Board. Washington, D.C.: National Academy Press. Townley, J.L., A. Wilson, and M.L. Thompson. 1985. Interactive Video Pilotage Training, Deck Officers: Final Report. Report No. MA-RD-770-85014. Washington, D.C.: U.S. Maritime Administration.
Representative terms from entire chapter: