This chapter provides highlights of the workshop session in which members of the steering committee and other participants offered their views on what work is needed in relation to human-automation interaction considerations for integrating Unmanned Aerial Systems (UASs) into the National Airspace System (NAS). As noted earlier, the workshop did not look at the hobbyist level of UASs.
As much as the need for use-cases and vignettes was discussed during the workshop, several participants acknowledged that there are shortfalls in this area, particularly in the analysis of the effects of unintended, criminal, or unanticipated usage. How function-task allocation is performed needs to be informed by an analysis of what is possible. Because of the overwhelming complexity of UASs, discrete modeling and simulation are simply not feasible. Therefore, the creation and employment of cases and vignettes provide valuable tools for understanding workload-assessment analyses, appropriate automation and contingency planning, and unintended-consequence analyses.
Another point made by participants during the discussion is that models that capture human interaction with a UAS do not yet exist. So in order to do a completely accurate model of the entire system and simulate activities, mental models need to be developed to capture the way that humans interact with UASs. This is probably the biggest obstacle to a comprehensive model of UAS activities that completely capture the complexity of the system behavior in all of its variations.
Some participants also suggested that research may be needed into the development of complex scenarios dealing with varying levels of automation and monitoring functions in order to flesh out this problem space. Results of this research would be particularly important in understanding the decision space in terms of safety and assurance designs, as well as certification. One participant pointed out that use cases should also include low-probability but high-impact events, such as the use of the systems by terrorist organizations and how the systems can be triaged and protected.
The absence at the workshop of discussion of the actual mechanisms of interaction between humans and the system elements was noted by participants. Emerging technologies in this area—including brain interfaces, spoken interfaces, and eye-tracking command interfaces—exist, and participants pointed out that they were not addressed. Some participants suggested that understanding how these types of emerging technical mechanisms for interaction were being implemented or considered for implementation into systems would be important for analyzing the transition and the effects for future systems: the physical and mental capabilities of the human operators need to be considered in this context.
Several participants pointed out that, although the workshop clearly focused on automated systems and the use of automation, and from time to time there was mention of how automation can make things worse, it did not cover the analytical issue of when automation makes things worse rather than better. This issue is tightly bound with system design for optimal use and interface. Participants noted that it is clear that some automation can be useful and is needed, as illustrated by the skills, rules, knowledge, expertise (SRKE) diagram (see Chapter 8), but exactly how much, in which roles, and in what format are interesting issues that warrant much more attention. Participants suggested that new models may be required to truly understand the trade spaces for a variety of situations and where in those trade spaces the best solutions are found. Furthermore, participants noted, some of the answers may not be amenable to pure analytics but, in fact, may be best found by experimentation.
One issue noted by several participants is that there are transition challenges when some elements are automated that often result in a higher workload in the near term and only result in a lower workload after the other elements of the system, particularly the human elements, learn how to interact with the new capability. In fact, these participants said, increased skills, manning, vigilance, workload, and training are often required in the near term. Understanding these issues will ease the transition to the future system, and that understanding requires consideration and probably research.
Several participants pointed out that the economics of manufacturing and technology are making even highly sophisticated technologies available to ordinary people. The integration of consumer electronics, open-source software, and flight systems has the potential for greatly changing the evolutionary path of UASs and probably needs more thought and research.
Throughout the presentations and discussions, the concepts associated with systems engineering and the exploration of a range of possible architectures were continuously brought up. Participants said that the challenges associated with designing for predictability and contingencies were almost always discussed from the perspective of the interfaces between the elements of the system, particularly between the autonomous and non-autonomous elements. But, noted one participant, the devil is in the details. In this vein, participants mentioned the great importance that use cases and vignettes offered for understanding the design elements and how to plan for contingencies, undesirable uses, illegal modification of system capabilities, and other problematic scenarios that could wreak havoc if not thought through carefully. It is recognized that not all potential contingency situations can be predicted, so it is important that the development of use cases and vignettes be deliberately inclusive of low-probability events in order to truly explore the potential problem space.
Some of the participants described several elements of this issue: the need for function-task allocation design guidance, appropriate assignment of roles, the development of concepts of operations based on risk-based certification standards, and analyzing the actual needs and appropriateness of automation within the overall architecture. Several participants also raised related issues of levels of automation and how much is enough and when it might cause more harm than good. Although there is as yet no firm answer, participants suggested the value of techniques to gain understanding of the SRKE axis, methodologies for developing the minimum amount of automation desired, and experimentation representing thoughtful approaches to gain insights. Understanding who—or which automated system—does what, and when, was repeatedly mentioned by participants as an important design feature. Some participants suggested that other important considerations are the types of decisions, the timeliness of actions, and cognitive pressure points.
As a help to the discussion, Nancy Tippens (steering committee chair) proposed a list of potential themes that she had heard throughout the presentations and discussions, and she encouraged participants to comment on it. This discussion resulted in four overall themes.
Theme 1: Controllers Must Be Involved in the Solutions Participants noted at several points that air traffic controllers are a critical part of any solution. There is a perspective, reflected by many participants, that a reflexive approach to thinking about approaches to automation tend to focus on pilots and operators. Participants repeatedly pointed out that the roles of pilot, operator, and air traffic controller are increasingly integrated for UAS operation, and, as a result, some of the more important functions that are critical to consider for human-systems interaction design for automation are control-focused activities. The discussion on this topic focused on the fact that the role of the air traffic controllers is incorporated, to a greater and greater extent, in automated elements in
actual flight systems. Thus, participants pointed out, considering the solution only from the perspective of a pilot or operator would ignore this reality.
Theme 2: The System Is Changing, which Requires Planning Participants said that the traditional roles of the controller, pilot, and operator are blurring or being subsumed into alternative structures. This change is important to consider from multiple perspectives, including certification, regulation, policy guidance, and the development of mental models that are shared by all the people in the system. The language is changing as well, participants noted, perhaps reflecting the changing mental models. Participants also pointed out that all of these issues have implications for the design of new systems, the design of future control structures, and the transition from the existing NAS environment to the future NAS environment.
Some participants also pointed out that there will likely be some messiness in the transition, and that should also be considered in order to reduce the impact over the period of transition. The messiness resulting from change can be minimized by robust planning activities, which require foresight, use-case analysis, and validation of the suitability of planned changes. The transition from current states to future states must, of course, be a critical part of this effort. Transition planning includes a plan not only for the actual system changes but also for the required training and testing of human participants so that the changes are gracefully integrated.
Theme 3: Training and Procedures Will Become Increasingly Important Participants stressed the importance of training and procedures for integrated operations. And some of them also pointed out that designs that are focused on assisting humans to be successful will reduce the need for repetitive training and memorization and will contribute to overall excellence in systemwide performance. Participants also mentioned the need for procedures that focus on contingency handling and remediation. As part of this discussion, it was pointed out that a more rigorous approach to human-factors testing and evaluation may be needed. In particular, an evaluation process for industry to follow for human-systems interaction assessment may be a useful development effort.
Theme 4: Bad Guys, Surprises, and Unexpected Behaviors A few workshop participants mentioned that illicit use of UASs in controlled airspace needs to be considered. They mentioned recent examples of problems, including hobbyist systems having incursions into controlled airspace, criminals using UAS systems to aid or execute illegal activities, and the use of UASs for violating social norms, such as privacy. Several participants offered their views that this problem was likely to get worse and that, because of the relatively small size of UASs, misuse could be extremely difficult to detect unless safety violations occurred. It was noted, however, that hobbyists’ use of unmanned aerial vehicles was outside the scope of the workshop.
The workshop ended with comments from Jay Shively (National Aeronautics and Space Administration [NASA]) about next steps and maintaining continuity. He reported that a group working on multiple UAS operations (known as SARP [science and research panel]) will be having an industry day in June that will include representatives from government, industry, and academia. He suggested that NASA may be trying to get people together to work on particular problems with bounded design space, such as collision avoidance. Such an approach would focus on which way to go, and why. Shively asked participants to contact him about future work, and he said NASA may consider holding another workshop in about a year.
This page intentionally left blank.