For this workshop session, the steering committee had asked the panelists to consider a two-part question regarding Unmanned Aerial Systems (UASs) and the National Airspace System (NAS):
What are the minimum automation requirements in the joint GCS [ground-control system]-vehicle system to integrate UASs into the NAS and what challenges do they create for automation-based human performance issues that exist in legacy systems?
Ellen Bass (steering committee member) and Erik Theunissen (Netherlands Defense Academy) offered a joint presentation. Theunissen began by noting the drivers for automation when considering UAS integration into the NAS—specifically, how to determine the minimum amount of automation needed for both the air vehicle and the air-vehicle-GCS combination. He also discussed trade space between required performance levels and the amount of automation minimally needed.
Bass noted that there is an urgent need for a methodology to systematically develop minimum automation requirements and evaluate their implementation. She explained that she and a colleague have developed a methodology for the development of minimum automation requirements that they have submitted to the Federal Aviation Administration: (1) develop an automation function allocation taxonomy, (2) perform task analysis, (3) categorize tasks by type, (4) generate function allocation rubrics for each task type, and (5) assign each task a recommended minimum function allocation strategy.1 Evaluation of this methodology would be a topic for future work.
The joint presentation continued with a discussion of minimum automation requirements that need to be considered in light of the context, which includes vehicle, airspace, and pilot. Other features pertaining to the automation, such as the performance of the algorithms, design constraints, performance boundaries, and degraded modes were discussed. In addition, Bass and Theunissen noted, a broader system perspective would include the performance of the individual components and humans, the interplay of associated systems, communication constraints, and communication delays.
1 Pankok, Jr., C., and Bass, E.J. (2017). Appendix A—Function allocation literature review. In Pankok, Jr., C., Bass, E.J., Smith, P.J., Bridewell, J., Dolgov, I., Walker, J., Anderson, E., Concannon, R., Cline, P., and Spencer, A. (2017). A7—UAS Human Factors Control Station Design Standards (Plus Function Allocation, Training, and Visual Observer). Washington, DC: U.S. Department of Transportation, Federal Aviation Administration. Also see Appendix B. Available: http://www.assureuas.org/projects/deliverables/a7/Final%20Report%20Front%20Matter.pdf [February 2018].
The two presenters next considered the question of what is needed to be able to determine a minimum required level of automation. They said the answer may include (1) a classification scheme, or taxonomy of wide enough scope to capture the applicable types of automation and with sufficient resolution to distinguish between the possible levels; (2) a systematic breakdown of the system and design into components that are small enough to enable a single classification of the automation; (3) performance requirements; and (4) a structured overview of the options.
Bass explained that such an answer leads to the question of how the minimum can be determined. A critical first step is a combination of analysis using models of required information processing, consistency metrics (for automation across the system), and results from the application domain. One can also integrate findings from the theoretical and empirical literature in combination with task analysis to develop a hypothesis. She noted that, to help determine those minimum requirements, it would be helpful to pull all the findings available from engineering, psychology, and other domains and then turn to human factors experiments to close the gaps.
In response to Bass’s presentation, Jay Shively (National Aeronautics and Space Administration) offered some comments. He urged caution, for example, about asking pilots what they want in the cockpit and then saying “yes.” He continued: “If we do a task analysis and we look at what information is required to perform the task, and then we give them extraneous information, perhaps it may bias the way they do it, and it may affect performance negatively. [Any time] I’ve ever asked a pilot if he wanted [anything] in the cockpit the answer was always “yes.” And so I’m just very cautious about that aspect of it.”
Bass responded: “You would be interested to know that one of the most contentious conversations we had was [whether one needs] to have automated landing capability in every UAS. I forget who brought up the last 50 feet, but if you lose [communication] link at that last bit, then it will be very difficult not to have auto-land. . . . And, similarly, there was a very contentious conversation about ground traffic displays around airports because of some of the visioning problems. It was interesting what you said about what the pilots think because it was really an issue of pride. Some of the pilots were like, well we can land it, it’s fine. And I’m like, well, what if there’s a gust of wind and you lose link right then, now you have to very quickly decide to go around, or you want to have an auto-land capability. So I think it’s very important if we do get subject-matter input, we are very careful about helping the people think through the contingencies, because they’re thinking about the best day, and we’re thinking about every day as potentially the worst day.”
Theunissen also responded to Shively’s comment about asking a pilot whether he wants something. “So DAA [detect and avoid approach] of course is intended to compensate for the fact that the pilot cannot look out of the window and do the visual separation task. So if he would be looking out the window and he sees an MQ9 [a UAS] here that is supposed to be a problem, and he sees a King Air [airplane] over here, he would know that one is unmanned and the other one is manned. Now we take that away from him, put him behind the display, and say, well, you have two symbols, and by the way you cannot see how the wings move, et cetera, so you are a bit limited in your ability to anticipate. We give you some additional stuff to compensate and alerting, but still, we have taken one important element away from him, he doesn’t know this, whereas, in reality, he does. And so I would actually argue that in the MOPS [minimum operational performance standards] we may want to consider whether it has to be added or not. So I’m certainly not [in agreement], well, if pilots just say that they want it then they have to be careful. I think in reality they know it, and we would need a good reason to take it away from them.”