After the final workshop of the series had concluded, the planning committee met virtually on October 9, 2020. To begin the meeting, each participant described themes that he or she observed during the workshop series and additional thoughts about U.S. Air Force (USAF) time cycles. The following discussions should not be interpreted as conclusions or recommendations; rather, they are reflections, insights, and ideas for future activities shared by planning committee members.
Ms. Deborah Westphal, chairman of the board, Toffler Associates, conveyed her initial thoughts about the major themes of the workshop series:
- Planning for future warfare
- The USAF is at its best when it is mission focused.
- Concepts of operations (CONOPS) could inform decisions on what capabilities we need.
- There could be a focus on outcomes versus inputs.
- Deeper thinking about speed and velocity, time, and temporal could lead to an understanding of how these things will change. Enemies may think about time differently; different systems work in different time dimensions.
- Total war/hybrid war/CONOPS for warfare
- Grow expertise and partnerships to support thought leadership and strategy making.
- Consider use of social media, manipulation/forensics.
- Consider financial, health care, and war for talent.
- New mindsets.
- Human capital strategy to prepare for technology (not sure of need for technologists to support development).
- Top cover/career path for thinkers or those that do not fit the model.
- Building trusted relationships inside and outside of the USAF
- Living behind walls prohibits communication and creates adversarial relationships.
- Authorities/who has responsibilities to escalate
- For how long to have them?
- In crisis, who has ownership of capabilities and decisions?
- What do authorities really look like when technology enablers are pushing to the edge?
- How work will get done
- What are the new processes needed for human–machine teams?
- Technology/What comes after the current set of technologies?
- Is there a low-tech strategy?
- What are the dependencies we are not thinking about?
Referring to the presentation from Dr. Matt Turek (see Chapter 5), Dr. Michael Yarymovych, president, Sarasota Space Associates, noted that artificial intelligence (AI) is now pervasive. Paraphrasing Vladimir Putin, he said that “he who owns the artificial intelligence owns the world.” The United States is facing a world of “fictional reality,” which leads to a new hybrid war. He predicted that machines would fight machines in the near future, which blurs the lines of responsibility.
Dr. Julie Ryan, chief executive officer, Wyndrose Technical Group, shared several prominent themes that she observed during the workshop series:
- Technology is not a panacea; it provides time advantages and time vulnerabilities, depending on how the technology is integrated into the fabric of the overall operational construct.
- Inserting technology in some situations (e.g., the underwater robot mentioned by Dr. David Mindell [see Chapter 2]) can radically transform how and how fast processes are done as well as enlarge the pool of knowledge transfer from a time and space perspective (e.g., involve more people over a larger geographic area in a faster time period).
- Collaboration is a function of bandwidth, time (latency), and human context.
- Building a talent pool, of either machines or humans, takes time. If there is a lag in a program, the talent pool can evaporate (under current economic systems) and must be reconstituted, which results in wasted time.
- The longer a technology is used, the more time the enemy has to study it and discover weaknesses in the technology or the use of the technology.
- Curating and nurturing talent is a direct function of time. Talent could be nurtured so that it can be applied at scale quickly.
- Talent is key: not just what they can do today but what they can and are enabled to do tomorrow.
- The time element is a critical part of the information battlespace, preventing misinformation/disinformation and viral memes from spreading, which can be critical to controlling or dominating the information high ground.
- A knowledge supply chain approach may be appropriate, in which knowledge is integrated in technologies or humans to best use the advantages each brings. Knowledge security consists of three attributes:
- Knowledge is available—the knowledge needed must exist.
- Knowledge is sufficient—the needed knowledge must be robust and trustworthy, rising to the levels needed by the users.
- Knowledge is accessible—the people, processes, and technologies that operate based on a set of knowledge must be able to obtain the knowledge.
- Deception is an attack in the “information battlespace.” Surprise is the result of effective deception.
- A knowledge supply chain approach may be appropriate, in which knowledge is integrated in technologies or humans to best use the advantages each brings. Knowledge security consists of three attributes:
- Decision processes can be a hinderance; formal orders and rules of engagement can impede the ability to act quickly. In other words, centralized control hampers quick reactions.
- Bureaucracy serves its own purposes, which inevitably leads to significant delays from formalizations. The real-time imperatives of war can be effective in cutting through the red tape, but that is hardly a good model. The job of bureaucrats is to dot eyes and cross tees, while the job of the military is to deliver effects on target. There is an inherent contradiction between the two missions that causes time delays.
- The use of contractors can be a tarpit, trapping the customer into a cycle of dependency that robs the customer of the ability to innovate, rapidly or otherwise, and have internal resiliency under attack or stress.
- CONOPS can help focus the overall efforts of the enterprise. CONOPS address how we engage in the world, not just in isolated scenarios. CONOPS can also help identify the ideal state, which can help the enterprise change toward a future state.
- Defining terms is critical as are metrics.
- The rate of change is an important element of time. That rate includes human perception (e.g., trust and trustworthiness). Access, control, and exploitation of knowledge fuel the ability of domains to converge to deliver effects.
- Time to plan versus time to execute is important.
- Trust is critical to efficient observe-orient-decide-act (OODA) loop execution.
- Speed of defense is as important as speed of offense.
- Humans process information differently than machines (humans by nature are visual data processors); it is important to provide information to humans in a way that their brains are hardwired to accept in order to speed knowledge transfer.
- It is also important to have data rather than opinions.
- The exponential production of information—some of which is very sketchy, some of which is plagiarized, and some of which is simply chaff—creates an environment where the ability to critically assess and judge information becomes a transformational skill.
- The value of representational knowledge (what is published, etc.) is decreasing in value even while the value of knowledge is increasing.
- This is a transition time between the industrial age, where humans were cogs in the machinery of progress, to the post-industrial age, which may be the knowledge age. This may require changes to how humans are “employed.”
- In the pre-industrial era, “unemployment” did not exist because actual employment was both rare and reserved to the upper class; in the industrial era, to be employed was to be a member of society, while to be unemployed was to be a drag on the economy. A person sold his time to an employer, which led to “time and motion” studies and timecards.
- In the knowledge age, machines will perform the repetitive tasks and humans will imagine and innovate. Measuring output in this paradigm cannot be done by counting hours at the office. A different approach to the valuation of human work could be developed.
- The integration of humans with pseudo-humans (robots, AIs, etc.) is misunderstood. To simply state that humans have always adapted to technology is probably missing an important part of the current era of technology transformation. In the past, technology has automated the physical elements of human labor (plowing, lifting, sorting, etc.) Now technology is automating parts of human cognition. It may be that there are types of personalities that can handle this better than others. Furthermore, the teaming of humans with pseudo-humans may require a different approach to management. Think, for example, of a nurse (or an electrician, janitor, or soldier) who is teamed with a pseudo-human. These people are not data scientists, computer scientists, or AI engineers. They are the people who do the work that the AIs are ill-suited for or are legally barred from doing (assuming that this occurs in the future). We already know that the designers have built huge biases into AIs (e.g., facial recognition)—recognizing that the AIs have to play with humans may help with that issue as well.
- What type of human (psychology, talents, etc.) is most suited to routinely work with a pseudo-human?
- How can humans be trained/educated in how to participate in human, pseudo-human teams? What does that training look like? What do the levels of proficiency look like?
- How does a human form trust in a relationship (team) with a pseudo-human?
- How does a manager assess the effectiveness of a human, pseudo-human team?
- If there are problems in such a team, how does a manager figure out what to do? Scrap the pseudo-human, get a software upgrade, retrain the human, or add another human into the mix?
- How does an enterprise detect problems with a human, pseudo-human team? For example, how can the enterprise detect if the level of trust in the team is being reduced, either by the participants in the team or by the consumers of the team output?
- It is also important to have data rather than opinions.
- What does a career path look like for a human in an enterprise in which human, pseudo-human teaming is the norm?
- Culture matters.
- Time-based outcomes and time-value propositions are important to be understood.
- What does “winning” mean, and in what time scale?
- The marginal time value of activities may be important to consider. For example, the marginal time for 1 million people to generate a raft of deceptive data is small compared to the marginal time required for an analysis staff to determine what is good and what is bad.
- The concept of a time salient in competition is intriguing. Mr. David Kaufman (see Chapter 6) presented an example of a time salient when describing disaster response. He said that there was a 72-hour salient, and after 72 hours, the adversary (nay-sayers, finger-pointers, media, etc.) would start to eat away at the edges of the salient.
Dr. William Powers, retired vice president of research, Ford Motor Company, focused on five ways for the USAF to adapt to shorter time cycles: (1) Emphasize the use of the Internet of Things for database communication and develop a plan for eliminating fight by phone; (2) recognize that microelectronics is the basis for speed (but be cautious about moving forward when abandoning trusted foundries); (3) hire an information intelligence czar; (4) reevaluate the emphasis on Cost Assessment and Program Evaluation—and start moving from cost to value; and (5) consider how to use high-level champions to speed up nominal processes and how to learn from the U.S. Army and the U.S. Navy.
Lt. Gen. Ted Bowlds (USAF, ret.), chief technology officer, IAI North America, presented three themes that he observed over the course of the workshop series: (1) The speed of technology capability development and the speed of integrating established technologies are enabled by a quality champion, not necessarily an organization. (2) The speed of adoption of operational usage is enabled by a quality champion and possibly a smaller organization. (3) A speed of shifting vectors could counter an evolving threat or counter something that has happened in technology development. He reiterated that all three transformations are achievable only with senior-level support and emphasized that they entail changes in culture, not policy.
Dr. Daniel Hastings, department head, Department of Aeronautics and Astronautics, and professor of aeronautics and astronautics, Massachusetts Institute of Technology, said that although technology may not be a panacea, some technologies could change capabilities substantially (e.g., quantum sensing and computing). The combination of quantum and AI, in particular, leads to new capabilities and new flows of information. However, separating information from misinformation remains an important issue. As AI and automation become more prevalent, humans could work with the automation proactively and productively in order to operate more quickly overall. He reiterated that trust is critical and that champions make a difference.
Dr. Joseph “Jae” Engelbrecht, president and chief executive officer, Engelbrecht Associates, LLC, echoed the importance of transformational leadership. Operational leadership from airmen is particularly important in light of future capability developments that include the integration of machines and humans; both a tactical view and a strategic view would be advantageous for operators. In an environment with competition and cooperation among great powers, a better understanding of time is critical. It is important that the operators consider desired outcomes and guide the technology in that direction. He advocated for the USAF to commit to educating its people at the start of and throughout their careers to guide their development as operational leaders.
Dr. Engelbrecht detailed his personal insights about and observations from the workshop series, in relation to the planning committee’s discussions prior to the start of the first workshop (see Preface):
One member asserted that the United States had no grand strategy for China (or Russia, Iran, or N. Korea).
- Our discussions confirmed no grand strategy, and it was not our role to imagine one.
- A May 2020 White House statement noted, “To respond to Beijing’s challenge, the Administration has adopted a competitive approach to the PRC, based on a clear-eyed assessment of the CCP’s intentions and actions, a reappraisal of the United States’ many strategic advantages and shortfalls, and a tolerance
- of greater bilateral friction. Our approach is not premised on determining a particular end state for China.”1
- Speakers acknowledged that the United States was in a competition with great power competitors, including China and Russia.
- Some speakers reported that the United States lost to China (or surrogate) in recent wargames. They suggested the acquisition of U.S. capabilities to attain a competitive edge.
- Participants were not clear about the meaning of a competitive edge. There was little discussion of what “winning” or any other outcome might be in a great power competition. One might interpret the desired outcome to be deterrence of war, but without a defined outcome, airmen are left to infer if their actions are contributing properly.
- Several speakers and participants highlighted specific capabilities and technologies that the United States could pursue quickly and lamented that the normal acquisition system may not be sufficient to attain crucial gains. Multiple participants described focused and more rapid acquisition processes that could help while they acknowledged the challenges of maintaining top cover and consistent funding for “special” acquisitions.
- Some discussions highlighted the challenge of a potential conflict with a peer competitor who was intricately involved in the global economy. For example, any competition with China must recognize the potential global consequences if the competition turned to conflict.
- Unlike the U.S. strategy to contain the Soviet Union, which could be isolated economically, the United States will be in a “coopetition” with China. China and the United States will seek cooperation in areas such as climate change and much of the economy. U.S. interests demand the reinforcement of human rights, free speech, self-determination, and global cooperation in areas from trade to standards. China may be reassured if U.S. interests eschew interference in nations’ domestic politics. At the same time, the United States may compete in trade, technologies, and capabilities. Each of these areas work on different time patterns. Understanding where cooperation and competition are appropriate may allow planners to fit those time dimensions into planning for operations.
- The linkages across the economies and across potential domains of conflict raise concerns that exceed the tactical and operational domains to which military thinkers and planners have become accustomed.
- Global finance and markets, Global Positioning System, supply chains, standards setting, climate, and health are examples of links and risks.
- China suffered intense humiliation from the Western treaty ports that followed the Opium wars in the 1860s and the later Japanese invasion. Any intrusion into the homeland is likely to spark an intense response. This psychological precondition coupled with an historical demand to preserve domestic order among millions and a current desire to preserve the legitimacy of leadership serves as a caution to U.S. planners.
- Furthermore, the United States has not witnessed an invasion on its homeland for more than 200 years (attacks on Pearl Harbor and 9/11 shifted attitudes but did not directly threaten each citizen’s life); an attack affecting all Americans’ lives would likely shake citizens and alter attitudes and actions about a conflict.
- The use of nuclear weapons could be of high concern given these conditions.
Another member stressed that the USAF is not thinking in terms of total war and that the current USAF has lost the art of imagination when it comes to airpower.
- Discussions noted that talented USAF experts focused on Title X responsibilities to train, organize, and equip. They practiced planning and exercise at the tactical level. Wargames considered tactical and operational play but seldom engaged in strategic issues. Our discussion of the various dimensions of time
1 White House, 2020, “United States Strategic Approach to The People’s Republic of China,” May 20, https://www.whitehouse.gov/wpcontent/uploads/2020/05/U.S.-Strategic-Approach-to-The-Peoples-Republic-of-China-Report-5.20.20.pdf.
- in operations and strategic affairs would be unfamiliar to almost all USAF officers. Pockets of innovators or strategic thinkers were rare, isolated, only occasionally offered top cover, and almost never promoted beyond the level of Colonel.
- This suggests to some that the USAF would benefit from a concerted effort to develop strategic thinking and planning though multiple efforts—for example, the USAF could caveat its Title X roles to emphasize mission outcomes. Organize for what outcome? Train for what capability? Equip to achieve what outcome? The USAF could focus professional military education on operational art and strategic outcomes. The USAF could recognize officers for accomplishments in strategic thinking, innovation, and entrepreneurship. The USAF might consider such recognition in professional military education, within commands, and even consider acknowledgement in performance reports.
- Many speakers and participants noted the value of linking operators with technical experts to increase focused innovation and integration. Many highlighted the value of developing trust and relationships over time. The USAF might consider how to build those relationships without resorting to formalized organizational structures or procedures that tend to lose focus and value as participants flow through.
A member suggested that the current USAF is not adequately thinking about new command authority structures.
- Speakers highlighted the significant acceleration in time that joint all-domain command and control and the applications of AI, autonomy, and other technologies offer.
- Participants’ discussions of the various dimensions of time in operations highlighted different cultural perspectives and implications of complexity, chaos, and uncertainty when dealing with peer competitors linked in a flattened global economy and environment.
- Linking C2 across domains seems to be a useful early step in managing operations from different time dimensions.
- The hypothesis that such developments demand greater attention to command and authority structures seems valid and deserves more development focused on being able to attain desirable outcomes.
- Several speakers noted the challenges of operating with confusing or misplaced authorities. Several offered examples when they went around those reluctant to give up authorities. And several mentioned dismissing doctrine, rules of engagement, and other constraints and instead using trust and relationships to achieve outcomes.
Discussions of the time dimensions that the USAF might consider in developing future CONOPS highlighted the variety of ways to think of time and the need to understand time from the perspective of competitors.
- Participants discussed time as linear, cyclic, accelerating in areas and slowing in others, and buffeted by notions of complexity, chaos, and uncertainty. They highlighted that time perspectives familiar to USAF officers were likely quite different from competitors with different cultural, historic, and psychological experiences. Some suggested that the USAF could invest energy and talent to better understand potential competitors’ time perspectives and horizons.
- Participants noted that the context influences what aspects of time are salient and relevant. Some believe that that the education of USAF officers could develop and practice assessing the context of situations and the perspectives on time.
Several speakers noted the importance of an abundance mindset.
- For example, some reported that when communication was evolving from expensive hardware and software toward cheaper and abundant bandwidth, it was appropriate to focus capabilities to take advantage of cheaper bandwidth.
- Others spoke of abundance in terms of a mindset that encouraged a focus on the velocity of achieving outcomes over speed and a focus on deployment rather than discovery. This view suggested recognizing
- what was available as abundance and being creative about getting it in the hands of operators quickly. Advocates of this approach argued that it retained and deployed talent.
Speakers who described innovation examples almost always described activity outside normal processes.
- Several highlighted the value of senior officer support and access to money.
- Many emphasized an outcome or warfighter focus.
- All noted that relationships and trust were important.
- Several recognized that “street cred” went a long way.
- All noted that innovations aimed at outcomes shortened time (and saved money).
- Several offered organizational transformation tips that emphasized which members of the organization to communicate with and focus on and who to ignore.
Dr. Brendan Godfrey, visiting senior research scientist, University of Maryland, provided context for the USAF’s current situation with a summary of recent USAF events and documents (see Box G.1). He commented that the threat to democracy from the manipulation of social media is just one aspect of hybrid warfare. He remarked that the United States does not yet seem to be thinking in terms of “whole of nation” engagement in future conflicts. He reiterated the suggestion for the United States to have an “information office,” to which the USAF would contribute. A disappointing revelation from the workshop series, he continued, was that speakers continually discussed how the USAF has difficulty adopting new ideas, especially when there are no champions willing to exert heroic efforts to overcome the system. While that is clearly not the right way to do business, it continues to be the way that business is done in the USAF. He cautioned that if the USAF is not in command of the most advanced technologies, adversaries will be in command of the USAF. Thus, he stated that it is crucial to make difficult changes and choices to enable the adoption and use of these technologies. The USAF has a serious issue in that it tends to focus too much on hardware and not enough on people.
Gen. Gregory “Speedy” Martin (USAF, ret.), GS Martin Consulting, Inc., pointed out that there is a difference between being slow (i.e., unable to make a decision) and acting slowly (i.e., exerting thoughtful control to make a decision). He said that the USAF could spend time developing a vision for the future through the lens of the necessary speed of action. Next, it would consider the concepts of action before moving to the concepts of operations—taking this perspective, gaps are revealed and solution sets can be developed and prioritized, whether they are organizational, systemic, or related to training or collaboration.
Dr. Richard Hallion, senior adviser, Science and Technology Policy Institute, said that the power of future AI creates a situation in which basic assumptions about conflict come to the fore. He highlighted three important themes of the workshop series: (1) hybrid war, (2) human–machine interface, and (3) AI fidelity and security. Transformational leadership and human relationships are critical, and technology cannot be viewed as more important than people. Some technology developments will drive major transformations in warfare, and the rapid speed of execution could result in reactive decision making, which impacts the acceleration of the pace of war. Thus, he continued, this environment is challenging in terms of planning and execution. For example, AI created new definitions of what constitutes legitimate and illegitimate targets (e.g., the banking, health care, or food supply system could now be targeted).
Mr. David Markham, managing director, Waywest Advisors, expressed hope that the workshops would stimulate additional activities. He described Hon. James Geurts’s idea (see Chapter 4) of finding ways to defeat the capabilities of the adversary without engaging in a head-to-head technology competition as “imagination of other concepts.” Although the United States has won previous conflicts based on its ability to innovate and outproduce the competition, he does not think that is possible with China—the United States buys so much from China, and China’s production capacity in several areas is greater than that to which the United States has access. He suggested that a future workshop further explore the slow rate of engagement from China as well as the point at which it becomes military.
Lt. Gen. Bowlds wondered if the world is approaching an inflection point at which an adversary will adopt a technology more quickly than the United States, putting the United States at an operational disadvantage. While
the United States is busy experimenting and adopting too slowly, other countries might already be using the technology. Dr. Hallion said that, in some respects, the United States has already reached that point. He added that the United States may already be in an AI war without having recognized it. Gen. Martin noted that although history may present the USAF as backward-thinking, its decision to create the new U.S. Space Force (upon recognizing that space is a warfighting domain) and change related policy is progressive. However, he explained that conducting operations based on the law, or the interpretation of the law, against an enemy that does not use that same framework is restrictive. Ms. Westphal expressed concern about the USAF’s stovepiped focus on known threats, while the technology revolution continues around the world. She was apprehensive that the USAF is expecting technology to create a deterministic outcome. Aligning organizationally, scaling, and pushing to the edge are also crucial, but she has observed little willingness to accept the disruptions that will accompany such changes. She said that the USAF would benefit from adopting a system-of-systems perspective and focusing on the human component. The USAF is structured similar to a pyramid; instead, she suggested that it be structured more similar to a pancake, with systems aligned on top of one another. Gen. Martin emphasized focusing on collaboration instead of on the interface (i.e., confronting instead of only recognizing the problem) as well as on options for the future. Ms. Westphal added that the future world is going to be very different; the USAF has to consider how to envision its place within it.
Dr. Rama Chellappa, Bloomberg Distinguished Professor, Departments of Electrical and Computer Engineering and Biomedical Engineering, Johns Hopkins University, emphasized the importance of workforce development by hiring the right young people with innovative technology ideas. The Department of Defense (DoD) is losing valuable talent owing to its salary structure. Classified information and controlled unclassified information also create too many restrictions for who can work on important problems. If the United States is concerned about AI, he proposed that the nation engage people with the expertise to manage it. While he expressed appreciation for privacy issues, he said that many things have to be unshackled to make progress. The United States is too constrained right now based on the limited data to which it has access.
Ms. Westphal asked if there is a science of human architectures, given that DoD tends to focus its architecture only on technology and products. Gen. Martin explained that three architectures used to exist: the technical, the systems, and the operational. The operational architecture would be fundamentally human-based, yet the USAF does not design such an architecture to enable more informed decision making. He proposed a fourth architecture, the integrational architecture, which could address that deficit, if done correctly. He added that the architecture that Mr. Preston Dunlap has created (see Chapter 4) could enable the level of speed and fusion of information needed to make decisions on the time scale of important activities. Ms. Westphal asserted her support for a separate operational architecture because the operational/human layer is the core of future warfare. Gen. Martin underscored that the solution sets have to consider the organizational, system, training, and human collaboration implications. He surmised that a ruthless leader may be needed to enforce change.
Ms. Westphal asked the workshop planning committee to discuss important issues on which the National Academies’ Air Force Studies Board (AFSB) could be advising the USAF. She suggested hybrid warfare, human architecture (i.e., command and control, leadership, man–machine interface, and teaming), and speed of technology and adoption. Gen. Martin posed a question about the standard definitions of hybrid warfare and total warfare. Ms. Westphal suspected that no standard definitions exist, because warfare continues to evolve and it is difficult to understand what it will look like in the future (e.g., it may include agriculture, health, social media, or the education system). Dr. Ryan explained that the term hybrid warfare emerged from debates within the cyber community about whether a cyberattack should be considered an act of war. Dr. Godfrey said that many adversaries are already attacking without bullets, leaving the United States flat-footed. He shared a definition from Wikipedia of hybrid warfare: “a military strategy [that] employs political warfare and blends conventional warfare, irregular warfare, and cyber warfare with other influencing methods, such as fake news, diplomacy, lawfare, and foreign electoral intervention.”2 He described it as a definition that encompasses too much; however, he agreed that hybrid warfare means using all resources at the disposal of the state in its conflict with other states. For example, China’s strategy is to take control of standard-setting organizations around the world so that the standards that
are developed favor China. Dr. Hallion portrayed hybrid war as a type of warfare that could be applied across the spectrum of conflict; it does not necessarily imply total war, which has traditionally been defined as a war of national survival. He defined hybrid warfare as both kinetic and non-kinetic warfare, with diffuse targeting across a nation state and its population, simultaneous and parallel operations (including an emphasis on disinformation and exploitation of media), and an emphasis on disrupting an opponent, which is carried to an extreme paralysis of that opponent. Ms. Westphal added that this can be done over several years, reinforcing the importance of time. She wondered if people are being manipulated to the point of self-destruction, and Dr. Hallion replied that that is part of the disinformation, deception, and exploitation of hybrid warfare. Lt. Gen. Bowlds alluded to a previous cyberattack by a foreign adversary—the United States did not retaliate because it did not consider the attack “an act of war.” Dr. Hallion added that the Pentagon was subject to 20,000 hacking attempts per day in the early 1990s, so one can only imagine the number of attempts today. He explained that it is important to devise a plan to determine which attempts are indicative of more organized, more systemic, and more sinister approaches that may be intended to degrade the functioning of the system to the point that other aspects of hybrid warfare can be brought to bear. Lt. Gen. Bowlds described these cyberattacks as intelligent probing to find a vulnerability and pre-positioning for a major attack. In a kinetic world, that is considered an act of war, whereas in the cyber world it is “the nature of the beast.”
Ms. Westphal suggested that AFSB could propose a study on hybrid warfare (e.g., the spectrum, how it escalates/de-escalates, and the future role of the USAF). Another area of interest could be the future role for people in the USAF (e.g., talent, leadership, and teaming). Dr. Godfrey advocated for narrowing the focus to a study on the human–machine interface. Dr. Yarymovych commented that legal and ethical issues emerge when machines are fighting machines: Who controls the machines, and who takes responsibility when the machines kill someone? He suggested that AFSB conduct a study on the human command and control of modern machines. Dr. Hastings proposed a study on how to motivate humans associated with the USAF to think creatively, broadly, and strategically. Lt. Gen. Bowlds noted that as machines have become more capable, the time cycle to understand the inputs that the human receives surpasses the human capacity (i.e., information overload). He added that the fusion of data has to become automated so that a person can make an intelligent decision. However, with so much information, the machines may drive the decision loop instead of the humans, which could create a dangerous scenario. An important issue to explore is the point at which the human in the loop starts to become minimized owing to the inability to “keep up.”
Dr. Engelbrecht put forth a related question for exploration: When is it acceptable for the machines to formulate or make a decision, and when is it appropriate for humans to build in constraints to control the outcomes? Ms. Westphal emphasized the need to understand what the people in the USAF are doing with all of this technology that it is building. Although this workshop series focused on time, if people are the slow part in the process and if machines become capable of identifying bias, what is the future role for humans? Dr. Ryan added that lawyers could become obsolete with AI—tasks performed by legal clerks and paralegals are already being automated. Dr. Powers wondered if this is the point that Lt. Gen. S. Clinton Hinote was making in terms of moving from human-in-the-loop to human-on-the-loop (see Chapter 5). If so, the important question is how the machine could do more while the human remains in control, but from a different vantage point. Dr. Ryan described this process as having a human over the loop—the human becomes the coach instead of the actor. Dr. Powers noted that AI provides a means for collaboration—in essence, a decision support system. Gen Martin added that a new data structure would facilitate an AI-driven future. Dr. Godfrey advised that the first step to achieve this vision is developing alternative future scenarios.
Dr. Engelbrecht expanded the first topic of hybrid warfare to include future hybrid competition: When does it become conflict and war, and what is the role of warriors? He emphasized that the future does not matter; it is only a tool to anticipate the competition. Lt. Gen. Bowlds mentioned the importance of planning for synchronization; a combination of fast and slow processes could induce more chaos. Dr. Ryan suggested that an attack on U.S. data centers and communication lines would paralyze the entire nation; it would be unable to organize itself without a steady supply of data owing to its overreliance on bureaucracy. Because future data will be on commercial web services, she continued, it is important to have a minimal essential set of data that could be accessed quickly within a denial of data scenario. She added that AIs need a backup plan in case they run out of fresh data
while under attack. Dr. Hallion said that the fidelity of data and the preservation of the data stream are important. Lt. Gen. Bowlds discussed the need to identify the most important data in the stream for protection. Dr. Engelbrecht suggested a structure that “graded” data. Gen. Martin pointed out that a data and communications architecture that satisfies the future needs of the USAF does not yet exist (despite plans for such a strategy as early as 2000). He reiterated that the USAF is not organized and does not allocate money in a way to enable such an architecture as a centerpiece of operations. He added that DoD does not have an architecture that can provide the right kind of access to information and action. DoD also has a bureaucratic process that is slow to change in terms of the human component. Dr. Chellappa noted that inferring logical thinking or roles for data is worthy of further study, because it will enable the integration of commanders’ intent into AI.
Lt. Gen. Bowlds expressed concern about the lack of a demand signal emerging from the experimentation in the USAF, which suggests that it could be difficult to overcome the valley of death. Gen. Martin agreed and noted that the demand signal could come from the enemy or be created by a good leader. The hope is that the Air Force Warfighting Integration Capability would generate gaps and opportunities, and the Advanced Battle Management System (ABMS) would prompt experiments to occur in a more coherent way. Lt. Gen. Bowlds reiterated that while technology is important, the demand signal has to emerge from an operational/threat perspective. Dr. Ryan commented on a possible transition plan for ABMS; it has to be moved in stages, with operational readiness tests to ensure that it works with operational data once it starts to be populated. Lt. Gen. Bowlds added that the upcoming Corona will choose which experiments to transition to programs of record. A program of record then allows people to be trained and for the technology to be sustained.
Dr. Godfrey and Ms. Westphal expressed disappointment that National Academies’ activities are not well attended by representatives of the USAF. Ms. Ellen Chou, director of AFSB, National Academies of Sciences, Engineering, and Medicine, commented on the success of this workshop series; it was an impactful, fact-filled information session. The workshop series posed many questions and raised deeper thoughts that could be explored further in additional workshops and/or studies.