Skip to main content

Currently Skimming:

3 Strategic Use of Information
Pages 17-28

The Chapter Skim interface presents what we've algorithmically identified as the most significant single chunk of text within every page in the chapter.
Select key terms on the right to highlight them within pages of the chapter.


From page 17...
... CYBER-ENABLED INFORMATION WARFARE AND INFLUENCE OPERATIONS Lin framed his presentation with the question, "What would Hitler have been able to do with the Internet? " Information warfare and influence operations, he explained, denote the "deliberate use of information to confuse, mislead, and affect the choices and decisions" of an adversary.
From page 18...
... Knowledge, truth, and confidence are all damaged, he continued, as the result of infection of the adversary's decision process with fear, anger, and uncertainty. Furthermore, he asserted, there are no noncombatants in information warfare and influence operations; everyone, including governments, universities, and news media, is a potential target.
From page 19...
... To illustrate, he pointed out that fluency bias and illusory truth bias allow simple and repeated messages to be received positively; confirmation bias causes people to seek out only information that is already in line with their beliefs; and emotional bias prevents people with strong emotional beliefs from considering rational arguments. Lin noted that some interest in policy changes has resulted from Russia's interference with the 2016 presidential election.
From page 20...
... However, Kerr noted, authoritarian states threatened by this method of discourse initiated regulations making it more difficult for citizens to communicate freely using the Internet. In response to such authoritarian clampdowns, political figures, such as Hillary Clinton, began publicly advocating for Internet freedom as a democratic freedom and individual right.
From page 21...
... Kerr highlighted two seemingly conflicting trends that shed light on this question. Pointing to Figure 3-2, she explained that, as general Internet penetration increased, the most authoritarian states (shown in red)
From page 22...
... Table 3-1 lists examples of these two different approaches to censorship. In Russia, for example, Kerr reported that new forms of control over the Internet and information flow included pressure on information technology businesses, improved surveillance capabilities, and the generation of new sorts of targeted content not readily identifiable as propaganda.
From page 23...
... The vulnerability of democratic systems to these new threats needs to be studied, she asserted, and the concept of soft power reexamined in the post–Cold War digital context. She suggested that authoritarian regimes may be more resilient than their democratic counterparts because they have been ad TABLE 3-1  Two Approaches to Internet Restriction "First Generation" "Next Generation" Site blocking Restrictive legal measures Keyword filtering Informal takedown requests Manual censorship of content Regulation of private companies Cellular or network shutdowns Just-in-time blocking/DDoS attacks Network traffic slowdowns Patriotic hacking/trolling/blogging "Walled garden" intranets Targeted and mass surveillance Economic takeovers Trials/physical attacks NOTE: DDoS = distributed denial of service.
From page 24...
... She suggested further that cyberconflict cannot be thoroughly understood through a framework of deterrence and military conflict, arguing that interdisciplinary work incorporating insights from media theory and the study of state–society relations, public opinion, protest movements, government processes, and other relevant areas is needed. CYBER PERSISTENCE: RETHINKING SECURITY AND SEIZING THE STRATEGIC CYBER INITIATIVE Harknett began by asking the audience to imagine that it was May 8, 1945, and World War II had just ended.
From page 25...
... He agreed with a statement by Vladimir Putin that whoever controls artificial intelligence will control the future of international dynamics, and noted that China has a grand strategy for translating cyber capabilities and artificial intelligence into strategic advantages, whereas the United States does not. REMARKS FROM SUZANNE FRY Fry began her remarks by observing that information has become such a part of the environment that it is like air or water.
From page 26...
... Fry observed further that, although information warfare and influence operations are not truly considered warfare in the traditional sense, when taken as a whole they represent attempts by another power to assert dominance. She posed the question, "How do we get our arms around the cumulative effects of this particular tool?
From page 27...
... She suggested that research is needed to determine whether censorship inhibits the ability to accumulate wealth. Finally, returning to the topic of strategy, Fry suggested that it would be helpful to have "predictions about the patterns of geopolitical competition" and "how those strategies might evolve in the strategic environment." DISCUSSION An audience participant opened up the panel discussion by suggesting that cyber issues may be one space in which alliances between state and nonstate actors are likely.
From page 28...
... With that analogy in mind, this participant asked whether the panelists thought there might be a way to link epidemiological methods to cyber issues. Lin responded that this is an idea advanced by other researchers, as well as incorporated in models from various disciplines, such as environmental studies.


This material may be derived from roughly machine-read images, and so is provided only to facilitate research.
More information on Chapter Skim is available.