4

The Misuse of Technologies

Two presentations at the workshop addressed the use of technologies to repress political change, perpetuate conflict, or otherwise undermine peacebuilding agendas. Countries, organizations, and individual response to the application of technology to peacebuilding, they can be expected to both counter those applications and use technologies for their own ends. Peacebuilders need to recognize these countervailing forces and plan and act accordingly if they are to make progress in reducing conflict and violence.

EXERTING CONTROL OVER INFORMATION

Ivan Sigal, executive director of Global Voices, which conveys to global audiences the voices of bloggers, writers, digital media activists, and translators who work in the developing world, began his examination of the misuses of technology by analyzing one of the two broad themes of the workshop: the means used to shape conflicts.

Conflict involves contestation, and those involved—including peacebuilders—have both intention and agency. Thus the activists represented by Global Voices have agency and seek to shape or influence their communities, as do their opponents in governments. Many of these activists use a collaborative and distributed form of knowledge to push ideas forward. To do this,



The National Academies | 500 Fifth St. N.W. | Washington, D.C. 20001
Copyright © National Academy of Sciences. All rights reserved.
Terms of Use and Privacy Statement



Below are the first 10 and last 10 pages of uncorrected machine-read text (when available) of this chapter, followed by the top 30 algorithmically extracted key phrases from the chapter as a whole.
Intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text on the opening pages of each chapter. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

Do not use for reproduction, copying, pasting, or reading; exclusively for search engines.

OCR for page 33
4 The Misuse of Technologies T wo presentations at the workshop addressed the use of technologies to repress political change, perpetuate conflict, or otherwise under- mine peacebuilding agendas. Countries, organizations, and individual actors can have objectives that are at odds with those of peacebuilders. In response to the application of technology to peacebuilding, they can be expected to both counter those applications and use technologies for their own ends. Peacebuilders need to recognize these countervailing forces and plan and act accordingly if they are to make progress in reducing conflict and violence. EXERTING CONTROL OVER INFORMATION Ivan Sigal, executive director of Global Voices, which conveys to global audi- ences the voices of bloggers, writers, digital media activists, and translators who work in the developing world, began his examination of the misuses of technology by analyzing one of the two broad themes of the workshop: the means used to shape conflicts. Conflict involves contestation, and those involved—including peace- builders—have both intention and agency. Thus the activists represented by Global Voices have agency and seek to shape or influence their communities, as do their opponents in governments. Many of these activists use a collab- orative and distributed form of knowledge to push ideas forward. To do this, 33

OCR for page 33
34 SENSING AND SHAPING EMERGING CONFLICTS they need not only access to authority and power but also relationships in information networks that allow them to influence those networks. In the Arab spring, maps of Twitter influence revealed important “nodes” in information networks. The individuals in question were not gatekeepers to authority and did not have exclusive access to resources, but they were good listeners and understood what kinds of skills could be of use to the communities they were addressing. For example, the activist who helped to overthrow the Ben Ali regime in Tunisia had been active in a dis- tributed network for six or seven years testing different information strate- gies, including the use of big data tactics and distributed data to demonstrate why the regime was corrupt. A follow-on of WikiLeaks was Tunileaks, which led to a series of stories revealing the extent of Ben Ali’s corruption from the perspective of the US government. These stories validated the claims of the opposition and further drove the conflict. Governments, whether oppressive or not, can react to technology- enabled peacebuilding through their own use of technology. They may try to control leaks or access to information (as described in the next section). Moreover, oppressive regimes appear to be learning from each other and collaborating in their use of technologies, Sigal noted—techniques used in Syria to conduct surveillance or filtering are almost identical to those used by Iran, and many countries in the Commonwealth of Independent States have very similar filtering systems that appear to be the result of collaboration. Sigal also observed that countries have collaborated on Internet gover- nance that would treat the Internet as media and therefore subject to state jurisdiction. The model of a “territorialized Internet, one where telecommu- nication borders and national borders are congruent, is one that is broadly appealing” among countries that seek to control Internet use. The United States and other countries “don’t have a vision for what we want the Internet to be—they do.” Sigal also described efforts by governments to use economic rather than political means to block Internet use. The government of Kazakhstan, for example, has been able to essentially create a national firewall without declaring one by incentivizing the largest telecommunications company in the country to provide free access to any kind of data, whether file shar- ing, music, or videos, while people who go outside the network pay for the data they access. “Suddenly going to Google…becomes a decision. Do I want to go to Google, or do I want to go to the one that I can get for free with KazakhTelecom?” While it may be easy to criticize China for erecting

OCR for page 33
THE MISUSE OF TECHNOLOGIES 35 a firewall around the country, it is more difficult to argue that the price KazakhTelecom charges for people to search Google is a travesty of choice. At a deeper level, Sigal warned about the temptation to view Big Brother as a metaphor for the evolution of cyberspace. Such a view assumes that regimes are monolithic, but they usually are not; rather, they shift or split their alliances to achieve multiple and contrasting objectives. A better paradigm is Aldous Huxley’s Brave New World. “Given enough freedom, we surveill ourselves. It’s not that there’s a watcher who will control everything that we do. [It’s] us, especially in free societies.” Policymakers can take several steps to help communities of activists and prodemocracy organizations oppose the actions of oppressive governments. For example, some projects funded by the State Department have helped provide anonymity for activists. Since many of the technologies that repres- sive regimes use to track, spy on, and otherwise monitor activists come from Western companies, export controls can clamp down on the distribution of these technologies. However, this approach is more difficult with nondemo- cratic countries that are nominally allies, and such controls do not affect unfriendly countries where some of these technologies are made. Some technology companies are working actively, though quietly, with activists toward positive ends. Some have hotlines and mediation processes so that if a government attempts to take down a posting, a company can assert that it is in fact a piece of rights documentation. “I want to commend those companies,” said Sigal. “That kind of process that allows for some kind of clarification about what the political value of that material is has a lot of impact.” Companies that build surveillance and privacy tools also have the option of conducting human rights audits among their clients, a strategy backed by many freedom-of-expression advocates. A critical aspect of interpreting the information generated by technolo- gies, said Sigal, is the creation of a frame for analysis. A set of events can occur that will not necessarily predict an outcome but make it more likely. For that reason, Global Voices analyzes, translates, and aggregates local citi- zen media for global audiences, focusing mostly on the developing world, and systematically tracks threats or events in fragile states. “We can see these events occur, almost like a rhythm, within a set of 50 to 60 countries around the world. That’s reactive, but it gives us a policy framework for imagining where these events might occur.” He also noted that peacebuilding is not the only framework for looking at sensing and emerging conflicts. People involved in conflicts do not nec- essarily see them in a negative light. Through a lens of justice, democracy

OCR for page 33
36 SENSING AND SHAPING EMERGING CONFLICTS building, or other activist frames, the same sort of data can be applied to a different agenda. He urged questioning “the normative assumption that conflict is always necessarily a bad thing. Because there is, I think, more of a continuum often between conflict which is creative, conflict which drives change, and conflict which is violent and negative.” THE NEW SOCIAL REALITIES OF CYBERSPACE Cyberspace has created a new social reality, said Rafal Rohozinski, principal with the SecDev Group, and laws have not been well adapted to govern this new reality. The use of new technologies to either protect or deny rights has not been defined legally or normatively. The result can be strong disruptions and distortions in political systems depending on how those systems operate. Rohozinski observed that Western governments to some extent exhibit what he called “the complacency of empire” with respect to information technologies. The Internet was invented, developed, and propagated around the world by the West. This technology, which has grown far beyond its original intended purpose, has created a platform for extending diplomacy through NGOs. The scale, scope, and reach of NGOs have expanded in ways that would not have been possible without the Internet, as have the business models of companies such as Google that were founded on the characteris- tics of the Internet. As a result, people in Western countries tend to take their freedom of navigation through cyberspace entirely for granted. But the Internet is changing. The vast majority of Internet users are no longer in North America, which represents only about 13 percent of the global Internet population and is declining. Two-thirds of all global Internet users are under the age of 35, and 40 percent are under the age of 25. Three out of five new Internet users live in states that are considered either failed or at risk of fragility. “The center for innovation, the drive to create things in this space, the impetus to try to describe it in policy terms, is no longer in Washington, no longer in Ottawa, the UK, or anywhere else. It’s shifting slowly but distinctly to the South and to the East,” said Rohozinski. This shift will have an impact on the governance of cyberspace, Rohozinski predicted. As people have come online, so have state interests and politics. This makes sense, said Rohozinski, since “a space that is colonized by a majority of your citizens is going to have all sorts of behaviors which, if those behaviors are translated into real life, would have real consequence.” Thus, cyberspace has become a place to be regulated and policed.

OCR for page 33
THE MISUSE OF TECHNOLOGIES 37 Because of the way the Internet is run, governments do not have the abil- ity to create the equivalent of a physical border around their corner of cyber- space and keep their citizens inside it while keeping others away. But they have an interest in doing so. One possibility is that in the future the Internet will no longer be neutral but will be subject to national laws. This could legitimate filtration, censorship, surveillance, and other forms of control pertaining to media, defamation, and other acts. People may no longer have the freedom of passage through cyberspace to which they are accustomed. Instead, the Internet could become much more fragmented and more like national telecommunication spaces. One enabler of this change is that the intelligence in the Internet has shifted from the periphery to the center. Today, telecommunications pro- viders have much more control over the Internet than in the past because they carry much more data, through television, mobile telephony, radio, and other forms of content. As a result, these companies are now able to measure, monitor, parcel, and direct traffic in ways that they could not before. As these central controllers pass cell phone service from one tower to the next, they can identify and track the user of that service. This may not matter as much in the United States as it does in other countries, but under authoritarian regimes, governments now have a way to know a lot about any individual “by essentially having them carry a digital dog tag everywhere.” Intelligent networks that enable this kind of monitoring are spreading fast outside North America. Advanced networks have greater penetration in some parts of Africa and Latin America than elsewhere, “which means those intelligent networks are being built exactly in the places where their capabilities can be turned inwards for surveillance purposes.” Surveillance also has become a much greater undertaking since the days of wiretaps. Furthermore, because the media environment is more complex, the kind of data that individuals generate through systems to which they are connected is much richer. As a result, new players have entered into that space, both in the United States and elsewhere, and these companies can break encryption in almost real time, in part because law enforcement in the United States and elsewhere requires domestic surveillance to support the needs of law enforcement. Governments have gotten much smarter about how to exercise their monopoly on the use of violence, force, and regulation not only within their physical borders but in cyberspace. National firewalls can prevent unwanted content going into or out of a country. Countries suffer negative conse- quences from erecting such barriers, so probably only about 12 to 15 do,

OCR for page 33
38 SENSING AND SHAPING EMERGING CONFLICTS said Rohozinski, but more could do the same if they chose to create a border around their cyberperimeter. Countries may also make information resources unavailable when it serves their purpose, through denial-of-service attacks, targeted filtering, or intentional disruption of protocols to make sure that opposition websites do not load. They may implement regulations and legislation to criminalize some online acts; in Belarus, for example, defamation of the president can be a cybercrime. Under this provision, the government can charge an inde- pendent media source with defamation and either filter a website or take it down. And governments can apply media law to all media content, forcing media to register locally or be subject to arbitrary filtration by the govern- ment. Finally, various activities can be criminalized, so that communication with known criminals, for instance, can become a criminal offense. A final approach is to use technological means to identify and target dis- sent and to confound readers about posted information. For example, the Iranian revolutionary guard cyber command has a Facebook-like page where it posts pictures of protesters online and asks people to crowdsource who they are, which has the additional effect of intimidating people who might be considering activism. In Syria, for example, the regime uses a technique called “eggshelling” on Twitter. Eggshelling is a way for a regime to control discourse on the Internet by putting out messages with ambiguous registra- tions that appear to support the government’s official positions. “Nobody really quite knows what it is. Is it a rumor? Is it really government stuff? If it’s not, is it quasi-believable? The sheer volume of it ends up pushing to one side a lot of stuff that comes from the opposition, which is less connected.” In other cases, criminal gangs have been hired to harvest damaging information or spread malware. Big data also can be misused; digitized census records or weapons registrations can be sold to third-party commercial entities that then sell them to risk security companies. “Although the initial collection of that data…may have been for a very worthy cause, the way that it’s actually put to use by others ends up being antithetical to the kind of security that it was supposed to create for the community.” Some kinds of activism require a public presence, which often requires divulging identity, Rohozinski observed. Some people may be willing to risk jail because it legitimizes their actions and their movement. In other cases, activists may not use the Internet, may work through multiple virtual pri- vate networks, or may work through external relationships. But even then, security may be impossible. “I have a community of friends who are part of the core Russian opposition movement,” said Rohozinski, “and they have

OCR for page 33
THE MISUSE OF TECHNOLOGIES 39 decided as part of their core tactics that they will do everything absolutely in the open. They have public meetings, and if you are not willing to be com- pletely transparent about who you are and what your intentions are you can’t show up, because they figured they can’t beat the Russian security.” There is some good news, said Rohozinski. “The more authoritarian a regime is, the more they’re caught in their own trap.” Governments want the benefits of modernization without the liabilities, yet the two are not easy to separate. “They want it both ways and realize that they can’t have it. They want to be connected and benefit from being members of a global commu- nity where science is cheap, where supply chains are accessible, etc., but at the same time they don’t want the politics of it.” As a result, governments do not want to jettison or ignore systems that activists can use to get around govern- ment restrictions. “Their headlong rush into the modern world also ties their hands because of the dependencies that it creates for them internally as well as externally.” What may be necessary in such an environment is to counter disinformation, as in the days of the Cold War. “Cyberspace is going from being the exceptional domain to one that reflects the complexity of real life. So I’m an optimist,” Rohozinski said. Rohozinski also recommended looking at the work done by the World Health Organization on violence mapping and prevention as a public health issue. This work has combined precursor indicators of violence, drawn from such measures as demographics, economic conditions, reports of homicide, and the prevalence of a grey economy to gauge the likelihood of conflict at different levels. For example, the introduction of policing in ungoverned spaces in Brazil has relied heavily on this public health approach of under- standing the precursors of violence, including messages sent on social media. This is a slightly different approach to the application of technology, because it is more about raising awareness. This awareness has not necessarily trans- lated itself into action by the peacebuilding community, “but it should be incorporated.” Rohozinski observed that security services are starting to be seen as a necessity, not an option. The professionalization of the provision of security tools will happen through market forces, which will gradually displace efforts offered through government agencies or other sources. USIP and the NAE could contribute to this evolution, he said, by acting as a focus of innovation for peacebuilding activities in both the public and private sectors. Individuals and organizations that recognize the new reality will be the ones that survive, he said, so training is essential to ensure that they remain up to date about the tools they use. The US State Department offers many

OCR for page 33
40 SENSING AND SHAPING EMERGING CONFLICTS good programs that can help prepare civil society organizations for the envi- ronments in which they operate, he said. He also pointed to organizations— domestic and international, public and private—that offer technical advice. “I don’t know if anybody has done an inventory of them, but there’s quite a few and they’re actually pretty good.” Rohozinski concluded that cyberspace is now a domain where conflicts will occur and need to be mediated. It is a space of maneuver, not one where people have freedom of navigation. It will need to be treated like physical terrain, and individuals and organizations will need the capacity to operate in it as they do in physical space. DISCUSSION Several participants discussed various negative applications of technologies. Dennis King reiterated the use of new technologies to spread misinforma- tion, disinformation, rumors, and incitement. Once incorrect information goes viral, correcting mistaken ideas can be very difficult. The fact that regimes use the new technologies to target individuals and organizations is more apparent with the use of social media than in the past. “Individuals connected to NGOs who’ve been involved in promotion of governance and technology have been imprisoned, killed, and attacked, and their NGOs have been banned,” he said. “The humanitarian space is already dwindling and shrinking. This is another way that the bad guys, the dark side, can further use [technology] to shrink the humanitarian space and access, and target civilians and human rights activists.” Sanjana Hattotuwa asked what would happen if 3D printers could be used to make exact digital duplicates of AK47 rifles? In this and other ways, technology could be used to exacerbate rather than prevent conflict. Chris Spence cited the social component of misuse, beyond the techni- cal issues. People are fooled into giving up their passwords, or they let their computers be taken over by malware. “No matter what we do, the humans who aren’t thinking about this every day are the ones who are the soft tar- gets.” Although his staff rely heavily on training, even they remain a target. Sigal said that security is a process and not an end state. It requires con- tinual investments as well as attention to the tools used to protect security, which can be turned against their creators to erode security. Hackers in some countries have been able to reverse engineer security tools and thereby put people at risk. “We need a Google for security,” he said, “a company that sees a business model in providing” security services.

OCR for page 33
THE MISUSE OF TECHNOLOGIES 41 He also added that the dark side/light side division, or skeptic versus uto- pian, is a misleading way of framing the issues. People accused of being uto- pians are often the most skeptical, because they have the practical experience of trying different things and realizing what works and what does not work.

OCR for page 33