2

The Technological Potential

Four presenters focused on the capabilities of new technologies in point to tremendous potential, although the contributions of technology toward preventing and mitigating violence depend on both the specific application and the context.

THE TECHNOLOGICAL CAPABILITIES

Prabakhar Raghavan of Google described some of the many technological capabilities that are now available. For example, it is routine in many parts of the world to use the collective flow of information from smartphones on a highway to measure traffic; the information can then be conveyed back to individual drivers about the state of traffic and the time it will take to get somewhere. This approach of using a “swarm of sensors” has been completely mechanized and is no longer “deep” (futuristic) technology. Instead, creativity centers on the development of new applications for the technology. The variety of applications to which swarms of sensors could be applied was not foreseen ten years ago, Raghavan said. Indeed, people tend to overestimate what will be possible in one year but underestimate what will be possible in ten years.

Another new trend is the remarkable power of machine learning. In the past, computer scientists tried to dissect every problem in minute detail,



The National Academies | 500 Fifth St. N.W. | Washington, D.C. 20001
Copyright © National Academy of Sciences. All rights reserved.
Terms of Use and Privacy Statement



Below are the first 10 and last 10 pages of uncorrected machine-read text (when available) of this chapter, followed by the top 30 algorithmically extracted key phrases from the chapter as a whole.
Intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text on the opening pages of each chapter. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

Do not use for reproduction, copying, pasting, or reading; exclusively for search engines.

OCR for page 11
2 The Technological Potential F our presenters focused on the capabilities of new technologies in peacebuilding. The rapidly growing range and scope of applications point to tremendous potential, although the contributions of technol- ogy toward preventing and mitigating violence depend on both the specific application and the context. THE TECHNOLOGICAL CAPABILITIES Prabakhar Raghavan of Google described some of the many technological capabilities that are now available. For example, it is routine in many parts of the world to use the collective flow of information from smartphones on a highway to measure traffic; the information can then be conveyed back to individual drivers about the state of traffic and the time it will take to get somewhere. This approach of using a “swarm of sensors” has been completely mechanized and is no longer “deep” (futuristic) technology. Instead, creativity centers on the development of new applications for the technology. The variety of applications to which swarms of sensors could be applied was not foreseen ten years ago, Raghavan said. Indeed, people tend to overestimate what will be possible in one year but underestimate what will be possible in ten years. Another new trend is the remarkable power of machine learning. In the past, computer scientists tried to dissect every problem in minute detail, 11

OCR for page 11
12 SENSING AND SHAPING EMERGING CONFLICTS analyze it, and come up with the optimum solution, but over the past two decades they have made great progress using a different approach. Instead of analyzing problems, they feed large amounts of data into computers along with a machine learning algorithm. The computers then “learn” how to carry out actions based on their analysis of the data. For example, Andrew Ng and his colleagues at Stanford University have used this approach to teach an autonomous model helicopter how to fly patterns that no human pilot would ever fly.1 “In some sense, 200 years of wisdom in fluid dynamics and aeronautics got compressed simply by throwing a lot of data” at the problem, said Raghavan. This approach is not universally applicable, but it has consid- erable promise. “This sort of machine learning and control has gotten us to the point where we almost have driverless cars on the road, and that’s a very exciting development if it can cut 30,000 road fatalities a year.” The challenge is much greater for peacebuilding, Raghavan admitted. Once a machine learning program has seen 50 street corners, it has a pretty good model of what a street corner is. But machines will not perform as well after seeing 50 conflicts and trying to make inductive inferences about the 51st. Conflicts are far more detailed in their social and political under- pinnings, so technological solutions can only go so far. Nevertheless, said Raghavan, “I’m a convert. I have tremendous faith in what machine learning is capable of accomplishing. There are times when you don’t have to get to the bottom of the detailed analysis. Machines can do things for you that are remarkably powerful.” Raghavan also pointed out that most computer cycles are used not to compute but to communicate. In many emerging markets, many people do not have a car but they have a smartphone. In that sense, transportation is falling behind communication in the modern pyramid of human needs. People may not have 24-hour electricity, but they have enough to keep their phones charged. “There is something very powerful about that,” said Ragha- van, and peacebuilding needs to tap into that development. As technologies continue to develop and be applied in unanticipated ways, Raghavan suggested that pressure from the peacebuilding community directed at technology developers to apply these new technologies to the cause of peace could have tremendous benefits. 1  A video demonstration is available at http://heli.stanford.edu (May 14, 2013).

OCR for page 11
THE TECHNOLOGICAL POTENTIAL 13 PERSPECTIVE FROM A SOCIAL SCIENTIST Duncan Watts, a principal researcher with Microsoft Research, parsed the issues discussed at the workshop into three categories. In the first category is what he called the representation of ground truth, in which information is gathered and processed to yield a representation of what is happening. (Box 2-1 presents an example of such a representation.) What happens with that information can vary from good to bad, depending on who is using it. The second category involves the ability to interpret a signal about what is happening to anticipate or predict what will happen. Technologically this Box 2-1 Sensing Conflict in Syria As an example of the capabilities of new technologies, Rafal Rohozinski, principal with the SecDev Group, described a sensing ex- ercise focused on Syria. Using social media analytics, his group has been able to identify the locations of ceasefire violations or regime deployments within 5 to 15 minutes of their occurrence. This information could then be passed to UN monitors and enable their swift response. In this way, rapid deductive cycles made possible through technology can contribute to rapid inductive cycles in which short-term predictions have meaningful results for actors on the ground. Further analyses of these events and other data also made it pos- sible to capture patterns not seen through social media analytics. For ex- ample, any time regime forces moved to a particular area, infrastructure such as communications, electricity, or water would degrade, partly be- cause the forces turned off utilities, a normal practice, and partly because the movement of heavy equipment through urban areas caused electric- ity systems to go down. The electrical grid is connected to the Internet, so monitoring of Internet connections provided immediate warnings of force movements. “These technologies are already quite powerful about being able to provide that kind of sensing,” said Rohozinski. However, there are ethical questions about whether gathering data at this level of granularity is consistent with international law, even for humanitarian actors. The collected data can become a risk to communi- ties that humanitarian actors are trying to help. The shaping of conflicts can be countershaped by actors who pollute data streams to change the nature of the response. “It’s not an uncontested environment and we can’t simply see it as one that we own [either] from a technology or from a data point of view.”

OCR for page 11
14 SENSING AND SHAPING EMERGING CONFLICTS is no more difficult than the representation problem. But theoretically it is more difficult because it raises questions about what signals are informative. The third category involves the facilitation of communication, resolu- tion, and reconciliation. The technological problem in this category is com- paratively simple, but the theoretical problem is immense. Giving people cell phones does not indicate whether things will change for the better—or worse. Watts also classified the issues discussed at the workshop according to the audience to whom information is directed or the users of particular tools. External actors may be agencies, NGOs, and self-organizing communities focused on an issue or problem; internal actors include the local communi- ties and people directly affected. The use of the information generated in any of the three categories above—representation, early warning, or com- munication and facilitation—is very different depending on which set of actors receives it. For example, early warning information bumps up against the problem of political will. Even if information indicates that something is going to happen, external agents may do nothing, or they may communicate information to a trusted network of internal actors. In the latter situation, internal actors need to worry about what to do with the information and what the likely consequences of that action might be. If a natural disaster is predicted, will a local population be better off or worse? Computer scientists refer to this kind of situation as the price of anarchy, where distributed deci- sions are not sorted by outcome. “Simply giving people more information doesn’t necessarily lead to a better outcome, although sometimes it does.” Technical problems, such as building better real-time awareness tools, can yield an infusion of resources to produce better tools. But political and social problems, such as convincing a policymaker to take a particular action, tend to be harder to solve. Other such problems concern the coordination of responders who converge on a conflict zone to help, or the best ways to encourage local communities to resolve their conflicting agendas. An experimentalist approach to political and social problems, noted Watts, might be to instrument the world, conduct field experiments to gauge the impacts of different interventions, and measure the results. Such an approach, however, would be insufficient. The technology challenges may be seen as low-hanging fruit for the near term, while agendas for research could be laid out in other areas to work toward long-term solutions. This way of looking at the issues prompts several questions, Watts noted. Are human analysts the best way to combine and analyze information, or can this sense making be better handled by machines? How can that capability

OCR for page 11
THE TECHNOLOGICAL POTENTIAL 15 be tested? If human analysts are used, how should they be organized? What kinds of people are needed? How can their division of labor be established? “These are standard questions in industrial organization and organizational sociology,” said Watts, “and I think we have good answers to them, but this is certainly an interesting context in which to think about it.” The most important question is what to do with information once it has been gathered. The answer is associated with a spectrum of social dynamics issues. Communities and nation-states are complex organizations with mul- tiple scales and many things happening simultaneously. Even if someone has a good picture of what is happening at the moment, the ways to improve a situation are not necessarily obvious. Decisions will also depend on whether actions are to be taken by an external or internal actor. “I don’t have any answers to any of these questions,” said Watts. “But I wanted to emphasize that the technology is extremely exciting.” Many things are possible today that were not possible ten years ago. But it is an illusion, he said, to think that gathering more data and applying more processing power is going to lead inevitably to better outcomes without understanding how systems work. BIG DATA FOR CONFLICT PREVENTION The world’s population is generating and processing an immense quantity of digital information, observed Emmanuel Letouzé, a consultant for the United Nations and other international organizations and the author of UN Global Pulse’s white paper “Big Data for Development: Opportunities and Challenges.”2 He quoted a figure from the University of California that the world’s computers process about 10 zettabytes of information in a single year, the equivalent of 10 million million gigabytes. Furthermore, the num- ber is increasing—“the growth is really ahead.” “Big data” is not well defined, but it is often characterized in terms of three Vs: volume, variety, and velocity. The volume ranges from kilobytes to petabytes, the variety from ephemeral texts to archived records, and the velocity from real time to batch processing, but all three dimensions are relative and contextual, said Letouzé. Intent and capacity are the central fac- tors affecting the application of technology, but how these play out exactly depends on the technology and the context in which it is applied. 2  The paper is available at www.unglobalpulse.org/sites/default/files/BigDataforDevelopment- UNGlobalPulseJune2012.pdf (May 14, 2013).

OCR for page 11
16 SENSING AND SHAPING EMERGING CONFLICTS Global Pulse has defined four kinds of big data in its work on develop- ment. Data exhaust refers to the “passively collected transactional data from people’s use of digital services like mobile phones, purchases, web searches, etc.,” which create networked sensors of human behavior. Online information is “web content such as news media and social media interactions (e.g., blogs, Twitter), news articles, obituaries, e-commerce, job postings”; these data treat Web usage and content as sensors of human intent, sentiments, perceptions, and wants. Data from physical sensors include “satellite or infrared imagery of changing landscapes, traffic patterns, light emissions, urban development and topographic changes, etc.”—information derived from remote sensing of changes in human activity. And citizen-reported or crowdsourced data refers to “information actively produced or submitted by citizens through mobile phone–based surveys, hotlines, user-generated maps, etc.”; this infor- mation is critical for verification and feedback. Global Pulse also has delineated three applications of big data. Early warning is “early detection of anomalies in how populations use digital devices and services,” which can enable faster response in times of crisis. Real-time awareness is the use of big data to produce “a fine-grained and current representation of reality,” which can inform the design and target- ing of programs and policies. Real-time feedback is “the ability to monitor a population in real time,” making it possible to understand where policies and programs are failing and make necessary adjustments. For the use of big data in conflict prevention, Letouzé distinguished between structural and operational efforts. The goal of the former is to understand the ecosystem while identifying the structural drivers of conflict. The goal of operational prevention is to detect and respond to anomalies through, for example, early warning and response systems. Big data can con- tribute to both forms of prevention, especially as data become more people centered, bottom up, and decentralized, said Letouzé. Global Pulse, in partnership with several other organizations, has analyzed situations analogous to conflict prevention to get a sense of the potential for big data to serve peacebuilding. For example, it has looked at the sociopsychological effects of a spike in unemployment, as measured by online discussions, to seek proxy indicators of upcoming changes, just as the food price index has been a predictor of food riots. And the ability of tweets to anticipate the official influenza rate in the United States similarly demonstrates how big data might provide early warning of emerging events. Mapping unstructured data generated by politically active users is fur- ther evidence of the potential of big data in conflict prevention, Letouzé said.

OCR for page 11
THE TECHNOLOGICAL POTENTIAL 17 For example, mining the social web during Iran’s postelection crisis in 2009 revealed some evidence for a shift from awareness and advocacy toward orga- nization and mobilization and eventually action and reaction. Similarly, data visualization of the Iranian blogosphere has identified a dramatic increase in religiously oriented users, while a study of tweets associated with the Arab Spring found that, in 2010, socioeconomic terms (e.g., income, housing, and minimum wage) largely prevailed whereas in 2011, 88 percent of tweets mentioned “revolution,” “corruption,” “freedom,” and related terms. The evidence, Letouzé explained, indicates that big data could help by providing digital “signatures” that can enhance understanding of human systems, along with digital “smoke signals” of anomalies for early warning and prevention. However, big data also pose risks and challenges in conflict settings. (Chapter 4 discusses in detail the misuse of technology in conflict settings.) As Patrick Meier and Jennifer Leaning pointed out in 2009, information and communications technologies, including the use of big data, raise serious concerns about access and security because of the lack of economic devel- opment, the prevalence of oppressive regimes, and the increasingly hostile environment for humanitarian aid workers throughout the developing world.3 In addition, the use of big data for conflict prevention faces many of the same challenges as its use for development, such as digital divides, lack of infrastructure and other resources, and political constraints. A related important challenge concerns the balance between access to data and protection of data producers. Reliability in conflict settings is another issue, especially when people have an incentive to “play the system” or suppress signals (e.g., by destroying cell towers). Though many people think that data are easy to access, in fact not all data are produced in easily accessible and storable forms, said Letouzé. Furthermore, in a conflict set- ting, the privacy challenge can become a security challenge. But the biggest problem Letouzé identified is what he called arrogance or overconfidence. People have a tendency to believe that data mining invariably yields the truth. They may see patterns where none exist, confuse correlation and causation, not understand sampling techniques, be misled by sample bias, or lack sufficient computing capacities to appropriately interpret the data. Data scientists or econometricians often do not know the context in 3  Patrick Meier and Jennifer Leaning. 2009. Applying Technology to Crisis Mapping and Early Warning in Humanitarian Settings. Cambridge, MA: Harvard Humanitarian Initiative.

OCR for page 11
18 SENSING AND SHAPING EMERGING CONFLICTS which data are generated to be able to distinguish between a joke, an off- handed comment, or a real threat. Big data can jeopardize the security and privacy of individuals and com- munities, and this risk may be greater in conflict zones, where it can create a new digital divide between and/or within communities and regions. At worst, big data could function as a sort of Big Brother for a world that is atheoreti- cal, acontextual, and all automated, according to Letouzé. Contextualization is key, especially when lives are on the line, Letouzé concluded. Big data should build on existing systems and knowledge and should be applied incrementally, iteratively, and over the long term, as a tool rather than a driver of change. Nevertheless, big data will continue to grow and develop and will likely eventually play a significant role in conflict prevention. TECHNOLOGICAL CHALLENGES FOR PEACEBUILDING Shortly before the workshop, USAID and Humanity United issued the Tech Challenge for Atrocity Prevention. Five key challenges in peacebuilding were presented at the workshop by Patrick Vinck, a research scientist at the Harvard School of Public Health and associate faculty with the Harvard Humanitarian Initiative (HHI).4 These challenges were: 1. Identification of uses of technology to deter enablers of violence— third parties such as multinational corporations and institutions that finance, arm, coordinate, or otherwise support perpetrators of violence. 2. Collection of evidence of sufficient quality to be used in court against the perpetrators. 3. Development of methodologies and indicators to assess vulnerabil- ity to inter- or intragroup violence. 4. Ability to communicate with and between conflict-affected commu- nities and also the ability of affected communities to communicate with responders. 5. Development of simple, affordable, trainable, and scalable technolo- gies to enable NGOs and human rights activists to gather or verify information from hard-to-access areas. 4  More information is available at www.thetechchallenge.org/#!enablers (May 14, 2013).

OCR for page 11
THE TECHNOLOGICAL POTENTIAL 19 The collection of information is a central component of these challenges, said Vinck. Significant progress has been made in mining information from new technological sources such as the Internet and social media. In a more active system, individuals in a community, whether volunteers or recruited for the task, would send information to a monitoring system. In particular, smartphones can be used to gather data more quickly, more accurately, and with better controls on where the information has been collected and when. An example from Eastern Congo is a project called Voix des Kivus, in which individuals were selected and trained to report information as it happened in the field. Another example of the adoption of a new technology is the use of satellite images to document the preparation of attacks, a step that has helped to democratize tools previously limited to military use. These new technological tools hold promise, but there has been very little evaluation of their application, Vinck noted. And the evaluation that has been undertaken reveals a problem of linking information with responses. In the Central African Republic, for example, a system was set up to improve communication between affected communities in the Lord’s Resistance Army area with humanitarian groups. After six months, hundreds of mes- sages had been received from the community, but no humanitarians indi- cated having responded directly to any of these messages, even though the system was supposed to be a two-direction communication system. “They were gathering and collecting the information but they were not using it,” said Vinck. The same thing happened with the Voix des Kivus project: it was a success in collecting information, but no humanitarians indicated having responded directly to that information. Vinck also pointed to a disconnect between the technologies discussed at the workshop and what is actually happening on the ground. In some places, less than a third of the population has access to a cell phone, and of that only a fraction may use text messaging. Text messaging may be common among the most educated people in the community but not, for example, among poor women, so the resulting information may be biased. Access to technol- ogy may vary by geography within a country, which may also distort the information provided. In some places even simple technologies like radios may not work because of a lack of electricity, equipment, or local capacity to fix equipment. Technology has great potential, Vinck said, but biased results may be detrimental to the situation on the ground. The information collected by communities through technologies is also typically available to those communities, which therefore have a responsi- bility to respond to that information, according to Vinck. Responses are no

OCR for page 11
20 SENSING AND SHAPING EMERGING CONFLICTS longer solely in the hands of international organizations or governments. With satellite imagery, for example, if credible evidence shows troops mass- ing outside a village, the people of that village can respond; they may flee, or they may respond with violence. Whoever compiles and provides information to a community has a responsibility for what happens with that information, which raises a host of ethical questions. What does information mean? How should it be inter- preted? How should it be shared and with whom? Finally, technology can bear witness to what has happened. Sensitive data need to be archived and protected, said Vinck. Many groups in the public and private sectors have collected large amounts of data, but there is no clear responsibility for storing the data. DISCUSSION Melanie Greenberg, president and CEO of the Alliance for Peacebuild- ing, called attention to issues associated with the sharing of data gathered using technologies (the subject of a previous NAE-USIP workshop that she cochaired5). There are particular ethical considerations associated with the sharing of data with the military, for example, as such sharing can affect the security of NGO personnel and their local partners. Matt Levinger, director of the National Security Studies Program at George Washington University, said his experience as an early warning analyst made him a skeptic about early warnings in general. “It’s hard to predict the future…any number of potential futures are possible.” A ­ etter b approach, he said, is early detection and adaptive response. In his work on conflict analysis, he thinks of actors as either dividers (potential sources of polarization and conflict) or connectors (potential sources of cohesion). Generally speaking, peacebuilding involves trying to identify and mitigate the effects of the dividers and trying to identify and bolster the connectors. A key question, then, is, Where do technologies have the potential to make new kinds of connections and boost resilience? “If we start thinking about what information do we need and go from there, we will be in a much better place than if we ask what information the technology allows us to obtain.” Robert Loftis, a consultant and former State Department official responsible for conflict stabilization, discussed the need to separate sensing 5  Thereport, Using Data Sharing to Improve Coordination in Peacebuilding, was released in December 2012 and is available on the NAE website (www.nae.edu/66866.aspx) (May 14, 2013).

OCR for page 11
THE TECHNOLOGICAL POTENTIAL 21 and shaping. Sensing essentially involves reacting to something that is hap- pening. But most conflicts are not surprises, even though their timing may not be known for sure. Sensing technologies can direct humanitarian aid, but, unlike shaping, they do not necessarily change the conflict. (Chapter 5 addresses the path from sensing to shaping.) The question, then, is whether the use of technologies can, in fact, prevent a conflict. Can they be used to help resolve land tenure disputes or differences over water rights before these become violent conflicts? This more anticipatory and active approach involves the dissemination and use of information to reduce differences among people and groups. Joseph Bock, director of global health training for the Eck Institute for Global Health at the University of Notre Dame, wondered whether some aspects of big data might be overly hyped. Flashpoints are often single pre- cipitating events, not related to complex pattern analysis, and understanding them may be more important than analyzing big data. Still, he said, the latter could be immensely useful in tracking sentiment through media and com- munications, which today is a labor-intensive task. Combined with the use of sensors to detect conversations, big data could be “incredibly powerful,” though there is also a risk of being massively intrusive. Fred Tipson called attention to the opportunities provided by tech- nologies that promote collaboration. Peacebuilding is built on interactions among individuals and groups, and technology platforms can facilitate these interactions and broaden the range and effectiveness of the actors involved.

OCR for page 11