This final chapter captures the major issues that arose repeatedly during the workshop discussions. The most significant revolved around the question of how peacebuilders can use data gathered from sensors, The other issues concerned the existence and significance of a digital divide, the role of the private sector, and the need for unity among peacebuilding organizations.
Fred Tipson set the context for the discussion of shaping policies on the basis of data by noting that the peacebuilding community often lacks actionable strategies to convert sensing into shaping. Early warnings, for example, can help people get out of the way, whether or not they change the course of events. The focus needs to be on how to assist the people involved to avoid the worst consequences of potential deadly violence. A continual challenge, he said, is “to think about how to translate information into action.”
One need is to engage policymakers who are in a position to shape conflicts. Several workshop participants observed that the Arab Spring movement has not been as influential as many hoped because it has been unable to gain much political representation and engage political institutions. As Chris Spence observed, the situation has been in some ways analogous to the
Occupy Wall Street movement, a leaderless movement that has been largely ineffective in bringing about policy change, compared with the Tea Party movement, which has been able to engage political institutions.
Libbie Prescott, strategic advisor to the US Secretary of State on science and technology, noted that the subject of political will arose several times during the workshop. Not all policymakers are comfortable with data and methodologies, she observed, and the information gathered through sensing may not be as self-evident to those who need to express the political will to act. Policymakers have preexisting agendas, and just presenting them with data does not guarantee a response. Presentations may need to be adapted to the individual. “The same data will not convince [different] people of the same outcome regardless of how accurate the data is. I don’t know if there is a technological fix for that, but it’s something to keep in mind.”
Prescott added that political will depends on a combination of the perceived certainty of information, the perceived cost of action, and the perceived cost of inaction. Data measurement and transparency can strongly influence these perceptions. As Secretary Clinton has said, data not only measure progress but inspire it. “Providing data in these environments allows for better accountability and greater governance,” Prescott said.
Prescott also asked whether a society is better off being able to detect something if it has no ability to change that thing. Surveillance is useful when there is a clear way to act on the information gathered. When policymakers receive information, they typically want to know what to do next, and asking for more money to study the situation further is typically not a satisfactory answer. If specific recommendations for action are lacking, policymakers may distance themselves from those who put them in an awkward situation, she said.
Neil Levine, director of the Office of Conflict Management and Mitigation at USAID, elaborated on this point by observing that early warnings often present decision makers with the difficulty of uncertain information and high costs. Sensing can help by clarifying the certainty or uncertainty of the information. Also, to the extent that sensing provides information further in advance of the onset of violence, it broadens the choices for policymakers and often reduces the cost.
Also on the issue of political will, Sanjana Hattotuwa noted that an emerging information landscape will make it more difficult for policymakers not to act when presented with actionable information. Information about atrocities such as ongoing genocides will inevitably reach the rest of the world rather than staying in a particular region, as might have happened in
the past. Policymakers may still choose not to act, but not because of a lack of information.
Despite the rapid advances of technologies in recent years, several workshop participants wondered whether digital divides between individuals, groups, regions, and countries still limit progress in the application of technology to peacebuilding. As technologies have become less expensive and more widespread, concerns about creating a culture of information haves and have-nots have faded, Prabhakar Raghavan noted, although he recognized that digital divides have not completely disappeared. But Moore’s law, which holds that computing power roughly doubles every two years, promises that divides will continue to diminish as computing devices become cheaper and more powerful.
Lawrence Woocher wondered whether digital divides will persist as more advanced technologies appear. “Perhaps we shouldn’t assume that there’s going to be a convergence but just a continuing trajectory upward around the world, [with] different paces for different places.” Raghavan acknowledged that the divide may never completely disappear, but technologies no one thought would become global are becoming routine everywhere, even though they may not spread in their most advanced form.
Duncan Watts clarified that inequality in communications technology is substantially smaller than other forms of inequality, such as access to health care, clean water, transportation, or education, and may even help reduce some of these other forms of inequality. Innovation will almost always accrue first to the wealthier parts of the world, he said, but inequality is less striking in communications than in other areas.
The role of the private sector in both advancing technology and contributing to peacebuilding came up in several contexts. Hattotuwa expressed concern about the privatization of information, noting that he is more comfortable with information being held by the United Nations than by corporations or other private organizations. Even when corporations want to be helpful, they may use information in a manner that differs from the expectations of the people who provided it.
Rafal Rohozinski made the related point that how the Internet functions depends on the deliberate acts of individuals and institutions. A generation of individuals has been behind the institutions running the Internet for the past 25 years, and that generation is now retiring. Instead, commercial interests are starting to colonize those institutions, including companies outside the United States.
Raghavan countered that companies want to not only make a profit but continue to exist. That desire “is not well served by doing anything that’s tactically expedient and strategically evil.” Companies such as Twitter have tried to act in a responsible manner, while institutions like the United Nations do not necessarily have the infrastructure to undertake similar functions. “Hopefully the people running these companies aren’t going to compromise their long-term integrity for a quick buck.”
Private companies may, however, apply standards to the posting of information. Fred Tipson recalled a comment made at a meeting by a Syrian activist who said that he and his colleagues count on YouTube to document the atrocities of the regime and mobilize the Syrian people and the international community. But YouTube has standards about what it will and will not allow in video depictions of violence and cruelty, which can undermine this strategy. Similarly, Google makes decisions about what to make available in different countries and what not to make available. “How transparent should the process be by which Google makes decisions around those issues?” There are no authoritative standards for privacy, transparency, or responsibility. “I think Google is trying to behave as responsibly as they can. I know they usually require a legal standard before they will take down something…. But that still raises the question of whether or not these activists deserve transparency in allowing people to see how awful the behavior of the regime has been.”
Companies confront the same ethical difficulties as other holders of information. In modern and open societies, information almost inevitably comes out after the fact, heightening the tension between transparency and caution. If analyses of information generate serious concern, should that information be made public, even if it could cause a panic? As smartphones make it possible to identify at least the approximate locations of their users, will phone companies allow geographic information to be sent with text messages? These are among the many practical questions that need to be answered as technologies continue to diffuse throughout societies.
Finally, Rohozinski made the point that global Internet companies should be worried because the creation of digital borders in cyberspace
through economic or political means could unravel their business models. These companies, too, have a vital interest in peacebuilding and in the free flow of information across borders. But the international rules relating to government controls on the Internet are up for renegotiation at the upcoming World Conference on International Telecommunications (WCIT-12), December 3–14 in Dubai.
Rita Grossman-Vermaas, senior international policy advisor for Logos Technologies, spoke about the need for greater collaboration and coordination between the peacebuilding and technology communities. For example, the peacebuilding community could identify the nature of conflicts and become part of the process to determine what kinds of technologies might be applied most usefully to those conflicts, from text messaging to satellite imagery.
Hattotuwa approached this issue in a somewhat different way. Some groups in the peacebuilding community demonstrate a marked resistance to sharing information, he said, and even are reliant on withholding information. “The assumption that the peacebuilders themselves are benevolent creatures working in the best interests of their communities and their nations and their peoples is, I think, something that we need to question, because that is not always the case.”
Melanie Greenberg called attention to the intersection of peacebuilding organizations with organizations focused on democracy, development, health, education, and other issues as a way of building unity. Many of these organizations increasingly see themselves as engaged in peacebuilding, she said, and even those that do not are sensitive to doing their work in such a way as not to exacerbate tensions.
Patrick Vinck similarly pointed to the need to develop collaborations between established organizations and new organizations that have emerged around specific technologies. He mentioned Human Rights Watch and Physicians for Human Rights, which have considerable expertise with consent forms that new organizations could use.
Noel Dickover called for efforts to bridge the gap between formal organizations and the volunteer technology community. People will show up at a crisis. The Red Cross now brings in technology volunteers in the same way they do people for food distribution. Can other institutions take advantage of technology volunteers to build a situational awareness network?
In wrapping up the workshop, Woocher returned to his original observation that peacebuilding is very broad and encompasses many different activities. He noted that the workshop was most successful in generating practical ideas when participants considered specific applications of technology, such as election monitoring. One way to extend this success may be to move discussions into the field. An example of this, noted earlier in the day by Dickover, is technology camps, where people go to a community and work with local actors could facilitate the identification of key issues and approaches to moving forward.
Tipson spoke more broadly of the need for groups to know what kinds of societal goals they wish to achieve. “To some extent the peacebuilding community talks too much about peace and not enough about the agendas that peace should be part of.” If an organization’s only objective is peace, someone who does not have that objective has a major advantage. Peacebuilders need a positive agenda that attracts new and different sets of players for whom nonviolence is a key objective. “That’s true in all of the peacebuilding problems that we’re looking at—there has to be a broader agenda for what change we want to see a society accomplish.”
As an example, Tipson pointed to the need to be more insistent about determining rules governing the Internet. Governments need to come together to develop “some kind of consensus around the way the Internet and these technologies surrounding it are going to be managed,” he said. The United States needs to be proactive in engaging with other countries to counter the efforts of the governments of China, Russia, and other countries to advance a more restrictive approach. As governments have gotten more sophisticated in their approaches to controlling communications, countries and groups that support liberalization need to become more sophisticated as well.
Technology can serve civil disobedience and civil mobilization, Tipson said, as a component of broader strategies for political change. It can help people organize and mobilize around particular goals. It can spread a vision of society that contests the visions of authoritarian regimes. And it can contribute to experiments in peacebuilding, such as better elections or formal “truth and reconciliation” processes.
Tipson urged the workshop participants to clearly identify peacebuilding problems and then ask how technology could help solve those problems. The problems may be related to conflict prevention, conflict management,
dispute resolution, postconflict reconciliation, or opposition to authoritarian regimes. Those involved in peacebuilding and technological development can benefit by working together to determine what capabilities would help in each of these settings, and how technology can help provide those capabilities.