Members of the workshop steering committee and workshop participants reflected upon key themes that had arisen during the panel discussions, and identified a few additional concepts, as summarized below.
Susan Landau reiterated Jessica Staddon’s comment that the power of transparency should not be underestimated and also suggested that the accuracy of the information shared is critical to an organization’s credibility. She noted that transparency can be a challenge for the IC because many details of their operations are necessarily classified, suggesting that this makes it even more important for the IC to get the big picture right when communicating with the public. For example, she noted that inconsistency in the reported number of instances in which the use of metadata was critical for the IC had undermined the community’s credibility. Landau said that her experience with the Academies study on bulk collection of signals intelligence,1 showed her how seriously the IC takes the rules around data collection, access, and use. She pointed out that this is generally not clear to those on the outside, and could be better communicated through deeper engagements with the academic community. While there are complexities associated with control of data flow and use, she suggested that better funding for privacy enforcement activities in general would be beneficial.
Tadayoshi Kohno pointed out that the absence (or perceived absence) of knowledge of IC activities among those in academia and the private sector made some of the privacy discussions challenging. He pondered the potential for developing “toy problems”—scenarios designed to embody some of the challenges that the IC faces at a classified level but without actually revealing classified or restricted information—to enable academic and industry researchers to better understand and make progress on some of the IC’s privacy challenges.
Kohno noted that, because time was limited, the workshop discussions of emerging technologies were not comprehensive. He pointed out that opportunities for discussing and learning about emerging technologies could also be found at other venues, such as the annual Consumer Electronics Show or through interactions with venture capital firms.
He reiterated Mark McGovern’s point that the audacity of new technologies makes it hard to anticipate their privacy implications. He pointed out that this is likely also true for any audacious new capabilities the IC might develop. He identified the recurring theme that a system’s privacy must be considered from conception to deployment, and take into account evolving uses and stakeholder needs. He identified the analogous need to continuously consider privacy throughout the IC’s activities, along with the public’s perception of this need. Finally, he pointed out the recurring theme of how difficult privacy is to define.
Frederick R. Chang expressed a hope that this workshop might serve as a tipping point and help seed important discussions, solutions, and identification of privacy challenges to be solved. He noted that privacy is a hard problem, confounded by human irrationality, uncertainty, inconsistency of research results, and a lack of resources for innovation. He suggested that advancement of a “science of privacy” might help to make progress, alluding to efforts under way within the IC. In particular, workshops could identify grand challenge problems; researchers could develop data sets to be shared; students could be encouraged to study this field. Chang suggested that some of the gaps between the participants from the IC and those from academia and the
1National Research Council, 2015, Bulk Collection of Signals Intelligence: Technical Options, Washington, D.C.: The National Academies Press.
private sector may have narrowed today, and that additional meetings such as this one would provide even more added value.
Helen Nissenbaum observed that the Snowden disclosures were an eye-opener for those working on privacy, in part because they clarified the importance of compliance with the law and internal policies, but also because the public’s strong reaction made it clear that people had actually expected more than just compliance.
She found that this meeting helped to start teasing out the intersection between technology, intelligence practices, and all of the values that are gathered together under the term privacy. She noted that the emerging technologies that define what the IC can do also define what everyone else can do, and that understanding how technologies change the world and the flow of information will enable everyone to be more intelligent and understanding about what should and should not be allowed to take place.
Fred H. Cate echoed and elaborated upon a number of points that arose in the panel discussions:
- The law is not enough for protecting privacy, and it has become less sufficient over time. Industry has had to come to grips with this.
- Privacy is hard— but not impossible. Important (if imperfect) steps can be taken to make a difference.
- Transparency makes a difference. This is difficult for the IC, but, again, not impossible. The IC can articulate its values and its commitment to accountability, including a visible commitment to firing individuals who do not live up to those values.
- “Transparent” need not mean “public.” The use of advisory boards or councils can convey message, values, and activities in a controlled way.
- Transparency also plays an important role in perception. If people feel comfortable with things they know you are doing, they are more likely to give you the benefit of the doubt on the things that are not made transparent. Many companies have recognized this.
- Rigorous data management is critical for protecting privacy. This requires great security, appropriate responses in the wake of a breach, and building accountability into the system.
- Creation of a value proposition can help build trust. Being clear and straightforward about the benefits provided by the existence or use of a given data set, tool, or authority is valuable.
Workshop participants discussed these reflections and ideas, reiterated recurring themes, and added a few final reflections. There was some discussion of the IC’s oversight and classification systems, with several participants suggesting a need for reform or improvements, in order to enable more transparency and assurance that privacy principles are being upheld.
Several participants discussed further the notions of an art of privacy or a science of privacy. One suggested that some sort of decision-making support tool, based upon fundamental principles, would be very helpful to organizations with limited resources for making privacy decisions, and wondered whether efforts toward a science of privacy framework might help to produce such a practical tool. Another participant recommended against the framing of an art or science of privacy, as it would seem to place the field of ethics, which provides a rigorous basis for deriving actionable principles, into the category of “art.” A third participant suggested that the notion of a science of privacy might not make sense because privacy is so context-dependent.
In response to the theme that privacy is difficult to define, one participant suggested that progress might be made by addressing individual facets of privacy and its associated values, such as the following:
- The right to be forgotten,
- The concept of freedom of thought,
- The concept of freedom from physical intrusion, and
- Avoidance of being “creepy.”2
A participant asked whether technology is upending our traditional notions of privacy. Another participant cautioned against thinking of technology as an independent force to which we must adapt, suggesting that society has a significant role—and responsibility—in shaping how technology is used.
Cate closed the workshop by thanking all participants, Academies staff, the workshop steering committee, and the panelists. He also thanked David Honey, director of science and technology, ODNI, Alexander W. Joel, and their colleagues from the IC for making the workshop possible. He suggested that many from academia and the private sector would be prepared to continue to engage on privacy with the intelligence community.
2 O. Tene and J. Polonetsky, 2015, A theory of creepy: Technology, privacy, and shifting social norms, Yale Journal of Law and Technology 16.1:2, http://digitalcommons.law.yale.edu/yjolt/vol16/iss1/2/.
This page intentionally left blank.
This page intentionally left blank.