Susan Landau, panel moderator and professor of cybersecurity policy at Worcester Polytechnic Institute, launched the session by observing that views around the privacy significance of metadata have changed in the past few years; people have become more aware that metadata can actually enable the extraction of sensitive information. She provided an example, noting that Google captures information about how users swipe their Android phones. Analysis of this metadata can inform improvements to the product’s design, but it could also be used to infer mood.
She observed that the many potential privacy implications of metadata illustrate the challenges posed by emerging technologies. Landau, as moderator, then introduced the following panelists and gave each of them 5 minutes for opening comments:
- Fuming Shih, senior product manager, Oracle Cloud;
- Tao Zhang, distinguished engineer, Cisco Systems;
- Mark McGovern, CEO, Mobile System 7; and,
- Lee Tien, senior staff attorney and Adams Chair for Internet Rights, Electronic Frontier Foundation.
Fuming Shih presented some results from his research on user privacy preferences and behaviors with smart phones, including the factors that affect individual preferences for information disclosure to mobile apps, and on ways of making privacy conflicts more visible to users.
Shih introduced the concept of privacy “tipping points”—individual user experiences or thresholds that precipitate a shift in thinking about a given technology and can alter user privacy preferences and behaviors. He cited his own experience with the Google Now feature on an Android smart phone, in which the phone notified him, unprompted, that it was time to head home for the day, having “learned” his commuting habits by tracking his past behaviors. This experience caused him to rethink the amount of data he was comfortable sharing.
Shih then summarized several useful themes from his research findings about individuals’ behaviors around privacy.
- Context matters. The details of a given circumstance are critical to understanding why people disclose—or do not disclose—their information.
- Privacy is complicated. Shih found that reasons for disclosing—or not disclosing—information vary between individuals. He suggested that understanding why people disclose is more important than what people disclose.
Tao Zhang started by noting that there are currently approximately 12 billion networked “things” worldwide. Analysis by Cisco and others suggests that there will be more than 50 billion devices connected to the Internet by 2020, in industrial systems, manufacturing plants, public services, connected vehicles, and consumer devices. There is even interest in embedding passive or active communication devices into people’s clothing.
Zhang expects that the world will become connected beyond what many of us can imagine today. This means that whenever we (or our devices) communicate, we will leave traces of ourselves behind—whether knowingly or unknowingly. Information will be left in many places, providing new opportunities for collection, and making it more difficult to enforce data collection and use policies.
He noted that people are becoming increasingly aware of the privacy implications of these technologies; they want to do something about it, but run into the challenge of competing requirements. For example, privacy and security can sometimes conflict; providing privacy on a network means hiding a user’s identity—or concealing his traces—which enables malicious actors to hide as well.
Mark McGovern began by making an analogy between security and privacy, noting that both can be difficult to define, and can mean different things to different stakeholders. He went on to discuss potential privacy implications of security technologies. In particular, enterprises are increasingly relying on mobile devices and storing data in the cloud. The increased number of access points creates uncertainty about who is using systems, increasing the need to monitor user activity. This trend could have the effect of tracking authorized users in a way they would not appreciate.
McGovern described the importance of balancing the needs of customers, investors, analysts, and team members when taking steps to improve security; security tools are often built before rules of acceptable practice have been established, necessitating a proactive approach to product support and build-out based upon the needs of investors, customers, users, and the public. He suggested that similar considerations likely apply to privacy as well.
Lee Tien addressed three main points: (1) the nature of privacy, (2) privacy as a social product, and (3) trust and transparency.
First, he identified privacy as an “essentially contested topic,”1 and suggested that it—not unlike beauty or justice—is not truly capable of being defined. He suggested that it is in fact a constitutive feature of privacy that its meaning must be debated over and developed by society—it cannot simply be prescribed by a determined formula. He pointed out that the Constitution does not say much about privacy, though it is appealed to in the First and Fourth Amendments, consistent with this picture.
He suggested that there are many different conceptions of privacy. Within the legal field, conceptions relate to the First, Fourth, and Fifth Amendments, associational privacy, and decisions in cases such as National Association for the Advancement of Colored People v. Alabama. Tien stated that all of these ideas are relevant to privacy discussions both within and outside the IC, but pointed out that people have many different worldviews and perspectives.
Tien then identified the notion of privacy as a social product, referencing the ideas of sociologists such as Howard Becker and Erving Goffman. He suggested that privacy is something that people create over time, based upon experiences, with the resources available. For example, an individual may create privacy by putting a letter into an envelope, or by going into a phone booth and closing the door. He identified encryption as a contemporary resource for the social production of privacy in a computerized world. He suggested that
1 W. Bryce Gallie, 1955, Essentially contested concepts, Proceedings of the Aristotelian Society, High Wycombe, U.K.: Harrison & Sons.
the use of a resource is necessary: Without the ability to mark a boundary, it is hard to achieve or demonstrate an expectation of privacy.
Landau led the panel in discussion of privacy implications of some emerging technologies, as summarized below.
Landau asked the panelists to comment on the privacy implications of technologies such as the Internet of Things, mobile devices, and notifications.
McGovern pointed out that emerging technologies such as the Internet of Things are changing rapidly. As consumers and developers learn more about these technologies, their privacy expectations will change. The significant lag between the development and adoption of new technologies poses a large risk that developers will fail to anticipate privacy implications and user expectations around these technologies.
Zhang noted that concern has risen around the potential for correlating data collected from different streams (as in the Internet of Things) to identify information about a given user. Such concerns generally prompted users to want more control over their data, and have helped spur development of enhanced user control mechanisms, such as user options for encryption and data storage schemes.
Shih questioned whether enhanced user control options would actually enhance privacy. He pointed out that some smart phones have hundreds of permissions or control options, resulting in a complex and poorly understood set of options. In his work on the Android system, he found that privacy choices are also often presented at bad times. For example, users are unlikely to read the permissions requirements carefully after downloading an app that they want to use immediately. Tien added that many people are truly surprised to learn how much can be inferred about an individual’s intimate details by analyzing seemingly innocuous data about their online purchases and other activities. He noted a huge gap between the technologically literate and the general public in their level of understanding of privacy risks, and that many have been surprised to learn of this gap, even in the corporate world.
According to Zhang, over the past 20 years, computing has shifted from local processing to the cloud, and most recently to “edge” models, which combine local and cloud computing and storage to minimize latency and increase performance. This has resulted in more distributed computing and storage, which can have additional privacy impacts. In particular, the increase in local storage minimizes the amount of data that are shared, but also means that personal information must be protected in more locations.
McGovern addressed notifications, suggesting that they can expose some of the less-visible functions of connected objects and devices. Unexpected notifications can make people realize that systems have learned something about them. Tien distinguished between notifications generated on a given device using locally stored data and notifications generated based upon data from somewhere on the cloud. While the privacy implications in each case are different, users may not be able to tell the difference. Shih described results suggesting that notifications about data collection may prompt a portion of users to opt out of a service entirely, but otherwise have minimal impact on most users’ behavior.
Zhang noted several emerging privacy concerns about connected vehicles. The navigation systems may store data in the vehicle that could be related to driving habits, and potentially be correlated with geolocation,
time, and other information. The vehicles’ networks provide potential pathways for accessing these data. He pointed out that there has been much debate about ownership of this sort of data. Some say that all of a vehicle’s data should be considered private, but others note that much of the collected data would enable car companies to improve vehicle function and maintenance. This is an open question, but much user data have been collected already.
Tien pointed out some emerging policy challenges with respect to connected vehicles. For example, some in California have proposed replacing the gas tax with a mileage-based tax that could potentially be determined either by tracking vehicle location over time or, alternatively, via some data-free method. He noted that perspectives about geolocation data have evolved over the past 7 years or so, owing partly to empirical studies showing how easy it can be to identify individuals based upon their physical paths. In this context, courts have found a reasonable expectation of privacy under the Fourth Amendment, in contrast to past decisions finding that individuals do not have a reasonable expectation of privacy if they are visible on a public road. Such shifts underscore the challenge of determining what counts as private.
Zhang explained that privacy has traditionally been generally supported by trust—we expect that data collectors will not abuse the information they hold. However, the public does not always consider this sufficient. The general idea of “privacy by design,” where privacy is considered at all phases of the engineering process, is becoming more common in certain contexts.2 For example, he suggested that customer concerns have been the major driver of joint work between industry and the Department of Transportation on designing privacy-preserving security systems for cars. Specifically, collision avoidance systems will require communication among a critical mass of vehicles to be effective, making it likely that such connectivity will be mandated in the future, prompting privacy concerns. Efforts are now under way to build communications systems that will not exchange data that could enable tracking of individual vehicles.
Shih suggested that most phone users seem to have the false notion that their data is not really at risk—for example, thinking that only celebrities will have their photos stolen and leaked. He suggested that people who do have concerns have no direct channel to voice them to phone designers.
Zhang suggested that the magnitude of users’ reactions to privacy issues depends on both the perceived value of a given product and the perceived user control over its functions. For example, the value of products such as smart phones is perceived to be high relative to the potential risk of information leaks; however, if significant risks become visible, or if users perceive that they do not really have choices, a tipping point can occur. Users may or may not provide feedback when they have privacy concerns, but often simply abandon a product. Shih added that his research suggests that both the accurate perception of control and the illusion of control make users more willing to stick with a product or situation.
Tien pointed out that receipt of customer concerns can prompt companies to simply reduce the visibility of the problematic functions, and postponing these solutions can result in much larger problems down the road. McGovern pointed out that future complications also arise due to the ambition of technology developers; indeed, it can be difficult to anticipate the societal impacts of audacious technologies. He also suggested that regulated industries are very aware of privacy issues, and are often more sophisticated in their questions about the risk of storing privacy-sensitive information. He suggested that other sectors, such as manufacturing, pharmaceuticals, and biotech, may be more focused on security than on end-user privacy. Tien added that there are many companies, such as data brokers, whose customers are not the individual end users, so individuals’ interests or privacy concerns may not be very visible to them. However, widely held public perceptions and bad publicity can create some push-back. Attention from legislators can also move companies to make privacy improvements.
2 For background on this concept, see A. Cavoukian, 2009, “Privacy by Design: The 7 Foundational Principles,” Information and Privacy Commissioner of Ontario, Canada, https://www.ipc.on.ca/english/Privacy/Introduction-toPbD/.
The panel also touched on the siloing of data as a technical control, challenges with verification of such controls, and the complement and interplay between legal and technical controls in providing accountability.
A participant cautioned against accepting the notion that users are willing to give up privacy in exchange for value. One of the panelists pointed out that true trade-offs may not even be possible because there is not a sufficient range of choices to enable a fair trade.
The group further discussed notions of trust. A panelist suggested that organizations earn trust not just by following the rules at hand, but by also acting with consideration and restraint. Another panelist suggested that people care most about privacy only after something bad has happened, and that explaining what happened and demonstrating how the problems are being addressed could help to strengthen trust. Other participants pointed out that a user’s trust in a technology may reflect an accurate understanding of a technology’s protections, but it could also reflect positive past experiences with a technology, ignorance about its underlying functions, or resignation to a lack of other options. In that sense, trust can be positive or negative, justified or misplaced, and productive or undermining to privacy.
The group went on to discuss smart vehicles. One participant noted that vehicles already contain dozens of computers and collect a significant and increasing amount of data. Another also noted that the general public may not understand the implications or details of this or future collection, even if notice of such collection is given.
Some participants commented on incentives in the private sector. One participant pointed out that a firm may have an incentive to maximize its ability to collect data while alleviating customer concern by providing superficial elements of control and notice. Another participant noted that the potential for individuals to be satisfied with superficial controls is a serious concern. One of the panelists noted that companies are focused on their business goals (often with good intentions), and that their attention to privacy will similarly be driven by economics. Greater attention to privacy might be encouraged by a reduction in the cost of doing the right thing, the presence of regulatory oversight, or increasing the value of privacy-respecting products. Another participant suggested that services and businesses whose competitiveness is tied strongly to their reputations are likely to be more attuned to privacy, and suggested that the companies have more technological capacity and expertise than their governance organizations.
It was noted by a participant that the IC does not have at its disposal the same tools that companies have for building trust with individuals, such as notifications and the “right to tinker” with technologies, and has had varying success in communicating either its privacy-protecting measures or the benefits the public derives from intelligence activities. The participant asked how the IC might best approach the question of trust with the public.
A panelist agreed that the IC and law enforcement have a limited set of tools for building trust; for example, individual notification of the subject of an investigation is not an option, except after the fact. He suggested that one might consider the proxies for the citizen in different contexts, for example the Congress, and work with them on enhancing disclosure and transparency. But he also suggested that the congressional oversight process itself may need reevaluation. Another panelist suggested that trust could be gained by providing representative or model examples of how data are used, so that the public can understand the safeguards that are in place today, the privacy impact of different practices, and how those practices contribute to the public good. One of the panelists added that the government and the private sector play different roles in society, and one should not expect practices that are acceptable in the private sector to also be acceptable in the government sector.
Finally, a panelist asked how society characterizes “sufficient privacy.” He suggested that, without an answer, it will be hard to know whether we are on the right path.