Fred H. Cate, C. Ben Dutton Professor of Law at Indiana University, welcomed the group back for the second day of the workshop. As moderator, he introduced the following panelists and gave each of them 5 minutes for opening comments, noting that they had been tasked with addressing a difficult set of issues:
- Jennifer Glasgow, chief privacy officer, Acxiom;
- Rob Sherman, deputy chief privacy officer, Facebook;
- David C. Vladeck, professor of law, Georgetown University Law Center, and former director of consumer protection, Federal Trade Commission; and
- Helen Nissenbaum, professor of media, culture and communication, and computer science, New York University.
Jennifer Glasgow began by providing background on her company and her work. Acxiom is a 45-year-old data company with two main lines of business. First, it collects data from public records and other sources, aggregating it and bringing it to the marketplace, both for marketing and for fraud and risk identification purposes. Second, it hosts sophisticated data warehouses for large clients in consumer-facing industries, including 50 of the top 100 companies. In short, her company is all about data, including some of what people may perceive to be scary uses of data.
She went on to discuss three key themes, as summarized below.
Compliance is no longer enough. Acxiom has a saying that has been baked into its culture: “Just because you can do it doesn’t mean you should.” The challenge, however, is in determining what one should do. This takes a multitude of approaches and techniques, and the ability to make these determinations does improve over time.
She suggested that many of the tools and techniques available for data collection and use are probably being deployed by both the IC and the private sector, noting a synergy between what the IC is facing in terms of scrutiny and public reactions and what some in the private sector have been dealing with for a long time. She pointed out that, while there is always the possibility of legislation, gaps in the legal framework mean that both industry and agencies need to develop their own rules and to identify and adopt best practices. The company’s brand and its customers’ brands are at stake.
Determining appropriate uses of data. Glasgow stated that Acxiom is pushing hard for more self-regulation; it derives its own policies, as do many other large companies. In 2014, the company conducted more than 800 privacy impact assessments taking into account all of the different stakeholders. She noted that the efficacy of the process improves with experience, over time.
She identified public relations as a key part of the process that has been underemphasized in both the commercial sector and the IC. She highlighted the importance of understanding how best to talk about privacy issues—including what to say and what not to say. It is also important to understand
what your critics say about you; whether it is valid or totally off-base, you should have the ability to defend your practices and/or set the record straight—before an incident occurs, rather than after.
- Learning from consumers’ views. Glasgow described some of what Acxiom has learned from its consumer-facing website. The site allows consumers to register and look at the data that Acxiom holds about them for marketing purposes, and enables them to change it, including to make corrections or to falsify information. At first, the company debated whether to allow a user to input data that were, to high certainty, inaccurate. Ultimately, it decided that, since the data were used for marketing, it would be appropriate for a consumer to receive marketing materials or targeted ads intended for those with their preferred profile (e.g., a 50-year-old wanting to be seen as a 30-year-old might actually be more interested in ads aimed at 30-year-olds). She suggested that, when it comes to data use, both facts and potential outcomes must be taken into account.
Rob Sherman began by noting that Facebook’s approach to privacy has evolved, and that it has learned a lot about privacy and its importance. Today, the company’s touchstone is building the trust of the people it serves, because the business relies upon its users being comfortable sharing data with the service. This drives much of the company’s efforts to be responsible data stewards, which involve both complying with the law and going beyond it to best meet user expectations. He noted that there are times when no direct feedback about a particular data practice is available, which can complicate an organization’s mission—a challenge that the IC likely also faces.
Facebook serves approximately 1.5 billion people globally, and the fact that they do not all have the same conception of privacy poses a major challenge. Facebook has increasingly used focus groups and found individuals to have thoughtful, sophisticated, and unique perspectives on privacy. People have different concerns and reasons for wanting to protect their information.
Sherman sees the role of Facebook’s privacy professionals as not only operating the business in a privacy-sensitive way, but also as empowering users to make their own choices, or “putting people first.” He noted that this requires providing people the necessary tools and controls, and making them comfortable that the company is doing the right thing even if no user tool is in place.
He noted that Facebook has an internal privacy process similar to the one that Glasgow described for Acxiom. It is referred to as cross-functional, meaning that the team comprises a broad range of stakeholders, including lawyers, engineers, security professionals, and communications experts. The engineering team is involved very early in the process, so that problems are dealt with before they are “baked in,” which has made “privacy review” almost synonymous with “product review.” This helps to avoid scenarios where privacy questions emerge at the end of a development process that must then either be scrapped, or have an after-the-fact fix tacked on.
Sherman noted that Facebook’s approach to external communication has also evolved. In the past, the company made its product decisions internally and communicated them externally later on. The company has since learned that broader external engagement, including with privacy experts and advocates, earlier in the process actually helps the company to make better decisions, and to have a better relationship with its customers.
Finally, he described Facebook’s privacy process as an ongoing review. He noted that privacy challenges tend not to be static, and that expectations continue to change; even if a decision is right at the time, the company must be comfortable revisiting it if the environment or demands change.
David C. Vladeck commended other speakers for beginning with the premise that we now live in an era where law itself fails to provide adequate guidance on privacy issues. He suggested several reasons for this failure: (1) technological development has outstripped the ability of regulators and lawmakers to keep pace and (2) legal instruments often lack the clarity that is needed to make tough decisions. He pointed out that there are ongoing battles over the meaning of the Fourth Amendment and suggested that the ambiguity inherent in many legal instruments has created difficulties for everyone. He suggested that the IC has as a result been forced to rely on self-generated norms, and the challenge has been in generating and enforcing these norms in a credible way.
He made an analogy with the private sector, suggesting that trust and public perception can be critical to a company’s business model. He reiterated Sherman’s point that Facebook’s thinking about privacy has evolved, and noted that the company now has a well-earned reputation for being respectful of privacy. He suggested that Facebook’s evolution on privacy (in the wake of a 2011 FTC enforcement action that called attention to its poor privacy practices) could serve as a model for the IC’s response in the wake of the Snowden disclosures.
For example, he suggested that norms need to be developed in a transparent way, with input from the public or some external validation process, and grounded in ethics- and value-based judgments of acceptable practices. He suggested that norms also need to be communicated broadly internally, and rigorously enforced; the norms themselves are valuable only if they are credible to the public, so violation of norms must be dealt with.
He suggested that the private sector may actually know more about individuals than the IC does, although the average American may not understand this. He reiterated the idea that the IC would be well-served to be more transparent about its practices, and suggested that having the public suspicious of the community would not help the IC in the long run. He noted that, in the aftermath of the Snowden disclosures, it became clear that the public had little sense of what the IC was doing. For example, it was a surprise that bulk collection was not viewed by the IC as “use”; there was a supposition that anything that was collected was used. He concluded by proposing that it is important for the IC to engage with the public about how data are being used, including the ways that could be inimical to individual welfare.
Helen Nissenbaum focused her remarks on the ethical value of privacy, political governance, and notions of freedom. She began by reiterating that constraints on the flow of information often protect individual interests, but are also valuable for society as a whole. She provided several examples of benefits to individuals that also benefit society as a whole:
- The ability for patients to speak freely and privately to physicians about health matters also serves public health interests.
- The ability for citizens to vote autonomously in democratic elections also helps to promote legitimate democracy.
- Confidentiality and privacy in tax records enables truthful financial disclosures and proper collection of funds.
She pointed out that technology can disrupt the flow of information, and people often hold onto established norms, but they do not do so blindly. Contextual norms are not arbitrary, and not made hastily; rather, they reflect a balance of needs among different stakeholders, and have been refined over time to promote different values and goals. She suggested that we should not blindly adapt our norms to accommodate new technologies, a point that also arose in previous panels. She suggested that norms emerging around new technologies must instead maintain a focus on the goals and values that inspired the preexisting ones.
Nissenbaum also addressed political governance. In this sphere, privacy (i.e. informational norms) protects and promotes important political values, including the various freedoms enshrined in the Constitution. Technologies disrupt these norms just as seeing through walls might disrupt the Fourth Amendment.
She pointed out that many post-Snowden commentaries focused on National Security Agency’s bulk collection practices. Given that almost every action today is an informational action, some have asked how the intelligence agencies could not collect data in bulk. But critics see such collection as informational dragnets that do not reflect what we expect from government actors in the context of liberal democracies.
Nissenbaum noted that some previously proposed mitigations are persuasive (for example, requiring that collected data only be read by machines or be subject to strict use requirements), and they may even be ethically defensible given certain types of conditions and assurances. Nonetheless, she suggested that it is important to understand what is at stake in these contexts.
She also noted that privacy protects critical freedoms against government interference in various activities and aspects of our lives. Freedom can be understood as non-interference; in other words, an individual is not a free citizen if government interferes with speech, association, or living in accordance with religious faith. The philosopher Philip Pettit defined freedom as non-domination, or security against arbitrary interference.1 In this sense, freedom means not only limiting what the powerful are permitted to do, but also reducing or eliminating their power to do it. Nissenbaum suggested that this view is relevant to both commercial and government actors; collection of data enables government and dominant commercial actors the capacity to exercise power, and the threat of action can be almost as menacing and debilitating as the action itself. Finally, Nissenbaum noted that data collection, accretion, and analytics are disruptive. She suggested that while these practices can be used for good, it is important to understand what is at stake.
The panel discussed a number of topics, including strategies for determining appropriate practices within institutions, re-use of data, resource constraints, and how to translate ethical values into practice.
Glasgow described Acxiom’s privacy impact assessment (PIA) process.
- A team identifies stakeholder concerns and possible consumer risks and harms around a given project, with a proxy advocate representing consumer interests.
- Enumeration of the possible negative impacts helps to elucidate how (or whether) a given decision would be defensible in a public space.
- Projects should be re-assessed periodically to account for shifting attitudes or laws.
- Assessment is an iterative process that evolves and improves with time.
Sherman noted a few important principles to inform privacy decision-making.
- Both actual and perceived harm (including violations of privacy or trust) can be damaging to an organization’s reputation.
- External consultation is valuable, and can help to provide a nuanced understanding of external expectations. This can be performed in a confidential setting.
Vladeck identified three key types of internal participants to engage in privacy-related decision-making:
- A “Cassandra”—someone whose job it is to identify every possible thing that could go wrong,
- Privacy officers—who can push privacy as a core value beyond simple legal requirements, and
- An institutional naysayer—someone who truly understands the organization’s mission and is also prepared to push back on the status quo and compel alternative strategies.
Nissenbaum suggested that external input on privacy decision-making is necessary, because a company or organization has a natural bias toward its own interests. She also pointed out that ethical approaches require consideration of contextual values, and how constraints on information flow would best promote the goals of a given activity.
The panel was asked to speak to decision-making around re-use of existing data. Glasgow and Sherman noted that Acxiom and Facebook conduct separate PIAs and analyses (respectively) on every proposed new use of existing data. Vladeck suggested that unvetted new uses of data are “the landmine of privacy.” Approximately half of the enforcement cases he saw brought while he was in law enforcement dealt with
1 Philip Pettit, 1997, Republicanism: A Theory of Freedom and Government, Oxford University Press, Oxford, U.K.
companies that otherwise had good data hygiene, but neglected to examine the implications of novel new uses of data.
The panel briefly discussed the personnel resources necessary for conducting privacy reviews.
- Sherman noted that Facebook has grown its personnel focused full time on privacy significantly, to about 50 full-time staff (though others on different teams also address privacy in their work). He also described a more scalable staffing model for resource-constrained organizations, in which a core group is responsible for coordination of privacy activities, with designated individuals embedded in different business units responsible for keeping track of possible privacy issues.
- Glasgow noted that PIAs at Acxiom have become procedural, and are held twice a day; this has helped to make the process relatively quick and easy. The scale is managed in part by separating and fast-tracking the smaller, simpler decisions.
- Vladeck added that some companies have a dedicated privacy officer whose job it is to handle these issues. He noted that privacy decisions are complicated, and sometimes the wrong decision is made, but someone with oversight needs to be involved.
The moderator asked the panel to comment on how ethics and cultural values are or can be translated into the decision-making process.
Nissenbaum pointed out that ethics are unlike law in that they are based in foundational principles, which can lead to different analyses and strategies, rather on than fundamental rules. For example, an ethics-driven approach could be to firmly define an explicit set of values and then evaluate whether a given action would be counter to these values. She also noted the sense in the United States that privacy is an ethical value.
Sherman noted that his company starts with a policy that reflects its values and works to communicate that policy and to respect users’ autonomy, often in concrete and tactical ways, such as by allowing a user to choose who gets to see a given status update. He also noted that there is an interest in making decisions more data-driven, in part to help meet the expectations of the users who do not speak the loudest. Facebook also documents its privacy decisions and maintains them in a database, which can assist in answering individual users’ questions about data use.
Nissenbaum asked how Facebook and Acxiom define a “privacy issue,” pointing out that even if Facebook users can control which friends see their posts they may not want Facebook to use that information. Mr. Sherman acknowledged that such a definition is not straightforward and noted that Facebook makes use of individuals’ data in order to provide services such as prioritization of Newsfeed content. He identified a privacy issue as any negative outcome that could happen as a result of the way the company collects, uses, or stores people’s data. Identification of potential issues begins with legal analysis and then takes into account user expectations and alternative approaches. Glasgow defined the term as including anything that can be viewed as negative by a consumer, regulator, or client. To identify potential issues, Acxiom has organized its business around major use areas, and attached specific ethical obligations to each area.
Vladeck suggested that an organization might generate its own operational set of ethical norms and rules by first considering its mission, such as protecting citizens from harm, and then using background norms (such as the Bill of Rights) to come up with discrete norms (such as a decision not to look at citizens’ data without a clearly articulated need to do so). He suggested that there are many foundational materials for generating discrete norms.
Several participants discussed drawbacks to reliance upon externally generated rules, raising the following points:
- Companies subject to external regulation often go into compliance mode, rather than trying to anticipate future needs and rule changes.
- Regulators may not anticipate new technologies or uses.
- It may not be clear how existing rules apply to new technologies.
- The responsibility for setting norms is on the regulators, rather than the actors.
- Regulators do not want to regulate too early, as this would stifle innovation, and because early business models may change.
It was suggested that there will always be a lag between technological development and the ability of regulators to define an appropriate boundary, but that this can actually be a good thing. A participant recalled an earlier theme that people tend not to be good at protecting the privacy of others, suggesting that this is an argument against self-regulation or internal privacy standards. The participant suggested that making internal rules or standards of practice available to the public would enable review by those who do not share the organization’s mission. A participant suggested that organizations should always base decisions on whether a given practice would be perceived as fair or deceptive. Someone asked how this might apply to the IC. It was suggested that this should be debated in a public way.
A participant noted that privacy is generally secondary to the main mission of commercial companies, which is profit. Though many companies are sincerely concerned about privacy, privacy considerations often also serve the primary mission by shaping customer perceptions. The participant suggested that this is not necessarily the same for the IC, whose mission can include the security of U.S. citizens in the broadest sense, read to include securing their privacy.
Another participant pointed out that the mission of the IC is to obtain information about bad actors that would not otherwise be provided to the government, for the purpose of informing policy decisions. By definition, practices such as interception of communications intrude on privacy. It is challenging to protect privacy while intruding on privacy. The need for public trust also poses the challenge of how to be transparent while still keeping secrets. While lessons and strategies from the private sector can be helpful and informative, the IC faces additional challenges.
One of the panelists suggested thinking about privacy in terms of the “appropriate flow” of information: While some of the IC’s practices would be inappropriate for a private company, they might nonetheless be appropriate for preserving privacy in the context of the IC’s mission; however, this reasoning should be articulated publicly. Another participant noted that some feel the IC’s mission could be undermined by articulating strategies for protecting U.S. citizens that may be less popular with the IC’s foreign partners.
The group discussed user privacy controls and oversight over back-end data uses. It was noted that users may not have a clear understanding of how their data are and are not used. It was suggested that users may not have the time—or may not want to take the time—to understand the nuances of how data are used on the back end of any system, but that there may be other ways of communicating this to users for their own benefit. For example, the FTC and the Irish Data Protection Commissioner have audit authority over Facebook’s operations, and audit reports have been made available to the public.2, 3 Whether or not users read these reports, knowledge of this external oversight is likely a comfort to users.
2 Data Protection Commissioner, 2011, Facebook Ireland Ltd: Report of Audit, December 21, https://www.dataprotection.ie/documents/facebook%20report/final%20report/report.pdf.
A participant noted that the IC is subject to outside (though still governmental) oversight, in particular through its own dedicated civil liberties and privacy offices and from the Department of Justice, neither of which are necessarily visible to the public. The participant asked what the community could do to build trust with the public. One of the panelists suggested that a good public relations campaign would be needed to effectively teach the public about the existing oversight mechanisms. Another of the panelists agreed, and suggested that the problem is not necessarily what the IC is doing, but possibly the fact that people do not understand what the IC is doing. The panelist suggested that the IC should be commended for the processes it has undertaken, but that it also needed to address public misperceptions. It was again pointed out that perceptions can be just as damaging as reality.
A participant suggested that a straightforward media appearance could help communicate the IC’s mission and challenges to the public. Another participant suggested that this on its own would not be enough, but that there is a directed effort under way for enhancing transparency. A panelist agreed, suggesting that a sea change is needed, and noting (from experience in the private sector) the importance of education, communication, consistency, and awareness within an organization, along with publicizing the remediating actions taken in response to a privacy breach or incident. Over time, an organization learns what to say, and who should say it, given the audience. Building trust requires an ongoing commitment to a culture of both internal and external communication.
Another panelist noted that some in the public perceive a gap between the IC’s public-facing commitments and its practices, noting that strategy for communicating actual practices can influence this perception. In addition, the panelist noted that the audience is diverse and could be divided up into many different subgroups, for example, U.S. and non-U.S. audiences.
A participant noted that, while the laws that govern the IC’s actions are public, their meanings or interpretations are not, and some terms in the statutes are not clearly defined. In particular, many were surprised about some of the Foreign Intelligence Surveillance Court’s interpretations of law. Another participant agreed, and suggested that there is a process problem, rather than a compliance problem, which could be addressed by involving adversarial viewpoints in the process, both internally and externally. It was pointed out that the IC recognizes this, and is making transparency a priority.
The group discussed the private sector’s strategies for compliance with the data protection laws of different nations. Sherman noted that Facebook is structured around two data controllers, Facebook, Inc. and Facebook Ireland, with different regulators. The company also consults with other authorities. It aims to comply with all applicable laws in such a way that the user experience and protections are consistent across the board, regardless of what region the user lives in, although this is not always possible. For example, Facebook developed a Download Your Information tool in response to European Union requirements, but has made it available to U.S. and other users as well. In the case of Acxiom, products are rolled out on a country-by-country basis, and product PIAs are performed independently for each country, according to Glasgow.
Panelists concluded the session with a few comments about future strategies and the outlook for improving privacy.
Sherman said that it is important for organizations to stay humble and listen to the privacy concerns of those that they serve. He suggested that understanding and protecting privacy is an iterative process that requires persistent engagement and honest exchanges of view. The biggest challenge is that the landscape is not static, and organizations must be willing to revisit, adapt, and improve their practices. While current laws are a baseline, it is unlikely that they will predict upcoming technology changes, so continuous evaluation is needed.
Glasgow suggested that an organization should not expect perfection, especially in its early privacy strategies. Regardless of the success of any individual action, persistence is necessary to make significant progress on building trust. She also suggested that the private sector may begin to face some of the same challenges that the IC has experienced as big data and machine learning continue to pervade business strategies and the concept of notice and choice breaks down.
3 Data Protection Commissioner, 2012, Facebook Ireland Ltd.: Report of Re-Audit, September 21, https://dataprotection.ie/documents/press/facebook_ireland_audit_review_report_21_sept_2012.pdf.
Nissenbaum suggested that society is already on a path that has strayed from ethical activity; we seem to accept that massive collection and aggregation of data is the proper state of affairs, despite the absence of a rigorous argument for why this is defensible. She noted that it is hard to turn back the clock, but suggested that there might nonetheless be value in revisiting this acceptance.