National Academies Press: OpenBook
« Previous: Workshop Introduction
Suggested Citation:"Remarks of Speakers." National Academies of Sciences, Engineering, and Medicine. 2016. Data Breach Aftermath and Recovery for Individuals and Institutions: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/23559.
×

Remarks of Speakers

The following sections summarize remarks by speakers at the workshop, along with discussion among attendees at the conclusion of each set of remarks. Speakers were clustered into two groups. The first group, consisting of Joel Reidenberg, Sasha Romanosky, Beth Givens, Tom Murphy, and Heather Adkins, offered empirical, consumer, and data holder perspectives. The second group, consisting of Bob Belair, James Harvey, David Vladeck, and Aaron Burstein, offered legal and policy perspectives.

RESILIENCE AND REMEDIATION

Joel Reidenberg, Fordham University

Joel Reidenberg, J.D., Ph.D., the Stanley D. and Nikki Waxberg Chair at Fordham University School of Law and a visiting research affiliate of the Center for Information Technology Policy at Princeton University, conducts research and teaches courses in information technology law, privacy, cybersecurity, and intellectual property. He introduced the types and impacts of data breaches and offered perspectives on their remediation.

The types of data breaches that are most salient for consumers are unlawful exfiltration of personal data and wrongful dissemination, destruction, or corruption of that data, Reidenberg said. These breaches are common—Reidenberg suggested that every adult person in the United States has likely had personal information breached in some

Suggested Citation:"Remarks of Speakers." National Academies of Sciences, Engineering, and Medicine. 2016. Data Breach Aftermath and Recovery for Individuals and Institutions: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/23559.
×

way—and in some cases these breaches can have significant consequences. For consumers, data breaches cause many types of harms, including loss of privacy, economic loss, safety hazards, fear of future damages, and inconvenience. Reidenberg mentioned a striking case from 1999 in which a woman’s personal information was acquired by a stalker and resulted in her murder. For companies, he said, loss of trust, economic loss, and business disruption are the most salient harms.

When a data breach occurs, the usual procedure is for the breached entity to notify those whose data has been compromised. Although warning victims is important and meant to be helpful, this notification can, unfortunately, work to decrease trust in systems, Reidenberg observed. The consumer, he said, is now concerned about financial or medical privacy, distrustful of the breached entity, and fearful for any potential short-term or long-term harm. In Reidenberg’s view, notifications of compromised data are not a systemic solution for resilience or remediation; while these notifications may treat a symptom, they do not provide a cure.

After notifying consumers of a data breach, most companies then offer free credit monitoring as remediation. That may reassure some consumers, or help consumers forestall any costs of identity theft, but Reidenberg observed that it fails to address safety issues, future breaches, the psychological impact on consumers, or the actual financial loss. If money has been stolen in some way, the attribution of economic loss often remains unaddressed: Who should be responsible for the loss? If someone adopts a false identity and receives medical care under the assumed name, how can the real and false medical records be identified and corrected to avoid potential safety problems in the context of future medical treatment for the identity theft victim? In Reidenberg’s view, there is currently no effective mechanism to deal with remediation of such potential downstream consequences of data breaches. “At the moment, we don’t have anything that can really effectively focus on all of these kinds of harms,” said Reidenberg. “This is a systemic need that will not be solved by one-off fixes; a solution is not going to be something as simple as a data breach notification.”

A further complication, he observed, is that it is nearly impossible to measure the psychological harms or the full extent of the damage from a data breach. In the breach of the dating website Ashley Madison, for example, personal information was publicized and harmed individuals’ privacy and reputations, but that harm is not easily quantified. In the case of Sony Pictures, the data breach likely had adverse economic impact on actors’ salaries as a result of the disclosures and also caused the cancellation of a movie

Suggested Citation:"Remarks of Speakers." National Academies of Sciences, Engineering, and Medicine. 2016. Data Breach Aftermath and Recovery for Individuals and Institutions: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/23559.
×

production. These economic harms are easier to quantify, but the costs of a myriad of indirect consequences are difficult to estimate, he said. The impact of compromised personal safety is similarly difficult to measure.

Industries are working to combat the problem of data breaches, of course. Information sharing and analysis centers (ISACs) are common in many industries and serve as central resources to help reduce the threat of cyberattacks, Reidenberg noted. Sharing information through ISACs can, for example, help mitigate a cyberattack on healthcare systems or a financial services malware attack. However, he said, the practice of sharing threat and attack information can also potentially increase the risk of exposing private data. “The more that personal data circulates across different organizations, the greater the vulnerability is,” said Reidenberg. There is little in the ISAC framework, he argued, that articulates consumer rights or the need for consumer protections.

Reidenberg highlighted two critical questions: Who is responsible for losses that result from a data breach, and what is the best remediation for consumers? In the context of credit cards, a 1978 statute provides some answers—the consumer may be liable for the first $50 and the rest is on the card issuer—but disputes between card issuers and merchants have generated a great deal of litigation in subsequent years, he noted, with no clear resolution. Without relevant statutes for other contexts, such as brokerage services and utilities, there is an even greater lack of clarity, he said. Who would be responsible, for example, if a power plant were breached, resulting in widespread blackouts? Would the utility company help bear the costs? Individuals and small businesses often wind up assuming the costs of data breaches, yet they have the fewest assets to absorb them.

Sometimes it is the consumer who, knowingly or not, undertakes risky behavior that leads to privacy breaches, Reidenberg noted, pointing to consumer education about risk and liability as one area in need of work. Another area that needs to be examined, he suggested, is the definition and management of “critical infrastructure.” Although an individual’s personal computer or smartphone may not itself control communications infrastructure or sensitive data, consumer devices, especially if used to access sensitive systems, can become an entry point for a massive breach and thus become a vector for harm.

Reidenberg also pointed to a need for clarity, oversight, and regulations to guide cybersecurity countermeasures. As an example, he pointed to the 2013 takedown of more than 1,000 networks that were part of the Citadel botnet using malware to control and access private financial information on millions of personal computers. Reidenberg observed that Microsoft, in collaboration with the Federal Bureau of Investigation (FBI), obtained a court order to proceed with a counterattack, but the process did not pay attention to how this takedown could adversely affect the unwitting owners of the

Suggested Citation:"Remarks of Speakers." National Academies of Sciences, Engineering, and Medicine. 2016. Data Breach Aftermath and Recovery for Individuals and Institutions: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/23559.
×

malware-infected computers and did not have a plan to address any collateral damage, passing the resulting costs on to the owners.

Concluding, Reidenberg stressed the need for academic research: “We have a huge need today for good, carefully thought-out empirical work to provide a sound basis for any kind of policy decision making,” he said. Four main areas he sees as particularly ripe for investigation include mapping the harms of data breaches to the breach type and characteristics of affected stakeholders; developing a comprehensive understanding of the full costs of remediation and the benefits of prevention; creating policies or procedures to standardize the reporting of security breaches; and developing a framework for accountability when implementing countermeasures.

COSTS AND CAUSES OF CYBER INCIDENTS

Sasha Romanosky, RAND Corporation

How much does a data breach cost a business? Which industries are at the most risk for losses? When does a data breach lead to litigation? The answers to these questions are crucial because they influence whether and to what extent businesses invest in preventing data breaches and other cyber incidents. The incentives (or disincentives) for strong data protections on the part of industry can shed light on business practices and help to inform policy decisions.

Sasha Romanosky, Ph.D., is a policy researcher at the RAND Corporation with expertise in the economics of security and privacy, cybercrime, national security, applied microeconomics, and law and economics. He summarized his research findings on the empirical costs of cyber incidents and discussed how those costs might (or might not) incentivize companies to implement better data protections.

Romanosky’s research is based on a data set of 12,000 cyber events collected by the insurance analytics company Advisen through news reports, Freedom of Information Act requests, academic databases, and other sources. This data set is more than twice as large as other publicly available data sources used for previous analyses. Even with so much data, Romanosky noted some limitations: Many events are not even publicly known; they may not have been detected by the company; they may have been detected but not disclosed; or they may have been disclosed but not entered into any industry or legal database. It is also challenging to put a dollar amount on the myriad impacts of cyber incidents. Despite these limitations, Romanosky’s analysis offers a rough estimate of the empirical costs of these events. With this knowledge, he noted, firms could choose to invest heavily in data security to prevent a breach, or take the opposite path and assume that breaches will happen and accept the financial burden of mitigating their effects.

Suggested Citation:"Remarks of Speakers." National Academies of Sciences, Engineering, and Medicine. 2016. Data Breach Aftermath and Recovery for Individuals and Institutions: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/23559.
×

Either way, he argued, better data can lead to more informed decisions, not only for businesses themselves, but also for insurers, consumers, and policy makers.

Based on the available data, Romanosky identified four main types of cyber incidents that put consumers or businesses at risk of financial loss: data breaches, defined as the unauthorized disclosure of personal information; security incidents, in which computers are used to attack a company; privacy violations, in which companies intentionally collect or use personal information; and phishing or skimming, which include various types of financial crimes against targeted individuals and companies. His analysis revealed that data breaches outnumber all other incident types by at least four to one, although security incidents appear to be rapidly becoming more frequent.

Quantifying the costs and risks of cyber events is challenging. Romanosky used three main measures to identify general trends: (1) incidents, as measured by the total number of incidents, and incident rate, or the proportion of companies that experience these events within an industry; (2) litigation, as measured by the total number of lawsuits, and litigation rate, or the proportion of companies that are sued for cyber event-related harms within an industry; and (3) costs, as measured by the total costs and losses per event. With this data, he claimed, companies can assess where their industry falls on the spectrum of severity of cyber events and better understand and prepare for the specific risks their company faces.

Romanosky’s results show that breaches, small and large, affect a wide range of industries and that different industries suffer different consequences and costs from these events. While his analysis suggests that the highest numbers of incidents occur in the finance, insurance, and healthcare industries, government and education suffer the highest incident rates. His research also suggests that while the information, finance, and insurance industries receive the highest total number of lawsuits, the mining, oil, and gas industry and the companies providing administrative and support services have the highest litigation rates. Romanosky’s analysis suggests that the transportation industry has the highest cost per event, and the highest overall risks (combining cost, incident rate, and litigation rate) are borne by the manufacturing, retail, finance, and insurance industries.

Drilling deeper into the financial costs to firms from data breaches, Romanosky described two kinds of costs: “first-party costs” and “third-party costs.” First-party costs are those a company bears after a data breach, such as the costs to notify consumers, cover remediation, and implement increased security measures as necessary. Third-party costs come from litigation and settlements. As other workshop attendees pointed out, there are multiple ways to measure costs incurred by a company—for example, it can be

Suggested Citation:"Remarks of Speakers." National Academies of Sciences, Engineering, and Medicine. 2016. Data Breach Aftermath and Recovery for Individuals and Institutions: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/23559.
×

debated whether staff time should be counted if it is incurred by salaried staff who are not paid overtime. However, Romanosky said the analysis is likely not granular enough for such distinctions to affect the overall findings of this study.

Analyzing the 12,000 cyber incidents in the database, Romanosky found that despite there being some very large, expensive data breaches—such as those at Target, Sony, Anthem, and Home Depot—that drive the mean cost of a breach toward the $5 million range, which has been frequently cited, the typical cost to an entity is less than $200,000.

He said that although this may seem surprisingly low, an important additional finding is that nearly 40 percent of all companies affected have suffered multiple incidents—a group Romanosky calls “repeat players”—and these companies suffer higher costs for some types of events. “These repeat players don’t seem to be litigated more than the non-repeat players,” he said. “But the cost to these repeat players, at least in terms of data breaches, is significantly different—almost twice as much with repeat players.” So, even if the per-incident cost is low, he said, an entity hit again and again would have more cause for concern. In addition, trends within industries can be telling. For example, in the finance and insurance sectors, he found that 50 percent of incidents involve repeat players, a phenomenon that is ripe for further research.

Romanosky said a “back-of-the-envelope” calculation based on the 12,000-incident data set puts the total annual cost of cyber events at about $10 billion. While granting that this figure is probably not completely correct and that a lot depends on how many breaches go undetected or unreported, he argued that the estimate is useful in a broad sense. “It gives us a general sense based on the information that we have of what we think the annual cost of these events would be,” he said.

Based on a rough comparison to other sources of industry losses, such as retail theft, healthcare fraud, or loss of intellectual property, the aggregate loss from cyber events is relatively small, Romanosky argued. Although the costs of these events may total approximately $10 billion, these costs only represent about 0.4 percent of annual revenue, he said, making cyber events a far less significant risk to businesses than other losses, such as shrinkage and fraud.

Data breaches and other cyber events are certainly increasing, but Romanosky’s research suggests that the typical costs to companies are relatively small, both in absolute terms and as a proportion of revenue. Thus, although there have been a number of rare, astoundingly costly events, he argued that companies overall may not have a strong financial incentive to invest in the infrastructure and systems required to rigorously protect data.

Suggested Citation:"Remarks of Speakers." National Academies of Sciences, Engineering, and Medicine. 2016. Data Breach Aftermath and Recovery for Individuals and Institutions: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/23559.
×

AN ON-THE-GROUND LOOK AT CONSUMER IMPACTS OF DATA BREACHES

Beth Givens, Privacy Rights Clearinghouse

Beth Givens directs the Privacy Rights Clearinghouse, an organization based in San Diego, California, whose mission is to educate and empower consumers to protect their data and their privacy. Her work focuses on the fallout for consumers when their data is breached, and she offered an on-the-ground perspective on the history of breach notification laws in California, research about the experiences of breach victims, and possible future trends in data breaches.

California was the first state to implement a data breach notice law. The catalyst for the law was a 2002 large-scale data breach of the state’s payroll database, which affected 265,000 employees, including the governor and the state legislature. The resulting law, implemented in 2003, mandated that firms must notify those individuals whose personal information (specifically, name plus driver’s license or identification numbers, or social security number, or financial account numbers) had been compromised.

Since the passage of California’s initial data breach notice law, the legal framework around data breaches has been continually updated in response to emerging threats and consumer harms. For example, the laws were updated in 2008 and 2009 to include medical and insurance data and harsher penalties after Los Angeles-area hospital staff were caught selling celebrities’ medical data. In 2012, the laws were updated to clarify the often opaque legal language used in data breach notifications. These updates also required additional information to be included in a data breach notification, such as the type of information that was breached, when the breach happened, and contact information for credit reporting agencies, and required breach notices to be posted on the website of the state attorney general.

A 2014 California law required local governments to send data breach notices and expanded the definition of personal information to include online login credentials, such as usernames, passwords, or a mother’s maiden name. In 2015, breached entities were required to provide free “identity theft prevention and mitigation,” which goes beyond credit monitoring, for 1 year if the data breached included a social security number or driver’s license number. In 2016, the law was again revised to better define “encryption” and outline required headings for data breach notices. Those headings are “What Happened,” “What Information was Involved,” “What We Are Doing,” “What You Can Do,” and “For More Information” and are intended to clarify the situation for consumers and empower them to take any needed action. Givens said it is unclear whether these im-

Suggested Citation:"Remarks of Speakers." National Academies of Sciences, Engineering, and Medicine. 2016. Data Breach Aftermath and Recovery for Individuals and Institutions: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/23559.
×

proved notice formats would cause any change in consumer behavior after a breach. She suggested that this would be a fruitful topic for research.

A theme throughout Givens’ talk was that breaches are constantly changing, and so too must our responses to them. She noted that the rapid evolution of California’s legal framework around this issue demonstrates how state legislatures are generally able to be more active and nimble than the federal government on this issue. Givens said that the California experience has also shown that requiring public reporting of data breaches can increase transparency of and research access to these events. She noted, as an example, that in California the attorney general has published helpful breach reports and analyses based on the information collected over the years.

Givens also offered insights on the experiences of the consumer victims of data breaches. A key finding from her organization’s close work with affected individuals is that victims most often report confusion after a notification. She suggested that this confusion might stem from the lack of clarity in data breach notices—a factor that recent revisions to California’s data breach notification laws might help address.

Givens observed that the purpose of the notice, however, is not just to inform the victim of the situation, but also to prevent a data breach victim from becoming a fraud victim. To prevent identity fraud, companies typically offer free credit monitoring for 1 year. Givens said that credit monitoring is appropriate if certain information, such as a social security number, is compromised. However, she said it is not helpful if the data breached included credit card information or health information, for example. Credit monitoring in that case only offers a false sense of security to the victim, Givens argued.

One company Givens spoke with offers “identity theft service” that includes monitoring the black market for a customer’s information and making sure that individual is not defrauded. Givens said that this service is actually much less expensive than credit monitoring and can be used for a wider range of compromised information. Givens observed that a conundrum of data breach response is that credit monitoring is considered a “best practice” after a data breach, despite the fact that it is inappropriate and inadequate in many situations, is more expensive than identity theft service, and is widely ignored by consumers. Givens posited that companies are perhaps afraid not to offer it because it has become so widely expected by consumers and regulators.

Another crucial aspect of data breach remediation is what affected consumers choose to do with the information and resources offered to them, Givens said. In her organization’s analysis, Givens was surprised to discover that a mere 5 percent of breach

Suggested Citation:"Remarks of Speakers." National Academies of Sciences, Engineering, and Medicine. 2016. Data Breach Aftermath and Recovery for Individuals and Institutions: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/23559.
×

victims took advantage of the free credit monitoring offered to them after their data had been compromised. She cites two reasons for this low rate. One is that in order to enroll in a credit-monitoring service, users must enter their social security number and/or date of birth. She said that many victims may be understandably risk-averse after being told they were a victim of a breach and, as a result, are reluctant to share such personally identifiable information. The second reason is that, in her view, credit monitoring simply is not perceived as valuable by many consumers. What most want instead is a quick repair if there is identity fraud that results from the breach.

Credit monitoring and identity theft services are efforts to prevent a victim of identity theft from becoming a victim of identity fraud. Givens offered some statistics about fraud: In 2011, one in five breach victims became a fraud victim. In 2012, this proportion rose to one in four, and in 2013, it was one in three. In 2014, this proportion decreased to one in seven, mostly thanks to the remediation efforts stemming from the large point-of-sale data breaches at Target and Home Depot (figures from research provided by Javelin Strategy and Research), Givens said. To rectify those breaches, credit card companies issued cardholders new cards, which was expensive but effective at reducing fraud, as the numbers show.

Givens noted that the nature of data breaches is constantly evolving. Today, almost any nugget of data can be valuable to thieves, not just social security numbers or financial account information. Those involved in perpetrating fraud are casting a much wider net, including information such as school or medical records, online login credentials, and more. Givens said that according to Javelin’s 2015 Fraud Impact Report, “Nearly any piece of information that fraudsters can get their hands on can be used to initiate or strengthen an attack.”

Another change Givens observed is that it can no longer be assumed that data breaches are always financially motivated. Although they are not within her purview, Givens mentioned the recent breaches by suspected Chinese hackers, breaches of medical records, and breaches of personnel files at the Office of Personnel Management. This highly sensitive data may have been compromised for a variety of nonfinancial motivations.

As always, companies are trying their best to stop data breaches from happening in the first place. Chip cards have made credit and debit accounts more secure, but Givens likens this development to “squeezing the balloon—it’s just going to push fraud in other directions.” For example, she said fraudsters may switch to a focus on new-account fraud, in which a person’s social security number or other information is used to open a new credit account, which is quickly spent down before the fraud is detected. Or they may focus more on transactions in which the card is not present—for example, in online transactions.

Suggested Citation:"Remarks of Speakers." National Academies of Sciences, Engineering, and Medicine. 2016. Data Breach Aftermath and Recovery for Individuals and Institutions: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/23559.
×

In Givens’ view, the areas that are perhaps most vulnerable to future fraud include online commerce, healthcare institutions, government agencies, and schools, largely because those institutions all use social security numbers, which can be used to open fraudulent new accounts. Healthcare records in particular contain so much information about a person (and sometimes about the person’s family members) that they are worth far more than a credit or debit card number on the black market, she said. Healthcare institutions, should, Givens urged, take every precaution to encrypt patient data and also segregate information based on whether it is medically pertinent. She also suspects that after bigger companies become smarter about data protection, fraudsters will target midsize and smaller places of business, which may not invest heavily in data protection or have time available to train staff in proper handling of sensitive records.

Transparency of data breaches could help consumers when deciding which financial or medical institutions to trust, Givens said. Deirdre Mulligan, University of California, Berkeley, likened breach notifications that are required to be made public (e.g., through publication on the attorney general’s website in the state of California) to the publication of pollution records by the U.S. Environmental Protection Agency. After a breach, she said, consumers might opt to take their business elsewhere, which could be a powerful motivator for companies to better protect their customers’ data. Givens agreed that this was an important point but noted that the research on this matter is somewhat mixed. She noted a study by Javelin found that 70 percent of consumers said they would take their business elsewhere following a breach when, in reality, only 19 percent actually did. However, a change in behavior even by just one in five could still have a significant impact, she said.

In conclusion, Givens said breaches will continue to happen and will likely expand into different spheres, depending on which data are the most valuable and the most vulnerable. The challenge for lawmakers, consumer advocates, and companies, she said, is to keep one step ahead of the attackers.

INFORMATION SECURITY IN THE UNIVERSITY ENVIRONMENT

Tom Murphy, University of Pennsylvania

Tom Murphy, who became the university chief information officer (CIO) for the University of Pennsylvania after decades of experience leading information technology strategy in private and public companies, began his talk with a stark summation of the challenges in information technology (IT) security. “I’ve worked in both public and private companies, and I can tell you from our perspective, information security is the boxing match that never ends,” he said. “And we just keep taking beating after beating.”

Suggested Citation:"Remarks of Speakers." National Academies of Sciences, Engineering, and Medicine. 2016. Data Breach Aftermath and Recovery for Individuals and Institutions: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/23559.
×

As a university CIO, he is responsible for providing appropriate and effective information security within a highly decentralized, diverse university ecosystem. In his talk, he outlined the challenges of working in a university environment and detailed the plan his team developed to balance information access with information security.

Higher education faces the same security challenges as large companies, he said, including the explosion of mobile and cloud computing, the need to handle large amounts of sensitive data, and the responsibility to uphold customers’ expectations of data privacy. However, he noted that higher education has some additional challenges that complicate its security landscape. Most universities, for example, have a tradition of embracing collaboration, freedom of expression, and decentralization of authority and are unlikely to take well to tight management or uniform regulation. In addition, he continued, universities today provide services that extend far beyond their primary educational missions, with many playing host to healthcare facilities, large housing complexes, major sporting events, and independent police departments. As Murphy described them, today’s universities are “more akin to a small city than a single, monolithic company.”

Because of these extra-educational services, most universities must already comply with laws such as the Health Insurance Portability and Accountability Act, the Family Educational Rights and Privacy Act, and numerous other state and federal regulations that protect personal and financial information. However, Murphy argued that universities must go even further in protecting their varied, highly sensitive, and extremely valuable data.

Murphy described his university: The University of Pennsylvania is an enormous, complex place, with an overall operating budget in fiscal year 2016 of $7.74 billion and a research budget of $940 million. It is a research university with 141 research centers and institutes, which means it has a high volume of proprietary and sensitive data. It houses sensitive personal data from about 25,000 students and more than 40,000 faculty and staff members distributed among a wide network of schools, centers, and institutes.

Murphy noted that the overall computing structure of a premier educational and research institution like the University of Pennsylvania must be quick and reliable, but different departments and schools have different information needs, different financial and human resources, and competing priorities within their vast, decentralized governance. Murphy and his team have no direct authority over students or staff, yet they are accountable for the security of all that information. They cannot impose mandates on technology use, yet their users—the students, faculty, staff, and alumni—expect universal, instantaneous access to information. Facilitating effective, appropriate IT security within this complex environment is challenging, but has also generated “a number of information security success stories,” Murphy said, with innovative and effective solutions arising across the university system’s diverse schools and centers.

Suggested Citation:"Remarks of Speakers." National Academies of Sciences, Engineering, and Medicine. 2016. Data Breach Aftermath and Recovery for Individuals and Institutions: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/23559.
×

In 2015, Murphy was tasked by the university’s Institutional Risk Committee and trustees to answer the question of how best to do information security in such a highly decentralized and complex environment. First, he and his colleagues took stock of the people, processes, and technology involved in IT security at the university, and then they added resources to better align with customer demand, including the first campus firewall, as well as analytics and central logging. With a team in the process of doubling from 5 to 10 full-time staff members, Murphy and his colleagues are working to establish a security baseline, create a set of universal recommendations, and increase campus-wide efficiency, training, and education.

Murphy described how each step of this process required a tremendous amount of fact-gathering and diplomacy to provide compelling cases to justify the group’s recommendations and expenditures. It also required relationship-building on Murphy’s part, to win over influential campus groups, such as the faculty senate, and to convince researchers that their life’s work is safer in a secured environment than on an unsecured personal laptop.

Murphy observed that relationships outside the university proved important as well: Peers at other universities, in particular the other Ivy League schools, were valuable resources for information and approaches to emulate, Murphy said. He also cited the value of close relationships with other university CIOs, as well as with Philadelphia’s FBI branch and police department. Despite all the research, diplomacy, and significant investment, however, Murphy emphasized the hard truth that no amount of spending could guarantee complete protection from a data breach.

After establishing a security baseline, Murphy’s team developed the Simplification, Automation, Visibility, and Engagement (SAVE) program. The primary goals of SAVE are to prevent privacy breaches; avoid loss of intellectual property, resources, or reputation; and reduce noncompliance. He noted that the program offers strong recommendations, but no mandates, to improve data security, access, awareness, and training for safe data handling.

While SAVE is meant to help prevent privacy intrusions and other harms from cyber incidents, Murphy’s team fully recognized that some breaches would be unavoidable. As a result, the program also includes a breach-response plan that empowers an incident response team to make decisions on behalf of the university. Within hours of a data breach, the team (following the Planning, Identification, Containment, Eradication, Recovery, Lessons learned model) begins determining a communications plan, identifying the cause, and putting containment measures in place.

Murphy noted that the team also has an annual SPIA (Security and Privacy Impact Assessment) program that assesses internal compliance with the recommendations that keep student, operational, financial, medical, or other data secure. While no penalties

Suggested Citation:"Remarks of Speakers." National Academies of Sciences, Engineering, and Medicine. 2016. Data Breach Aftermath and Recovery for Individuals and Institutions: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/23559.
×

exist, finding security lapses motivates Murphy’s team to work closely with a department to eliminate such gaps and better protect data. Every school or center is also required to have a security liaison, a staff member who received security training and becomes the “eyes and ears” for data security on their site.

University research is particularly vulnerable to intellectual property or data theft, and such an enormous, decentralized environment cannot fully control all its data access points. Murphy addresses these concerns through yearly assessments, and through the office of security liaisons. Extra staffing and security are also needed for applications that require special handling, such as data encryption or protection from physical theft. Yet, he said, his greatest fear remains the “unknown unknown”—an act that is completely unpredictable.

Murphy observed that there are also “known knowns” that even the most robust security system cannot protect against. One of the biggest, he said, is human error. “No matter how good the tools are that I put into place, we’re dependent on our constituents to help secure our data and our systems,” he said, emphasizing that for a system to truly be secure, it must be easy for nonexperts to use. While analogies to crime or war are common in discussions of cybersecurity, Murphy pointed to healthcare as perhaps a better analogy for cybersecurity today. In healthcare, most people are not doctors, but patients. The same is true in cybersecurity: most people are not security experts, but everyday users. Continuing the analogy, he noted that while everyone gets sick at some point, vaccines, immunizations, and basic hygiene can keep most people healthy most of the time. Similarly, in cybersecurity, he said, there might be inevitable security breaches, but basic security measures can keep most intruders at bay. In addition, no two diseases or breaches are exactly alike, and a given diagnosis may have multiple treatments, which may work in some situations but not in others.

Taking the discussion further, he noted that evidence-based medicine optimizes decision making by relying on well-designed research, which has led to effective solutions such as the use of checklists in improving hospital sanitation. Murphy argued that the information security community should use similar practices. A succinct checklist of responsible computer or smartphone ownership, repeated often, he suggested, could significantly reduce breaches. Users need frequent reminders, but it’s worth it. After all, even doctors need to be reminded to wash their hands. “The [IT] community must agree on the four or five most critical basics of what people need to know to safely own a computer and create a universal and oft-repeated campaign to make sure they are not forgotten,” Murphy said.

Suggested Citation:"Remarks of Speakers." National Academies of Sciences, Engineering, and Medicine. 2016. Data Breach Aftermath and Recovery for Individuals and Institutions: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/23559.
×

Despite an emphasis on public education and awareness around data breaches, the truth is that people make mistakes or bad security decisions all the time, Murphy said. Information security could benefit from a “public health” campaign encouraging everyone to better protect their data, which would make for a more secure environment overall, Murphy argued. While his office is now launching a campus campaign to deeply engage the entire community, Murphy recognized that his efforts have their limits. “I can’t stop all of that behavior. All I can do is educate,” he said.

AN INDUSTRY PERSPECTIVE ON BREACH DISCLOSURES

Heather Adkins, Google, Inc.

As manager of information security at Google Inc., Heather Adkins offered the perspective of someone who spends each day assessing and responding to information security threats in the private sector. She is responsible for Google’s global data security—no small task at a company with more than 60,000 employees in more than 40 countries. Adkins’ talk focused on the benefits, and some downsides, of data breach disclosure in four main areas: assistance to customers, identification of attackers, deterring future attacks, and aid in data-security education for everyone.

Adkins noted that a primary reason companies disclose data breaches is that they are required to by law. At Google, Adkins said there are several additional factors at play. An important reason Google supports disclosure, for example, is because it is ethically the right thing to do, Adkins said, noting that disclosure aligns with one of Google’s mottos, which is to “put the user first, and all else will follow.” Disclosing a breach is the right thing to do because it empowers the user to take action to defend themselves against fraud, and because it empowers users to make informed choices as consumers.

Some consumers might elect to leave a company that has compromised their data; however, Adkins offered a few examples suggesting that this is uncommon, and perhaps impractical. She noted that the highly publicized breach at Target, for example, wound up costing the company an estimated 0.1 percent of its 2014 sales (after subtracting insurance payouts and tax deductions), and the company’s profits and stock have risen 21 percent since then, suggesting that few consumers actually made the decision to “vote with their feet” by taking their business away from Target. Another example she observed is that after the 2014 Sony breach, the company is estimated to have lost 0.9 to 2 percent of its sales for the year, yet the event gave the company free publicity for a movie, The Interview, that might otherwise have been more of a financial loss.

Because breaches have become so frequent and widespread, Adkins said consumers’ apparent reluctance to leave companies that disclose data breaches may also

Suggested Citation:"Remarks of Speakers." National Academies of Sciences, Engineering, and Medicine. 2016. Data Breach Aftermath and Recovery for Individuals and Institutions: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/23559.
×

be attributable to the fact that a consumer would be hard pressed to find a company that has not been compromised. The truth, she said, is that no company’s data is completely secure. “Every organization is compromised, and they’re compromised repeatedly whether they know it or not,” Adkins said.

Citing personal experience of data breaches and fraud, Adkins argued that credit monitoring should become a consumer’s default status, not a temporary, 1-year service used only after a known breach has occurred. Adkins described how she became a fraud victim in 2007 as a result of a data breach that had occurred in 2003. In such a case, she said, a 1-year period of credit monitoring after the 2003 breach would not have been sufficient to prevent or alert her to the subsequent fraud 4 years later. “We should just be offering [credit monitoring] by default all the time—it should be something you always have,” she said. However, she acknowledged that this is not a foolproof fix; despite near-constant credit monitoring and multiple credit card re-issues, she noted that her personal data is likely still vulnerable to breach and fraud.

Another reason to disclose data breaches is to aid in the deterrence of future attacks, Adkins argued. For example, disclosure and attribution of state-sponsored breaches could start a more official conversation between nations. Or, financially motivated attackers might reconsider targets that are publicizing their stringent security measures. However, she also observed that in some situations it is possible for disclosures to backfire by feeding valuable information to attackers, raising difficult questions around when and how to disclose a breach and its suspected causes or responsible parties.

Adkins said that it is useful to consider an attacker’s motivations when deciding how to respond; for example, a hack could be motivated by espionage, financial gain, or simply fun and fame. Companies such as Google sometimes use warnings or alerts as a remediation when a breach or attack is detected, but that can pose its own challenges. She noted that disclosing information about a breach can be dangerous if the breach is misattributed, misdiagnosed, or otherwise misunderstood. For example, Adkins posed a hypothetical situation in which two attackers hack into a company’s system but only one is caught: “Then the other one gets to watch us play incident response against the other, and they can do their jobs much better. In fact, disclosure gives the attacker situational awareness, and we must take that into account,” she said.

Often breaches become a game of cat-and-mouse, with each party trying to guess what the other knows. Adkins said that Google might choose to alert users who may be targeted by hackers by posting a pink status bar at the top of the person’s Gmail account. The attacker can also see the pink bar, and so they know that Google knows there has been a breach, but they might not know exactly what Google knows. In another example, she described when a security software company was breached in 2013; the company posted on its website some of the evidence for the breach, such as where the

Suggested Citation:"Remarks of Speakers." National Academies of Sciences, Engineering, and Medicine. 2016. Data Breach Aftermath and Recovery for Individuals and Institutions: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/23559.
×

malware was found. For the attacker, that disclosure may have offered valuable clues about what was discovered, and, more importantly, perhaps what was not discovered, such as a second “back door” that the attacker could continue to exploit. Acknowledging that this is a difficult tension, Adkins said that although it is not advisable to publicly lie, deceiving your attackers can be an important tactic. “I think we can make smart choices about what we disclose and in what detail . . . We need the laws to not be so prescriptive that we have no agility in those situations,” she said.

Another reason for breach disclosure is to improve the security awareness and habits of consumers and employees. Adkins told a story of a breach in 2010 that affected data from Google and at least 20 other companies. Employees were required to change their passwords, lost their access to virtual private networks, and were directed to take other precautions without knowing why, causing understandable frustration on the part of many affected workers. Once the breach was disclosed, she observed that the experience presented a useful starting point for a deeper conversation with employees about information security and vulnerabilities.

One of Adkins’ roles at Google is to engage employees on the day-to-day security issues that they encounter. Through education campaigns, she noted that Google employees have learned to identify phishing attacks, detect penetration tests, and identify software vulnerabilities. Disclosing known attacks to employees offers valuable fodder for engaging in collaborative conversations about improving data security.

Adkins closed with a few comments about formal groups for information sharing (ISACs), which have recently sprung up in many industries as a means to facilitate company-to-company exchange related to preventing breaches. One reason for skepticism about their benefits and effectiveness, she said, is that there are a large number of small startup companies that are left out of the conversation, largely because they are not well equipped to detect breaches affecting them and thus have little to share. “If you have nothing to share, nobody will share with you,” she said. But when large companies suffer and disclose breaches, “it’s a magnet for information sharing when you are able to disclose and to talk very openly about what happened about your organization.”

Adkins posited that information sharing works best among small communities in which players have established deep relationships. When asked by Bob Blakley, CitiGroup, Inc., about that viewpoint, Adkins noted that Google has indeed benefitted from Structured Threat Information eXpression (STIXTM) and Trusted Automated eXchange of Indicator Information (TAXII), which seek to standardize and automate the exchange of

Suggested Citation:"Remarks of Speakers." National Academies of Sciences, Engineering, and Medicine. 2016. Data Breach Aftermath and Recovery for Individuals and Institutions: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/23559.
×

information about cyber threats. However, to truly have an impact, information sharing must go beyond indicators: “It has to be a rich conversation about attackers, who they are, their modus operandi, their speed of attack through victim A, through victim B. There’s a lot to a forensic analysis that is helpful outside of IP addresses,” Adkins said.

Companies are required to disclose information about data breaches to the government, and many also disclose this information to users and employees for a variety of other reasons. In addition, William Sanders, University of Illinois, Urbana-Champaign, pointed out that detailed information about these breaches could have a lot of value for academic research; measuring the user’s response, and any attacker’s counterresponse, to a warning on an Internet browser, could be a rich place for Google and academia to partner, for example. Adkins concurred, saying that conducting studies on users and employees would likely be more feasible than studies on adversaries. Sharing Google’s breach data would require removing any personally identifiable information, and so it might be easier, safer, and more useful for Google to share data around an attack: how it started, what techniques were used, and so forth. She further observed that additional complications include the fact that huge attacks, like those at Target and Home Depot, are not characteristic of most data breaches, and also, many breaches simply go undetected. Some academic researchers have tried creating sham companies and databases to lure attackers for research purposes, but Adkins noted that it is extremely difficult to pull off this approach.

For both academic research and companies’ breach prevention purposes, Adkins said that the information that is likely most useful is that concerning how the attack occurred. Although many companies are capable of identifying that an attack has occurred, she noted that fewer are equipped to “build out the full kill chain” and trace the root causes of the breach. Finding—and sharing—such information would be a valuable weapon against future breaches.

DATA BREACH AFTERMATH AND RECOVERY

Bob Belair, Arnall Golden Gregory, LLP

Bob Belair is a partner at the law firm Arnall Golden Gregory, LLP, where he serves as practice leader of the Privacy Practice Group and co-chair of the Government Relations Practice. He has been active in data privacy issues since 1971, when he was a law student assistant to Alan Westin, a celebrated legal scholar whose work was instrumental in defining information privacy in the modern era. Belair began by recounting how he had assisted Westin on his second book, Data Banks in a Free Society, which grew out of a National Academy of Sciences study that articulated a comprehensive approach to

Suggested Citation:"Remarks of Speakers." National Academies of Sciences, Engineering, and Medicine. 2016. Data Breach Aftermath and Recovery for Individuals and Institutions: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/23559.
×

information privacy, later popularized by the Privacy Protection Study Commission and what was then the Department of Health Education and Welfare.

Belair’s presentation focused on the thorny legal issues companies face in the aftermath of a data breach, and it stressed the need for standards across all aspects of breach prevention and remediation. In his view, the current lack of good remedies for data breaches underscores the need for further scrutiny and the type of “thoughtful and creative” solutions for which the Academies are known.

Belair began with a discussion of common ramifications after a breach and the harms and remedies that may be involved. He contended that class-action lawsuits do not represent an effective remedy for most data breaches. The research, document discovery, and attempts to prove concrete harm for these suits add up to a great deal of expense, and worse, in his view, class-action suits are not generally beneficial to consumers because they tend to award little money to the actual victims—those consumers whose information has been breached.

He noted that other types of lawsuits can be brought after a breach and spoke specifically about shareholder derivative or employee lawsuits. He observed that although data breaches have not proven to cause significant financial losses for their companies in the long run, in the short run, they can be very damaging for stock prices, and thus for their shareholders, who might try to bring suit. In addition, he noted that employees could also sue the company if it is their data being compromised, regardless of whether it was an employee responsible for the breach in the first place.

Belair described how there can be numerous longer-term costs for companies after a data breach, apart from those related to litigation or settlements. Employees and senior executives can lose their jobs; for example, Target’s CEO resigned after the company’s 2013 breach. In addition, he noted that there is the clear potential for harm to the company’s reputation and its future costs of auditing and security can skyrocket. Reiterating the impact of even short-term stock plunges, Belair emphasized that these events can be a significant blow to a company’s bottom line as well as the stock market more broadly.

Belair described how companies also face regulatory actions after a data breach. He noted that the Office for Civil Rights, the U.S. Department of Health and Human Services, the Federal Trade Commission (FTC), and other government agencies may get involved, depending on the nature of the data that was breached. Despite often being their opponent in lawsuits, Belair praised these agencies for being reliable, vigilant, and fair. He singled out the FTC in particular for focusing on breaches that showed gross negligence or malicious intent (such as a shocking eight separate breaches at one state university system) instead of breaches with little evidence of wrongdoing. As an example, he characterized a data breach lawsuit from 2008, in which Belair argued against the FTC, as a fair investigation, and one that resonated throughout the industry. “The point is, we

Suggested Citation:"Remarks of Speakers." National Academies of Sciences, Engineering, and Medicine. 2016. Data Breach Aftermath and Recovery for Individuals and Institutions: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/23559.
×

need the FTC on the beat here,” he said.

In fact, Belair posited that the FTC’s power is currently too limited. Enabling the FTC to prescribe what constitutes adequate information security could lead to a more effective security standard that would help to prevent breaches. Other regulatory bodies, such as the Federal Communications Commission, the Securities and Exchange Commission, the Consumer Financial Protection Bureau, and state attorneys general, might also provide valuable input, he said.

Despite what he views as a true need for a national standard for data security and breach recovery, Belair said he is not optimistic that one will emerge. Pointing out that there have been a number of failed bills introduced in Congress, he contended that no privacy bill would be balanced in a way that would satisfy both consumer advocates and businesses. Privacy is a complex issue, and he observed that lawmakers can be reluctant to pass a bill that can have unforeseen consequences.

Belair offered an outline of some of the many questions to answer before a standard can be developed: for example, What is the right standard for liability—strict liability, negligence, reasonability, or some combination of these? How much should be offered in damages? What types of information are protected, and should there be state, federal, or court-imposed requirements for data security?

Belair brought up the breaches at Ashley Madison and VTech as situations without obvious financial harm, but with potentially deep psychological harms. In response to a question from Yoshi Kohno, Belair considered whether it might become more common to see breaches in which personal or reputational damage is the main target, rather than financial or fraud-related ends. For example, he said, in this presidential election year, it would not be surprising to see a major breach that is intended to undermine the campaign of a presidential candidate. The data breach laws, in his view, are not nuanced enough to cover these situations, despite the information leaks being serious, potentially devastating events in people’s lives.

Belair closed by highlighting three constructive changes he has observed over the past 20 years. First, he emphasized that real progress has been made around data security. For example, most states have some kind of data privacy laws in place, and the Gramm-Leach-Bliley Act requires financial institutions to safeguard consumer data. Second, he expressed optimism that technology will continue to offer fixes, such as encryption, that make data security easier and better, although of course, companies will have to decide to invest in these fixes. And finally, Belair said he believes the culture around

Suggested Citation:"Remarks of Speakers." National Academies of Sciences, Engineering, and Medicine. 2016. Data Breach Aftermath and Recovery for Individuals and Institutions: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/23559.
×

data security is changing. More and more, consumers have higher expectations for data security, and he noted that companies are now making clear investments in meeting these expectations by creating positions such as chief security officer, chief privacy officer, and chief compliance officer.

Prompted by Eric Grosse, Google, Inc., Belair addressed the tensions between information sharing and legally privileged company information. He offered the caveat that, as a lawyer who represents breached companies, his perspective is likely different from many others in the room. Belair noted that it is natural for companies and their legal teams to take advantage of whatever opportunities and resources are available. He said that companies can often satisfy the needs of the FTC or other regulators without divulging privileged information, yet in the context of litigation, privilege must be used in a much more limited way.

In a theme that ran throughout his presentation and the discussion that followed, Belair expressed concern over the state and influence of cybersecurity insurance. With dozens of such insurance carriers today—up from just six carriers in 2008—the vast majority of companies have cybersecurity insurance. Belair described how when issuing policies, insurers collect enormous amounts of data on the companies they cover, requiring comprehensive questionnaires and documentation covering all aspects of the company’s security measures, company processes, and employee policies. Because of the lack of federal policies for security standards, Belair said these insurance companies are essentially creating security policy. Describing a specialist insurer client of his as an example, Belair said that the company has such a large percentage of the market share for cybersecurity that it “probably makes more pervasive and important decisions about cybersecurity inside America’s corporations than any other entity.”

Policy created by insurers may or may not be the optimal way to create national standards, Belair said: “The question for you guys is, does that make sense? Is that what we want?” The answer may be yes—Bob Blakley pointed to the success of fire and building codes, which were largely driven by the insurance industry, as an example of the potential benefits of this approach—but Belair emphasized that it is still a question worth asking.

BREACH RESPONSE—VIEW FROM THE TRENCHES

James Harvey, Alston & Bird

James Harvey, a partner in the Technology and Privacy Group at the multinational law firm of Alston & Bird, was among the first attorneys in his firm to establish a special practice in data privacy and cybersecurity. As co-chair of Alston & Bird’s Privacy and

Suggested Citation:"Remarks of Speakers." National Academies of Sciences, Engineering, and Medicine. 2016. Data Breach Aftermath and Recovery for Individuals and Institutions: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/23559.
×

Security Task Force and the firm’s Cybersecurity Preparedness and Response Team, he has been focusing on data breaches and breach responses nearly exclusively for the past several years.

In his talk, Harvey attempted to demystify the actions lawyers often recommend to a company after it discovers a data breach. From a legal perspective, everything changes once a company discovers or is notified of a breach, he said, describing the complexities of a breach response as “a pressure-cooker of an experience” and “truly a corporate root canal.” However, this process can begin slowly, he said; rather than a bright line being crossed, it is often more like an evolving realization as evidence accrues, until the point at which the company truly understands what has happened and what its liability is.

Harvey noted that it often takes a third party to discover that a company has been hacked. Sometimes companies are contacted, formally or informally, by a government agency, such as the FBI. Reporters also break these stories through connections within the hacker community or other companies. Harvey mentioned Brian Krebs, a former Washington Post journalist who now writes the blog Krebs on Security, who is a well-known and widely respected source of information on security breaches. Harvey said Krebs is often remarkably accurate; he constantly monitors Krebs’ Twitter feed for stories about his clients. Harvey said that when Krebs notifies a company about a potential breach, that company goes on high alert.

Harvey described how the data breach at the payment processing company Global Payments in 2012 illustrates the power Krebs and other security journalists wield when they disclose a possible breach. He said that the breach was reported on between 9:00 and 9:15 a.m. on a Friday, and within 2 hours, the company’s share price had dropped 24 percent, and trading in the stock was halted altogether. Because Global Payments is a publicly traded company, the breach had enormous implications not only for the company itself, but also for its investors and the wider financial services industry. Harvey pointed out that companies suffer these implications regardless of whether a reported breach turns out to be true or not, and he cautioned that journalists should be cognizant of the disastrous consequences data breach stories can have.

Harvey discussed how once they suspect a breach, executives have many decisions to make: Should the company put out a press release? When? What if the first release is wrong, and they have to retract the initial report? What effect will disclosure or nondisclosure have on the company’s stock price, customers, or reputation? It is the legal team’s job to be aware of the potential liabilities when advising clients. Harvey said that companies must also be aware that suspected or potential data breaches are tantalizing stories for the media, prosecutors, regulators, and other stakeholders. Disclosures and their timing can have immediate and lasting effects on stock prices, customers’ trust, and other downstream effects. In some cases, he said, even Congress

Suggested Citation:"Remarks of Speakers." National Academies of Sciences, Engineering, and Medicine. 2016. Data Breach Aftermath and Recovery for Individuals and Institutions: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/23559.
×

will probe exactly what the company did from the first red-flag event until the breach was investigated and disclosed.

Harvey discussed a major breach at Target, noting that the company stated in an initial press release after the company’s 2013 breach that they had “identified and resolved the issue.” Two weeks later, another press release admitted that the breach was more extensive than the company had realized, and that it was not in fact resolved. While companies may want to rush to disclose, Harvey said that he advises them to choose their language carefully in order to avoid issuing retractions or incurring lawsuits or regulatory fines. He encourages clients to view breaches as a brand-building experience; customers might remember the breach if the company handled it well, but they will certainly remember it if the company botched it.

Cyber incidents are expensive, and Harvey said that one of the highest costs is the investigation. Within hours of learning about a potential breach, he said, incident response forensic investigators, who operate under legal privilege on the company’s behalf, are called in. In the case of a large breach, Harvey said that the cost of this response team can be five times higher than the legal costs.

Harvey highlighted how far-reaching and expensive data breaches are, as well as how complicated the web of liability can be. In the 2015 breach of Anthem, Inc., for example, although the data was breached on Anthem’s servers, it was their customers—the administrators of benefit plans and insurance companies, for example—who had to notify affected consumers, he said. In the Target hack the same year, Target was liable, to the tune of tens of millions of dollars, to the credit card companies whose card numbers were stolen. In fact, Harvey observed that credit card companies have a great deal of power when it comes to data breaches involving credit card information; Harvey categorized the companies as “judge, jury, and executioner” in these situations. “It’s a very difficult circumstance,” he said. “From a monetary exposure perspective, particularly for retailers and those involved in the payment card business, this is where the real money is at issue.”

Right now, Harvey said that the United States has a patchwork of state laws for handling data breaches, as opposed to one clear and straightforward system, yet plaintiffs often rush to bring data breach suits that can threaten confidential company information. When a company shares breach information, officially or unofficially, with a third party—such as law enforcement, regulators, or an ISAC—that information may become public during a lawsuit, potentially exposing the data and the company to additional

Suggested Citation:"Remarks of Speakers." National Academies of Sciences, Engineering, and Medicine. 2016. Data Breach Aftermath and Recovery for Individuals and Institutions: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/23559.
×

risk, he noted. Harvey expanded on this point in response to a question raised by Richard Danzig, Center for a New American Security, specifying that malware signatures and IP addresses will likely be shared, but companies have a legitimate interest in keeping personally identifiable data, as well as information about the ongoing investigation, under privilege, in the context of breach investigations or lawsuits. Although forensics investigators may uncover a trove of information, this information should be generally considered under legal direction in anticipation of a lawsuit. Harvey recognized that this delicate balance of sharing or not sharing information can raise tensions, but he pointed out that companies have a right to a certain level of privacy while they are trying to investigate and respond to a breach.

Harvey switched gears and asked a question that provoked a wide discussion among attendees: Is there true consumer harm when a credit card number is stolen? If no identity fraud is committed, he posited that a consumer has not suffered a tangible loss. “A lot of times there is no consumer harm, in terms of those pure dollars out of pocket,” he said. “The other types of harm are intangible, if you will, and much more difficult to quantify.”

Fred Schneider pointed out that time spent resolving the issue is indeed a cost and, therefore, represents a harm to a consumer. Fred Cate, Indiana University, suggested that credit card companies could reduce this burden by making the process of changing one’s credit card numbers after a breach more automated, for example, not just by sending a new card in the mail but by making it easy for consumers to automatically update the card numbers on all of their regular charges. Expanding on this idea, Bob Blakley noted that payment systems such as Apple Pay and Samsung Pay use one-time numbers that are distinct from the primary account number for each transaction, suggesting that such technological solutions could help further reduce the burden of replacing credit cards after a breach.

Tying into this discussion, Harvey raised the issue of “breach fatigue,” suggesting that after so many breaches, consumers no longer judge breached companies as harshly and no longer take breach notifications as seriously as perhaps they once did. But while the harms from disclosure of credit card numbers alone may represent limited harm in his view, combining credit card data with additional personal details such as social security numbers, fingerprints, or federal background checks amounts to a potentially much bigger problem. The threat is particularly acute, he noted, if such information falls into the hands of a foreign adversary as part of a nation-state backed hack.

Paul Kocher, Cryptography Research, Inc., expanded on the different types of harms that can result from breaches, citing suicides that allegedly resulted from the Ashley Madison breach. In addition, he noted that a reporter told him that she fears cybercrime reporting because of the risk that hackers will make public her personal

Suggested Citation:"Remarks of Speakers." National Academies of Sciences, Engineering, and Medicine. 2016. Data Breach Aftermath and Recovery for Individuals and Institutions: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/23559.
×

information, a practice known as doxxing. He pointed out that these are real harms that remuneration or credit monitoring cannot resolve.

Turning from consumer harms to the harms suffered by a breached company, Yoshi Kohno raised the question of whether there is any indication that breaches might—in the future, or even today—result not only in the exfiltration of data but in the modification of a company’s records. Harvey said this is a real danger that can call a company’s publicly reported numbers into question, but it depends on what data has been accessed. It can be extremely difficult to determine exactly what data has been accessed, acquired, or even modified, and all of these unknowns can have ramifications for the company and its liability, he said. As a result, data breaches can create a legal quagmire for consumers and for companies, with far more questions than answers.

THE CHALLENGES OF REMEDIATION

David Vladeck, Georgetown University

David Vladeck, currently a professor at Georgetown University Law Center, formerly served as director of the Bureau of Consumer Protection at the FTC, where he oversaw numerous data breach and privacy-related investigations. He shared his perspective on the inadequacy of current data breach remediation measures, addressed the motivations for breaches and the types of harms they cause, and raised pointed questions about ways to not only mitigate the effects of breaches, but actually prevent them from occurring in the first place.

Data breaches are all too common and affect a huge swath of the population, essentially making normal what should, in Vladeck’s view, be considered unacceptable. In 2014 alone, 110 million Americans—more than one-third of the population—had their data breached, he noted. Vladeck began by emphasizing how important it is to find better solutions for data breaches, which in his view have not been properly addressed in the policy sphere. “I’m really delighted that this group is taking a hard look at this problem, because this problem has plagued us for a long time and it has not received the attention I think it deserves from policy makers,” he said.

Expressing little confidence in current remediation efforts, Vladeck characterized typical breach remedies as “crude,” “a poor substitute for avoidance,” “the least robust remedy available,” “the last step in a cascade of bad options,” and “simply an effort to staunch a wound, when the wound has already been inflicted.” Financial remediation, he said, cannot return the user to the pre-breach status quo, because the information is still vulnerable. It also cannot begin to address the $25 billion aggregate annual losses to businesses from identity theft in the United States, a figure Vladeck noted is $11 billion

Suggested Citation:"Remarks of Speakers." National Academies of Sciences, Engineering, and Medicine. 2016. Data Breach Aftermath and Recovery for Individuals and Institutions: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/23559.
×

larger than the aggregate loss from all property crime. In response to a question raised by Bob Blakley, Vladeck noted that although consumers likely do not end up footing the bill for that full figure in dollars, “they pay with their time, with their mental health” as they go through the difficult processes required to correct credit card fraud or claim identity theft.

In the case of a breach involving nonfinancial information, such as children’s data, medical data, or private photographs, Vladeck said, “remediation is essentially a mirage.” No amount of credit monitoring, credit freezes, or other currently available remediation tools can offer comfort to victims, for example, who had their medication use, their children’s information, or their personal photos exposed in situations in which they had an expectation of privacy. “There’s no remediation for this kind of outrageous assault on privacy,” Vladeck said.

Because remediation does not currently address the harms done by data breaches, Vladeck urged attendees to devise strategies “to reverse the tide of data breaches in the first place.” Two changes are needed if that is to happen, he said: companies and the government must secure data in a way that is appropriate to its level of sensitivity and volume, and lax security practices must be penalized to a far greater degree than they are currently.

In order to assess whether companies are securing data appropriately, what constitutes appropriate data security must first be defined. To do this, Vladeck suggested that researchers should conduct retrospective analyses of data breaches (once the details have been publicized and anonymized). Vladeck said that when the FTC brings data breach cases to court, it does make those elements public, but for every breach case, there are many more investigations where potentially valuable information—rich data from which researchers could glean insight into how breaches happen—is not publicized, due to existing FTC statutes. Although researchers, as a result, only have access to what Vladeck said is “the tip of the iceberg” in terms of total data breaches, that information can be a useful starting point for greater insights into what works and what does not in terms of data security protections.

Vladeck noted that the most common type of data breached, and the majority of the cases the FTC has brought so far, is personally identifiable information. He contended that these breaches can often be attributed to “inexcusably poor security measures by the company.” As an example, he cited the 2008-2009 breaches at Wyndham Worldwide Corp., in which the company was breached three times in 18 months, which resulted in the credit card information of more than 600,000 people winding up in the hands of the Russian mafia.

Vladeck noted that the release of personally identifiable information has concrete harms because it allows criminals to commit identity fraud, a crime that has skyrocketed

Suggested Citation:"Remarks of Speakers." National Academies of Sciences, Engineering, and Medicine. 2016. Data Breach Aftermath and Recovery for Individuals and Institutions: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/23559.
×

in recent years. He said that in 2014, the FTC fielded 332,000 identity fraud complaints, a number that only includes those who completed the FTC’s exhaustive online form. The agency estimates that for every completed form, there are 10 to 20 other victims who either never completed the form or never even knew to file one in the first place—amounting to an estimated 1,000 identity theft victims each day.

The enormous growth in identity fraud coincides with the enormous growth of the Internet economy, said Vladeck; he cited several examples of vulnerabilities in home Internet use. TRENDnet recently settled a case in which its baby monitors were using an unsecured wireless video feed that allowed hackers to easily gain access to videos of children and individuals inside their homes. FrostWire, a peer-to-peer file-sharing application, settled with the FTC because its default setting made it too easy for customers to unwittingly expose their personal files. The so-called “Internet of Things,” a trend in which our daily lives and homes are becoming ever more tied to the Internet, further increases our vulnerability. Vladeck imagined a “smart” refrigerator alerting its owner that he has run out of beer through an insecure connection with no data encryption. While this may not be personally identifiable information, that doesn’t mean that it shouldn’t stay private.

In addition to intentional hacks, Vladeck observed that the accidental release of information presents a serious concern for data security and privacy. As an example, Vladeck pointed to an accidental release of data about Prozac users by the drug company Eli Lilly in 2002. Pharmacies use unsecured trash receptacles to dispose of sensitive medical information. Stiff penalties might be the only way to stop companies from being so egregiously careless with private, potentially valuable information, Vladeck said.

Like the other speakers at the workshop, Vladeck recognized the difficulty of quantifying harm to a consumer after a breach. To address this, he suggested drawing ideas from fields that quantify harm probabilistically. For example, in medicine, it is impossible to say that being exposed to a toxic substance means a person is 100 percent likely to develop cancer, but the exposure is considered harmful if it raises the person’s cancer risk past a certain threshold. Similarly, Vladeck noted, when a person’s data is breached, it is not guaranteed that identity fraud will occur today, but, as the criminal’s intention is likely to use or sell that data, there is undoubtedly an increased risk of identity fraud in the future. In that sense, the harm is virtually certain, although it is impossible to predict when the fraud will occur. “We need to recognize that the risk of identity theft itself is a real harm, just the way the risk of cancer and the risk of other things that are bad are real harms,” Vladeck said.

As for tangible financial harm after a breach, Vladeck rebutted the suggestion that current measures to protect consumers against fraudulent credit card charges makes a credit card breach essentially harmless: “It is simply not true that the people whose financial information was taken as the result of the big breaches suffered no loss.” While they

Suggested Citation:"Remarks of Speakers." National Academies of Sciences, Engineering, and Medicine. 2016. Data Breach Aftermath and Recovery for Individuals and Institutions: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/23559.
×

are protected from unauthorized credit card charges, that right is time limited, he noted. In a case like Wyndham, in which the fraud unfolded over the course of 18 months and some victims were not notified until years later, that time window might have closed. In addition, to benefit from these protections, consumers must notice a fraudulent charge in the first place, which Vladeck suggested is often not the case. Thieves are creating sophisticated, believable charges that don’t raise red flags. They “are not stupid enough to put ‘Russian mafia’ on your credit card statement,” said Vladeck. Rather, they create a small charge, such as $7.99 or $9.99, with a generic name. He cited a scam based in Eastern Europe that used the names of U.S. presidents as part of the names of phony companies, in the hopes that consumers wouldn’t notice anything amiss. In that case, he said, they judged consumers correctly, and were able to skim hundreds of millions of dollars.

While one could argue that consumers must be responsible for their own protection, Vladeck contended that practically speaking, that is incredibly difficult. A consumer must not just own a computer, but be a skilled user. After a breach affecting financial data or personally identifying information, phone calls to the organizations theoretically capable of addressing identity theft or fraud typically offer little assistance: Calling a credit monitoring service, or even the Internal Revenue Service, often gets a victim nowhere, he said. For these reasons, Vladeck is emphatic that consumer harm is real. There is virtually no pathway to follow after a breach of nonfinancial information, such as medical data, private photographs, or the like. In those situations, the onus of reclaiming identity falls to consumers. When a child’s data is breached, the fraud often isn’t discovered until a child reaches adulthood and needs to rely on his or her identity to open a credit account, buy medical insurance, or sign up for a loan. In these instances, again, current remediation efforts provide little value.

Vladeck concluded by enumerating another harm from today’s constant stream of data breaches: a growing mistrust of the Internet economy. Computer hacking was the crime most worrisome to Americans in 2014, and more than 60 percent of Americans are concerned about the security of their credit card information, phones, or computers.1

In a question, Deirdre Mulligan raised the point that remediation is necessary because no matter how solid data protections become, breaches will remain, at least to some degree, inevitable. Vladeck agreed, but restated his position that companies must try as hard as they can to prevent breaches, but he acknowledged that some data

___________________

1 R. Riffkin, 2014, “Hacking Tops List of Crimes Americans Worry About Most,” October 27, http://www.gallup.com/poll/178856/hacking-tops-list-crimes-americans-worry.aspx.

Suggested Citation:"Remarks of Speakers." National Academies of Sciences, Engineering, and Medicine. 2016. Data Breach Aftermath and Recovery for Individuals and Institutions: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/23559.
×

breaches take place even when strong security measures are in place. He observed that although it may be good news for the company that breaches ultimately are not very expensive with today’s remediation measures, that is bad news for the consumer. He contended that holding companies who fail to take adequate precautions to a strict liability regime, covering all types of data, would offer a stronger motivation to protect consumers’ data. “Sometimes a stick is as good as a carrot,” he said.

Noting that data is often sold or passed to multiple third parties, Yoshi Kohno asked which parties should be legally responsible for securing that data and responding to any breaches. Vladeck answered that whoever had the data breach is the one responsible, not necessarily the first company who collected or generated the data. Expanding on this point, he noted that even data that is properly secured, encrypted, or quarantined can be breached. While conceding that breaches will continue to happen, Vladeck emphasized that they are less significant when a company has done a thorough job securing data and that, in many cases, companies likely have the capability to better secure consumer data. After all, he observed, companies often aggressively guard their own trade secrets even while they take far less rigorous measures to protect payment information or other personally identifiable data.

Fred Cate pointed out that the government never intended for social security numbers to be secret; they only became so when banks started using them for account passwords. Perhaps instead of protecting certain numbers, there is a way to make the infrastructure more flexible when proving an identity, he suggested. Vladeck concurred that this is a promising idea that warrants further research. Picking up on Cate’s point, Schneider parsed the problem as one between personal identifiers (such as labels, phone numbers, or names) and personal authenticators (such as a PIN, token, or biometric), and instead of protecting identifiers, we should be requiring authenticators. Vladeck cautioned that authentication is difficult for the user to navigate, although he agreed that it is more secure.

Returning to an idea explored in several other presentations, Steve Lipner brought up the role insurers play in this sphere. Vladeck suggested that their presence improves data security discipline among companies, for example, by refusing coverage to companies who do a poor job of data security. However, on balance, he said he agreed with Belair that interests of insurance companies are not always aligned with the consumer’s or the public’s, and thus these companies might not be the appropriate parties to, essentially, create security policy. Although security standards are needed, Vladeck said, insurers—and even government bodies—are not necessarily the only or best ones to create them. For example, private standard-setting organizations, such as the International Organization for Standardization, could collaborate to develop enforceable, effective security standards for companies.

Suggested Citation:"Remarks of Speakers." National Academies of Sciences, Engineering, and Medicine. 2016. Data Breach Aftermath and Recovery for Individuals and Institutions: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/23559.
×

DATA BREACHES AND THE FEDERAL TRADE COMMISSION

Aaron Burstein, Federal Trade Commission

Aaron Burstein is a senior legal advisor to Commissioner Julie Brill at the FTC, where he advises on enforcement and policy matters concerning privacy, data security, financial practices, and a range of other consumer protection issues. He has also focused on consumer protection issues in past roles at the White House and the U.S. Department of Commerce. He specified that he was speaking at the workshop for himself and not on behalf of the FTC.

Noting that his experience overlaps that of David Vladeck, Burstein expanded on and clarified some of the themes brought up in Vladeck’s presentation and subsequent discussion. He sought to widen the lens from a focus on data breach incidents themselves to the broader context of data security practices and the FTC’s role in policing those practices.

Burstein began by delineating the boundaries of the FTC’s authority. As a consumer protection agency, the FTC does not pursue criminal activity and does not try to chase down hackers. Rather, its charge is to protect consumers from unfair or deceptive practices, which is generally how a data breach against a company is characterized. Burstein noted that not every breach FTC investigates leads to an enforcement action, nor does every action include a data breach. Of the hundreds of FTC investigations that have been conducted following reported breaches, he said that only about 60 cases have been brought. (In response to a question from Sasha Romanosky, Burstein said it is likely that there are more cases worth bringing than the FTC currently has the capacity to handle, but he did not specify how many more would potentially be brought if more attorneys were available.)

Through these cases, the FTC has developed a broad notion of both qualitative and quantitative harm; Burstein reiterated how difficult it is to measure qualitative harm and noted that the FTC proceeds cautiously in addressing it. One reason for this is that the FTC is governed by a legal standard of “unfairness,” which, in order to win a case, requires “proof of substantial injury to consumers, costs that don’t outweigh the benefits, and something that wasn’t reasonably avoidable by consumers.” However, the variety of potential consumer harms is increasing as data breaches evolve into new areas. Burstein observed that qualitative harm can now be caused by breaches of sensitive information such as private photos, children’s information, geolocation information, or medical data, for example. Burstein described a case in which a medical transcription company inadvertently made transcriptions of doctors’ patient notes available to public search engines. The case resulted in FTC enforcement action.

It is also possible for the FTC to bring cases before a data breach has occurred.

Suggested Citation:"Remarks of Speakers." National Academies of Sciences, Engineering, and Medicine. 2016. Data Breach Aftermath and Recovery for Individuals and Institutions: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/23559.
×

Burstein described instances in which the FTC took action after discovering apparently inadequate data security measures. In response to a question from Paul Kocher, Burstein noted that disclaimers by a company may do little to protect them from FTC action, because the FTC is empowered to look more broadly at the underlying practices of a company. If the agency discovers statements that appear to be deceptive or practices that may be unreasonable, the FTC may still build a case, despite any stated disclaimers.

In the vast majority of cases in which it takes action against a company, Burstein said that the FTC enforces nonmonetary “conduct” remedies. Depending on the facts of a particular case, the agency can require a company to implement a comprehensive security program or require that the company obtain and submit to the FTC biennial security assessments for a period of 20 years, for example. These conduct remedies are meant to fix errant behavior and send a strong message to other companies about security expectations. Burstein observed that conduct remedies do not offer immediate relief for the consumers whose data has been breached, and only in rare cases does the FTC recover money from a data security defendant. In response to a question raised by Paul Kocher about cases involving organizations with no revenue stream, such as open-source software, Burstein said that a monetary judgment can still be entered, although it is typically suspended for amounts that are beyond a defendant’s ability to pay. By entering it into the record, Burstein said, the judgment can help to inform future cases even if the company does not actually pay the full amount of the judgment.

Turning to the question of what more could be done to prevent or reduce the impact of breaches, Burstein reiterated the idea, discussed earlier in the workshop, that technical measures may be employed to make information less useful after a breach. He noted, however, that especially with the trend toward the Internet of Things, the economics of convincing companies to invest properly in information security becomes more challenging when devices are low-cost, abundant, and designed to be used for only a relatively short period of time. In his view, the FTC can help address this by identifying appropriate information security practices, or, at the very least, identifying what constitutes an unreasonable vulnerability. Burstein observed that the FTC has learned a great deal about best security practices that can be useful to help companies identify and address weak security measures. He noted that through publications, workshops, and direct interaction with industry, the FTC aims to help software developers understand the lessons that have been learned—sometimes the hard way—by other companies.

Suggested Citation:"Remarks of Speakers." National Academies of Sciences, Engineering, and Medicine. 2016. Data Breach Aftermath and Recovery for Individuals and Institutions: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/23559.
×

Yoshi Kohno asked whether the FTC had insights into the security practices of U.S. companies as they compare to companies in other countries, a question Burstein said he was unable to address because he had not been closely involved in examining companies outside of the United States.

In response to questions raised by Mulligan and Sanders regarding standard setting, Burstein said it would be a “tough balance” for the FTC to participate in setting technical standards because of its primary role as an enforcer, but that security guidance could be a better role for the FTC. For example, the FTC could theoretically recommend that consumer products be set to the highest privacy settings by default, or issue reminders for users to tighten their own security measures, such as by varying their passwords or using conservative privacy settings. He also noted that the FTC does not have authority to issue regulations in the way that is typical of other agencies, and that the breadth of companies the FTC focuses on makes it challenging to create a rule that could be binding and required of companies that are subject to it.

Suggested Citation:"Remarks of Speakers." National Academies of Sciences, Engineering, and Medicine. 2016. Data Breach Aftermath and Recovery for Individuals and Institutions: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/23559.
×

This page intentionally left blank.

Suggested Citation:"Remarks of Speakers." National Academies of Sciences, Engineering, and Medicine. 2016. Data Breach Aftermath and Recovery for Individuals and Institutions: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/23559.
×
Page 5
Suggested Citation:"Remarks of Speakers." National Academies of Sciences, Engineering, and Medicine. 2016. Data Breach Aftermath and Recovery for Individuals and Institutions: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/23559.
×
Page 6
Suggested Citation:"Remarks of Speakers." National Academies of Sciences, Engineering, and Medicine. 2016. Data Breach Aftermath and Recovery for Individuals and Institutions: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/23559.
×
Page 7
Suggested Citation:"Remarks of Speakers." National Academies of Sciences, Engineering, and Medicine. 2016. Data Breach Aftermath and Recovery for Individuals and Institutions: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/23559.
×
Page 8
Suggested Citation:"Remarks of Speakers." National Academies of Sciences, Engineering, and Medicine. 2016. Data Breach Aftermath and Recovery for Individuals and Institutions: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/23559.
×
Page 9
Suggested Citation:"Remarks of Speakers." National Academies of Sciences, Engineering, and Medicine. 2016. Data Breach Aftermath and Recovery for Individuals and Institutions: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/23559.
×
Page 10
Suggested Citation:"Remarks of Speakers." National Academies of Sciences, Engineering, and Medicine. 2016. Data Breach Aftermath and Recovery for Individuals and Institutions: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/23559.
×
Page 11
Suggested Citation:"Remarks of Speakers." National Academies of Sciences, Engineering, and Medicine. 2016. Data Breach Aftermath and Recovery for Individuals and Institutions: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/23559.
×
Page 12
Suggested Citation:"Remarks of Speakers." National Academies of Sciences, Engineering, and Medicine. 2016. Data Breach Aftermath and Recovery for Individuals and Institutions: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/23559.
×
Page 13
Suggested Citation:"Remarks of Speakers." National Academies of Sciences, Engineering, and Medicine. 2016. Data Breach Aftermath and Recovery for Individuals and Institutions: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/23559.
×
Page 14
Suggested Citation:"Remarks of Speakers." National Academies of Sciences, Engineering, and Medicine. 2016. Data Breach Aftermath and Recovery for Individuals and Institutions: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/23559.
×
Page 15
Suggested Citation:"Remarks of Speakers." National Academies of Sciences, Engineering, and Medicine. 2016. Data Breach Aftermath and Recovery for Individuals and Institutions: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/23559.
×
Page 16
Suggested Citation:"Remarks of Speakers." National Academies of Sciences, Engineering, and Medicine. 2016. Data Breach Aftermath and Recovery for Individuals and Institutions: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/23559.
×
Page 17
Suggested Citation:"Remarks of Speakers." National Academies of Sciences, Engineering, and Medicine. 2016. Data Breach Aftermath and Recovery for Individuals and Institutions: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/23559.
×
Page 18
Suggested Citation:"Remarks of Speakers." National Academies of Sciences, Engineering, and Medicine. 2016. Data Breach Aftermath and Recovery for Individuals and Institutions: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/23559.
×
Page 19
Suggested Citation:"Remarks of Speakers." National Academies of Sciences, Engineering, and Medicine. 2016. Data Breach Aftermath and Recovery for Individuals and Institutions: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/23559.
×
Page 20
Suggested Citation:"Remarks of Speakers." National Academies of Sciences, Engineering, and Medicine. 2016. Data Breach Aftermath and Recovery for Individuals and Institutions: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/23559.
×
Page 21
Suggested Citation:"Remarks of Speakers." National Academies of Sciences, Engineering, and Medicine. 2016. Data Breach Aftermath and Recovery for Individuals and Institutions: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/23559.
×
Page 22
Suggested Citation:"Remarks of Speakers." National Academies of Sciences, Engineering, and Medicine. 2016. Data Breach Aftermath and Recovery for Individuals and Institutions: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/23559.
×
Page 23
Suggested Citation:"Remarks of Speakers." National Academies of Sciences, Engineering, and Medicine. 2016. Data Breach Aftermath and Recovery for Individuals and Institutions: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/23559.
×
Page 24
Suggested Citation:"Remarks of Speakers." National Academies of Sciences, Engineering, and Medicine. 2016. Data Breach Aftermath and Recovery for Individuals and Institutions: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/23559.
×
Page 25
Suggested Citation:"Remarks of Speakers." National Academies of Sciences, Engineering, and Medicine. 2016. Data Breach Aftermath and Recovery for Individuals and Institutions: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/23559.
×
Page 26
Suggested Citation:"Remarks of Speakers." National Academies of Sciences, Engineering, and Medicine. 2016. Data Breach Aftermath and Recovery for Individuals and Institutions: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/23559.
×
Page 27
Suggested Citation:"Remarks of Speakers." National Academies of Sciences, Engineering, and Medicine. 2016. Data Breach Aftermath and Recovery for Individuals and Institutions: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/23559.
×
Page 28
Suggested Citation:"Remarks of Speakers." National Academies of Sciences, Engineering, and Medicine. 2016. Data Breach Aftermath and Recovery for Individuals and Institutions: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/23559.
×
Page 29
Suggested Citation:"Remarks of Speakers." National Academies of Sciences, Engineering, and Medicine. 2016. Data Breach Aftermath and Recovery for Individuals and Institutions: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/23559.
×
Page 30
Suggested Citation:"Remarks of Speakers." National Academies of Sciences, Engineering, and Medicine. 2016. Data Breach Aftermath and Recovery for Individuals and Institutions: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/23559.
×
Page 31
Suggested Citation:"Remarks of Speakers." National Academies of Sciences, Engineering, and Medicine. 2016. Data Breach Aftermath and Recovery for Individuals and Institutions: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/23559.
×
Page 32
Suggested Citation:"Remarks of Speakers." National Academies of Sciences, Engineering, and Medicine. 2016. Data Breach Aftermath and Recovery for Individuals and Institutions: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/23559.
×
Page 33
Suggested Citation:"Remarks of Speakers." National Academies of Sciences, Engineering, and Medicine. 2016. Data Breach Aftermath and Recovery for Individuals and Institutions: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/23559.
×
Page 34
Suggested Citation:"Remarks of Speakers." National Academies of Sciences, Engineering, and Medicine. 2016. Data Breach Aftermath and Recovery for Individuals and Institutions: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/23559.
×
Page 35
Suggested Citation:"Remarks of Speakers." National Academies of Sciences, Engineering, and Medicine. 2016. Data Breach Aftermath and Recovery for Individuals and Institutions: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/23559.
×
Page 36
Next: Concluding Plenary Discussion »
Data Breach Aftermath and Recovery for Individuals and Institutions: Proceedings of a Workshop Get This Book
×
Buy Ebook | $14.99
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF

In January 2016, the National Academies of Sciences, Engineering, and Medicine hosted the Workshop on Data Breach Aftermath and Recovery for Individuals and Institutions. Participants examined existing technical and policy remediations, and they discussed possible new mechanisms for better protecting and helping consumers in the wake of a breach. Speakers were asked to focus on data breach aftermath and recovery and to discuss ways to remediate harms from breaches. This publication summarizes the presentations and discussions from the workshop.

  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    Switch between the Original Pages, where you can read the report as it appeared in print, and Text Pages for the web version, where you can highlight and search the text.

    « Back Next »
  6. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  7. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  8. ×

    View our suggested citation for this chapter.

    « Back Next »
  9. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!