The workshop’s second session focused on the feasibility of segmenting encryption policies, for example, by creating avenues for government access to plaintext for certain types of information, certain types of users, or certain layers of information.
The session began with a brief exploration of use cases presented by Marc Donner. The discussion was moderated by planning committee members Fred Chang, director of the Darwin Deason Institute for Cyber Security at Southern Methodist University, and Susan Landau.
Drawing from his past experience working in the finance industry and with Google Health, Donner used the finance and health care sectors to illustrate how industries handle highly sensitive information, including their use of encryption.
In the context of finance, Donner described the complex networks of data holders, data flows, and transit points involved in carrying out daily transactions in both consumer banking and institutional finance. While some links are encrypted and there are set processes for protecting data, he said these practices are not consistently employed. Data in transit from one node to another is typically encrypted, but most core databases are not encrypted, largely due to the sheer number of people who must access these databases on a constant basis. He noted that a system like Google Keystore, which provides an encryption key for each data item, is a potentially viable approach in this context and could be implemented if legacy systems can be fixed to accommodate this.
In the context of health care, Donner said that as in the financial industry, most systems do not use encryption for data in storage. Where encryption is incorporated, it is done largely for data in transit or for stored information at the level of individual departments or laboratories rather than systemwide. A further complication, he said, is that health care providers are struggling with tough questions surrounding what information to share with consumers—and how best to share it—when most systems are organized to share information among care providers and insurers and thus not structured or communicated in ways that would be appropriate for consumers. In addition, he cited Google’s abandoned effort to accumulate consumers’ medical data (an effort called Google Health, which he ran for 2.5 years and shut down) to illustrate the argument that consumers are afraid of concentrating their medical data, which he attributed partly to weaknesses in the health system and partly to weaknesses embedded in policy structure.
Broadly speaking, Donner speculated that the reason encryption is not more widely used in these industries is because decision makers do not perceive a need for it. This is further complicated by the fact that many are focused on the more basic struggle of staying ahead of the information-sharing needs that are central to their daily
business. He said institutions are “chewing away at the problem,” but incorporating encryption more broadly requires a change in the organization of systems, and doing so without interrupting the delivery of daily services is challenging. Returning to the idea of implementing a system such as Google Keystore, Donner said if data at rest are encrypted, then people with system privileges will have to retrieve a key to access the data, which, in a well-managed system, will leave footprints that can be tracked. While recognizing that implementing such a key management system is not free or easy, he said it is possible.
Regarding the different treatment of data in transit as compared to data at rest, Donner noted that there might not be much of a difference between the two from a communications theory standpoint, but one important difference is that an encryption key for data in transit can be discarded after the end of the communication, whereas with stored data, the encryption key needs to be preserved and managed until the data can be discarded.
In a broader discussion, participants considered how various aspects of an institution’s context influence the need, desirability, and technical framework for exceptional access.
Broadly speaking, when considering whether exceptional access is possible or desirable, it is useful to consider demonstrated attitudes toward information sharing in various sectors. On the whole, Donner said, the healthcare and finance industries want to comply with government requests and are eager to respond when asked for information. Andrew Sherman reiterated this point with regard to these industries, saying government wouldn’t need exceptional access to his laptop because the source data for it came from a system to which “the legal department would gladly give you access with the right paperwork.” In addition, Donner likened most institutional databases to a “maze of twisty passages,” suggesting that anyone seeking access, exceptional or otherwise, to most corporate data would likely need someone from the institution to help navigate the database to find the desired information anyway. Given this context, he posited that it likely is not reasonable to expect the government to expend the resources necessary to use a backdoor to tap into such databases when the institution could simply be directly asked to provide the information.
Speaking to the role that device encryption plays in protecting corporate data, Guy L. “Bud” Tribble, vice president of software technology at Apple, raised the point that although many corporate databases may have been sitting on a relatively secure server when they were initially set up, some part of the data may end up on a laptop or mobile device. Donner replied that this happens frequently and suggested that the consequences of such events largely depend on how well run the institution is. A larger, well-run institution (which he defined as one that has an established information technology department) would require full-disk encryption on laptops or mobile devices; however, there are many smaller institutions that do not have such a requirement. Kevin Bankston reiterated the point that when data on corporate mobile devices and laptops is encrypted, it is being protected by the same full-disk encryption as all other consumer products, and thus any vulnerabilities that may be created by mandated mechanisms for exceptional access would extend to a variety of sectors, including health and finance. Therefore, Bankston continued, it is not possible to segment encryption policy as between consumer and corporate data, because consumer software and devices are also widely used in a variety of corporate sectors.
Participants moved into a discussion about the feasibility of strategies for allowing government access to plaintext in a way that is segmented either vertically (i.e., by sector or user) or horizontally (i.e., segmented according to different technological layers, such as a phone’s operating system versus in applications that run on that operating system).
Eric Rescorla questioned whether it is possible to “wall off” the use of strong encryption in specific sectors when the security of these systems depends on the same commodity products used by all other applications. For example, he said, when you interact with your stockbroker, you are likely using the same browser that you use
to interact with Facebook and the same voice-over IP system that you use to talk with your doctor. As such, he questioned whether it was possible to allow “stronger” encryption—in this case, meaning encryption without exceptional access—to some applications but not others.
It is largely impractical for institutions to build their own encryption tools, Donner said, and as a result, when encryption is used in various sectors, it is based on standard, commercially available products. The difference, in his experience, is how well institutions manage their security activities. Getting consensus around what to encrypt and how to encrypt it is a significant challenge, because sectors like finance and health care are rapidly changing, they involve complex, interacting work flows rather than simple transactions, and require supple yet powerful access control mechanisms.
Butler Lampson suggested that from a practical standpoint, the feasibility of walling off exceptional access for certain sectors would depend on the ability to escrow the encryption keys for one application separately from the keys for other applications such that one could access different sets of keys for different purposes.
Brian LaMacchia, director of the Security and Cryptography Group at Microsoft Research, pointed to lessons learned in the 1990s about the pitfalls of effectively deploying “stronger” and “weaker” security for different contexts. At that time, companies would build worldwide products with “weak encryption” (meaning small key size, easily crackable by a reasonably resourced attacker) built in and then offer a “high encryption pack,” which added “strong encryption” capabilities to the product, to U.S. and Canadian customers only. Microsoft shipped “high encryption packs” for operating systems through Windows 2000. In 2000, U.S. export controls changed such that operating systems including strong encryption could generally be shipped worldwide, and the need for splitting out a high-encryption pack went away.
During the period when export controls were in effect, Microsoft added a “server-gated cryptography” feature to its Web browser Internet Explorer to allow selection of strong cryptography when connecting to servers outfitted with special certificates for their server keys, even if the underlying Windows operating system only supported weak encryption, LaMacchia said. Netscape implemented a similar feature for its browser that it called “International Step-Up.” The result, according to LaMacchia, was a mix of confusion and unintended consequences; while Microsoft forged a workable solution, it was difficult to test and deploy and also difficult to remove once export controls changed a few years later and server-gated cryptography was no longer needed. This export-control architecture is still deployed and continues to be a source of vulnerabilities, he added.
Lampson raised the concern that the fundamental goals of what would be accomplished by vertically segmented exceptional access have not been adequately articulated, making it impossible to envision a technical solution to achieve such a goal. Given that, for example, medical data involve many different kinds of data that are held by different parties and for different purposes, it is extremely difficult to imagine how one could possibly provide stronger cryptography for medical data.
Given a general sense that segregating where exceptional access would or would not be required by sector is difficult, Landau moved the conversation toward the feasibility of segregating exceptional access horizontally. What, for example, would be the implications of requiring exceptional access at the platform layer but not at the application layer?
Daniel Kahn Gillmor said that in his view, there would be no way to make sense of such segregation from an engineering perspective, because if we require exceptional access at the operating system layer, someone will just make an operating system that doesn’t provide the exceptional access. Landau countered with the argument that an operating system is not easily made and widely deployed in the same way that apps are. Gillmor further observed that, if you were to require exceptional access at the hardware level (i.e., providing access to the plaintext of what is stored in random access memory), then ultimately you have provided access to all of the data, so segmentation is impossible when the lowest layers are vulnerable.
Tribble agreed that it is hard to imagine a way to effectively segment exceptional access horizontally. He added that it is probably feasible for apps to evade built-in exceptional access mechanisms, for example, by using a key that is in the user’s head and not stored on the device.
Building on these ideas, Rescorla cited the Communications Assistance for Law Enforcement Act as an example where law enforcement was provided access to the bottom layer of the communications stack by requiring carriers to provide access to the raw data. However, applications can encrypt the data, and carriers have no obligation to provide data they don’t have. Any requirement that ensures access at lower layers necessarily is very difficult and involves breaking boundaries between different pieces of the system.
LaMacchia expanded on this point, explaining that segmenting horizontally would require engineers to break “abstraction boundaries” that are crucial to the integrity of computer programs. Abstraction boundaries are rigid design boundaries built into programs that serve to break a “hard problem”—that is, the overall function of the program as a whole—into many simpler problems that, when linked together, achieve the desired goal. This approach enhances security because each component can be built securely and engineers know that the communication points between the components are a key place to focus when defending against malicious behavior by another component. Breaking the abstraction boundaries, as a result, adds complexity, makes vulnerabilities more likely, makes it harder to maintain the software, and makes it more difficult to track and fix problems.
Wrapping up, Lampson said the problem of horizontal segmentation is compounded by the fact that apps operate essentially independently of operating systems, so building an exceptional access system that prevents apps from using superencryption1 would be exceedingly difficult.
1 Superencryption involves first encrypting using an encryption algorithm of the user’s choosing and then encrypting a second time using a preferred or mandated encryption algorithm, such as one that enables exceptional access.