As discussed in Chapter 1, the options for ensuring access to plaintext fall into the following broad categories:
- Take no legislative action but potentially pursue technical, law enforcement, and legal options to obtain or compel cooperation of target;
- Provide additional resources to access plaintext;
- Enact legislation that requires that device vendors or service providers provide government access to plaintext without specifying the technical means of doing so; and
- Enact legislation requiring a particular technical approach.
As noted previously, these are not necessarily mutually exclusive; for example, the second option can be pursued regardless of or in addition to the other three options.
Each of these categories may involve some combination of (1) legal and policy changes, (2) technical means, and (3) provision of additional financial or technical support. The sections that follow explain each of these three areas. Some are available today (but could be enhanced with additional resources), while others require new technologies, major investments, or legal changes. Although some of these options have been studied—some over a period of almost two decades—all but the status quo represent new initiatives that would undoubtedly lead to new technical, market, and legal responses if implemented. The final section of this
chapter considers some of the issues that would arise with legislation mandating government access.
Even without new legislation mandating government access, there are several legal avenues currently available to the government that may enable it to decrypt the information. This section discusses legal issues that arise in those situations.
There are many different situations in which the government has legal authority to obtain information, but where access to the information or the meaning of the information may be defeated by encryption technologies. For example, in many cases, the government may legally obtain a computer or handheld device that requires a fingerprint or other biometric information or passcode for access. Even where access to the computer or device is not limited, a particular file on a computer or device may be protected by a passcode or other means. The government may also have the legal authority to seize and obtain information stored in other places, like the cloud, but accessing such information may nevertheless require a biometric identifier or passcode.
The legal avenues available to the government in these cases depend in part on whether the information is protected by a biometric identifier or by a passcode as well as whether the government is seeking information directly from the user or from a third party like the provider.
Where the information is protected by a biometric identifier and the user of a device is in custody or otherwise available, the government may seek to compel the user to provide a fingerprint or other biometric data to unlock a device and allow access to the data within.
At the moment, the law is reasonably well settled that the government may obtain such an order in the context of certain physical acts.1 As the Supreme Court held in United States v. Hubbell, 530 U.S. 27, 34-35 (2000), “even though the act may provide incriminating evidence, a criminal
1Holt v. United States, 218 U.S. 245 (1910); Schmerber v. California, 384 U.S. 757, 764 (1966); United States v. Dionisio, 410 U.S. 1, 5-6 (1973); Doe v. United States, 487 U.S. 201 (1988).
suspect may be compelled to put on a shirt, to provide a blood sample or handwriting exemplar, or to make a recording of his voice.”
To be sure, complexities may arise under the Fifth Amendment when the act of providing the fingerprint could be deemed to be testimonial—for example, an implicit admission that the phone in question belongs to the subject—but obtaining a fingerprint per se, or compelling someone to touch a fingerprint sensor, is likely not itself a testimonial act protected by the Fifth Amendment, although there may be exceptions.2
As one court of appeals has explained,
[A]n act of production can be testimonial when that act conveys some explicit or implicit statement of fact that certain materials exist, are in the subpoenaed individual’s possession or control, or are authentic. The touchstone of whether an act of production is testimonial is whether the government compels the individual to use “the contents of his own mind” to explicitly or implicitly communicate some statement of fact. Put another way, the Court has marked out two ways in which an act of production is not testimonial. First, the Fifth Amendment privilege is not triggered where the Government merely compels some physical act, i.e. where the individual is not called upon to make use of the contents of his or her mind. The most famous example is the key to the lock of a strongbox containing documents, but the Court has also used this rationale in a variety of other contexts. Second, under the “foregone conclusion” doctrine, an act of production is not testimonial—even if the act conveys a fact regarding the existence or location, possession, or authenticity of the subpoenaed materials—if the Government can show with “reasonable particularity” that, at the time it sought to compel the act of production, it already knew of the materials, thereby making any testimonial aspect a “foregone conclusion.”3
There could be practical limits as well, of course, because some hardware devices are designed to remain locked after several failed attempts to open them biometrically—for example, by then requiring that the passcode be entered. And of course, not all users enable biometric features, and the features can be readily disabled.
2 For a discussion of some of the complexities, see K. Goldman, 2015, “Biometric passwords and the privileged against self-incrimination,” Cardozo Arts & Entertainment Law Journal 33:211; O. Kerr, 2016, “Can warrants for digital evidence also require fingerprints to unlock phones?,” Washington Post, October 19; and O. Kerr, 2016, “The Fifth Amendment and Touch ID,” Washington Post, October 21.
3 In re Grand Jury Subpoena Duces Tecum Dated March 25, 2011, 670 F.3d 1335, 1345-46 (11th Cir. 2012).
The situation is almost exactly reversed when it comes to compelled production of a passcode. In most situations, the law does not allow the government to compel disclosure of a passcode. Unlike a biometric marker, disclosing a passcode is generally understood as a testimonial act protected by the Fifth Amendment. Thus, under current case law, although providing a fingerprint may resemble providing a physical key to a safe, disclosing a passcode is more like revealing the combination to a safe, which is protected.4 At the same time, there may be circumstances where someone other than the owner in the case of a privately owned device or a corporation in the case of a business-owned device, knows the passcode and in that situation, the government could compel production of the passcode because that individual or organization would not have a Fifth Amendment right to refuse disclosure.
Where the government cannot obtain the assistance of the user of a device to defeat encryption, it may also seek assistance from third parties, such as the manufacturer of a device or the provider of a software operating system. To date, issues in this area have usually arisen under the All Writs Act (28 U.S.C. §1651); its application, however, is currently unsettled law. Issues may also arise under the “technical assistance” provisions of the Wiretap Act (8 U.S.C. §2511). Another option for the government is “lawful hacking,” which typically does not require compelled assistance from a third party, but accomplishes some of the same results and may raise some of the same questions. One controversy that arises in connection with compelled assistance is whether and on what time scale providers should be allowed to disclose that such assistance has been provided.
All Writs Act
Although the use of the All Writs Act in a decryption case came to public prominence in connection with efforts by the Federal Bureau of Investigation (FBI) to compel Apple to decrypt the phone of a dead terrorist in the San Bernardino, California, case, the act has long been used to compel assistance from third parties in implementing surveillance orders obtained by the government. In 1977, the Supreme Court addressed such technical assistance in U.S. v. New York Tel. Co.,5 where it held that the
4 See Doe v. U.S., 487 U.S. 201 (1988) at 210 n.9.
5 434 U.S. 159 (1977).
All Writs Act, which allows federal courts to “issue all writs necessary or appropriate in aid of their respective jurisdictions and agreeable to the usages and principles of law,”6 could be used to compel a telephone company to assist with installation of a pen register, a mechanism to record the telephone numbers called by the phone. The pen register itself was authorized under Federal Rule of Criminal Procedure 41, but that rule did not explicitly require telephone companies to provide technical assistance in installing a pen register. In deciding that the government could invoke the All Writs Act to compel assistance, the Supreme Court noted that the Wiretap Act contains a provision requiring companies to provide technical assistance. The Court explained that in light of the Wiretap Act’s “direct command to federal courts to compel, upon request, any assistance necessary to accomplish an electronic interception, it would be remarkable if Congress thought it beyond the power of the federal courts to exercise, where required, a discretionary authority to order telephone companies to assist in the installation and operation of pen registers, which accomplish a far lesser invasion of privacy.”
The government has started using the All Writs Act to seek considerably more in the way of technical assistance from providers or others to defeat encryption. To what extent the act applies to such cases and what kind of assistance can be compelled by the act is unsettled by the courts; Congress could conceivably step in to clarify as well. In the Eastern District of New York, for example, the Department of Justice and Apple engaged in a dispute about whether Apple could be compelled to unlock an iPhone for which there was a federal search warrant. The government relied on the All Writs Act and New York Tel. Co., while Apple claimed that the All Writs Act does not apply as a result of Communications Assistance for Law Enforcement Act (CALEA),7 a 1994 statute that requires telecommunications providers to maintain their networks in certain ways that allow for wiretapping, but does not apply to stored data on a handset. Apple argued that Congress considered and rejected the possibility of imposing mandates for law enforcement access on handset device providers when it adopted CALEA.
Apple’s main argument was that the All Writs Act could not be used to compel what Congress declined to address in CALEA—that is, that
6 28 U.S.C. §1651.
7 47 U.S.C. §1001-1010. Under one provision of Commission on Accreditation for Law Enforcement Agencies, Inc. (CALEA), 47 U.S.C. §1002(b)(3), a “telecommunications carrier shall not be responsible for decrypting, or ensuring the government’s ability to decrypt, any communication encrypted by a subscriber or customer, unless the encryption was provided by the carrier and the carrier possesses the information necessary to decrypt the communication.”
CALEA occupies the field of compelled assistance.8 In February 2016, a magistrate judge in Brooklyn, New York, ruled for Apple, concluding that “the relief the government seeks” under the All Writs Act “is unavailable,” primarily “because Congress has considered legislation that would achieve the same result but has not adopted it” in CALEA.9 The government’s appeal is pending. By contrast, a magistrate judge in the Central District of California reached the opposite conclusion.10 In this latter case, the FBI used technical means to obtain the data, and the lawsuit was dismissed. There will undoubtedly be more litigation in this area.
Assuming that the government prevails in its interpretation of the All Writs Act and can legally compel companies to provide technical assistance to defeat encryption where the government has a lawful warrant for the encrypted information, the extent and circumstances of such assistance will presumably be worked out on a case-by-case base.
As noted above, in applying the All Writs Act in New York Tel. Co., the Supreme Court reasoned by analogy to the “technical assistance” provision in the Wiretap Act. It is therefore possible, in a case involving a wiretap (rather than access to stored data via a search warrant), that the government may seek to compel assistance from providers under that provision. There is today very little publicly available law on the limits of “technical assistance.” A divided panel of the Ninth Circuit held that the Wiretap Act could not be used to compel assistance with a wiretap in ways that entirely disabled the communications system for the particular customer targeted by the surveillance. The majority concluded that disabling the system was inconsistent with the statutory command that technical assistance be provided “in such a manner as will protect its secrecy and produce a minimum of interference with the services that such carrier . . . is providing that target of electronic surveillance”:
8 See Quinta Jurecic, DOJ and Apple File Briefs in EDNY Encryption Case, Lawfare (Oct. 26, 2015), https://www.lawfareblog.com/doj-and-apple-file-briefs-edny-encryption-case; see also H.R. Rep. 103-827(I), 103d Cong. 2d Sess. at 13 (1994) (“While the Supreme Court has read [18 U.S.C. §2518(4)] as requiring the Federal courts to compel, upon request of the government, ‘any assistance necessary to accomplish an electronic interception,’ United States v. New York Telephone, 434 U.S. 159, 177 (1977), the question of whether companies have any obligation to design their systems such that they do not impede law enforcement interception has never been adjudicated.”).
9 In re Apple, Inc., 149 F. Supp.3d 341, 344 (E.D.N.Y. 2016).
10 In the Matter of the Search of an Apple iPhone Seized During the Execution of a Search Warrant on a Black Lexus IS300, California License Plate 35KGD203, 2016 WL 618401 (C.D. Cal. Feb. 16, 2016).
[T]he “a minimum of interference” requirement certainly allows for some level of interference with customers’ service in the conducting of surveillance. We need not decide precisely how much interference is permitted. “A minimum of interference” at least precludes total incapacitation of a service while interception is in progress. Put another way, eavesdropping is not performed with “a minimum of interference” if a service is completely shut down as a result of the surveillance.11
The majority further concluded that the assistance provision, unlike CALEA, does not require providers to redesign their systems to facilitate government surveillance.
An alternative to introducing lawful access mechanisms to defeat encryption is to use what is sometimes referred to as “lawful hacking,” which allows investigators to intrude into a computer system and access its content without the need to break encryption. For example, the government may obtain a warrant to secretly insert software on a targeted computer that surreptitiously records every keystroke on a computer. This can be used to capture the suspect’s passwords, thus allowing access to everything else.
The idea behind lawful hacking is that “[i]nstead of introducing new vulnerabilities to communications networks and applications, law enforcement should use vulnerabilities already present in the target’s communications device to wiretap in the situations where wiretapping is difficult to achieve by other means.”12 It is a technique that has been in use by the FBI since at least the early 2000s. Information must be in plaintext while the device is processing or displaying and is thus susceptible to capture by appropriately tailored malware to defeat or evade the encryption on a subject’s device. Although such malware is not guaranteed to work or be sufficient in every circumstance, it is another option that may be effective in many cases.
Some have suggested that despite its limitations and challenges, lawful hacking offers potential middle ground for at least a subset of cases: “This proposal potentially offers an attractive solution to Going Dark challenges, which could theoretically satisfy equities on both sides of the debate.”13
11In re U.S. for an Order Authorizing Roving Interception of Oral Communications, 349 F.3d 1132 (9th Cir. 2003).
12 S.M Bellovin et al., 2014, “Lawful hacking: Using existing vulnerabilities for wiretapping on the Internet,” Northwestern Journal of Technology and Intellectual Property 12:i.
13 S. Hennessey and N. Weaver, 2016, “A judicial framework for evaluating network investigative techniques,” Lawfare Blog, July 28, https://www.lawfareblog.com/judicial-framework-evaluating-network-investigative-techniques.
Two important points about this technique should be noted. First, in discussing this technique, the committee is referring only to instances where the government has a lawful warrant to obtain the encrypted information or in the case of foreign intelligence, a Foreign Intelligence Surveillance Act order. Second, by lawful hacking, the committee means the use of techniques that have been authorized by a court pursuant to law.
The legal dimensions of lawful hacking have not been extensively litigated and are still unclear. For example, it is uncertain whether there are circumstances when the government can be compelled to reveal details about the methods in criminal investigations. It is also uncertain how tools used to generate admissible evidence in court would be vetted in some way for its forensic soundness—for example, the government would have to demonstrate that it could reliably extract evidence without altering it.
Are the issues (particularly the legal issues) different for government-supplied and vendor-supplied tools? Would all techniques be widely available, or available only to certain agencies? How would lawful hacking tools be created and distributed? Under what circumstances is the government required to reveal the vulnerabilities it discovered to the companies that developed the products whose vulnerabilities that hacking tools exploit?
Again, the extent and application of such authority is unsettled and will depend on the particular circumstances of the requested order authorizing lawful hacking.
This section examines high-level technical options for providing access to plaintext and the associated challenges with each approach. The discussion draws on the basic concepts of cryptography described in Chapter 2. It is not intended to be comprehensive but rather to introduce some major technical options and to illustrate the issues that arise in evaluating the associated benefits and risks.
A number of technical approaches to providing exceptional access to plaintext have been proposed (see, e.g., Box 5.1). The following are the general approaches most commonly discussed and also selected to be representative of the range of potential benefits and shortcomings of technical access schemes.
Required Vendor Unlock
One proposed approach would require vendors to maintain capabilities to unlock phones or other devices and access the data stored on them. When a law enforcement agency encounters a device that it needs to unlock, it would present an unlock request for a specified device along with the appropriate legal order. The vendor would then be responsible for validating the source of the request and the judicial documents and, depending on how the access arrangement is structured, either the vendor unlocks the phone when it is presented by law enforcement, or it provides the law enforcement agency with a token the agency can use to unlock the phone.
An unlocking scheme could take various forms. The simplest, but riskiest in terms of the potential scale of compromise, would involve a single master key that covers all phones from a vendor or all phones of a particular model. More likely, a scheme would create an unlock token by combining a vendor’s signing key with a unique key associated with a unique device identifier that the vendor creates and stores for each phone. The discussion below assumes a scheme of the latter sort.
An unlocking scheme will not provide direct access to prior communications unless they are stored on the phone (and not separately encrypted by the communications application). However, unlocking a phone by authenticating as a user provides much more capability than just access to the data stored on that device. For example, it allows access to remote accounts or services belonging to the device owner (including but not limited to messaging services) and associated data that is not stored on the phone. The scope of what investigators are permitted to access from an unlocked phone would likely be defined by the authorizing court order. In addition, limits could be set by the technical mechanism. At least one proposal for a vendor unlock mechanism includes a
provision to freeze the unlocked device and prevent it from being used directly to, for example, access remote accounts or services.
In terms of the process for interacting with law enforcement agencies, large online service operators such as Google, Facebook, and Apple already have processes in place to receive and validate warrants and other law enforcement requests to manage and deliver unencrypted customer data that they hold. By contrast, device vendors such as Apple do not presently have processes in place to provide law enforcement agencies with unlock codes, which would include not only validating law enforcement requests but also managing master signing keys and creating device-specific unlock codes. A workable solution would have to be deployable on billions of devices.
Vendors already have had to address at least some of the technical risks of an unlocking scheme. Most device and operating system vendors already maintain a master key that is used to authenticate software updates. Because the software update capability enables the vendor to modify the device software arbitrarily, vendors already have strong procedures in place to protect these update keys and limit their use.14 The process and workflow used by vendors in delivering updates is designed to prevent inadequately tested, unapproved, or malicious code in an update and includes controls that keep a single individual from releasing an update. The key used for device unlock would, the argument goes, be handled by a similar system and similar procedures to those used to protect software update keys. If they are to cryptographically authenticate authorized unlock requests, manufacturers will have to store and protect secret signing keys for this purpose. Exposure of these keys would allow anyone to generate unlock tokens on their own. These signing keys, like the keys that manufacturers use to authenticate software updates, therefore, pose a cybersecurity risk and will have to be protected against criminal organizations and foreign agents. So, too, would the system that stores the keys associated with individual devices. A single point of failure could potentially jeopardize the security of millions of devices.
There is an important difference, however, in the ease of use of an update signing key and a key to accomplish exceptional access. Stealing an update signing key does not give an immediate benefit. The only use for the key is to sign a piece of malicious software that will be used to attack a targeted system or systems. And creation of that malicious software requires significant skill if it is to be capable of being deployed to a large number of target systems or a target system whose attributes are unknown, or if it is intended to be long-lived (e.g., to survive past future
14 The software update capability itself cannot be used to unlock a device because, as with well-designed smartphones, a software update can only be installed on an unlocked device.
valid updates to the target system). The impact of such a signed malicious update could be extremely serious, and preventing theft of signing keys is thus a major concern for software vendors.
Stealing an exceptional access key, by contrast, enables a thief, who has physical access to a device, to open it. An attacker might have only temporary access to a device—perhaps at a border crossing, perhaps while the phone is left outside the protected facility during a meeting. If the exceptional access key can be used to decrypt an externally collected image of the device memory, theft of such a key is an especially serious threat. If the exceptional access key can only be used to decrypt the protected physical device (not a memory image) and if the device is rendered inoperable (frozen) by the act of decryption, as was the operating model for one concept presented to the committee, the device owner will at least have an indication that the device has been compromised.
The security risks of the scheme can be partly mitigated if the access scheme is hardware based (see below) in order that an attack against a device unlock mechanism cannot be carried out remotely. This forces a party seeking to unlock a device to have physical possession of that device.
Several operational factors distinguish an unlocking system from a software update system. For one, such a system would most likely be used more often—perhaps several times per day—as compared to the keys used to sign software updates, which are used infrequently by a generally small group of highly trusted individuals. For another, a code intended to unlock a phone requires an individualized access key per phone (using, e.g., the phone’s unique identifier) and as many individual keys to be generated or retrieved as there are requests to unlock.15
Proponents and critics disagree about how much greater the risk of compromise would be as well as, at least implicitly, about how to weigh the incremental risk against the benefits of enabling government access.
How frequently might vendors be asked to unlock phones? It is difficult to predict the volume of requests to vendors, but a figure in the tens of thousands per year seems reasonable, given the number of criminal wiretaps per year in the United States and the number of inaccessible devices reported by just the FBI and the Manhattan District Attorney’s Office (see the section “Encryption as an Impediment to Investigations” in Chapter 4). As a result, each vendor, depending on its market share, needs to be able to handle thousands to tens of thousands of domestic requests per year.
15 S. Landau, 2017, “Punching the wrong bag: The deputy AG enters the crypto wars,” Lawfare Blog, October 27, https://www.lawfareblog.com/punching-wrong-bag-deputy-ag-enters-crypto-wars.
Such a change in scale, as compared to the software update process, would necessitate a change in process and may require a larger number of people authorized to release an unlock code than are authorized to release a software update, which would increase the insider risk.
Critics worry that using this approach might erode trust in the software updates issued by vendors and lead users to eschew important security updates, thus significantly increasing their exposure to Internet malware and attacks by an array of actors. This risk stems in part from proponents having used software updates as an analogy to the unlocking process. If advocates of required vendor unlock were to avoid this analogy, that would reduce the level of mistrust. So, too, would avoiding requests that vendors subvert the software update process (as contrasted with implementing a per-device unlocking scheme) to unlock devices.
A related process and scale issue is the need for vendors to validate court documents before they release an unlock code or performing the unlock in order to thwart malevolent actors seeking to surreptitiously unlock devices. The challenge is similar to the one faced by telecommunications carriers with CALEA16 or cloud providers served with requests for customer data. Such requirements could impose a burden on small vendors and constitute a barrier to entry for new vendors; small vendors would likely need to enlist trusted third parties similar to how telecommunications carriers ensure compliance with CALEA.
Whether or not this burden would be reasonable or not depends, of course, on how one weighs the innovation and public safety equities. Although, as occurs in other sectors, small businesses could be protected from a significant financial burden in response to a law enforcement request, this “solution” is only partially effective. Should a small vendor grow large—which can happen quickly for Internet applications—their system architecture must suddenly need to accommodate the assistance requirement. Thus a small vendor would essentially have to build provisions for exceptional access into their architecture from the outset. This may mean that both the financial burdens and security risks of the requirement are present even before the vendor is formally subject to it.
Requiring a U.S. vendor to have the ability to unlock every phone has the potential to erode trust in that vendor’s products in the international market, but it is difficult to quantify the impact or assess how much additional impact the imposition of a U.S. requirement will have if other nations have already placed such requirements on a vendor. A competitor could argue that another vendor may cooperate with U.S. authorities
16 Federal Communications Commission, Second Report and Order and Memorandum in the Matter of Communications Assistance for Law Enforcement Act and Broadband Access and Services, FCC 06-56, p. 11, https://apps.fcc.gov/edocs_public/attachmatch/FCC-06-56A1.pdf.
to unlock a foreign phone, even in circumstances where the government lacks the authority. (Of course even if the United States does not impose an access requirement, other countries certainly could as a condition of participating in their domestic markets.) Similarly, it would be unsettling if foreign vendors could unlock any phone belonging to U.S. individuals. In the hands of a foreign government, this capability could, for example, be used against a U.S. executive to undermine U.S. corporate secrets and national security.17 The threat with respect to foreign governments is, of course, much lower in the case where the unlocking mechanism requires one to have physical possession of a device than in the case where a device can be unlocked remotely.
Key escrow is a scheme where the keys needed to decrypt data are held in escrow—by the vendor, a third party, or the government—so that an authorized third party can access the keys. The key escrow approach is applicable both to data at rest and data in motion. When this approach was studied extensively in the 1990s, several specific proposals were made, and some products that implemented key escrow for encrypted network communications were built and offered commercially.
The escrowing party could be the government, or it could be some other entity or entities. If vendors hold the keys, then each vendor can choose its own algorithms and formats. If a U.S. agency is to hold the recovery key, there are two possibilities. One is that the government determines the algorithms and formats that vendors use when implementing key escrow. Another is that the vendor provides the government with the necessary code to hold escrowed keys and perform the unlocking. The former would impose additional burdens on the vendor to integrate the government’s solution into its product or service (while also keeping it from selecting what it deems the best technical solution), while the latter would impose additional burdens on the government to maintain and operate a system for each product or service that it seeks to access. Note that the latter could also mean a similar situation that law enforcement currently faces with the complexity created by the plethora of changing formats, compression algorithms, and protocols used by software applications.
The use of a third party to escrow keys may be perceived as preferable to having government itself hold the keys, which may be an attractive alternative for vendors who do not want to manage keys and authenticate
17 Of course the security risks to U.S. travelers are well known and assumed as a fact of life by many U.S. business travelers. Those handling sensitive information are generally advised to travel with a “throwaway” device to minimize their exposure.
requests for them, but third parties themselves will be attractive targets for attackers, especially if the same escrow agent maintains the keys for a wide range of systems.
One way to protect the key is to split it into pieces known as “shares” and store each share with a different organization. By analogy, many doors have a lower and upper lock for extra security. You could, of course, give the lower lock key to one friend and the upper lock key to another, making it hard for either friend to misuse your key and enter your home by his or herself. If you did it for the keys to an encrypted computer, it would mean that someone would need to attack both of your friends to unlock it. More complicated schemes allow the cryptographic key to be split—for example, into five shares so that any three can be used to decrypt. Two shares, however, are of no help in decrypting. Consequently, if two shares are compromised by an adversary, the data is still safe. This technique can be used to protect high-value secret keys. The proposals from the 1990s called for the law enforcement access secret keys to be split into parts held by separate government agencies. Although such a scheme provides additional security, it also introduces additional technical and organizational complexity to the key escrow system associated with retrieving and combining the shares and thus, potentially, creates additional risks.
The key escrow approaches that were proposed in the 1990s—and the Clipper proposal in particular—were reviewed extensively by independent researchers. Several weaknesses in Clipper that could interfere with government access were discovered,18 but none were found that would weaken the encrypted communication between parties using the Clipper devices to communicate.
Owing to its complexity, it is difficult to design and implement a large-scale key escrow system securely. Indeed, a 1996 National Research Council report on cryptography19 recommended that an escrow scheme be tested at scale before requiring its use, something that has never been done for an escrowed communications system.
That report did not study the then-proposed Clipper scheme in depth but found that any scheme that includes key escrow would result in enhanced law enforcement access to encrypted information but weaken the security of authorized users’ information. The finding of weakened security was based on the theoretical potential for abuse or the potential for failure of the escrow mechanism. The report also found that there
18 M. Blaze, 1994, “Protocol failure in the escrowed encryption standard,” pp. 59-67 in Proceedings of the 2nd ACM Conference on Computer and Communications Security, Association of Computing Machinery, http://www.crypto.com/papers/eesproto.pdf.
19 National Research Council, 1996, Cryptography’s Role in Securing the Information Society, National Academy Press, Washington, D.C.
was some benefit to authorized users, especially of storage encryption systems, from an escrow mechanism that would allow users to recover their own stored data owing to a failure of the encryption system or associated key storage. The report made no attempt to quantify either benefit to law enforcement or cost in weakened security. Once the government abandoned its attempts to press for key escrow, Clipper and similar communications key escrow schemes disappeared from the market. There were few purchasers within the United States, and even fewer abroad.20
By contrast, many storage encryption products today offer key escrow-like features to avoid data loss or support business record management requirements. For example, Apple’s full disk encryption for the Mac gives the user the option to, in effect, escrow the encryption key. Microsoft Windows’ BitLocker feature escrows the key by default but allows users to request that the escrowed key be deleted. Some point to the existence of such products as evidence that key recovery for stored data can be implemented in a way that sensibly balances risks and benefits at least in certain contexts and against certain threats. In any case, data that is recoverable by a vendor without the user’s passcode can be recovered by the vendor for law enforcement as well. Key escrow-type systems are especially prevalent and useful where the user, or some other authorized person such as the employer, needs access to stored data. Key escrow-type systems are less prevalent for transitory communications, including text messages, where the ability to retrieve past content is often less important. There are, however, some settings, such as the financial industry, where there are requirements that communications be stored and retrievable.
Hardware-Based Device-Level Key Escrow for Access to Stored Data
An alternative approach for access to mobile devices would be to escrow a device decryption key in the device itself—for example, by storing it in some form of secure hardware. To retrieve the key from the hardware module, an investigator would—after receiving proper legal authorization—be required to get an authentication token from the key holder. With this token, investigators would retrieve a key that can be used to decrypt the data stored on it. This would in turn require manufacturers to maintain a service to store and protect the keys used to generate the authentication tokens, validate law enforcement requests, and produce the tokens. This key could be stored whole or broken into pieces, with each piece escrowed separately with different parties. The
20 W. Diffie and S. Landau, 2010, Privacy on the Line: The Politics of Wiretapping and Encryption, MIT Press, Cambridge, Mass.
mechanism for validating the authorization token and releasing the decryption key would be managed entirely by hardware and designed so that it could not be triggered by software running on the device. Compared to the key escrow option described above, this approach has the advantage of only allowing the decryption key to be retrieved if one has physical custody of the device, but, otherwise, it raises the same risks and complexities.
There are several approaches to providing law enforcement access to encrypted information that fall into the general category of “weakening” encryption. One is to limit the key length so that law enforcement or intelligence agencies can reasonably recover plaintext by trying all possible keys. Another is to implement an encryption algorithm that incorporates a feature that allows authorized agencies to use a special key or algorithm to recover plaintext. The first alternative is very similar to the approach tried during the 1990s when exportable encryption products were limited to 40-bit keys. The second alternative, as described, has not previously been implemented although a variation of the first alternative, the IBM Commercial Data Masking Facility, involved manipulating a 56-bit DES key by setting certain bits to zero and encrypting the modified key using constant keys. Under that alternative, key management software continued to negotiate 56-bit keys, but law enforcement (or any other attacker) only had to exhaust a 40-bit key space to recover plaintext.
Both approaches to weakening encryption have generally fallen out of favor as possible solutions. With today’s widespread availability of computing resources, many actors could exploit systems using shortened keys. Like any vulnerability, once discovered, such weaknesses can be exploited by anyone. And any solution that required shortened keys or use of a specific encryption method would create a legacy problem; systems would have to accommodate the shortened keys in protocols, even if the methodology were found to be insecure and abandoned. That would create long-term risks, not only to the systems that employed shortened keys, but also to all systems needing to interact with it.21
21 The so-called FREAK exploit discovered in 2015 took advantage of flawed implementations of OpenSSL and Apple TLS/SSL client software. In unpatched systems, it allows an attacker to use a man-in-the-middle attack to force vulnerable clients and servers to use weakened encryption. See B. Beurdouche, K. Bhargavan, A. Delignat-Lavaud, C. Fournet, M. Kohlweiss, A. Pironti, P.-Y. Strub, and J.K. Zinzindohoue 2015, “A messy state of the union: Taming the composite state machines of TLS,” pp. 535-552 in 2015 IEEE Symposium on Security and Privacy, doi:10.1109/SP.2015.39.
Require Vendor Assistance But Impose No Requirements on Deployed Technology
This variant of the above approaches would seek to compel vendors to render reasonable assistance on a case-by-case basis but not impose requirements on the technology they deploy to enable such assistance. It is an option that would presumably be pursued under the All Writs Act or some legislative clarification or extension of that law. Of course, there may be very different views as to what constitutes “reasonable”—both in terms of the costs to the vendor and the risks to a vendor (and its users) if the tool or technique used by the vendor is itself discovered or stolen and then exploited. This sort of divergence was on display in the San Bernardino case, where the FBI sought to have Apple prepare and sign software that would allow it to unlock a phone recovered from one of the shooters—a request that was withdrawn after the FBI found a third party that could unlock the phone. New legislation could help establish parameters for what is reasonable, but extensive litigation is likely. The effectiveness of this mechanism may erode as vendors improve security to respond to the general threat environment or—potentially—to specifically hamper such assistance.
It is also possible that the market will bifurcate. If the legal standard depends on the difficulty and cost of providing technical assistance, some companies will seek to raise those costs so as to not be required to provide such assistance. Alternatively, other companies will seek to make it as easy as possible to comply in order to keep costs down. One result would be that the accessibility of one’s data by the government could vary quite a bit by vendor.
New cryptographic techniques might change the parameters of the debate in the future. Standard encryption enables anyone who holds the secret key to fully decrypt, while all others learn nothing about the plaintext data. Modern cryptography now offers a richer set of capabilities than simply full access or no access. For example, when certain ciphers are used to encrypt, it is possible to issue a restricted secret key that lets the key holder ascertain whether a certain keyword or phrase appears in the plaintext but learn nothing else about the plaintext. In theory, this restricted key could enable law enforcement to determine whether a suspected device contained certain keywords or phrases, while learning nothing else about the contents of the device. Researchers continue to make advances in this general technical area, but technologies for a general-purpose search on encrypted data are not yet ready for mass adoption.
These techniques would also pose many challenges in the context of investigations. First, even simple obfuscation, such as avoiding likely key words or obfuscating the text by replacing the letter “i” with the letter “l” (ell) will prevent a keyword match. More aggressive text obfuscation—notably prior encryption using other cryptography—will prevent the search from working altogether. Finally, a scheme that uses a restricted secret key generated by a trusted authority has the same difficulties as the key escrow schemes discussed above.
In the 1990s, law enforcement authorities seemed willing to accept the risk that end-users would install encryption features that did not implement the (then-proposed) key escrow mechanisms. Today, a similar question about effectiveness arises with proposals to regulate the use of encryption in mass-market products and services. Sophisticated criminals have always had means to evade surveillance, while changing the defaults has the potential to affect a much wider range of investigations. Nevertheless it is important to note that smart and determined actors can employ a variety of techniques that can be used to evade a mandate against default encryption. At the same time, most people accept vendor defaults, even when they may present risks to their security and privacy. The discussion below does not speak to whether bad actors will seek to circumvent regulations on encryption but rather how they might go about doing so.
A few possible techniques are discussed below.
Adapt to a Platform-Level Mandate by Adopting Application-Level Encryption
If cryptography is implemented inside an application, a mandate placed on the vendor’s hardware and operating system (such as an Apple iPhone and its operating system iOS or the multiple vendors of Android phones) will not provide access to that application’s data, because even once the phone’s disk is decrypted, the application’s data remains encrypted and unavailable to an investigator. Already, dozens of applications support such application-level encryption, and many are developed outside of the United States. These include stand-alone applications that would allow a user to store encrypted files on a smart phone or laptop. Of course, if the user chooses a weak password in the application or a flaw in the application can be identified and exploited, it may still be possible to gain access.
One could imagine imposing a similar requirement on all application
developers, but there are a number of complications. First, developers reside worldwide, and it is not clear how one would impose this mandate on everyone. In fact, as with mandated access to devices, this mandate could disadvantage U.S. developers relative to foreign developers who are not subject to the mandate. Second, as in the case of mandated vendor unlock, discussed above, it raises scale and process challenges for small firms. Third, even if one can regulate application software produced by companies, there is a wide range of open-source software with encryption capabilities that is freely available and modifiable.22
With some platforms, such as Apple’s iPhone, only applications approved by the vendor can be installed through the normal software installation channel. On these platforms, it may be possible to block the installation of applications that provide unapproved encryption.23 However, there are several ways for a determined and knowledgeable user to bypass these restrictions.
On some platforms, applications can be loaded from any number of widely available application stores. Even on more restricted platforms, such as Apple’s iPhone, applications can be “side loaded” using freely available developer tools and distributed on a limited basis by anyone with a developer account. Another possibility is to “jailbreak” the phone (see below) and disable or remove the offending features.24 Moreover, platforms such as Android and iOS provide at least a limited ability for running code entered by the user. For example, one can install a Python interpreter from Apple’s app store that can access the system clipboard, and use it for encryption and decryption.
Applications providing encryption could even be run entirely within a Web browser, making it even more difficult to regulate their use. As with jailbreaking (see below), it is important to understand how much such bypassing erodes the benefits to law enforcement of access mandates. There are also platforms, such as the open-source Android operating system, that do not impose such restrictions on what software can be installed.
These considerations point to the difference between an exceptional access regime intended to work against a skillful adversary, which is impractical, and making it work for mass-market, default communications and storage products and services. The only way to guarantee that
22 A recent global survey of encryption products found that one-third were open source. See B. Schneier, K. Seidel, and S. Vijayakumar, 2016, A Worldwide Survey of Encryption Products, Berkman Center Research Publication No. 2016-2, https://ssrn.com/abstract=2731160.
23 Indeed, Apple recently removed encryption and VPN software from its China app store at the request of the Chinese government.
24 Because jailbreaking involves breaking the secure boot mechanism, it reduces protection against malicious software.
every form of encryption is subject to exceptional access is to certify the software that is allowed to run on every storage and communication device, which would be extremely expensive, intrusive, and bad for innovation. Consider by analogy the situation with physical search: a skillful adversary can make it effectively impossible for government to find physical objects, but it is tricky, expensive, and inconvenient to do this, so one does not abandon physical search just because it may not be effective in some circumstances.
Also, with respect to the possibility that third parties create applications that provide encryption without exceptional access, it is important to bear in mind that correctly designing and implementing systems that use encryption is challenging. One consequence in a world in which exceptional access is mandated is that systems without exceptional access may be less secure than mainstream systems that do provide exceptional access. The latter can benefit from the resources and expertise of the large enterprises or consortia that develop, deploy, and maintain them, which may stand in contrast to the groups that build capabilities intended to thwart a requirement.25
Install Alternative Operating System Software
Vendor mandates rely on devices running operating system software that properly responds to unlock codes. If the user is able to alter the existing operating system or install an alternative operating system (which on some platforms requires circumventing vendor security measures), the device may no longer respond to unlock codes. When faced with such a device, law enforcement might be unable to unlock it, even with the vendor’s assistance.
Jailbreaking of mobile devices is a fairly common occurrence, although by no means universal—and vendors are strongly motivated to prevent all devices they have supplied from being broken by a single “wholesale” attack of the sort needed for a jailbreak. It is worth noting that an exploit used to jailbreak is different from one that is used to circumvent a device lock and encryption; jailbreaks work only after a phone has been unlocked.
25 Note, however, that the messaging app Signal, which is believed to work securely, was developed by two people. A system that includes exceptional access would be more complicated and might require a larger team.
Use Legacy Devices or Software
This technique relies on continuing to use devices or software that are not compliant with a new regulation. It is in general difficult to stop people from using old hardware and software, although this is a time-bounded problem as people upgrade to take advantage of new capabilities. As discussed above, app store restrictions do not apply on all platforms and, in any event, are not foolproof for restricting access to noncompliant software. With respect to hardware, legacy smartphones could presumably be restricted through regulations on what devices cellular carriers allow to connect to their networks, but laptops and other hardware could not similarly be regulated.
Use Other Techniques for Concealing Messages and Stored Data
Finally, determined actors can use other techniques to conceal messages or stored data. One way is to use steganography, a technique where one hides private information by embedding it in public data. For example, one can hide secret information in public images such as pictures of cats. The information is hidden in the pixels representing the cats. Only someone who knows where to look will find the data. To a law enforcement agent, the images look like normal uninteresting images. Many free steganography applications are currently available for both Android and iOS, and all are quite easy to install and use. The use of such applications could be regulated, subject to all of the caveats as for encryption applications. It is sometimes possible to detect but not necessarily to decrypt steganographic messages.
Another technique is to employ secret sharing, where the user splits a file into two or more shares, where a single share reveals nothing about the file contents. Each share is stored on a different device, so that capture of a single device reveals nothing. An investigator who recovers a phone but not the other device will learn little from unlocking it.
There are other avenues for investigators to gain access to plaintext and other digital information that may aid an investigation. Some have argued that these tools, especially in aggregate, may serve as at least a partial substitute for regulations that mandate exceptional access. Others, notably from the law enforcement community, have warned that these tools, although useful in some cases, will not be a satisfactory substitute in many others. The following are some important examples.
Although encryption hides content, current encryption systems generally do not hide metadata. For example, in a chat system, metadata includes information such as who communicated with whom, for how long, the participant’s location, and so on. Metadata is also needed to prevent spamming and denial of service, which is why many existing systems collect this information. Such metadata provides new sources of information not previously available to investigators that exceeds the old “who called whom when” provided by the telephone network.26 Nonetheless, metadata is not necessarily a substitute for content (see the section “The Practical Utility of Alternatives to Exceptional Access” in Chapter 4). It is not a given that all metadata will remain unencrypted. Methods have been developed for hiding “who communicated with whom.” As these technologies become more robust and popular, even some basic metadata could become unavailable.
Access Data Stored in Cloud Services
In coming years, the majority of user data at rest will probably be stored in the cloud. Because in many instances users or service providers want access to this data (e.g., for searching, aggregation, and analysis), the data is typically stored in a way that enables the provider to access the data in the clear. Law enforcement can interact with cloud providers to obtain data that they need for investigations, such as data mirrored or backed up from mobile devices. Using this approach requires no modification of cryptographic systems on the mobile devices. However, some firms offer services that encrypt data so that the service provider cannot access it. Also, users who are sensitive to the possibility of government access can generally opt out of cloud storage for their data.
This section focuses on technical aspects of lawful hacking; legal aspects are discussed in the section “Legal Aspects of Lawful Hacking” above.
The scope of impact—and potential risk—will depend on the exploit that is used. Sometimes the government will succeed through an approach that affects only a single device, such as leveraging misconfigurations of a target system. Other times, the government will exploit a vulnerability in
26 Berkman Center for Internet and Society at Harvard University, 2016, Don’t Panic: Making Progress on the “Going Dark” Debate, February 1, https://cyber.harvard.edu/pubrelease/dont-panic/Dont_Panic_Making_Progress_on_Going_Dark_Debate.pdf.
the target system. When the government finds vulnerabilities in products that lead to exploits, it can choose to either collect these exploits or report the vulnerabilities to the vendor. The product is vulnerable in either case; choosing which alternative to take is a matter of policy. But the fact that the vulnerability was discovered means it is possible that others will discover it too, although the probability of independent discovery is a matter of debate.
The equities for lawful hacking may vary considerably depending on the particular circumstances. For example, one consequence of pursuing this approach is that it increases the incentives for government (and the contractors that provide lawful hacking services to government) to acquire and hold exploits rather than report them to vendors. As with the use of hacking techniques for foreign intelligence collection, it requires that the benefits to investigators be balanced with the risks to users of systems with unpatched vulnerabilities. The Obama administration established the “vulnerabilities equities process” in an attempt to address these trade-offs, and it has recently been updated and made public.27 Another consideration with respect to equities is that vulnerabilities are fragile; they may be discovered and fixed if they are used. They may also be discovered and reused by other parties. A final risk is that information about vulnerabilities or hacking tools that use them can leak or be stolen from government agencies and then be used by malicious actors.
From a technical perspective, there are three domains where tools are needed: locked devices, encrypted data in the cloud, and encrypted communications. In each case, the challenge for investigators is acquiring the tools needed to cover all the devices and services that may arise in investigations—and obtaining the necessary resources. For which devices would law enforcement have hacking tools, and would those tools require physical possession of the device? Another is the time and effort required to use the tools. From the perspective of law enforcement, something fast and reliable would be best, but one can imagine tools that require thousands of dollars’ worth of computation or days to weeks of effort. Moreover, given the fragility and specificity of lawful hacking approaches, law enforcement will need to develop or otherwise acquire a large number of exploits, which will be expensive and time-consuming. Such delay may be acceptable for some investigations, but for others, it may put law enforcement into a situation where they are always lagging the events they are charged to investigate.
Exploitable vulnerabilities will always be present in software, espe-
27 Executive Office of the President, 2017, Vulnerabilities Equities Policy and Process for the United States Government, November 15, https://www.whitehouse.gov/sites/whitehouse.gov/files/images/External%20-%20Unclassified%20VEP%20Charter%20FINAL.PDF.
cially in large and complex systems. However, it will not necessarily be easy for the government, especially state and local law enforcement agencies, to stay ahead of continually improving security technologies. Many vendors have made major investments in software security, which will likely raise the cost of discovering vulnerabilities. Patches are also available much more rapidly in today’s environment where users are accustomed to constant updates. Trusted boot, which is fundamental to device encryption, has made malware implantation harder. Other security advances that impede hackers include anti-hammering protections (which mitigate the risk of repetitive password attacks), biometric and two-factor authentication (which reduces reliance on passwords and the risk of phishing), and anonymous routing services (which makes it more difficult to identify endpoints and targets). Additionally, if the lawful hacking attack is discovered and the attack vector is understood and publicized, it will be remediated, creating further challenges for those seeking access—assuming the software is updated, which may or may not be automatic depending on the particular vendor and context.
As a result, lawful hacking of individual communications applications such as Snapchat and devices such as iPhones and laptops with full disk encryption will require a level of effort that may well not scale to the number of investigations implicated and may well not be feasible for all investigative agencies. There are also limits on what tools may be appropriate for law enforcement agencies to use. For example, some of the means at the disposal of the National Security Agency would be inappropriate or illegal for traditional law enforcement.
Acquire Better Tools and Capabilities for Accessing and Analyzing Plaintext
As observed earlier, encryption is not the only barrier to effective use of plaintext. Even when information is not encrypted, it can be difficult for law enforcement or intelligence agencies to access and analyze information. Information may be transmitted using nonstandard protocols or stored in unfamiliar formats—and these may change on a regular basis as companies evolve software and services. Making effective use of data, especially when the volumes are large, requires specialized tools and expertise. Acquiring these capabilities will require additional resources; see the next section.
The following are options for providing additional financial or technical support to respond to the challenges posed by encryption:
- Provide law enforcement with additional financial resources. Law enforcement capabilities and spending have not kept pace with the growing role of digital evidence in investigations.28 In today’s world of multiple types of devices, applications, and networks, the “one-size-fits-all” solutions of the telephony era are no longer possible. Thus greater technical expertise is a necessity in modern investigations. With additional resources, the government could hire more specialists, pursue more sources of information, find additional clever workarounds when data is encrypted, expand capabilities for lawful hacking, and find and punish more criminals. As indicated previously, with more resources, law enforcement would be able to access and use sources of information that are now too difficult.
Sharing and access to specialized services. When one law enforcement group learns how to use a new source of data, or finds (or develops) a useful tool, how do other law enforcement groups learn about this, as there are surely some that could use it? More broadly, there are clear economies of scale. A small town—or even small city—police department cannot maintain the technical staff needed to find new sources of data or learn about new methods and tools. At the same time, it may be desirable to limit the spread of sensitive techniques used to access plaintext, lest they leak out and bad actors either learn how to circumvent them or make use of the techniques for their own purposes.
Both formal and informal sharing institutions can play a role if properly staffed and funded. Existing specialized federal entities and government-affiliated nonprofits (Box 5.2) and the existing analysis capabilities of federal, state, and local law enforcement agencies can be leveraged to assist law enforcement groups that lack the necessary skills and equipment. For this assistance to partially offset the loss of plaintext, the capabilities and scale of these entities may well have to increase by orders of magnitude.
- Enhance corporate outreach to law enforcement. Companies could enhance their efforts to engage the law enforcement community, making sure law enforcement officials are familiar with their products and what data does and does not exist. (Fully implementing this would require a detailed discussion of the changing details of what data is retained, how its location is determined, and how long it is kept.) The U.S. Department
28 The gap is recognized in the FBI’s fiscal year (FY) 2017 budget request that included an increase of $38 million to “counter the threat of Going Dark” (U.S. Department of Justice, FY 2017 Authorization and Budget Request to Congress, https://www.justice.gov/jmd/file/821341/download). The FY2018 request calls for an increase of $22 million, 80 positions, and 20 agents for “Going Dark/Investigative Technology”(U.S. Department of Justice, FY 2018 Budget Request at a Glance, https://www.justice.gov/jmd/page/file/968261/download).
- Vendors supply source code and internal documentation. Some vendors, such as those selling secure messaging, might be willing to voluntarily supply information if the government’s equities process provides sufficient priority to the release of vulnerabilities to vendors so that the vendors can remediate them. One objection is that for many programs the implementation changes several times a year, so guaranteeing that the supplied code is what is running on a particular device would be challenging. In the long run, this is likely to make the systems more
of Justice’s National Domestic Communications Assistance Center currently facilitates some level of assistance. Other avenues for cooperation include the following:
secure; law enforcement’s opportunity comes between the discovery of an exploitable vulnerability and when it is patched in the systems they care about. Pursuing this option assumes that the vulnerabilities equities process favors disclosure to the vendor and assumes that law enforcement agencies have the capability to discover new vulnerabilities.
- Vendors voluntarily share information about vulnerabilities. Vendors will fix the security vulnerabilities they find, but in the interim, legitimate government interests as well as bad actors could potentially make use of the vulnerabilities. Absent a requirement to share these bugs, it is not clear, especially in the current environment, whether vendors would actually participate; they generally avoid sharing vulnerabilities for any offensive purpose in part because they fear legal liability and reputational damage if they enable their own customers to be attacked successfully. With both this case and the one that follows, a further complication to consider is the risk that the government agencies that hold a vulnerability may lose it; the risk is more than theoretical in light of reports about government-held tools being compromised.
- Third parties voluntarily share information about vulnerabilities. Third parties often discover security vulnerabilities and report them to vendors. They could share this information to the government before or at the same time they provide it to vendors.
- Support innovation. Research funding may lead to innovative technical solutions that better accommodate government access and end-user security. One example would be searchable encryption, discussed above as well as in Chapter 2.
As discussed earlier, there are two broad categories of possible legislation mandating government access to encrypted information.
- Enact legislation that requires that device vendors or service providers provide government access to plaintext without specifying the technical means of doing so. The mandate could be described in a variety of ways, with different types of problems. For example, it might require that vendors be able to comply with warrants seeking access to the plaintext of the information when their products and services are used to encrypt but leave it to industry to design the technical solution.
- Enact legislation requiring a particular technical approach. For example, a law or regulation could require vendors to implement hardware-based device-level key escrow for access to stored data or require vendors or third-party key escrow for access to communications. As a middle ground, a law could call for rulemaking to select a technical approach.
The first approach has the advantage of allowing industry greater flexibility in developing and selecting solutions that best fit their technical and business circumstances. On the other hand, if industry is left to choose, there may be a plethora of plaintext recovery solutions adopted. This cacophony in the marketplace may be a challenge to government agents, because they will have to rely on different techniques in different cases. But because the United States cannot regulate what applications and devices are developed outside U.S. borders, some degree of cacophony is likely to exist for investigators regardless of limits created by U.S. legislation.
By contrast, the second approach may be more burdensome if the selected approach is difficult for vendors to implement, and it would not foster innovative solutions. However, it may provide the greatest scale because law enforcement would have a repeatable and dependable way to access plaintext when authorized pursuant to law. By having everyone adopt the same technical approach, it would also magnify the risk of catastrophic failure if that common approach were to have exploitable security flaws.
Even for a technically non-specific mandate, there are many details that would need to be worked out and specified in a legislative proposal. For example,
- Which companies, products, and services are covered, and what are the exact responsibilities of vendors and service providers?
- How will legacy devices be treated? Are they exempt from the requirements, or must they be taken out of service if they cannot comply? If they are not grandfathered, how will the requirements be enforced?
- How robust must the exceptional access mechanism be against user efforts to disable it?
- What rules apply to devices that are carried into the United States by foreign visitors? Must they be retrofitted or disabled at the border? Similarly, what rules apply to services provided by firms without a clear U.S. presence, and how would these rules be enforced?
In terms of cost, if industry is left to innovate, it will incur research and development costs, product (re)development costs, and the costs of adhering with any access regime. (Some of those costs are incurred today with regard to access to plaintext, such as the costs of responding to judicial process.) If the government mandates a particular solution, there will be the cost of re-engineering systems, protecting the access mechanisms, and responding to government requests for data.