D
Security and Privacy

Voter registration systems are known to be points of risk in election administration systems. Indeed, the ostensible purpose of voter registration is to make the election system more secure against fraud in the first place. When a voter registration system is computer-based, security thus becomes an issue.

Security is the property of a computer system whereby the system does what is required and expected in the face of deliberate attack.1 For purposes of this report, privacy refers to the protection of the information contained within the VRD against improper access.

As the comments in this appendix indicate, privacy and security issues related to VRDs are not merely technical issues. Indeed, a mix of policy and technology is relevant to their consideration, and these issues are nothing else if not hard to resolve, especially on a limited timescale. It is largely for this reason that the committee does not view these issues as having easy resolution in the short term. Accordingly, the committee will be addressing these issues in its future deliberations, and the final report will include both more substantial analysis and recommendations related to security and privacy.

SECURITY2

Although the security of electronic voter registration systems has not been subject to the levels of scrutiny directed at electronic voting systems, the security of VRD systems is nonetheless important. Security of computer systems is usually conceptualized in terms of confidentiality, integrity, and availability:3

  • Confidentiality. A secure system will keep protected information away from those who should not have access to it. Examples of failures that affect the confidentiality of a VRD

1

Reliability in the face of human, machine, or network failure is also an important dimension of system trustworthiness, but this appendix focuses on security against deliberate attack.

2

There is an extensive body of National Research Council work on computer security issues, beginning with Computers at Risk: Safe Computing in the Information Age, 1990, and continuing with Cryptography’s Role in Securing the Information Society, 1996; Trust in Cyberspace, 1999; Realizing the Potential of C4I: Fundamental Challenges, 1999; Making IT Better: Expanding IT Research to Meet Society’s Needs, 2000; Cybersecurity Today and Tomorrow: Pay Now or Pay Later, 2002; Software for Dependable Systems: Sufficient Evidence?, 2007; and Toward a Safer and More Secure Cyberspace, 2007, all published by the National Academy [Academies] Press, Washington, D.C. In addition, an extensive discussion of security and privacy issues specifically with reference to voter registration databases is contained in U.S. Public Policy Committee of the Association for Computing Machinery, Statewide Databases of Registered Voters: Study of Accuracy, Privacy, Usability, Security, and Reliability Issues, 2006, available at http://usacm.acm.org/usacm/PDF/VRD_report.pdf. Excerpts from the executive summary of this report relevant to privacy and security are provided in Box D.1.

3

See for example, NRC, Toward a Safer and More Secure Cyberspace, The National Academies Press, Washington, D.C., 2007.



The National Academies | 500 Fifth St. N.W. | Washington, D.C. 20001
Copyright © National Academy of Sciences. All rights reserved.
Terms of Use and Privacy Statement



Below are the first 10 and last 10 pages of uncorrected machine-read text (when available) of this chapter, followed by the top 30 algorithmically extracted key phrases from the chapter as a whole.
Intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text on the opening pages of each chapter. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

Do not use for reproduction, copying, pasting, or reading; exclusively for search engines.

OCR for page 48
D Security and Privacy Voter registration systems are known to be points of risk in election administration systems. Indeed, the ostensible purpose of voter registration is to make the election system more secure against fraud in the first place. When a voter registration system is computer-based, security thus becomes an issue. Security is the property of a computer system whereby the system does what is required and expected in the face of deliberate attack.1 For purposes of this report, privacy refers to the protection of the information contained within the VRD against improper access. As the comments in this appendix indicate, privacy and security issues related to VRDs are not merely technical issues. Indeed, a mix of policy and technology is relevant to their consideration, and these issues are nothing else if not hard to resolve, especially on a limited timescale. It is largely for this reason that the committee does not view these issues as having easy resolution in the short term. Accordingly, the committee will be addressing these issues in its future deliberations, and the final report will include both more substantial analysis and recommendations related to security and privacy. SECURITY2 Although the security of electronic voter registration systems has not been subject to the levels of scrutiny directed at electronic voting systems, the security of VRD systems is nonetheless important. Security of computer systems is usually conceptualized in terms of confidentiality, integrity, and availability:3 • Confidentiality. A secure system will keep protected information away from those who should not have access to it. Examples of failures that affect the confidentiality of a VRD 1 Reliability in the face of human, machine, or network failure is also an important dimension of system trustworthiness, but this appendix focuses on security against deliberate attack. 2 There is an extensive body of National Research Council work on computer security issues, beginning with Computers at Risk: Safe Computing in the Information Age, 1990, and continuing with Cryptography’s Role in Securing the Information Society, 1996; Trust in Cyberspace, 1999; Realizing the Potential of C4I: Fundamental Challenges, 1999; Making IT Better: Expanding IT Research to Meet Society’s Needs, 2000; Cybersecurity Today and Tomorrow: Pay Now or Pay Later, 2002; Software for Dependable Systems: Sufficient Evidence?, 2007; and Toward a Safer and More Secure Cyberspace, 2007, all published by the National Academy [Academies] Press, Washington, D.C. In addition, an extensive discussion of security and privacy issues specifically with reference to voter registration databases is contained in U.S. Public Policy Committee of the Association for Computing Machinery, Statewide Databases of Registered Voters: Study of Accuracy, Privacy, Usability, Security, and Reliability Issues, 2006, available at http://usacm.acm.org/usacm/PDF/VRD_report.pdf. Excerpts from the executive summary of this report relevant to privacy and security are provided in Box D.1. 3 See for example, NRC, Toward a Safer and More Secure Cyberspace, The National Academies Press, Washington, D.C., 2007. 48

OCR for page 48
APPENDIX D 49 include an unauthorized party obtaining voter information on a large scale or a spouse abuser obtaining the address of his/her spouse from a VRD even if such information is supposed to be protected. • Integrity. A secure system produces the same results or information whether or not the system has been attacked. When integrity is violated, the system may continue to operate, but under some circumstances of operation it does not provide accurate results or information that one would normally expect. An example of a failure that affects the integrity of a VRD is an unauthorized change in a VRD that could force an individual to show identification at the polls when in fact there is no such requirement for that individual to do so. • Availability. A secure system is available for normal use even in the face of an attack. An example of a failure in availability might be a system that is clogged with so much bad data that the system no longer operates reliably (for example, a flood of bogus paper voter registration applications that overwhelms the data-entry staff in a particularly critical jurisdiction). A number of security breaches of VRDs have been reported.4 For example, on October 23, 2006, an official from the not-for-profit Illinois Ballot Integrity Project reported that his organization had used the Chicago voter database remotely to compromise the names, SSNs, and dates of birth of 1.35 million residents. According to a spokesman for the Chicago Election Board, the problem arose because the city’s database allowing voters to locate their voting precinct once asked voters for detailed information such as Social Security numbers, and even though the Web site was updated to require only names and addresses to make a query, the links to the Social Security numbers and the dates of birth were never eliminated.5 Developing secure systems (where “system” is intended to include the human and organizational aspects of a system as well as the technology) is a challenging task, and much has been written about such matters. But it is essential to consider three fundamental points about security. First, good security practices require thinking about building security in from the start. Good system specifications inform analysts of what is “required and expected” behavior. Good software engineering enables the system to be implemented in a way that conforms to the system specification. Formal verification methods and other analysis tools may be helpful in showing that implementations faithfully conform to certain aspects of their specifications. Second, security threats can arise even in systems that are not connected to the Internet. Although Internet connections are often an important source of vulnerability, they are most assuredly not the only source. The recent history of computer security is replete with examples of security compromises that had nothing to do with the Internet, such as data on stolen laptops, attacks from insiders abusing their privileges, and “social engineering” attacks involving humans posing as other humans, often over the telephone, in order to learn credentials such as passwords that can enable them to access systems and files they should not be able to access. For example, video surveillance cameras caught two intruders in Mississippi on June 23, 2006, stealing hard drives from 18 computers. Data files contained names, addresses, and SSNs of current and former city employees and registered voters as well as bank account information for employees paid through direct deposit and water system customers who paid bills electronically.6 Third, any realistic assessment of a system’s security involves actual testing of the system’s security by an adversary who is motivated to compromise it. Although testing cannot, and does not, 4 See http://www.privacyrights.org/ar/ChronDataBreaches.htm. This site contains descriptions of a number of data breaches involving actual VRDs, and a number of others of potential relevance to VRDs. 5 See http://abcnews.go.com/Politics/story?id=2601085; http://www.electiondefensealliance.org/chicago_voter_ registration_database_wide_open. 6 See http://www.privacyrights.org/ar/ChronDataBreaches.htm.

OCR for page 48
50 STATE VOTER REGISTRATION DATABASES: IMMEDIATE ACTIONS AND FUTURE IMPROVEMENTS BOX D.1 Excerpts from a 2006 Study of Voter Registration Databases Relevant to Privacy and Security The following material is reprinted from the executive summary and the main text of Statewide Databases of Registered Voters: Study of Accuracy, Privacy, Usability, Security, and Reliability Issues, a 2006 report by the U.S. Public Policy Committee of the Association for Computing Machinery. 2. Accountability should be apparent throughout each VRD. It should be clear who is proposing, making, or approving changes to the data, the system, or its policies. Security policies are an important tool for ensuring accountability. For example, access control policies can be structured to restrict actions of certain groups or individual users of the system. Further, users’ actions can be logged using audit trails (discussed below). Accountability also should extend to external uses of VRD data. For example, state and local officials should require recipients of data from VRDs to sign use agreements consistent with the government’s official policies and procedures. 3. Audit trails should be employed throughout the VRD. VRDs that can be independently verified, checked, and proven to be fair will increase voter confidence and help avoid litigation. Audit trails are important for independent verification, which, in turn, makes the system more transparent and provides a mechanism for accountability. They should include records of data changes, configuration changes, security policy changes, and database design changes. The trails may be independent records for each part of the VRD, but they should include both who made the change and who approved the change. 4. Privacy values should be a fundamental part of the VRD, not an afterthought. Privacy policies for voter registration activities should be based on Fair Information Practices (FIPs), which are a set of principles for addressing concerns about information privacy. FIPs typically address collection limitation, data quality, purpose specification, use limitation, security safeguards, openness, individual participation, and accountability. There are many ways to implement good privacy policies. For example, we recommend that government both limit necessarily reveal all security problems (and does nothing by itself to eliminate such problems), testing can often identify some remaining failures. PRIVACY Some of the information in VRDs is, by law, public information, although the specifics of which data items can be regarded as public information vary from state to state. In addition, states often limit the purposes for which such information may be used. Nevertheless, the electronic availability of such information raises concerns about the privacy of that information, because electronic access greatly increases the ease with which it can be made available to anyone, including those who might abuse it. One of the thorniest issues regarding privacy is the tension it sometimes poses with transparency. In its starkest terms, maintaining privacy involves withholding certain information associated with individuals from public view, while transparency involves the maximum disclosure of information, even if such information is associated with individuals.

OCR for page 48
APPENDIX D 51 collection to only the data required for proper registration and explain why each piece of personal information is necessary. Further, privacy policies should be published and widely distributed, and the public should be given an opportunity to comment on any changes. . . . 6. Election officials should rigorously test the usability, security and reliability of VRDs while they are being designed and while they are in use. Testing is a critical tool that can reveal that “real-world” poll workers find interfaces confusing and unusable, expose security flaws in the system, or that the system is likely to fail under the stress of Election Day. All of these issues, if caught before they are problems through testing will reduce voter fraud and the disenfranchisement of legitimate voters. . . . Security Against Technical Attacks . . . [M]echanisms should be deployed to detect any penetration of system defenses, as well as any insider misuse. For example, application-specific intrusion detection systems could be used to monitor the number of updates to the VRD. Any large spike in activity, whether by an authorized user or in the aggregate, might warrant human attention. In addition, officials could consider contracting with a third-party network security monitoring service to detect network intrusions and attempted attacks on the system. . . . . . . Officials should consider including an independent security review and publication of the software as part of the acceptance testing for the system. Claims that the security of the system will be endangered by such a review should be treated with extreme skepticism or rejected outright. . . . SOURCE: U.S. Public Policy Committee of the Association for Computing Machinery, Statewide Databases of Registered Voters: Study of Accuracy, Privacy, Usability, Security, and Reliability Issues, 2006, available at http://usacm.acm.org/usacm/PDF/VRD_report.pdf. (c) 2006 ACM. Excerpted with permission. ISBN: 1-59593- 344-1. Permission to make digital or hard copies of portions of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. To copy otherwise, to republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permission from permissions@acm.org. As an illustration of how these tensions play out, consider a proposition regarding the public disclosure of the reason(s) for removing specific individuals from voter registration lists. On one hand, the removal of a voter from a VRD is often associated with a stigmatizing condition, such as being a felon or being declared mentally incompetent. Those mistakenly removed from a VRD may experience adverse consequences from such association, and even if the removal is correctly performed, those individuals are still arguably entitled to some measure of privacy. Thus, a person balancing the scales in favor of privacy would argue that the reasons for removing individuals from the VRD should be kept confidential, as they are in some states already. On the other hand, advocates of greater transparency argue that removals from a VRD should be subject to public oversight in the same way that additions are. They point out that convictions and even arrest records are generally public, and thus argue that not disclosing reasons for removal from a VRD does not really protect the privacy of these individuals anyway. At the same time, they argue that associating reasons for removal with specific individuals is critical to determining the qualification of voters—and that statistical tabulations alone would not provide the detail needed to investigate individual errors that might indicate systemic problems.

OCR for page 48
52 STATE VOTER REGISTRATION DATABASES: IMMEDIATE ACTIONS AND FUTURE IMPROVEMENTS The committee noted significant value without much negative impact on privacy in statistical tabulations of the reasons for voters being dropped from a VRD and publication of such tabulations, as well as in personal and private notification of individual voters of the reason(s) for being dropped. But the different points of view described above were reflected in the committee, and thus the committee takes no position on the desirability or undesirability of the above proposition. The committee might address this point in its final report. Other privacy advocates have raised concerns about the widespread availability of complete voter registration information in the context of the physical security of battered men or women. Such individuals have good reason to keep their addresses private, and might be apprehensive with good reason about the availability of their addresses to their batterers. A second concern relates to abuse of lists of validated addresses for commercial marketing purposes—many citizens would be upset to know that the information they provided to exercise their right to vote in a democracy is also being used for commercial purposes. Addressing such issues properly belongs to state policy makers, who can develop (and sometimes have developed) regulation and law to protect citizen interests—for example, some states only allow political parties to obtain voter registration lists. A second set of privacy issues arises from matching and linking records. For example, voter registration lists may be matched against a list of convicted felons. If a list of voters removed from the VRD is made public, those removed from the list improperly or removed for other reasons (that is, all nonfelons removed from the list) may be tainted by association in the public eye. Similarly, if a voter registration list is made public that indicates the source of an individual application, those who registered to vote at public assistance agencies might regard their privacy rights as having been violated. Although overt public disclosure would violate the NVRA, accidental disclosure through a security breach might have a similar result. This could in turn reduce the likelihood that people will seek out public assistance if seeking it will automatically place that information in a voter registration record that is publicly accessible. Alternatively, where registration is not automatic, it may reduce the number of individuals who take advantage of the ease of registering at the public assistance agency and thereby undercut the goal of the program. A third set of privacy issues arises from insider access to the VRD. Insiders such as election officials could be expected to have access to the full set of information associated with any individual record, and possibly to some of the information in matched records existing in other databases. Although most election officials are trustworthy in this regard, a few might seek to use this access—improperly— for personal benefit or gain, and security measures (such as tamper-proof audit logs) are needed to prevent or deter such inappropriate insider access.