D
Security and Privacy

Voter registration systems are known to be points of risk in election administration systems. The ostensible purpose of voter registration is to make the election system more secure against fraud in the first place. When a voter registration system is computer-based, security thus becomes an issue.

Security is the property of a computer system whereby the system does what is required and expected in the face of deliberate attack.1 For purposes of this report, privacy refers to policies that protect the information contained within the voter registration database (VRD) against inappropriate access.

As the comments in this appendix indicate, privacy and security issues related to VRDs are not merely technical issues. Indeed, a mix of policy and technology is relevant to their consideration, and these issues are nothing else if not hard to resolve.

SECURITY2

The security of the VRD is necessary to ensure that the VRD properly performs its function as an accurate and complete list of registered voters. Although the security of VRD systems has not been subject to the levels of scrutiny directed at electronic voting systems, it is nonetheless important. Security issues in VRDs arise for three reasons. First, state VRDs contain personal information associated with registered voters, and such information must be protected against disclosures not permitted by law. Second, the overall integrity of the VRD must be protected against unauthorized alterations (e.g., individual records being improperly added, deleted, or changed). Third, the VRD system must be avail-

1

Reliability in the face of human, machine, or network failure is also an important dimension of system trustworthiness, but this appendix focuses on security against deliberate attack.

2

There is an extensive body of National Research Council work on computer security issues, beginning with Computers at Risk: Safe Computing in the Information Age, 1990, and continuing with Cryptography’s Role in Securing the Information Society, 1996; Trust in Cyberspace, 1999; Realizing the Potential of C4I: Fundamental Challenges, 1999; Making IT Better: Expanding IT Research to Meet Society’s Needs, 2000; Cybersecurity Today and Tomorrow: Pay Now or Pay Later, 2002; Software for Dependable Systems: Sufficient Evidence?, 2007; and Toward a Safer and More Secure Cyberspace, 2007, all published by the National Academy [Academies] Press, Washington, D.C. In addition, an extensive discussion of security and privacy issues specifically with reference to voter registration databases is contained in U.S. Public Policy Committee of the Association for Computing Machinery, Statewide Databases of Registered Voters: Study of Accuracy, Privacy, Usability, Security, and Reliability Issues, 2006, available at http://usacm.acm.org/usacm/PDF/VRD_report.pdf. Excerpts from the executive summary of this report relevant to privacy and security are provided in Box D.1.



The National Academies | 500 Fifth St. N.W. | Washington, D.C. 20001
Copyright © National Academy of Sciences. All rights reserved.
Terms of Use and Privacy Statement



Below are the first 10 and last 10 pages of uncorrected machine-read text (when available) of this chapter, followed by the top 30 algorithmically extracted key phrases from the chapter as a whole.
Intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text on the opening pages of each chapter. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

Do not use for reproduction, copying, pasting, or reading; exclusively for search engines.

OCR for page 87
D Security and Privacy Voter registration systems are known to be points of risk in election administration systems. The ostensible purpose of voter registration is to make the election system more secure against fraud in the first place. When a voter registration system is computer-based, security thus becomes an issue. Security is the property of a computer system whereby the system does what is required and expected in the face of deliberate attack.1 For purposes of this report, privacy refers to policies that protect the information contained within the voter registration database (VRD) against inappropriate access. As the comments in this appendix indicate, privacy and security issues related to VRDs are not merely technical issues. Indeed, a mix of policy and technology is relevant to their consideration, and these issues are nothing else if not hard to resolve. SECURITy2 The security of the VRD is necessary to ensure that the VRD properly performs its function as an accurate and complete list of registered voters. Although the security of VRD systems has not been subject to the levels of scrutiny directed at electronic voting systems, it is nonetheless important. Secu - rity issues in VRDs arise for three reasons. First, state VRDs contain personal information associated with registered voters, and such information must be protected against disclosures not permitted by law. Second, the overall integrity of the VRD must be protected against unauthorized alterations (e.g., individual records being improperly added, deleted, or changed). Third, the VRD system must be avail - 1 Reliability in the face of human, machine, or network failure is also an important dimension of system trustworthiness, but this appendix focuses on security against deliberate attack. 2 There is an extensive body of National Research Council work on computer security issues, beginning with Computers at Risk: Safe Computing in the Information Age, 1990, and continuing with Cryptography’s Role in Securing the Information Society, 1996; Trust in Cyberspace, 1999; Realizing the Potential of CI: Fundamental Challenges, 1999; Making IT Better: Expanding IT Research to Meet Society’s Needs, 2000; Cybersecurity Today and Tomorrow: Pay Now or Pay Later, 2002; Software for Dependable Systems: Sufficient Eidence?, 2007; and Toward a Safer and More Secure Cyberspace, 2007, all published by the National Academy [Academies] Press, Washington, D.C. In addition, an extensive discussion of security and privacy issues specifically with reference to voter registration databases is con- tained in U.S. Public Policy Committee of the Association for Computing Machinery, Statewide Databases of Registered Voters: Study of Accuracy, Priacy, Usability, Security, and Reliability Issues, 2006, available at http://usacm.acm.org/usacm/PDF/VRD_report. pdf. Excerpts from the executive summary of this report relevant to privacy and security are provided in Box D.1. 

OCR for page 87
 IMPROVING STATE VOTER REGISTRATION DATABASES Box D.1 Excerpts from a 2006 Study of Voter Registration Databases Relevant to Privacy and Security The following material is reprinted from the executive summary and the main text of Statewide Data- bases of Registered Voters: Study of Accuracy, Privacy, Usability, Security, and Reliability Issues, a 2006 report by the U.S. Public Policy Committee of the Association for Computing Machinery. 2. Accountability should be apparent throughout each VRD. It should be clear who is proposing, making, or approving changes to the data, the system, or its policies. Security policies are an important tool for ensuring accountability. For example, access control policies can be structured to restrict actions of certain groups or individual users of the system. Further, users’ actions can be logged using audit trails (discussed below). Accountability also should extend to external uses of VRD data. For example, state and local officials should require recipients of data from VRDs to sign use agreements consistent with the government’s official policies and procedures. 3. Audit trails should be employed throughout the VRD. VRDs that can be independently verified, checked, and proven to be fair will increase voter confidence and help avoid litigation. Audit trails are important for independent verification, which, in turn, makes the system more transparent and provides a mechanism for accountability. They should include records of data changes, configura- tion changes, security policy changes, and database design changes. The trails may be independent records for each part of the VRD, but they should include both who made the change and who approved the change. 4. Privacy values should be a fundamental part of the VRD, not an afterthought. Privacy policies for voter registration activities should be based on Fair Information Practices (FIPs), which are a set of principles for addressing concerns about information privacy. FIPs typically address collection limita- tion, data quality, purpose specification, use limitation, security safeguards, openness, individual participation, and accountability. There are many ways to implement good privacy policies. For example, we recommend that government both limit collection to only the data required for proper registration and explain why each piece of able and functional when needed, both to perform the “real-time” updates required by HAVA and, most critically, on or before Election Day to enable real-time queries or to create poll books. Security measures address the issue of both who is authorized to view or change information in the VRD and of what information within any record in the VRD may be viewed or changed. In the security context, viewing information includes seeing individual records and sending or transferring records en masse; changing information includes adding entirely new records, altering one or more fields within one or more records, and deleting records. The security of systems is usually conceptualized in terms of confidentiality, integrity, and avail - ability.3 These apply in the context of VRD systems (where “system” is intended to include the human and organizational aspects of a system as well as the technology): 3 See for example, NRC, Toward a Safer and More Secure Cyberspace, Seymour E. Goodman and Herbert S. Lin (eds.),The National Academies Press, Washington, D.C., 2007.

OCR for page 87
 APPENDIX D personal information is necessary. Further, privacy policies should be published and widely distributed, and the public should be given an opportunity to comment on any changes. . . . 6. Election officials should rigorously test the usability, security and reliability of VRDs while they are being designed and while they are in use. Testing is a critical tool that can reveal that “real-world” poll workers find interfaces confusing and unusable, expose security flaws in the system, or that the system is likely to fail under the stress of Election Day. All of these issues, if caught before they are problems through testing will reduce voter fraud and the disenfranchisement of legitimate voters. . . . Security Against Technical Attacks . . . [M]echanisms should be deployed to detect any penetration of system defenses, as well as any insider misuse. For example, application-specific intrusion detection systems could be used to monitor the number of updates to the VRD. Any large spike in activity, whether by an authorized user or in the aggregate, might warrant human attention. In addition, officials could consider contracting with a third-party network security monitoring service to detect network intrusions and attempted attacks on the system. . . . . . . Officials should consider including an independent security review and publication of the software as part of the acceptance testing for the system. Claims that the security of the system will be endangered by such a review should be treated with extreme skepticism or rejected outright. . . . SOURCE: U.S. Public Policy Committee of the Association for Computing Machinery, Statewide Databases of Registered Voters: Study of Accuracy, Privacy, Usability, Security, and Reliability Issues, 2006, available at http://usacm.acm.org/usacm/ PDF/VRD_report.pdf. (c) 2006 ACM. Excerpted with permission. ISBN: -59593-344-. Permission to make digital or hard copies of portions of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. To copy otherwise, to republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permission from permissions@acm.org. • Confidentiality. A secure system keeps protected information away from those who should not have access to it. Examples of failures that affect the confidentiality of a VRD include an unauthorized party obtaining voter information on a large scale or a spouse abuser obtaining the address of his/her spouse from a VRD even if such information is supposed to be protected from disclosure. • Integrity. A secure system produces its intended results or information, regardless of whether or not the system has been attacked. When integrity is violated, the system may continue to operate, but under some circumstances of operation it does not provide accurate results or information that one would normally expect. Failures of integrity of a VRD include both inclusion of noneligible individuals and unauthorized exclusion of eligible registered voters, as well as unauthorized modifications to data fields such as addresses, birth dates, or voting histories. • Aailability. A secure system is available for normal use even in the face of high load or an attack. An example of a failure in availability might be a system that is clogged with so much bad data that the system no longer operates reliably (typically this refers to electronic attempts to overwhelm a system but also could occur in the nonelectronic domain; for example, a flood of bogus paper voter registration applications might attempt to overwhelm the data-entry staff in a particularly critical jurisdiction).

OCR for page 87
0 IMPROVING STATE VOTER REGISTRATION DATABASES A number of security breaches of VRDs have been reported.4 For example, on October 23, 2006, an official from the not-for-profit Illinois Ballot Integrity Project reported that his organization demonstrated that it was possible to use the Chicago voter database remotely to compromise the confidentiality of names, SSNs, and dates of birth of 1.35 million residents. According to a spokesman for the Chicago Election Board, the problem arose because the city’s database allowing voters to locate their voting precinct once asked voters for detailed information such as Social Security numbers, and even though the Web site was updated to require only names and addresses to make a query, the links to the Social Security numbers and the dates of birth were never eliminated. 5 Security threats can arise even in systems that are not connected to the Internet. Although Internet connections are often an important source of vulnerability, they are most assuredly not the only source. The recent history of computer security is replete with examples of security compromises that had noth - ing to do with the Internet, such as data on stolen laptops, attacks from insiders abusing their privileges, and “social engineering” attacks involving humans posing as other humans, often over the telephone, in order to learn credentials such as passwords that could enable them to access systems and files they should not be able to access.6 Developing secure systems is a challenging task, and much has been written about such matters. 7 Below, some best practices for security measures are highlighted. • Access control policies should be established and enforced that group people by established roles (based on function, jurisdiction, etc.) and assign to those roles the appropriate (minimal) level of access needed to carry out their job functions. In doing so, the “principle of least privilege” should be followed: access should be kept to the minimal necessary levels. This reduces the possibility of both intentional misbehavior and accidental mistakes. • The number of people with administrative privileges should be limited. Very few users should have the ability to grant access to others. However, there should also be rules and procedures that allow trusted election officials to temporarily increase privileges available to others during emergencies or time-critical situations (such as on Election Day) in a controlled and fully audited manner. The specific number chosen here should balance the competing concerns of minimizing administrative privileges to minimize abuse and increasing them to ensure availability. This balance will vary depending on the size and other specifics of a jurisdiction, but a reasonable number might be expected to be at least 3 and no more than 10. • Authorized users of the system should receive security training, including how to choose and protect passwords and how to resist “social engineering” attacks (attempts to deceive someone into performing certain actions). • Encryption should be used to protect the confidentiality of data. For example, all communica - 4 See http://www.privacyrights.org/ar/ChronDataBreaches.htm. This site contains descriptions of a number of data breaches involving actual VRDs, and a number of others of potential relevance to VRDs. 5 See http://abcnews.go.com/Politics/story?id=2601085; http://www.electiondefensealliance.org/chicago_voter_registration_ database_wide_open. 6 For example, video surveillance cameras caught two intruders in Mississippi on June 23, 2006, stealing hard drives from 18 computers. Data files contained names, addresses, and SSNs of current and former city employees and registered voters as well as bank account information for employees paid through direct deposit and water system customers who paid bills electronically. See http://www.privacyrights.org/ar/ChronDataBreaches.htm. 7 There is an extensive body of National Research Council work on computer security issues, beginning with Computers at Risk: Safe Computing in the Information Age, 1990, and continuing with Cryptography’s Role in Securing the Information Society, 1996; Trust in Cyberspace, 1999; Realizing the Potential of CI: Fundamental Challenges, 1999; Making IT Better: Expanding IT Research to Meet Society’s Needs, 2000; Cybersecurity Today and Tomorrow: Pay Now or Pay Later, 2002; Software for Dependable Systems: Sufficient Eidence?, 2007; and Toward a Safer and More Secure Cyberspace, 2007, all published by the National Academy [Academies] Press, Washington, D.C. In addition, an extensive discussion of security and privacy issues specifically with reference to voter registra - tion databases is contained in U.S. Public Policy Committee of the Association for Computing Machinery, Statewide Databases of Registered Voters: Study of Accuracy, Priacy, Usability, Security, and Reliability Issues, 2006, available at http://usacm.acm.org/ usacm/PDF/VRD_report.pdf.

OCR for page 87
 APPENDIX D tions channels used by the system should be secured via end-to-end cryptography to protect both the confidentiality and the integrity of the data. In many cases, this will be handled by the network or application layer (e.g., via the use of https on Web interfaces) rather than in the database system itself. Stored data—or at least sensitive data fields, such as SSN—should be encrypted as well, and under some circumstances, the data need not be decrypted for it to be used. 8 • Firewalls should be used to severely limit connectivity between internal and external networks. • Mechanisms (such as commercially available intrusion detection and anti-virus systems) should be deployed to detect and prevent any penetration of system defenses or insider misuse. • It is easier to secure a computer if less software is installed on it. To the extent feasible, computers used for administering VRD systems should be dedicated for this purpose. (Election offices with limited resources may find it difficult to refrain from using computers for multiple purposes.) Furthermore, the number of computers that have the complete VRD system and/or the complete VRD database (particu - larly sensitive information such as complete or last four digits of Social Security numbers) should be limited. • Election officials should obtain independent security review of the VRD system before deploy - ment and thereafter whenever significant changes are made to the VRD system. Periodic security review is also helpful, though state regulations may make such review more difficult. • All changes to the VRD contents and system must be tracked (e.g., via immutable audit logs and associated policies for monitoring them) for accountability purposes. 9 These include changes on indi- vidual VRD records, large-scale or batch updates, source code, database schemas, system configuration, and access control policies. Such logs also guard against individuals with authorized access viewing those records for unauthorized purposes; such unauthorized purposes may include satisfying curiosity (e.g., viewing details about a famous person) and making illicit money (e.g., selling an SSN). Immutable audit logs serve as a deterrent (because the use of such a log has been made known), a forensic tool when a breach is believed to have occurred, and a useful tool when conducting sample audits. • Any realistic assessment of a system’s security involves actual testing of the system’s security by an adversary that is motivated to compromise it (such as a “red team” commissioned to find vulner- abilities). Although testing cannot necessarily reveal all security problems (and does nothing by itself to eliminate such problems), testing can often identify some remaining failures. • Recovery from security failures and/or accidental mishap must be possible. This topic is dis - cussed in more detail in Section 3.6 (Backup) in the main body of the report. These measures address security issues for data under the control of the relevant election registrar. In the event that the election registrar releases data to another party (e.g., on demand to a requestor as required by policy or applicable law), there are few if any practical technical measures that the election registrar can take to ensure the subsequent security of the released data. Perhaps the only action that the election registrar can take is to ensure that the data released consist only of those data that are required to be released and no other data. Once the data leave the control of the election registrar, it is up to the recipient to enforce any relevant security measures. PRIVACy Distinct from security issues, privacy issues relate to policy regarding what information may be disclosed to which parties under what circumstances. Thus, a hypothetical law requiring that any reg - istered voter’s name and address (but not party affiliation or Social Security number) must be available 8 For example, consider the use of a full SSN to facilitate matching of records. An individual’s full SSN is usually regarded as sensitive information, but the use of a full SSN can greatly enhance the accuracy of performing matches. But encrypting the SSN creates a new but still unique identifier, which can itself be used as the match key without revealing the true SSN. 9 Immutable audit logs are further described in http://www.markle.org/downloadable_assets/nstf_IAL_020906.pdf.

OCR for page 87
 IMPROVING STATE VOTER REGISTRATION DATABASES Box D.2 Codes of Fair Information Practice Fair information practices are standards of practice required to ensure that entities that collect and use personal information provide adequate privacy protection for that information. As enunciated by the U.S. Federal Trade Commission (other formulations of fair information practices exist), the five principles of fair information practice include: • Notice and awareness. Secret record systems should not exist. Individuals whose personal informa- tion is collected should be given notice of a collector’s information practices before any personal informa- tion is collected and should be told that personal information is being collected about them. Without notice, an individual cannot make an informed decision as to whether and to what extent to disclose personal information. Notice should be given about the identity of the party collecting the data, how the data will be used and the potential recipients of the data, the nature of the data collected and the means by which it is collected, whether the individual may decline to provide the requested data and the consequences of a refusal to provide the requested information, and the steps taken by the collector to ensure the confiden- tiality, integrity, and quality of the data. • Choice and consent. Individuals should be able to choose how personal information collected from them may be used, and in particular how it can be used in ways that go beyond those necessary to complete a transaction at hand. Such secondary uses can be internal to the collector’s organization, or can result in the transfer of the information to third parties. Note that genuinely informed consent is a sine qua non for observation of this principle. Individuals who provide personal information under duress or threat of penalty have not provided informed consent—and individuals who provide personal information as a requirement for receiving necessary or desirable services from monopoly providers of services have not, either. 1 See http://www.ftc.gov/reports/privacy3/fairinfo.htm. without restriction to the public reflects a policy choice rather than a security issue. A security issue arises if an unauthorized party is able to gain access through the VRD to the voter’s Social Security number, which is supposed to be kept confidential. That said, technical measures to enhance security sometimes protect privacy as well. Some of the information in VRDs is, by law, public information, although the specifics of which data items can be regarded as public information vary from state to state. In addition, states often limit the purposes for which such information may be used. Nevertheless, the electronic availability of such information raises concerns about the privacy of that information, because electronic access greatly increases the ease with which it can be made available to anyone, including those who might abuse it. Many analysts of privacy issues point to fair information practices as a reasonable framework for privacy protection that balances privacy rights against user needs for personal information, and in the context of voter registration, the 2006 USACM report on statewide databases recommends the adoption of such practices as the basis for privacy policy regarding voter registration activities. 10 Fair information practices (FIPs) generally include notice to and awareness of individuals with personal information that such information is being collected, providing individuals with choices about how their personal information may be used, enabling individuals to review the data collected about them in a timely 10U.S. Public Policy Committee of the Association for Computing Machinery, Statewide Databases of Registered Voters: Study of Ac- curacy, Priacy, Usability, Security, and Reliability Issues, 2006, available at http://usacm.acm.org/usacm/PDF/VRD_report.pdf.

OCR for page 87
 APPENDIX D • Access and participation. Individuals should be able to review in a timely and inexpensive way the data collected about them, and to similarly contest that data’s accuracy and completeness. Thus, means should be available to correct errors, or at the very least, to append notes of explanation or challenges that would accompany subsequent distributions of this information. • Integrity and security. The personal information of individuals must be accurate and secure. To assure data integrity, collectors must take reasonable steps, such as using only reputable sources of data and cross- referencing data against multiple sources, providing consumer access to data, and destroying untimely data or converting it to anonymous form. To provide security, collectors must take both procedural and technical measures to protect against loss and the unauthorized access, destruction, use, or disclosure of the data. • Enforcement and redress. Enforcement mechanisms must exist to ensure that the fair information principles are observed in practice, and individuals must have redress mechanisms available to them if these principles are violated. For reference purposes, the original “Code of Fair Information Practices” promulgated in 972 by the Health, Education, and Welfare Advisory Committee on Automated Data Systems is provided below:2 The Code of Fair Information Practices is based on five principles: . There must be no personal data record-keeping systems whose very existence is secret. 2. There must be a way for a person to find out what information about the person is in a record and how it is used. 3. There must be a way for a person to prevent information about the person that was obtained for one purpose from being used or made available for other purposes without the person’s consent. 4. There must be a way for a person to correct or amend a record of identifiable information about the person. 5. Any organization creating, maintaining, using, or disseminating records of identifiable personal data must assure the reliability of the data for their intended use and must take precautions to prevent misuses of the data. 2 U.S. Department of Health, Education and Welfare, Secretary’s Advisory Committee on Automated Personal Data, Sys- tems, Records, Computers, and the Rights of Citizens, 1973. and inexpensive way and to contest the data’s accuracy and completeness, taking steps to ensure that the personal information about individuals is accurate and secure, and providing individuals with mechanisms for redress if these principles are violated. Box D.2 describes two versions of a code of fair information practices. In the context of government-operated voter registration systems, many tensions arise between these principles and the application of existing policy and law. For example, one of the thorniest issues regarding privacy is the tension it sometimes poses with transparency. In its starkest terms, maintaining privacy involves withholding from public view certain information associated with individuals, while transparency involves the maximum disclosure of information, even if such information is associated with individuals. As an illustration of how these tensions play out, consider a proposition regarding the public dis - closure of the reason(s) for removing specific individuals from voter registration lists. On one hand, the removal of a voter from a VRD is often associated with a stigmatizing condition, such as being a felon or being declared mentally incompetent. Those mistakenly removed from a VRD may experience adverse consequences from such association, and even if the removal is correctly performed, those individuals are still arguably entitled to some measure of privacy. Thus, a person balancing the scales in favor of privacy would argue that the reasons for removing individuals from the VRD should be kept confiden - tial, as they are in some states already. On the other hand, advocates of greater transparency argue that removals from a VRD should be

OCR for page 87
 IMPROVING STATE VOTER REGISTRATION DATABASES subject to public oversight in the same way that additions are. They point out that convictions and even arrest records are generally public, and thus argue that not disclosing reasons for removal from a VRD does not really protect the privacy of these individuals anyway. At the same time, they argue that associating reasons for removal with specific individuals is critical to determining the qualification of voters—and that statistical tabulations alone would not provide the detail needed to investigate indi - vidual errors that might indicate systemic problems. The committee noted significant value without much negative impact on privacy in statistical tabu - lations of the reasons for voters being dropped from a VRD and publication of such tabulations, as well as in personal and private notification of individual voters of the reason(s) for being dropped. But the different points of view described above were reflected in the committee, and thus the committee takes no position on the desirability or undesirability of the above proposition. Other privacy advocates have raised concerns about the widespread availability of complete voter registration information in the context of the physical security of battered men or women. Such indi - viduals have good reason to keep their addresses private, and might be apprehensive with good reason about the availability of their addresses to their batterers. A second concern relates to abuse of lists of validated addresses for commercial marketing purposes—many citizens would be upset to know that the information they provide to exercise their right to vote in a democracy is also being used for com - mercial purposes. Addressing such issues properly belongs to state policy makers, who can develop (and sometimes have developed) regulation and law to protect citizen interests—for example, some states allow only political parties to obtain voter registration lists. Another tension arises because some state laws also allow election officials to change voters addresses of record without their explicit consent (e.g., when the officials receive a notice of a forward - ing address). Finally, FIPs require that the personal information provided by individuals be used only for speci - fied purposes. But election officials rely on third parties to collect voter registration information, and they have no effective control over how those parties actually use the information they collect. And in some cases, election officials must release voter registration lists to political parties. The committee has no specific knowledge of whether third parties do in fact use voter registration information for their own purposes, but it recognizes the possibility of doing so as a potential conflict with implementing FIPs in a voter registration context. One way to guard against large-scale misuse of voter registration data (e.g., using voter data for commercial purposes after agreeing contractually to only use the data for political purposes) involves seeding the database before it is transferred with one or more record(s) that can be used to detect later misuse. For example, a seeded record may indicate that John Cue Smith is a registered voter, at the registered address of 123 Special Street in a town within the relevant jurisdiction. If a piece of mail later arrives for John Cue Smith at this address promoting the sale of tennis shoes, the possible misuse of the database may be worth investigating. A second set of privacy issues arises from matching and linking records. For example, voter registra - tion lists may be matched against a list of convicted felons. If a list of voters removed from the VRD is made public, those removed from the list improperly or removed for other reasons (that is, all nonfelons removed from the list) may be tainted by association in the public eye. Similarly, if a voter registration list is made public that indicates the source of an individual application, those who registered to vote at public assistance agencies might regard their privacy rights as having been violated. Although overt public disclosure would violate the NVRA, accidental disclosure through a security breach might have a similar result. This could in turn reduce the likelihood that people will seek out public assistance if seeking it will automatically place that information in a voter registration record that is publicly acces - sible. Alternatively, where registration is not automatic, it may reduce the number of individuals who take advantage of the ease of registering at the public assistance agency and thereby undercut the goal of the program.

OCR for page 87
 APPENDIX D A third set of privacy issues arises from insider access to the VRD. Insiders such as election officials could be expected to have access to the full set of information associated with any individual record, and possibly to some of the information in matched records existing in other databases. Although most election officials are trustworthy in this regard, a few might seek to use this access—improperly—for personal benefit or gain, and measures (such as immutable audit logs) are needed to deter and/or investigate such inappropriate insider access. A fourth set of issues arises in the context of transferring a VRD to another party en masse. Such a bulk transfer may occur, for example, when two VRDs must be compared to each other (e.g., for the purpose of identifying duplicate registrations between them), to judicial authorities for jury selection, to political parties, or to any other party in accordance with applicable law. Because bulk transfers—by definition—involve personal information on a very large scale, potential threats to privacy are magni - fied in such circumstances. For example, voters may well provide personal information for voter registration without knowing that such information may be used for other purposes. Even if such uses are entirely legal, it is still desir - able to protect voter privacy to the maximum extent consistent with law. Thus, voter registration records transferred for comparing VRDs should only include the records that need to be used or matched, i.e., active records, and the fields contained on each record should be limited to the fields necessary to per- form matching (such as name and date of birth but not party affiliation) and the voter’s state-assigned voter ID. (The latter is necessary because without such a pointer, a record cannot be recalled or updated and reconciliation audits become problematic.) Bulk transfers of data are also likely to persist in the absence of specific actions taken to decommis - sion (remove from service) the data involved. Persistence after the data have served the original purpose of the transfer increases the likelihood of unintended disclosure and/or repurposing inconsistent with the original reasons for bulk transfer. Lastly, bulk transfers of data—by definition—involve large quantities of data. Without specific knowledge of precisely what data have been transferred (i.e., a complete copy of what was transferred), it can be very difficult to determine who needs to be notified in the event that a problem arises (e.g., a data compromise). All too often, the only information kept regarding the bulk transfer are the selection criteria used to generate the data to be transferred and the number of records sent—given changes to the database in the intervening period, this information is almost certainly insufficient to reproduce the transferred dataset.