This imaginary future security checkpoint allows grounding many of the generic issues faced when deciding whether and how to introduce new information collection, fusion, and analysis systems and how to manage their potential impacts on civil liberties both in their specific implementation and in terms of general policies and legal frameworks.

E.1.3
Possible Privacy Impacts

The privacy impact of detection technologies can vary significantly depending on choices made during deployment. The committee suggests that future regulations should differentiate systems and deployments based on features that can significantly affect perceived privacy impact, including:

  • Which data features are collected. For example, when capturing images of baggage contents, the images might or might not be associated with the name or image of the passenger. Anonymous images of baggage, even if stored for future data mining, might be perceived as less invasive than baggage images associated with the owner.

  • Covertness of collection. Images of passengers might be collected covertly, without the awareness of the individual, throughout the airport, or alternatively with the passenger’s awareness and implicit consent at the security checkpoint. Many will consider the former to be more invasive of their privacy.

  • Data dissemination. Data might be collected and used only for local processing, or disseminated more widely. For example, images of bags and passengers might be used only locally, or disseminated widely in a nationwide data store accessible to many agencies.

  • Retention. Data might be required by regulations to be destroyed within a specified time interval, or kept forever.

  • Use. Data might be restricted to a particular use (e.g., anatomically revealing images of airport passengers might be available for the sole purpose of checking for hidden objects), or unrestricted for arbitrary future use. The perceived impact on privacy can be very different in the two cases.

  • Use by computer versus human. The data might be used (processed) by a computer, or alternatively by a human. For example, anatomically revealing images might be accessible only to a computer program that determines whether there is evidence of a hidden object under clothing. Alternatively, these images might be examined by the human security screener to manually search for hidden objects. The former case may be judged as less invasive by many passengers. Note that if a computer examination identifies a suspicious case, then a manual examination can be the next step. If the computer examination is sufficiently accurate, such



The National Academies | 500 Fifth St. N.W. | Washington, D.C. 20001
Copyright © National Academy of Sciences. All rights reserved.
Terms of Use and Privacy Statement