The so-called “internet of things” enabled by RFID systems conceptually “make[s] it possible for computers to identify any object anywhere in the world instantly.”4 Such a vision holds tremendous promise in contexts such as inventory management and shipping and handling, as well as in hospital care, education, and safety monitoring. But, clearly, the promise is burdened by equally tremendous possibilities for misuse. The RFID Position Statement mentioned above (see footnote 2) lists five potential threats to privacy and civil liberties from the large-scale deployment of RFID technologies: hidden placement of tags, unique identifiers for all objects, massive data aggregation, hidden readers, and individual tracking and profiling.

As described previously, “Big Brother” scenarios in which commercial interests or government can track an individual’s every purchase and move by compiling vast quantities of minute data from electronic product codes within RFID tags are some time away from being realized. But possibilities for immediate misuse remain. Workshop participants argued that addressing social, ethical, legal, and cultural concerns is crucial for RFID technologies at every stage, including technological design and the development of industry standards, policy and regulation, and specific applications. Developing policy to incorporate social norms in emerging technologies is often a discouragingly long process in an industry that seems to move at lightning speed, almost always ahead of the policy discussions as well as ahead of the purview of government regulators. For that reason, this chapter on the societal and privacy concerns associated with RFID technologies necessarily takes a long view of the technology and its potential implications.

Particularly in the realms of social norms, ethics, and policy, RFID technologies confound simple discussion in a number of ways. First, the differences in the speed at which policy and technology develop forces policy into what would seem to be the realm of science fiction. If policy discussions are forced to make assumptions about future technological developments, policy may fail to fit appropriately with societal interests as they evolve along with the technology. Second, because of what is often referred to as “function creep,” technologies designed for one task are often adapted to accomplish another. Thus the stated purpose of any new system will be an incomplete description of that system’s eventual use. Third, two primary means exist for incorporating social goals (be they privacy, security, manageability, reliability, or usability) into a system. The two means are regulation and design. Each is very differently motivated into action. Fourth and finally, thus far the most articulate discussion of social goals related to RFID technologies centers on notions of privacy. Privacy, however, is not universally defined, nor is it a flexible enough concept to encompass all of the issues that must be taken into consideration as RFID-incorporating systems become prevalent.5 And, of course, privacy is far from a homogeneous or single-dimension concept.6 Not only must the complexity of privacy as a concept be recognized, but other (sometimes related) concerns should be explicitly articulated as well.

Now is a good time for a thoughtful consideration of societal, cultural, and ethical issues related to RFID systems. A brief workshop cannot do justice to the complexity of all these

4  

“The Internet of Things” was the title of a Forbes article in March of 2002 by Chana R. Schoenberger <http://www.forbes.com/global/2002/0318/092.html>. Accessed December 14, 2004. (The quote is taken from <http://archive.epcglobalinc.org/aboutthecenter.asp>, accessed December 14, 2004.) The AutoID center has been incorporated into EPCglobal, and archives of the former organization should be available at <http://www.epcglobalinc.org/>, accessed December 14, 2004.

5  

One thing that makes “privacy” particularly challenging is that it has a weak feedback loop—it is not always immediately obvious when privacy has been affected or violated.

6  

Individual thresholds vary with respect to privacy—what one person might consider deeply private, another might casually disclose. Moreover, some argue that privacy encompasses more than merely the concealment or revelation of information, also being connected to autonomy and trust. Privacy is also deeply tied to context—behavior in one circumstance may be considered much more acceptable from a privacy standpoint than that same behavior in another situation.



The National Academies | 500 Fifth St. N.W. | Washington, D.C. 20001
Copyright © National Academy of Sciences. All rights reserved.
Terms of Use and Privacy Statement