system useless for normal purposes. For example, an outside intruder may access a critical health information system not just to snoop on data but to insert a computer virus or Trojan horse that "crashes" the system at some later date or erases critical data files. Alternatively, an outsider could launch an e-mail attack in which a remote computer sends tens of thousands of e-mail messages in a very short time (e.g., an hour) to a given site, overwhelming the ability of the mail servers to process mail and rendering the system useless for ordinary e-mail purposes.
There are two basic approaches to countering organizational threats to the privacy and security of electronic health information: deterrence and imposition of obstacles. Deterrence seeks to prevent violations of policy by imposing sanctions on violators; these sanctions may include dismissal, civil liability, or criminal prosecution. Obstacles are erected to prevent violations of policy by making them hard to achieve. Practical systems adopt a mixture of the two approaches; thus, in physical security one may install a reasonably strong lock (an obstacle) and an alarm system (representing deterrence, because apprehension in the act of breaking in carries criminal sanctions).
Deterrence assumes that individuals who constitute a threat can be identified and subjected to such sanctions. Technical support for deterrence centers on mechanisms for identifying users and auditing their actions. Obstacles are most often used in situations in which the threat cannot be identified or it is not practical to impose sanctions, such as in the protection of military or diplomatic information. Technical supports for imposition of obstacles include mechanisms for making a priori determinations of authorized use and then taking active steps to prevent unauthorized acts.
Three factors inhibit organizational adoption of obstacles: (1) the direct cost of the mechanisms, such as access control tokens, and cryptographic devices; (2) the indirect cost of decreased efficiency and morale (e.g., the "hassle factor" of an additional inconvenience); and (3) the possibility that an obstacle may prevent necessary, legitimate access or use of data (e.g., in an emergency or some other situation not anticipated by the mechanism's designer). Deterrence mechanisms also entail costs, but these costs tend to be more indirect (e.g., personnel costs in educating users about the existence of penalties for abusing access privileges).
Specific countermeasures have to be developed for each of the five