Health Applications of the Internet
Many health-related processes stand to be reshaped by the Internet. In clinical settings, the Internet enables care providers to gain rapid access to information that can aid in the diagnosis of health conditions or the development of suitable treatment plans. It can make patient records, test results, and practice guidelines accessible from the examination room. It can also allow care providers to consult with each other electronically to discuss treatment plans or operative procedures. At the same time, the Internet supports a shift toward more patient-centered care, enabling consumers to gather health-related information themselves; to communicate with care providers, health plan administrators, and other consumers electronically; and even to receive care in the home. The Internet can also support numerous health-related activities beyond the direct provision of care. By supporting financial and administrative transactions, public health surveillance, professional education, and biomedical research, the Internet can streamline the administrative overhead associated with health care, improve the health of the nation's population, better train health care providers, and lead to new insights into the nature of disease.
The capability of the Internet to support these applications depends on whether the relevant technical needs are met and whether the operational aspects of the systems involved are understood and manageable. As with any information technology system, the technical requirements depend heavily on the specific characteristics of the individual systemsthe number of anticipated users, degree of real-time interaction desired, number of simultaneous sessions that must be supported, and so on.break
Many of these factors, in turn, are influenced by considerations other than network performance. These include organizational competencies, changing preferences and expectations of consumers and care providers, reimbursement policies for different health services, availability of complementary technologies, and laws. The confluence of so many factors confounds attempts to predict viable future applications of the Internet in the health sector.
This chapter presents a broad overview of the types of applications that the Internet can support in consumer health, clinical care, financial and administrative transactions, public health, health professional education, and biomedical research. It draws on a series of site visits by the committee (these visits are summarized in Appendix A) and other briefings to the committee to examine applications that have been deployed and that are still in the early stages of conceptualization. The chapter attempts to assess the technical capabilities demanded of the Internet in terms of bandwidth, latency, security, availability, and ubiquity (as defined in Chapter 1). Specific technical information is presented where possible, but because of the nascent nature of many Internet applications in the health sector, often the most that can be offered is a qualitative assessment. Accordingly, a ranking scale is used to assess the importance of each technical dimension to each class of applications. These dimensions are ranked on a scale of one to four, with one plus sign (+) indicating little importance relative to the other dimensions and four plus sings (++++) signifying the most importance. The chapter also identifies organizational- and policy-level issues that will influence the way the Internet is deployed in different health applications and notes, where applicable, other technologies that must be developed to make certain applications feasible. Specific technical, organizational, and policy issues are addressed in subsequent chapters of the report.
Consumer health is one of the areas that could be most dramatically reshaped by the Internet. Consumer health refers to a set of activities aimed at giving consumers a more pronounced role in their own health and health care, ranging from the development of tools for self-assessment of health risks and management of chronic diseases, to home-based monitoring of health status and delivery of care. This area is similar to public health (discussed later in this chapter) in that it aims to provide consumers with the information and tools needed to improve their health, but it is less concerned with the detection of regional outbreaks of disease and is not part of government-based reporting structures. The Internet could become a significant enabler of consumer health initiatives in that it pro-soft
vides an increasingly accessible communications channel for a growing segment of the population. Moreover, in comparison to televisionalso a widely available medium for reaching consumersthe Internet offers greater interactivity and better tailoring of information to individual needs. These capabilities may lead to significant changes in consumer behavior (e.g., cessation of smoking, changes in diet) that could greatly improve health.
Ongoing trends in health care are likely to reinforce the shift toward consumer-oriented health information. Since the mid-1960s, patients have been encouraged to take a more active role in their own health care, and care providers have recognized the value of engaging patients to participate more meaningfully in their own care. Furthermore, attempts by care providers and managed care plans to streamline services and cut costs have shortened hospital stays, increasing the need for patients and their families to understand how to provide care themselves. Greater emphasis is being placed on preventive care, which requires consumers to understand health risks and the effects of different behaviors (such as smoking and dietary habits) on their health. These trends heighten the need for consumers to have access to reliable health information and open channels of communication to care providers and other health professionals.
Consumer health initiatives that rely on the Internet reflect, and could even drive, significant changes in the structure of the health care industry. Concurrent with changes in the economics of the health care delivery system, the duration of a medical consultation is steadily declining, and the availability of practitioners for substantive discussions between visits is decreasing. Continuity of care is increasingly disrupted as patients change care providers in response to changes in their health insurance plans. These trends favor consumers who are well informed and autonomous. Consumer health initiatives attempt to involve patients more actively in care-related decision making and enable them to exercise greater control over their health. Indeed, the Internet could change the culture of health care from one in which patients are viewed as recipients of care to one in which they are partners in care. Eventually, they may be able to use the Internet to access and update their personal medical records or receive care in their homes.
Consumer-Oriented Health Web Sites
Over the past few years, leading providers of health information have identified the Internet as an effective medium for reaching large numbers of health consumers. The most visible aspect of this recognition is the explosion of Web sites geared to consumer health issues (Table 2.1). These sites are dedicated to the diagnosis and management of diseases, thecontinue
promotion of various healthy lifestyles, and interventions to prevent the onset of disease. The formats range from mailing lists to interactive Web sites, chat sessions, or compilations of online resources. One recent survey suggested that consumers use these sites to gather information on diseases, medications, and nutrition, as well as to find care providers or participate in support groups (Table 2.2).break
The network capabilities required by consumer health Web sites are not especially demanding today, but the requirements could grow over time. Most sites offer text and limited graphics, which do not require significant bandwidth, but the availability of greater bandwidthespecially in the local loopcould enable the design of more sophisticated sites offering educational videos for downloading over the Internet. Security requirements are also minimal because personal health information is generally not exchanged on these sites. Protection is needed for financial transactions related to the purchase of health products, but this requirement is no different than that for other e-commerce applications. Similarly, consumer health Web sites do not demand exceptional reliability because they are unlikely to be used for applications in which lives are at stake. However, consumer health Web sites may drive the need for improved privacy-enhancing technologies. The information sought by consumers on the Internet, and the purchases they make, can reveal much about personal health concerns and problems. To prevent organizations from compiling profiles of their health concerns, consumers may demand greater anonymity in their Web browsing and purchasing and tighter restrictions on the ways in which organizations can use information about their habits.
A larger issue is the need for tools to help consumers find information of interest and evaluate its quality. The sheer volume of health information available on the Internet can be overwhelming. For example, a simple Web search for "diabetes mellitus" can return more than 40,000 Web pages,1 and some 61,000 Web sites contain information on breast cancer (Boodman, 1999). To sort through this volume of material, consumers need effective searching and filtering tools that can identify and rank information according to their needs and capabilities and present it in a form that they can understand, regardless of educational and cultural background. Consumers also need a way to judge the quality, authoritativeness, and provenance of the information. The Internet enables anyonecontinue
to publish information, so filtering and credentialing become more important. A recent study found that 6 percent of the 400 sites containing information on a form of cancer called Ewing's sarcoma contained erroneous information, and many more were misleading. Sites contained different (and often incorrect) estimates of basic information such as survival rates (Biermann et al., 1999).
Several initiatives are already under way to evaluate the quality of health information on the Internet. The Department of Health and Human Services' Scientific Panel on Interactive Health Communication calls for disclosure statements on Web sites to make it easy for consumers to evaluate the source and authority of information resources (SCIPICH, 1999). Other efforts focus on systems for classifying health Web sites according to metrics such as accuracy, timeliness, completeness, and clarity.2 With these evaluations, standard search engines could provide consumers with a measure of trust in the information they are retrievingat least to the degree that they trust the organization performing the content labeling. The World Wide Web Consortium, for example, has created a system called the Platform for Internet Content Selection (PICS), which can help users control the types of information retrieved from the Internet.3 To accommodate different perspectives on health and health care (e.g., alternative as opposed to traditional medicine), a wide variety of organizations could rate health Web sites. Additional research may suggest ways of automating the evaluation process, perhaps using metrics such as the number of pointers to, or users of, a given site as indicators of the site's effectiveness (as some search and referral engines are currently doing). Technology could also be used to help prevent alterations of the site's rating to assure consumers that an evaluation was indeed performed by the stated third party. This function requires cryptographic authentication technologies that are currently available but have not yet been widely deployed for this purpose.
E-mail between Patients and Providers
The Internet can also be used to facilitate electronic communications between patients and care providers, typically in the form of electronic mail (e-mail). To date, e-mail has been used only sporadically between patients and providers, but it is of growing interest. It could prove to be an effective mechanism for improving care and lowering costs because more frequent communications might enable better tracking of a patient's progress or eliminate the need for an office visit. This premise has yet to be tested rigorously in clinical settings, and a number of technical and nontechnical issues need to be resolved (Mandl et al., 1998).break
Bandwidth and availability are not issues in the near term because most messages currently consist of text only and are not used for time-critical communications. The most pressing technical issue is security. Most e-mail exchanges between patient and provider involve discussions of personal health information, which must be suitably protected from breaches of confidentiality and, to a lesser extent, alteration. Most e-mail is not encrypted during either transmission or storage, and its point of origin is not authenticated. It is therefore much easier to forge an e-mail message than a clinician's note or telephone call.
Several approaches are available for improving the security of e-mail exchanges. Secure Sockets Layer (SSL) encryption, which is commonly used to encrypt e-commerce transactions (see Chapter 3 for a description of the technology), can be used to protect communications between a user's personal computer and the electronic mail server. Other protocols, such as Pretty Good Privacy (also described in Chapter 3), can be used to protect communications as they move across the network between the sender and recipient. User authentication can be enhanced through the use of nontrivial user names and passwords or more secure forms of authentication, such as those based on public key encryption (also described in Chapter 3).
The more daunting barriers to patient-provider e-mail are institutional policies for confidentiality and for integrating e-mail into work flows. Most e-mail systems are without even the most basic protection of the confidentiality of message contents. Mail received at the place of work is, by law, fully accessible to the employer. One study showed that patients are hesitant to use e-mail from work to communicate about their health for fear that employers or insurance companies might use the information in ways that affect them personally (Fridsma et al., 1994). To avoid the risk of having messages discoverable at a place of work or other sensitive locations, individuals can store their e-mail files on the server of a trusted third party and/or encrypt messages for storage, but rules regarding disclosure still need to be developed.
Health care organizations are also concerned that e-mail might overload care providers with yet another task in the context of increased clinical and administrative burdens. There are related concerns about the liability of providers if, for example, they miss a subtle but (in retrospect) irrefutable and important question or comment in a patient's electronic note. Many organizations have yet to establish policies regarding the quality of service, such as a maximum time to respond or even acknowledge receipt, that patients can expect from e-mail with providers. Another important concern is economic: there are currently no mechanisms for paying providers for what could be as taxing or time-consuming a clinicalcontinue
activity as any in-person clinical visit. Furthermore, no policies and procedures have been developed for incorporating e-mail into electronic patient records. As a consequence, decisions made on the basis of e-mail information are at risk of having no documented basis in the record.
Safe and effective use of e-mail for clinical discussions between patients and providers will require the development of policies to govern its use. These policies will need to address issues of confidentiality, data integrity, authentication, timeliness, and the appropriateness of the use of e-mail for different kinds of discussions. In some cases, telephone or face-to-face conversations may be considered a more appropriate form of communication. These policies will need to be articulated to all consumers and also embodied in the e-mail user interfaces so that health care consumers can have realistic expectations about the use and safety of clinical e-mail.
Online Health Records
The Internet is emerging as a medium for giving consumers direct access to their personal health records. Historically, care providers have maintained voluminous records of patient encounters within their organizations, documenting dates and times of consultations, diagnoses, lab results, prescriptions, and more. These records are maintained and largely controlled by care providers, although patients have the right, in some states, to review their records and propose amendments as necessary. In the past two years, however, a number of new Web sites have begun to allow consumers to store their own health records online.4 The potential benefits of these sites are many. With them, consumers can create comprehensive, longitudinal records that capture information about the care received from different organizations over an extended period of time. Consumers can use these records to help monitor and evaluate their health status, and they can grant access, if they wish, to different providers for purposes of care. Many sites provide some sort of override feature that enables care providers to gain access to a patient's records in an emergency situationsomething that is much more difficult to do if the records are not stored online.5
Like e-mail used for clinical purposes, Web-based medical records require considerable attention to security to minimize the risks of inappropriate disclosure. Personal medical records must be protected against inappropriate disclosure, both to outsiders who attempt to break into the system and to those who operate and maintain the Web sites. Most existing services use SSL encryption to protect data communications between users and the host Web site and a combination of user names and passwords (transmitted securely over the Internet) to authenticate end users.break
Systems operating with user identification and passwords can provide reasonablybut not fullysecure access to many types of applications. If online records become more widely used in the provision of care, then it may be advisable to enhance the robustness of user authentication, perhaps with public key encryption systems and user certificates (see Chapter 3). The PCASSO system being developed by Science Applications International Corp. (SAIC) and the University of California at San Diego, for example, uses public key encryption and a challenge-response token, as well as a password, to protect patient information at a far higher level than is possible with SSL.6
Other technical requirements will be modest in the near future unless online patient records become more complex and more widely used in the provision of care. At present, most online medical records consist primarily of text and demand little bandwidth for fairly rapid downloading. If such records begin to include medical images (e.g., X rays, computed tomography (CT) scans, and mammograms), then much higher bandwidth would be needed for timely downloading (see the section on medical images below). Similarly, reliability requirements are not high because online records are still supplements to, as opposed to replacements for, the records maintained by provider organizations; an inability to access an online record is unlikely to interfere with the provision of care. If online records become more widely used and more complete than providers' records, then reliability could become more of a concern. Scalability is not an issue, either, because records are not needed simultaneously by multiple users.
Ubiquity of access to the Internet is a significant consideration in the development of online medical records because it would ensure that all consumers could keep such records and that those records could be accessible from a large number of unpredictable locations, such as a consumer's home or office, a care provider's office, or an ambulance responding to an emergency. A number of business and policy issues need to be resolved as well. Organizations that store online health records will need to develop policies that balance the need for privacy and security against the need for ready access to records by patients and eventually by care providers and perhaps insurance companies, researchers, and others. Rules may also be needed to govern organizations' use of the online records they maintain. Under what conditions will they be able to provide consumers with recommendations about necessary medical tests or possible drug interactions? To what extent should they be allowed to mine patient records for information that might lead to direct marketing efforts? Under what circumstances should records be made available to public health agencies and researchers?break
Patient Monitoring and Home Care
The Internet offers the opportunity for improved monitoring of consumer health and, potentially, provision of in-home care through video-based consultations with care providers (discussed in the Clinical Care section, below) and control of medical equipment (e.g., pacemakers and dosimeters) deployed in the home. The goals of such activities are to assist in the early detection of potential health problems, ranging from heart attacks to congestive heart failure and diabetes, and to reduce the need for clinical intervention and costly hospital stays.7 Remote consultations to the home may be most useful for monitoring patients with ailments such as congestive heart failure and end-stage liver disease. These applications do not require video imagery; the provider simply listens to heart and lungs, taking vital signs and pulse oximetry. In-home care is consistent with existing trends in the health care industry. Since 1975, the number of home health agencies has grown from 2,300 to almost 8,500, while the number of hospital beds per 1,000 enrollees has declined from 51 to 28.8 Similarly, the number of patients receiving home care nearly tripled between 1982 and 1994. These trends reflect, in part, attempts by health insurers and health management organizations to reduce the costs of care associated with long hospital stays.9
To date, few attempts have been made to monitor patients at home. Most efforts have focused on chronic conditions, such as diabetes, asthma, and congestive heart failure, for which well-established protocols exist for home care. The devices used for monitoring are minimally modified copies of devices used in hospitals. Little effort has been made to develop or distribute small devices that mimic the functionality of much larger hospital counterparts with automated quality control and calibration and remote polling and configuration by authorized care providers. Almost none of these devices is as portable or easy to use as a standard pager. In part because of these limitations, home monitoring has not grown as much in popularity as have consumer information on the Web and patient-provider e-mail.
In January 2000, however, Medtronic Inc. announced plans to work with IBM Corp. and Microsoft Corp. to develop a system that will enable heart patients with implanted pacemakers, defibrillators, and experimental cardiac-pacing and -monitoring devices to transmit cardiac data over the Internet to their cardiologists. Eventually, care providers may be able program the devices over a secure Internet connection without requiring patients to visit their offices. Developers of the system posit that it will result in fewer office visits and hospitalizations, thereby lowering costs while improving patient monitoring and care, but a means of charging for the monitoring service has not yet been devised. Medtronic hopes that itscontinue
secure Internet system will find utility beyond cardiac patients, perhaps allowing patients with implanted drug pumps to have their doctors change the drug regimen remotely over the Internet (Burton, 2000).
Continued advances in computing and communications technologies could enable more widespread deployment of home-based health monitoring systems. For more than two decades, the feasible density of transistors on an integrated circuit has been increasing by a factor of 10 every 7 years. Memory densities have increased even faster, gaining an order of magnitude every 6 years. As a result, medical devices such as stethoscopes, glucometers, and electrocardiogram monitors already can be equipped to support Internet connections and deployed to consumers at low cost. Over time, computing and communications capabilities will probably be incorporated into a number of other devices that could serve as sources of health information, whether bathroom scales or exercise equipment. If a house is networked, then it would be possible to use a personal computer to connect and control a number of medical monitoring devices. Although the number of homes with conventional local area networks (LANs) is small (mainly because of the high cost of wiring a house appropriately and the disruption involved), Ethernet-like connectivity can be provided to any room in a house through devices that are either wireless or attached to the existing telephone or electric wiring.
Indeed, advances in microelectromechanical systems (MEMS) devices, combined with those forecast in microelectronics, biosensors, and biomaterials, could lead to revolutionary changes in therapies, delivery of medication, and monitoring and alerting systems for the elderly and those with chronic conditions. Devices already on the market, such as pacemakers, wireless stethoscopes, and blood sugar monitors, could be augmented with networking capabilities. High-resolution digital video cameras that are acquired by consumers for recreational or other purposes might become useful in health care applications.
Home-based monitoring is unlikely to require high-bandwidth connections from homes to the Internet because individual messages tend to be small. In demonstration projects, however, investigators have had to work hard to ensure that all participating patients had uninterrupted access to even modest bandwidth, often contracting with the local cable or telephone company to hook up a specific home. The installations, connectivity, and subsequent support costs have accounted for a large portion of the cost of the monitoring efforts. Bandwidth is a more significant issue for provider organizations, which will need to ensure that their facilities can handle the aggregate load of monitoring numerous devices (e.g., if hundreds of thousands of patients with congestive heart failure are monitored at home). At this point, it is difficult to estimate the aggregate bandwidth needed by providers of monitoring services because it iscontinue
not clear how many patients would be monitored simultaneously or by the same server. The load on the network might be reduced if monitoring hardware reported only summary data and any anomalies detected, unless detailed raw data were requested. Home-based monitoring would require high reliability to ensure that data can be regularly and routinely transmitted and high levels of security to prevent alteration of data as they transit the network.
Other factors are equally or more important to the evolution of home-based monitoring. Even modest monitoring efforts will not be effective unless mechanisms are deployed to enable care providers to review the monitored data, identify worrisome outliers, and respond in a timely way. The need for oversight of such large numbers of patients at home could result in the emergence of a new category of ancillary health professionals. Furthermore, the effective use of such large amounts of monitored data will require automated data reduction and intelligent data analysis techniques. For some populations (e.g., patients with diabetes or congestive heart failure), this approach could enable fine-grained medical oversight that could result in improved short- and long-term outcomes. But, if used inappropriately, it could also afford vast opportunities for unnecessary and unwanted intrusions into the privacy of all health care consumers.
The benefits of home monitoring cannot be fully realized unless reimbursement is provided for virtual home visits and remote monitoring. In addition, policies for protecting the confidentiality of data gathered in this way will have to acquire the force of law if abuses are to be prevented. Even the strongest cryptographic methods cannot prevent the subversion of a system by parties with strong financial interests in breaching patient data confidentiality. The challenges that must be overcome to provide this level of surveillance appear to be more nontechnical than technical, and they include issues of organizational structure and reimbursement rather than networking capabilities.
Beyond the use of the Internet for home monitoring is the possibility of using it to modify home medical devices remotely. After a remote consultation or review of home monitoring data, a care provider might, for example, want to change the setting of a threshold on a patient's pacemaker, alter the parameters for a programmable insulin pump, or increase the dose delivered by an infusion pump for an oncology patient. Such capabilities are already used to control spacecraft and other remote equipment and could have a large impact on health care, especially in rural environments. Although remote control of such equipment will be unnecessary (or unnecessarily paternalistic) for some patients, it might be appealing in cases involving disabilities or simply for the sake of convenience.break
The control of remote medical equipment would pose a number of challenges for the Internetor any other control network. Although bandwidth requirements would be minimal because the commands would likely consist of short messages, the requirements for security and availability would be extremely high. Data would need to be protected from intentional and unintentional corruption to ensure that commands are transmitted as intended. High levels of authentication would be needed on both ends of the connection to ensure that the appropriate equipment is being manipulated and that only authorized personnel send modifications. The network would have to be protected from denial-of-service attacks that could prevent the receipt of update information.
Technical Requirements for Consumer Health Applications
The technical capabilities needed to support consumer health applications of the Internet are modest, largely because the systems developed to date have had to rely on the existing Internet infrastructure. Early experimentation with more advanced systems that provide real-time video connections between care providers and patients (or their parents) at home demonstrates the increased demands that consumer health could place on networking resources. The discussion below reviews the technical needs for consumer health applications with respect to bandwidth, latency, availability, security, and degree of access. As noted at the beginning of this chapter, the importance of each capability is indicated on a four-point scale, with one plus sign (+) indicating limited needs and four plus signs (++++) signifying an important need.
Consumer health applications vary considerably in the bandwidth they demand. The retrieval of information from health-related Web sites demands little bandwidth on the consumer end, but the potentially large volume of requests made of any particular site could drive up the aggregate bandwidth requirement on the information provider's side. Access to patient health records could demand somewhat greater bandwidth than is typically available today or significantly greater if records include enhanced content, such as medical images or videotapes of telemedicine consultations.
In general, applications that support consumer health do not require the instantaneous delivery of information, so the latency requirements ofcontinue
the Internet are not great. In some patient-monitoring applications, time-liness is a concern, but delays of a few seconds would not threaten a patient's well-being. Latency could become more of an issue if online medical records became the norm and care provider organizations needed timely access to them for purposes of treating patients. In many instances, however, records could be uploaded from remote sites in advance of scheduled appointments, and latency would be a significant issue only in emergency situations.
The need for network availability differs significantly among consumer health applications. The Internet is already sufficiently available for the distribution of health information to consumers and for exchanges of e-mail between patients and providers. Somewhat greater availability would be needed for remote monitoring and remote control operations, although most home monitoring devices and medical equipment could be designed to buffer enough data to overcome short lapses of connectivity. Home monitoring and control will not become commonplace, however, until providers (and consumers) of such services receive guarantees that lengthy network outages will occur very infrequently.
Many consumer health applications demand high levels of security. Although this is generally not an issue with respect to the downloading of health information from consumer Web sites, access to online patient records demands confidentiality because such records contain personal information. The same is true for e-mail messages between patients and providers that contain personal health information. Data from remote patient monitoring devices also require security to prevent corruption (intentional or unintentional) during transit across the network or after storage. As described in greater detail in Chapter 3, both technological and administrative solutions are required to secure these types of consumer health information. For example, authentication technologies are needed to validate the identities of those requesting and transmitting data. Effective controls are needed to prevent users from accessing information about other consumers. Encryption technologies are needed to protect the confidentiality of data transmitted across the network and ensure its integrity. Policies will be needed to determine who can have access to consumer health information and under what conditions. Security requirements will grow as consumers use the Internet to store, retrieve, and update their personal health records.break
Consumer health applications also raise the issue of online anonymity. Searches for online information can reveal a lot about consumers' health concerns, as can their online purchases of prescription and nonprescription pharmaceuticals. Given the sensitivity of some of these conditions, the demand for anonymous Web browsing and even anonymous e-commerce could grow. Consumers may also demand greater anonymity in e-mail to online physician services offered by some consumer Web sites. Whether anonymity is desirable from a social perspectiveand under what circumstances (e.g., anonymous Web browsing may be more plausible than anonymous e-commerce)is an issue for continued debate and discussion.
Key to the success of consumer health applications is widespread access to the Internet. As noted above, many consumer applications currently demand only moderate bandwidth and latency, meaning that standard modem access to the Internet, at 28.8 to 56 kilobits per second (kbps), may suffice. Additional bandwidth could be needed if online access to health records and downloading of educational videos become more popular and widespread and if online health records grow to include not just text but medical images and perhaps even videos. As discussed in the next section (Clinical Care), remote medical consultations to the home over the Internet could require bandwidth of 128 kbps or more in both directionsif such applications prove technically feasible and economically viable. The larger issue may be that of ensuring equitable access to health resources by different demographic groups. There are already considerable differences in access to health care in the United States; ensuring that differential access to the Internet along demographic lines does not exacerbate this imbalance could become an increasingly important issue, especially if the provision of health care moves online.
The Internet offers several avenues for augmenting the health care services in clinical settings. Remote video consultation, for example, could give consumers greater access to skilled health professionals regardless of geographic proximity. The use of the Internet to transfer medical images to expert interpreters could accelerate and improve the diagnostic process as well as reduce costs. Virtual reality tools could help surgeons plan medical procedures and improve their use of information during procedures. The use of the Internet to access and assemble health records could give a provider improved information for treatment purposes, regardlesscontinue
of whether the patient is a regular client or a stranger. Each of these applications poses a range of technical challenges for networking researchers and other information technologists. In most cases, the applications have not yet been demonstrated on a scale sufficient to determine their medical efficacy or influence on costs of care. As the discussion below demonstrates, the use of the Internet in clinical care will be influenced by a range of technical, organizational, and policy issues.
Remote medical consultation has long been pursued as a means of overcoming the unequal distribution of clinical expertise. It is a method of offering expert consultations to patients in remote rural areas, for example, or underserved urban areas or prisons. Even where clinical expertise is available, but inconvenient for either the patient or the provider, remote medical consultations may be a cost-effective alternative to staffing multiple clinics with subspecialists. Remote consultations may also be useful to specialized service organizations that attempt to establish economies of scale for particular types of clinical service, such as the interpretation of radiological images (e.g., CT and magnetic resonance images), while also developing more effective bargaining units for health care contracting. These organizations, which are becoming more numerous, can benefit insofar as their reach is extended beyond their immediate geographical area, allowing them to serve a broader pool of consumers.
The network performance required for remote consultation is variable and depends on a number of factors, including (1) the resolution required in the transmitted signal or image to support diagnosis, (2) the timeliness with which data must be received and interpreted (e.g., whether the system is used for real-time consultation or asynchronous review), (3) the degree to which the data may be compressed, (4) whether the entire data set must be transferred or application-specific decisions can be made about which subsets to transmit, and (5) whether the transmission can be considered only on a point-to-point basis or as part of aggregate traffic. These factors vary significantly across different applications and operating modes. For example, psychiatric evaluations may be viable with video that has lower resolution than a cineo-angiogram, but the application needs to operate in real time rather than in a store-and-forward mode for review at a later time.
No conclusive studies have been done regarding the bandwidth needed for different applications; the results of research on this issue generally depend on the provider involved and the study structure. However, reasonable guidelines can be gleaned from experiments conducted to date. Practitioners at East Carolina University (ECU) in Greenville,continue
North Carolina, for example, have considerable experience with remote consultations, having conducted about 3,000 real-time consultations in 31 different specialties since establishing a telemedicine program in 1991 (see Appendix A for more information on the ECU program). The five most active specialty areas have been dermatology; cardiology; neurology; gastroenterology; and allergy, asthma, and immunology. Practitioners have found that the bandwidth needed for most real-time, video-based consultations varies from 128 kbps to 384 kbps, depending on the degree of resolution needed for diagnosis and the rate of motion in the video (Table 2.3).10
For some procedures, such as cineo-angiograms, echocardiograms, and gait analysis (Box 2.1), more bandwidth can be advantageous. Cineo-angiograms, for example, can be transmitted at 384 kbps, but 768 kbps produces better results. Cineo-angiograms are generally not performed in real time (because the source is film); therefore, they can be done in a store-and-forward mode, with bandwidth needs determined by the number of cases to be examined on a given day and the desired turnaround time. Adult echocardiograms are often done in real time and can be read adequately at 384 kbps, but pediatric cardiograms may require 768 kbps because the area being observed is so small. Extensive testing with echocardiography indicates that data rates in excess of 1.5 megabits per second (Mbps) probably do not contribute to an increase in clinical efficacy, but further investigation is under way.11 For remote analysis of acontinue
patient's gait, bandwidth of up to 2.5 Mbps may be necessary, but this application has not been extensively evaluated.12
For the most part, remote consultation programs rely on dedicated networksnot the Internetto provide connectivity between remote clinics and a centralized consulting facility. The ECU program, for example, uses an amalgam of microwave links, T1 lines, telephone lines, and integrated services digital networks (ISDN; see Chapter 3) for a variety of applications. The ECU program and other experimental programs, such as the National Laboratory for the Study of Rural Telemedicine at the University of Iowa, also make use of statewide fiber-optic networks for connectivity between some sites.13 Although costly, dedicated lines have been viewed as the most effective means of guaranteeing access to adequate bandwidth as needed. The Internet does not yet offer the quality of service needed for real-time video consultations. Some organizations, including ECU, have begun to shift their systems to the IP in anticipation of connections to the Next Generation Internet, but they will continue to rely on dedicated links until a more viable Internet-based infrastructure is available.
Continued advances in telecommunications infrastructure could causecontinue
remote consultations to become less the province of a few sites equipped with specialized telemedicine rooms and more a routine component of the services offered by health plans. A number of integrated health care delivery systems have begun to experiment with remote consultations (typically over leased lines) to provide specialty services in outlying areas. If the Internet could support such capabilities, then remote consultation could become more common, perhaps even extending beyond regional boundaries. Indeed, an enhanced Internet could help extend teleconferencing to the home, enabling consumers to ask for videoconferences with care providers whenever health problems require immediate attention. Such capabilities could dramatically transform health care by eliminating many office visits.
Regardless of whether the patient is at home or somewhere else, remote consultations require sustained bandwidth in two directionsfrom the patient to the provider and vice versa. This stands in contrast to many high-bandwidth applications, such as entertainment, education, or scientific visualization, in which sustained access to high bandwidth is needed in one direction only, from a centralized distributor of content to a recipient. Clearly, the need for high bandwidth in two directions is not unique to health care; many businesses need bidirectional bandwidth to support collaborations between workers in different locations. However, remote medical consultations to the home demand that such capabilities be available from many locations, not just from corporate offices that might already lease a high-bandwidth access line to the Internet or a private, corporate network. The most likely users of remote medical consultations would be primary care providers and patients living in rural or remote areas, many of whom have limited access to high-bandwidth Internet connections (see Chapter 3).
The future of remote consultations will be influenced by a number of factors beyond network technology. Other technical challenges arise from the need for appropriate data acquisition devices to digitize the observations involved in the consultation. Although many medical devices have been instrumented to allow for remote control and data acquisition, few of them have achieved mass market acceptance. As a result, such devices tend to be confined to a few specially equipped rooms in institutions supporting remote medical consultation. Many more of the devices used as part of routine physical exams could be modified for data acquisition and control. They could then be deployed in patients' homes to facilitate home-based consultations with patients whose diseases require intensive monitoring and oversight. In the near term, these devices might be individually configured to work with a home computer and to relay information to a remote care provider over the Internet. Remote care providers might even be able to exercise some control over these devices, whethercontinue
adjusting their sensitivity or other operating parameters. Eventually, the devices could be designed to connect automatically to the Internet and be configured by a remote Web browser. Initially, this might be cost-effective for only small, high-risk populations, but remote consultations to the home could become more popular as the technology continues to evolve and costs decline.
Beyond technical challenges, a number of organizational and policy issues need to be resolved if remote consultations are to become more viable in the future. Health care organizations need to develop viable business models for remote consultations that meet the needs of different users. Will remote consultations bring in income directly, or will they generate revenues indirectly by channeling patients into a provider's health care system? East Carolina University, for example, has developed at least five different business models for its services, but only oneproviding services to prison inmateshas proved profitable. The others typically operate with grants from federal agencies or are seen as experiments to broaden the reach of local provider organizations. Profitability is currently constrained by the fact that many health plans, including Medicare, do not yet routinely pay for remote consultations, although experiments are under way to examine alternative repayment schemes (see Chapter 5). Other issues, such as state-based licensure of health professionals, impede attempts to deliver remote consultations across state lines.
Closely related to the provision of remote medical consultations is the use of communications networks to transfer still medical images. This capability could enable care providers to retrieve digital images from an online repository (often referred to as a picture archiving and communications system, or PACS), send images to specialists for interpretation (a form of remote consultation), or receive information from emergency vehicles responding to a call. The potential benefits of such systems could include the following:
• Improved management and use of medical images (i.e., reduced probability that images will be lost or incorrectly filed). Physicians at the University of California at San Francisco (UCSF), for example, noted during the committee's site visit (see Appendix A) that before establishing their PACS, 15 to 20 percent of radiographic images were lost and hundreds went unread.
• Improved quality of care through expert interpretation. Five university medical centers, including UCSF, have established an expert radiographiccontinue
interpretation center, Telequest, which accepts images from a variety of clients, including rural clinics, and provides transcribed diagnoses. This experimental program makes expert interpretation more widely available throughout the country.
• Reductions in the cost of radiological interpretation. Centralization of expertise could reduce health care costs by obviating the need for individual provider organizations to maintain more expert radiologists than they can keep busy.
From a networking perspective, the challenge in remote imaging (sometimes called tele-imaging) is the size of medical images. Typical uncompressed radiographic files range from about 25 kilobytes (kB) for a nuclear medicine image to 50 megabytes (MB) for digitized mammograms, but multiple images often are needed, either to provide a complete view of the object of interest from different angles or to compare various images. Hence, the size of an uncompressed radiographic study can range from 1 to 2 MB for a nuclear medicine study to almost 200 MB for digitized mammograms (Table 2.4). The size of these studies is expected to grow as imaging technology advances; image resolution is expected to improve by a factor of 10 or more in cases such as cineo-angiography. As of early 1999 researchers at UCSF were working with digitized cineo-angiograms that were 60 MB in size and with intravascular ultrasound images that were 50 MB.14 High-resolution electron microscopes can produce individual images that are 2 MB in size, but such instruments are available only at a small number of research centers.15break
The bandwidth required to transmit these images is determined by several factors, including the amount of time in which the image must be transmitted and the degree of compression that is allowable without degrading the image so much as to impair interpretation and diagnosis. Lossless compression techniques can reduce image size by a factor of 3 or 4; lossy compression techniques can reduce images by a factor of 10 to 20 without sacrificing diagnostic quality in some applications (Lou et al., 1997). Acceptable compression levels vary by application domain (e.g., teleconferencing versus radiology) and by intended user (e.g., radiologist versus primary care physician). With digital mammography, the maximum degree of acceptable compression is controversial because of concerns over the loss of detail.
In many applications, images can be sent, often as e-mail attachments, to a remote site for interpretation and diagnosis within 1 or 2 days. This technique does not place extreme demands on the network and has been used successfully by several organizations, even at low bandwidth. Several years ago, for example, Massachusetts General Hospital in Boston used regular voice lines at 9.6 kbps to receive radiographs and CT images from Saudi Arabia. The images were compressed at ratios of 20 to 1 and 10 to 1, respectively, so that exams could be transmitted in 20 minutes to 1 hour and interpretations could be provided in 1 to 2 days (Huang, 1996; K.J. Dreyer, Partners Healthcare System, personal communication, 1999). Increased use of this technique could demand greater bandwidth, however. A busy mammography center may perform 80 to 100 examinations per day. If all these images were sent out for interpretation, an average sustained throughout of almost 1 Mbps would be required without compression and almost 100 kbps with 10 to 1 compression. This admittedly high-end application is within the average performance capabilities of the Internet backbones but not of all Internet service providers.
A desire for faster turnaround in the interpretation of medical images could increase demands on networking resources, even if the volume of examinations transferred across the network is small. Faster networks would enable various types of service improvements. For instance, they could enable remote experts to provide faster diagnoses or second opinions to help referring physicians plan follow-up treatments while their patients are still in the office, or they could enable the remote experts to provide real-time advice, such as suggesting a need for additional images to aid in diagnosis before a patient leaves the mammography center. In the first example, sometimes referred to as teleconsultation, a response may be desired within 30 minutes or so; in the second example, sometimes referred to as telemanagement, a response may be desired in near real time. Given the size of the images to be transmitted and the possible need for reference sets, the bandwidth demands could be tremendous.continue
With lossless compression of 4 to 1, an entire mammography study would require 1.7 Mbps to be transmitted in 2 minutes to allow near-real-time interpretation. Such a capability would not be a necessity in most cases; expert reading of mammograms in real time is not needed for regular screenings, but it can be useful if potential abnormalities are discovered.
In some imaging applications, high-speed networks become less important if the data can be intelligently processed prior to transmission. For example, during the development of PACS for integrated health care delivery systems, users often specify that several care providers need to be able to simultaneously access uncompressed medical images from any location in the system within 2 seconds. Such characteristics can translate into a requirement for network bandwidths of 100 Mbps (typically a LAN) and specialized high-resolution monitors, which can make such systems costly (in the $2 million to $3 million range). With some attention to physician's schedules, however, systems can be developed that store images locally on a computer that the physician is likely to use, thereby reducing the strain on the network. Furthermore, systems can be developed that operate efficiently on 10 Mbps LANs and standard computer monitors. For example, most computer monitors cannot display a full-screen medical image at full resolution; if the maximum resolution of a clinician's video screen is 1,024 × 768 pixels × 12 bits of gray scale (or 1.2 MB), then there is no use in sending all 128 MB of a digital mammogram study. Instead, a lower resolution image could be sent and additional detail could be requested on smaller portions of the image as the radiologist identifies areas of interest. A variety of commercial solutions are now available that enable the design of tele-imaging systems that appear highly responsive without greatly increasing network performance requirements (Box 2.2). Such systems tend to transmit only portions of a complete image at any one time and therefore place less stringent demands on network capabilities than full-screen, full-resolution systems. Continued evaluation will be needed to determine the relative effectiveness of these alternative designs in diagnosis medical conditions.16
Data security is also important to teleradiology applications. Both patient confidentiality and data integrity must be maintained during image transmission and storage. Confidentiality can be maintained through the use of a host of technologies for authenticating users, controlling their access to images, and encrypting transmissions (see Chapter 3). Data integrity can be maintainedan especially important function given the ease with which a digital image can be alteredthrough the use of technologies such as digital signatures, which are used in a number of e-commerce applications. Nevertheless, trade-offs need to be made between the level of protection of digital images and other considerations, including cost and ease of use (Huang, 1996).break
Several transactions form essential components of clinical care: (1) administrative functions such as referrals, practice management, and billing (addressed in the section below on financial and administrative transactions), (2) distribution of medical supplies and purchasing and inventory control, and (3) clinical functions, such as reporting of results from testing laboratories and interinstitutional communication of health information. While some progress has been made in the first two of these areas, progress in the third has been limited. Some health care organizations use a Web-based infrastructure for reporting laboratory results within an institution, but few integrate the laboratory data into the clini-soft
cal information system containing patient records. Fewer still use the Internet for exchanging patient records among affiliated or unaffiliated health care organizations. As a result, when patients arrive at a health care organization for the first time (perhaps after a referral or changing health plans) or visit the emergency room of a hospital they have not visited before, their medical records are either inaccessible or reduced to a photocopied or faxed subset of the paper record in another institution. This is often true of communications between departments even within the same institution because only a small minority of health care institutions have at least some form of enterprise-wide clinical information system.
Greater use of the Internet to facilitate exchanges of clinical information could improve the quality of care by making better and more complete information available to care providers. A recent report from the Institute of Medicine identified medical errors as the source of much unnecessary morbidity and mortality (IOM, 1999). By integrating the clinical transactions of all parties to health care delivery (hospitals, pharmacies, clinicians, insurance) across the Internet, there is a significant opportunity to detect and prevent such errors.17 Use of the Internet for transferring medical records would enable care providers to better treat patients who become ill or are injured while traveling or who have not previously been under their care.
Despite the lack of effort in this arena to date, the Internet appears to provide a viable medium for use by hospitals to share patient health records for the purpose of improving care. This capability was explored through the World Wide Web Electronic Medical Record System (W3EMRS), developed by researchers at Boston's Children's Hospital, Beth Israel Hospital, Massachusetts General Hospital, and the Massachusetts Institute of Technology. The original purpose of the system was to allow the sharing of clinical information across the Internet among emergency room clinicians at the three participating hospitals. The system was subsequently deployed for the sharing of birth data and perinatal maternal data among Brigham & Women's Hospital, Beth Israel Deaconess Medical Center, and Children's Hospital for the management of newborn infants with jaundice at Children's Hospital or one of its affiliated practices. It was also adapted for use within the seven affiliated hospitals of the Boston-area CareGroup; the system is accessible to all authorized clinicians and saves an estimated $1 million annually by reducing the time spent searching for records, time needed to admit a patient, number of admitted patients, length of hospital stays, and time spent in training. The impact on patient retention and member attraction is projected to increase revenues by $3 million to $4 million per year (see Box 2.3 forcontinue
additional information about the W3EMRS system and its implementation within CareGroup).
The bandwidth needs for exchanges of clinical information vary with the size of the records to be exchanged, the number of records that are transmitted in a given period of time, and the timeliness with which records must be accessed. The size of a medical record transmitted electronically between sites can vary considerably, from as little as 1 kB to as much as several gigabytes if the record contains several medical images. In general, older and sicker patients have the largest records. Paper charts 3 to 4 inches thick and divided into several volumes are not unusual in hospitals, like those in CareGroup, that serve large numbers of such patients. The number of records transmitted also varies considerably.break
(box continued on next page)
The average person will see a doctor as an outpatient three to four times a year, with Medicare patients making five to six visits a year and healthy adults making approximately two visits per year. Children visit doctors at the same rates as the elderly, but their records are smaller. Overall, 116 hospitalizations occur per year per 1,000 people, with 332 hospitalizations per year per 1,000 people for those 65 and older. Overall, 380 emergency room visits occur per year per 1,000 people. The timeliness required depends on whether access is needed in an emergency room situation or whether the record is to be transmitted to a specialist for an appointment at a later date.
Of greater concern is security. At issue are the confidentiality of clinical transactionswhich tend to contain personal information andcontinue
(box continued from previous page)
identitiesand the integrity of the data transmitted. Confidentiality of the transmitted record can be addressed with encryption, as long as it is accompanied by strong methods for authenticating both the sender and receiver of the record. In pre-Internet (i.e., telephone or fax) communications, authentication is handled by the participants, who have some notions of how to verify that they have reached the correct party. These already shaky notions are ineffective in Internet communications. Internet protocols such as the hypertext transfer protocol (HTTP), which handles Web-based transactions, do have provisions for encrypting communications and validating users' identities, but effective mechanisms have yet to be deployed for distributing the tokens, certificates, or other technologies needed for authentication to all potential users of a system. Few organizations have adopted basic security procedures for protecting data integrity and data stored on computers that are accessible over the Internet.
System reliability is also an issue in clinical transactions. If health care organizations are to rely on Internet-based systems for access to patient records, then they must be assured that the systems will function properly 24 hours a day, 7 days a week. System outages of limited duration may be tolerable if records are being transmitted to a specialist or another institution in advance of a scheduled appointment, but they cannot be tolerated if access is needed on demand, such as for emergency room treatments.
A number of other challenges stand in the way of Internet use for clinical transactions. Perhaps the most daunting is the lack of agreement on data interchange standards and standardized vocabularies, or nomenclature, to describe clinical entities. Considerable support exists for the Health Level Seven (HL7) standards, which were developed for clinical transactions (Box 2.4),18 and the HL7 organization continues to improve the completeness of its HL7 data model to encompass more possible medical transactions (e.g., the current standard does not cover all drugs, X-ray studies, or nursing interventions or problems). Nevertheless, proprietary concerns and institutional inertia have led many vendors of clinical information systems and health care organizations to develop commercial applications and home-grown systems that are not compliant with the HL7 standard or idiosyncratically compliant with it. Furthermore, health care organizations have not been able to agree on a standardized vocabulary to use in describing different sets of clinical entitiesdespite significant support by the National Library of Medicine for multiple standardization efforts, including the construction and maintenance of a metathesaurus as part of the unified medical language system (UMLS).
There are many reasons why neither the data models, such as HL7,continue
nor the vocabularies that are among the constituents of the UMLS became popular, but these are beyond the scope of this report. However, it is clear that one important reason is that there have been no sufficiently motivating arguments for data sharing across or even within institutions. Unlike billing transactions or pharmaceutical transactions, clinical transactions have only an indirect effect, at best, on the profitability of health care organizations. The health care industry is much less consolidated than the pharmaceutical industry, which has been more successful in deploying an interoperability standard. An additional inhibitor is the nature of clinical transactions, which tend to be more complex and varied than commercial transactions. The inertia with which clinical information systems have been deployed and accepted into practice has encouraged the development of consumer-driven health information systems in which third parties store and provide access to clinical data. Nonetheless, the functionality of these consumer information systems will be no better than that of their counterparts in health care institutions without the widespread adoption of standards for health data exchanges and the development of a robust means for authenticating users.
Use of the Extensible Markup Language (XML) for Internet-basedcontinue
transactions may provide additional interoperability but could encounter similar barriers. XML is the universal format for structured documents and data on the Web. Like HTML, XML makes use of tags (words bracketed by "<" and ">") and attributes (of the form name="value"), but whereas HTML specifies what each tag and attribute means (and often how the text between them will look in a browser), XML uses the tags only to delimit pieces of data, leaving the interpretation of the data to the application that reads it. XML documents include an XML document type declaration, which contains or points to markup declarations that provide a grammar for a class of documents. This grammar is known as a document type definition (DTD). Therefore, before XML documents adhering to the HL7 data model can be created, a commonly accepted HL7 DTD must be ratified (Dolin et al., 1998). Without agreement on this common HL7 DTD, the exchange of clinical data across the Internet among health care systems will be significantly more cumbersome and will probably be further delayed.
New federal mandates are likely to encourage greater standardization that could facilitate use of the Internet for exchanges of clinical information. The Health Insurance Portability and Accountability Act (HIPAA, P.L. 104-191) requires the secretary of Health and Human Services to adopt standards for the electronic transmission of health information associated with the following transactions: health claims or equivalent encounter information, health claims attachments, enrollment and disenrollment in a health plan, eligibility for a health plan, health care payment and remittance advice, health plan premium payments, first report of injury, health claim status, and referral certification and authorization. The challenge will be to ensure timely compliance with HIPAA standards by the multitude of legacy applicationsa task that will require at least as much effort as the Y2K remediation.
Furthermore, the emergence of Internet industries to host several of the aforementioned clinical transactions and their data repositories may provide the means for the widespread implementation of automated health transactions. However, hosting such applications has its own dangers. By slicing the space of clinical transactions into sharply demarcated segments (e.g. clinician documentation, laboratory reporting, medication ordering), there is a risk that important data relevant to patient care will become more dispersed and functionally unintegrated. Only by ensuring close adherence to HIPAA with a high degree of interoperability (i.e., adherence to data storage and communication standards) can this risk be abated.break
Technical Requirements for Clinical Care
The technical capabilities required by clinical applications of the Internet are even more demanding than those required by consumer health applications because a number of factors converge. The need to protect the confidentiality of patient information is combined with the need for high bandwidth and low latency to support remote consultations; high availability is also required to ensure that patient records can be accessed when needed and that systems remain operational for the duration of a remote consultation. Although not all clinical applications of the Internet simultaneously stress each of these dimensions, the set of foreseeable clinical applications, taken as a whole, does.
Bandwidth requirements for clinical applications vary considerably, but many possible applications could demand high bandwidth. Remote consultations, for example, would require sufficient bandwidth for realtime video at rates approaching 1 Mbps for some types of diagnostic procedures. The transmission of large medical images could also require high bandwidth in some instances, to support the transfer of large numbers of images between an imaging center and a remote interpretation center or rapid turnaround of diagnoses from a remote specialist. Even remote access to electronic medical records could demand relatively high bandwidth to the extent that such records include images or video. In many cases, records (or images) could be downloaded in advance of the need to view them, although this technique would not be as effective in emergency situations.
Latencies across the Internet are adequate for many clinical applications, such as results reporting and downloading of most medical records (if these records are already stored remotely online), but applications like remote consultation would demand lower latencies to facilitate more natural interactions between participants. In virtual reality applications, low latencies are needed to create realistic spaces and interactions that are not distracting to users or participants.
Because of safety and timeliness considerations in patient care, availability of the network is vital. Clinicians awaiting a lab result, radiologi-soft
cal examination, or connection to a patient's home cannot tolerate any unavailability of the network or the clinical applications running on it. Likewise, remote consultations will not be viable if network availability cannot be assured and connections are broken frequently. Neither care providers nor patients will tolerate delays, downtime, or lost connections in such applications.
Because patient information is so sensitive and its safety is paramount, the security of the network is vital for clinical applications. Without assurances that the confidentiality and integrity of patient data will be protected and that critical services will be available when needed, both the government and the public will resist the sharing of data across institutions. Security improvements will entail both technical measures (many of which would be deployed within specific applications) and the development of robust confidentiality policies to govern the disclosure of personal health information. Considerable technology is available for improving the security of many clinical transactions between established partners, but it is not widely deployed in health organizations (CSTB, 1997). More advanced security technologies, as outlined in Chapter 3, could improve protection, especially for data exchanges between unaffiliated health organizations and between consumers and care providers.
Ubiquitous access will be important for many clinical applications if for no other reason than the dispersion of health care providers, many of whom continue to work in private practice or remote clinics. For remote consultations, distributed collaboration, and home care, the ubiquity of network services is essential. High-bandwidth (broadband) access could also be important for applications requiring the transmission of large images or real-time video.
Financial and Administrative Transactions
The Internet is being evaluated as a medium for streamlining financial and administrative transactions in the U.S. health care system. Health care in the United States is financed largely by a network of so-called third-party payersentities that insure and pay for health services but are not directly engaged in providing care. These entities range from government programs such as Medicare and Medicaid, which pay for the care of the elderly and impoverished, to private-sector organizations,continue
including traditional indemnity insurers, self-insured companies, and managed care organizations. The Internet could be used by providers to submit claims for payment or by individuals to enroll, disenroll, and change their coverage. Payers could quickly confirm an individual's eligibility for coverage and convey any changes to the health plans, which, in turn, could quickly relay the information to the person's designated providers. By accelerating these transactions, the Internet could reduce misunderstandings and disputes among parties, hasten payers' premium payments to plans and plans' payments to providers, and reduce administrative costs, which by some estimates constitute 30 percent of all health care expenditures in the United States. By one estimate, paper claims cost between $2 and $18 each to process, whereas electronic claims have costs measured in cents (McCormack, 2000)
Health care organizations have filed claims electronically for some time. Approximately 65 percent of the 4.7 billion claims submitted for payment by care providers and pharmacies in 1999 were submitted in electronic form. Hospitals and pharmacies have gone the farthest down this path, having submitted 85 percent and 89 percent of their claims, respectively, in electronic format in 1999, compared to just 43 percent for individual physician practices. Much of this progress is due to prompting from the Health Care Financing Administration (HCFA), which administers the Medicare and Medicaid programs, and from Blue Cross/Blue Shield organizations, each of which was expected to receive more than 80 percent of claims electronically in 1999. By contrast, HMOs and other commercial insurers, which together account for 44 percent of all health claims, were expected to receive just 18 percent and 45 percent of claims, respectively, in electronic format.19 Medicare has required its contracted managed care organizations to transmit electronically beneficiary enrollment, disenrollment, and correction data in batch mode to its data center (HCFA, 1999b). These managed care plans also obtain data on the disposition of their transactions and on plan membership and payments electronically.
Despite the trend to electronic formats, only a few provider organizations use the Internet to submit electronic claims or related transactions, and few payers are capable of accepting Internet-based transactions. Until recently, Medicare transactions, for example, have been conducted using telephone lines to access the data center. HCFA is replacing this system with the Medicare Data Communications Network, which will be accessed with Web browsers. HCFA's carriers for the conventional Medicare program, fiscal intermediaries, and most of the Medicare+Choice plans are using this network, and by July 1999, the remaining managed care plans were required to have made the transition. No such requirements apply to communications and data transfer between individual physi-soft
cians' offices and the HCFA carriers that process their claims. Medicare and other payers plan to increase the electronic transmission of data related to the quality of care and satisfaction of beneficiaries. Medicare, for example, foresees that its health plan management system will collect plans' quality-related data from the Health Plan Employer Data and Information Set (HEDIS) and the Consumer Assessment of Health Plans Survey (CAHPS) (HCFA, 1999a).
Private-sector organizations have also begun to experiment with Internet applications for financial and administrative transactions. Many such organizations see the Internet as a plausible means of achieving the long-held vision of seamless integration of information across organizations. Health organizations can assume that networking capabilities will be in place so they can concentrate their resources on higher-order functionality. The Internet may also make electronic claims submission practical for small group practices that cannot afford the hardware and staff needed for more conventional electronic data interchange (EDI) systems. The Regence Group of Seattle, Washington, for example, has developed a Web-based interface application called Network Data Express (NDEX) for determining beneficiary eligibility and making referrals. The system features claim status inquiries, provider directories, reference materials (such as the formulary), e-mail, and managed care data and reports. It processes about 20,000 transactions per month, doing the work of two to three full-time employees who would otherwise give the same information out by phone (see Appendix A for additional information on Regence's system).
Other efforts have been initiated at the state and regional level to promote health information exchanges. The Community Health Information Technology Alliance (CHITA) in Seattle, Washington, the Minnesota Health Data Institute, and the Affiliated Health Information Networks of New England, a project of the Massachusetts Health Data consortium, are three examples.20 Such programs attempt to facilitate information exchange among a variety of organizations, including care providers, insurers, pharmacies and pharmaceutical benefits managers, accrediting organizations, and state health organizations. Considerable effort has been devoted to defining standards for data exchange and determining the types of data that must be exchanged for different transactions.
Security concerns have been a major impediment to greater sharing of information for payment and administration. Many such transactionsespecially paymentcontain sensitive information related to a particular patient's health, so their confidentiality must be maintained. Similarly strong requirements exist for data integrity. According to one information security officer interviewed as part of this project, the responsibilities of security officers in health care differ from those of their counterparts incontinue
other industries: the applicable state and federal laws are different, and the privacy and security concerns are greater. At the same time, the health care industry is driven by economics, not privacy, so there is a need to balance cost effectiveness with security protections.
CHITA and the Foundation for Health Care Quality worked with the Massachusetts Health Data Consortium and Minnesota Health Data Institute on a three-state project focusing on electronic security. The goals were to determine how electronic security could be implemented affordably and to develop a business case for a community-wide, secure infrastructure for electronic business. The group worked with SAIC to develop a security and risk management plan for business-to-business health information networks. The plan identifies seven levels of increasing health care security, numbered 1 through 7. CHITA is working with Seattle-area health care organizations to implement level 6 security practices (HSL 6) within participating regional organizations.
HSL 6 supports remote access to a data repository but not direct remote access to the internal network of a health organization. It includes specifications for three network-based information services that are deemed essential to financial and administrative transactions: (1) authenticated, secure messaging, (2) authenticated, secure file escrow and transfer, and (3) authenticated, role-based access at the level of individual users. The security model has been developed and published (SAIC, 1998), and CHITA is in the process of identifying an organization that will function as a trusted intermediary to oversee a prototype implementation, followed by a wider pilot project in the region. Issues to be addressed include the identification of a certificate authority, which might be a nonprofit organization, the state or federal government, or a private corporation.
HCFA is also a strong proponent of EDI but has prohibited the submission of payment information over the Internet owing to concerns about security and confidentiality. In January 1999, the agency revised its security policies to allow Internet-based transmission of information after it has been received from outside parties (HCFA, 1999a). HCFA is not allowing claims transmission over the Internet except by those Medicare contractors participating in an interoperability pilot of the HCFA Internet security policy. The pilot, which began in September 1999 and was scheduled to run through December 1999, tested e-mail, batch, real-time, and Web-based transmissions, while utilizing various authenticating and encryption technologies and including digital certificates through cooperating certificate authorities. Results of the pilot and accompanying recommendations were anticipated in February 2000. Depending on the findings and the cost-benefit analysis associated with the transmission of claims, HCFA will decide on whether or not and when to allow Internetcontinue
transmission on an operational basis.21 Acceptance of the Internet by HCFA could stimulate its wider use for submitting claims because HCFA processes a significant percentage of the nation's health care payments, and its acceptance could signal that such submissions can be handled securely.
Advances in the use of the Internet for financial and administrative transactions will be accelerated by the HIPAA of 1996 and by the 1997 Balanced Budget Act (P.L. 105-34). The HIPAA requires providers, provider organizations, payers, and clearinghouses to adopt uniform transaction standards, code sets, identifiers, and electronic signature standards for electronic transmissions of health care claims. In addition, HIPAA prescribes security standards for the protection of all electronic health information both within and between health care enterprises, and it gave Congress until August 1999 to pass comprehensive health privacy legislation. In the absence of such legislation, the secretary of Health and Human Services is to promulgate regulations to protect the privacy of personal health information (Harman, 1998) (see Chapter 5). Some health care organizations report that they have been slow to implement programs for Internet-based submissions of medical claims until the new regulations are in place. They fear that they may need to modify their systems after final regulations are promulgated or that Congress may pass legislation that supersedes them. The Mayo Foundation, for example, uses dial-up connections and leased T1 lines to submit claims electronically to government and commercial payers but will not implement an Internet-based system until security and privacy guidelines have been adopted by the federal government (McCormack, 2000).
A number of obstacles may further delay the widespread use of the Internet for financial and administrative transactions. While many large health care organizations are moving toward electronic billing and have Internet connections, many private practitioners (almost half of all physicians practice independently) lack Internet access and practice management systems for electronic billing. Furthermore, many practice management systems are not interoperable with the Internet, requiring users to download claims information into a separate Internet-based application, a process that adds unnecessary cost and complexity. Organizations large and small that have legacy systems operating for EDI and claims processing may also be slow to replace those systems with Internet-compatible systems, although in the long run they will need to modernize their systems. A lack of standards for electronic claims will continue to impede efforts at Internet-based exchanges, as many practice management systems use different formats for data, and payers often cannot accept data in multiple formats. standards efforts, including those mandated by HIPAA and those under way in other regional collaborations, maycontinue
ease this concern, but the decentralized nature of the health care industry presents a significant impediment.
Technical Requirements for Financial and Administrative Applications
The bandwidth requirement for most financial and administrative transactions is modest. Most transactions consist of short, text-based messages. In some cases payers may request care providers to transmit large diagnostic images in support of claims for payment, but such images need not be delivered rapidly, so even in these cases, bandwidth is not a significant consideration.
Latency is also not a significant factor in financial and administrative transactions. In some cases, such as checking on the terms of a patient's coverage for certain procedures and eligibility for reimbursement, timely responses are desirable. If an organization's servers are far too small and the system response time gets long, there could be problems, but those would be problems not with the Internet itself but with the individual nodes on the network. For many transactions, such as submission of claims for payment, latency is not an issue at all as the payment process tends to be slow. Granted, improvements in technology and processes could eventually allow for near-real-time review of claims and electronic payment for procedures, but response times and latency would not be a driving consideration in such systems.
The importance of availability in financial and administrative transactions is relatively high, depending on the specific use. The routine uses of payers require no more than a moderate level of availability. System outages could be compensated for by waiting to send a payment request. However, the preapproval of immediate care for beneficiariesfor example, the approval of pharmacy claimswould demand greater availability since there is greater need for a rapid response. Care providers and payers are unlikely to use the Internet for such transactions if it is not reliable.break
Security of Internet communication of health-related data is a sine qua non of the Internet's greater use for financial and administrative transactions. Many of these transactions contain sensitive, personal information on the types of health care services that were provided to a particular patient or the diagnosis of a condition. Hence, they are almost as sensitive as clinical records. Both care provider organizations and payers also have strong incentives to demand that data integrity be maintained, to ensure that information is not corrupted during transit or when stored in computers attached to the network. As with clinical transactions, providing such security entails both technological mechanisms and confidentiality policies that govern disclosures of information by health organizations. Care providers and payers with established relationships can make use of existing technology for securing information during both transmission and storage, but more advanced technologies for authenticating users would enable transfers of information among a larger number of payers and providers.
Requirements for ubiquity of access would be high if the objective is that all providers, including individual physician offices, routinely submit claims and quality improvement information, eligibility checks, and other information via the Internet. In the short term, claims administrators could use the Internet for communications with institutional providers, reducing the degree of ubiquity required.
Public health workers promote health and the quality of life by preventing and controlling the spread of disease, injury, and disability. Public health officials collect statistics on the occurrence of diseases, disseminate guidelines to health care practitioners and the public, fund research on ways to improve public health, and deliver health care to underserved populations. A number of these activities could be enhanced by an Internet that is better attuned to public health needs, that provides sufficient security to protect sensitive medical records, that is accessible to all public health workers and the public at large, and that remains operational even in times of natural or man-made disasters. Public health surveillance, in particular, stands to benefit from Internet-based transactions to assist in collecting data about the health of individuals, personal risk factors, and medical treatments, as well as data about potentialcontinue
sources of disease and injury in the environment and resources that can be used to take effective action.
In recent years, attention has turned to making certain that public health officials at local, state, and regional levels have adequate connectivity and expertise to use the Internet for their work. Several reports have reflected on the need for new relationships and better collaboration between public health officials and individual health care providers.22 The National Library of Medicine (NLM), in conjunction with several other public health organizations, has initiated a program, Partners in Information Access, designed specifically to help public health officials gain access to the Internet and to relevant health information.23 Since October 1998, 20 awards totaling just under $1 million have been made for programs in 20 states. The goals of this program are fourfold: (1) to increase public health professionals' awareness of the services of the NLM, the Centers for Disease Control and Prevention (CDC), and the National Network of Libraries of Medicine (NN/LM), (2) to assist public health professionals in getting connected to the Internet, (3) to train public health officials in the use of information technology and information services, and (4) to increase awareness of public health information needs and resources among NN/LM members. Individual projects will attempt to provide modems and connections to Internet service providers for public health departments lacking such capabilities; support access to public health information and related biomedical topics via local medical libraries; survey the information needs of public health officials; and train public health officials to use the Internet and specific information resources, such as PubMed and CDC WONDER.
These efforts reflect the growing awareness of the linkages between public health and the care of individuals. Recent changes in the ecology and epidemiology of disease and the organization of health care delivery systems have led to a convergence of these two components of health care. For example, the AIDS pandemic forced many to realize that high socioeconomic status did not confer immunity from epidemic infectious diseases. Second, it has become increasingly clear that the major controllable causes of disease involve the traditional interests of public health: smoking, alcohol and drug abuse; injuries; and nutritional problems, including obesity. Finally, the advent of managed care and capitation has made payers responsible for protecting the health of populations. This convergence of public and private health interests represents a historical opportunity to bring public health thinking into the daily practice of medicine. It makes public health surveillance a more compelling application of the Internet. By some estimates, only about 10 percent of all early deaths in the United States can be prevented by medical intervention; population-based approaches could prevent up to 70 percent of them bycontinue
targeting underlying risks such as tobacco, drug and alcohol abuse, diet and sedentary lifestyles, and environmental, occupational, and infectious risk factors (McGinnis and Foege, 1993).
Public Health Surveillance
The public health system in the United States is hierarchically organized around community (city, county, or other local jurisdiction), state, and federal efforts. Each of these jurisdictions is chartered to collect different sorts of data and share them in different ways. The federal public health centers must recognize large-scale trends in the occurrence of disease and allocate resources to minimize the damage to the public health. Community public health offices must process information about individual patients and local outbreaks in order to recognize and respond to the needs of the community. For historical reasons, the three levels of public health monitoring and surveillance have developed very different organizations and communication mechanisms. However, there are fairly well defined communication points where the systems interact with one another. For example, physicians and medical laboratories must report the occurrence of certain conditions to local health departments, depending upon the reporting requirements. Certain conditions must also be reported to state public health offices, which in turn file reports with the federal CDC. Although the sets of data reported to CDC are uniform across the states (and updated regularly), each state and county health department can require that any condition it deems significant must be reported (rural counties, for example, have different interests from urban ones).
An important mechanism for collecting information of great significance to public healthand one that is ripe for the Internetis automatic reporting by medical laboratories of test results for some communicable diseases, such as tuberculosis. Such systems promise both to improve reporting of adverse events and to lower the costs of collecting and maintaining such data. Testing laboratories are required to report certain diagnoses to their local health offices so that public health officials may ensure that adequate treatment is delivered and that spread of disease is contained. Currently, most such reporting is done on paper, with laboratory results being sent by mail or fax to the public health office.24 Officials from the county public health office then follow up with the local physician and/or patient to investigate possible causes of the condition, paths of contagion, and needed interventions. This reporting system is fraught with errors and delays, as reports are transmitted in a range of forms (mail, fax, etc.), following different sets of rules, to different county offices. Not surprisingly, reports are often incomplete or are sent to the wrongcontinue
county health department because testing laboratories cannot determine in which county the patient resides.
The Internet offers a way to streamline this system, ensuring that reports are sent in a timely manner to the correct local public health office and to the state for analysis. The Washington State Department of Health, for example, has begun to develop its Electronic Laboratory Reporting System, to support Internet-based submission and notification of cases of reportable diseases within the state, which total about 100,000 annually. The system uses the Internet to allow testing laboratories to report conditions directly to the Department of Health, which forwards the information to the appropriate local health department. It is intended to hasten the filing of reports, reduce the burden of reporting for laboratories and health agencies, improve the state's ability to track disease outbreaks that cross county lines, and ensure that reports are transmitted to the correct county health office. The system takes advantage of the broad reach of the Internet to establish connectivity among the health departments and private testing laboratories. In preliminary tests with the Group Health Cooperative of Puget Sound, the system improved the rate of reporting of health conditions at both the state and local levels, especially for smaller counties whose paper-based reports were more prone to be lost or misdirected. The time to file a report with the counties improved moderatelyto less than one daywhile the time to transmit reports to the state improved dramaticallyfrom a mean of 40 days with a paper-based system to just a day with the Internet-based system.25
The Internet also offers unprecedented opportunities for planning and resource allocation at the community, state, and federal levels, potentially improving care and reducing costs. Especially in a setting of limited resources, mechanisms for identifying the need for resources and deploying them rapidly to affected populations are of critical importance. Automated systems for tracking the outbreak of diseases both acutely (on the scale of hours) and subacutely (on the scale of days to a week) would allow for dynamic allocation of resources, such as medicine, non-pharmaceutical medical supplies, donated organs, blood products, and even medical personnel, based on needs. Consider an outbreak of illness caused by a pathogenic bacteria contaminating hamburgers sold by a chain restaurant. Early detection of such an outbreak could lead to rapid notification of local and state public health officials so they could begin to track down the source of the infection. At the same time, pharmaceutical suppliers could be notified that an extra supply of certain types of antibiotics or rehydration intravenous fluids would be required in the region. Finally, hospital personnel could be alerted to the fact that these cases were appearing and could be briefed on the signs and symptoms to make them more prepared for emergency room visits related to the out-soft
break. At the federal level, information about these outbreaks could contribute to decisions on the cost-effectiveness of setting up new regulations, their enforcement, or their propagation and dissemination within the health enterprise. Clearly, application software must be developed to assist decision makers in allocating resources and in identifying and responding to trends in disease, but the Internet would provide the infrastructure necessary to gather the data upon which these decisions will be based.
Integrating Data Sources for Improved Decision Making
By allowing automated queries to disparate databases, the Internet could also help public health officials better integrate the available data to improve data analysis and health monitoring. Currently, a number of political and bureaucratic boundaries impede the use of the Internet for public health purposes. Most importantly, federal and state public health agencies are organized in vertically integrated, disease-specific systems. One rationale for this structure is that vertically integrated data and communications systems best serve the traditional public health functions for a given disease.26 Thus, dozens of systems support individual diseases (such as AIDS) or disease groups (e.g., hospital-acquirednosocomialinfections). The result is massive duplication, and a patient's clinical information could reside in several different systems that do not interconnect. The Internet could be a powerful technical tool (and political motivator) to realign these programs and allow better integration of data for monitoring public health. Doing so would require that public health offices and their databases be connected to the Internet and that mechanisms be put in place for protecting the security and confidentiality of data that contains personally identifiable health information.
Beyond integrating databases within the public health sector, the Internet offers the opportunity for public health officials to collect data from private sources that might be important in their surveillance efforts. School attendance records and sales of prescription drugs or nonprescription remedies could signal the outbreak of a disease in its early stages, before symptoms reach the level at which people visit a doctor. Indeed, the New York City Department of Public Health arranged to receive such data from one local drugstore chain to improve its surveillance activities, recognizing that abnormal sales of antidiarrheal medicines could indicate a wide-ranging but low-level epidemic of food poisoning or problems with the water supply. Being able to access such information quickly through the Internet could allow health care providers to respond rapidly to disease clusters and reduce the exposure of the population to disease. Much of this information is available today in electronic format, and withcontinue
proper protections for proprietary and confidential information, it could be made available to public health officials via the Internet.
Responding to Bioterrorist Attacks
How to detect and respond to a bioterrorist attack (e.g., an intentional release of poisonous gases or tainting of the public water supply) has become a growing concern for the public health community. The use of biological weapons by terroristseven an individual terrorist acting alonecould inflict life-threatening illnesses on a large scale and, unlike explosions or chemical releases, could easily escape immediate notice. Many biological agents would not produce symptoms in their victims for days, weeks, or longer, and initial reports of illnesses might not appear unusual, delaying recognition of a widespread problem.
In the case of bioterrorist attack, each of the phases of the public health process would depend on a successful infrastructure: recognizing a trend, identifying the cause of the trend, formulating a strategy for responding to it, allocating resources for the response, deploying the response, and monitoring its success. Initial clinical reports, which might come from doctors' offices and emergency rooms over a large area, would need to be aggregated at a high enough level for a geographical pattern to emerge and a problem to be detected. Local public health officials in the affected areas would need to confer with one another to plan a coherent response to the attack and allocate resources to address immediate medical needs. Data would need to be provided to public health teams charged with identifying the pathogen and formulating and implementing a response. The ability to keep information from the public in order to avoid panic could also be important, depending on the situation.
The CDC found in a 1998 study that most local health departments lacked the capabilities to adequately detect and respond to a report of bioterrorism. It found that most such departments lack basic information and communications systems and cannot communicate reliably with CDC, state health departments, or emergency response agencies in a crisis. Half lacked Internet access, 20 percent lacked suitable computer capacity, and 70 percent lacked training in the use of electronic information technologies for conventional health purposes (CDC, 1998).
To remedy this problem, CDC is developing a national Health Alert Network that will facilitate the collection of information from testing laboratories, the sharing of information among public health officials, and consultations among them regarding needed responses.27 A total of $28 million was allocated to this task in FY99.28 The network will use desktop personal computers and laptops connected to the Internet with sufficient bandwidth to handle the transfer of laboratory reports, interac-soft
tive collaboration among public health officials, and multimedia distance training. It will make use of public key encryption for secure communications and authentication. Because of its critical nature and the need for its continuous availability, the network will be designed with sufficient redundancy to provide backup operations in case of a link failure and disaster recovery plans to allow rapid restoration of service in case of other component failures.29 Videoconferencing capabilities are seen as important, for they would allow public health officials to communicate more effectively during a crisis than they could with either text or audio alone. Mechanisms may be needed to accommodate (possibly by diversion) high volumes of traffic in an emergency.
Technical Requirements for Public Health Applications
Use of the Internet for public health surveillance would require technical advances in a number of areas. Of primary interest are ubiquity and security, but availability is also of concern. Other technical parameters, such as bandwidth and latency, are less important in most public health applications, although the desire for videoconferencing in widespread emergencies would increase the need for bandwidth and for low latencies to support real-time, interactive video. Solutions to these technical problems could greatly expand the use of the Internet in support of public health.
In general, the information transmitted for the purposes of public health requires relatively low bandwidth. Public health data rarely involve images or other large data objects, although videoconferencing among public health officials would require higher bandwidth from at least some computers and locations. Of course, there is also the potential for many data objects to be transmitted through the network, raising the bandwidth requirement by virtue of aggregated traffic levels rather than large individual files.
Few of the applications of the Internet in public health are sensitive to small delays (i.e., of seconds to minutes) in the transmission of data, so that latency is less important.break
For public health, the availability of the network is of moderate importance. Although short downtimes can normally be tolerated, the minute-to-minute monitoring of outbreaks of acute disease (especially in the case of bioterrorism) would not tolerate extended periods of network failure. If the Internet were to be used for detecting bioterrorist attacks, it would have to be reliable and resistant to hostile attacks (which could accompany a bioterrorist attack). Because data collection and aggregation take place continuously, loss of network might lead to loss of data and failure to respond in a timely fashion.
The security of data on the Internet is of paramount importance to public health applications. Data reported by testing laboratories contain identifying information that is used by public health officials to map diseases and conduct interviews with affected patients. The public health system depends on the public's trust that sensitive health data are being used for the benefit of the public only. Such data must be protected both in transit and while stored in computers in public health offices. Sharing public health information at the community, state, and federal levels requires the development of advanced technologies for intelligently stripping data of identifiers so that personal identities cannot be reconstructed from the data. Although certain local public health functions (treatment and prevention of tuberculosis, for example) require knowledge of the patient and his or her home situation, it becomes less necessary to have identifying information at the state and federal levels, where general trends are of interest. Even with these technologies, it is critical to have technologies for authenticating data and users. Also important is the ability to protect sensitive institutional data and sensitive information relating to bioterrorist attacks. Such protection would require policies to determine who may access which data, as well as technologies to protect the confidentiality of the information and its integrity.
The success of the public health system requires that reporting and surveillance networks have widespread connectivity that includes local (e.g., community) health departments, testing laboratories, and the provider organizations that order the tests. To serve the entire nation in a cost-effective, standardized way, it is critical that the public health information infrastructure extend to every community, state, and federal publiccontinue
health agency. Information gaps would be a great burden on the nation, since they would require creation of a secondary, mostly redundant mechanism for data collection and dissemination. As in many other areas of information technology, a few exceptions threaten to make the entire enterprise too expensive. At the same time, the benefits of a ubiquitous network to community, state, and federal agencies would be substantial and would probably improve public health greatly. Public health organizations often run on tight budgets, so the cost of access to networking technologies must be reasonable. In any event, fewer distinct entities would probably need to be connected for public health applications than for consumer health, clinical care, or financial and administrative applications.
Despite advances in technology and the Internet, the education of health professionals is practiced much the way it has been for decades. Students of medicine, nursing, pharmacy, and allied health disciplines set out on a course of graduate and postgraduate education, with much of this training occurring in classrooms or lecture halls. The emergence of the Internet and Internet-based technologies has the potential to transform health professional education at all levels. Educational systems that were once teacher-centered and geographically limited can now become learner-centered and unconstrained by geography. If the Internet is to support this transformation, the demands on it will be substantial.
Graduate education is provided by 124 accredited four-year medical schools in the United States in two phases: basic science education and clinical education. Basic science courses, such as anatomy, physiology, and pharmacology, are taught in a traditional lecture format supplemented by reading and hands-on laboratory sessions. Significant challenges exist in providing basic science education, including the large amount of information that needs to be transmitted, the fast pace of change in the information base, and a lack of tools that would allow students to index what they learn and to retrieve it later in their training. In contrast, clinical education uses different methods. Knowledge about the diagnosis, treatment, and care of patients is transmitted mostly using an apprenticeship model, whereby the student learns from taking care of patients under the guidance of more senior clinicians.
Significant efforts have been made over the last decade to make basic science education less didactic and more problem-oriented. These effortscontinue
have led to new teaching methods and materials, some of which use computers and the Internet, and new courseware. The advent of online textbooks, journals, and interactive courseware shared across institutions could accelerate this trend so that students spend less time reading books and attending lectures and more time researching topics online. Another trend in basic science education is the use of sophisticated simulations to demonstrate anatomical or physiologic concepts. Such simulations are three-dimensional, color representations that can be rotated or otherwise manipulated. The bandwidth requirement for these applications is high, straining local networks, especially local access connections to students' homes. There are other significant barriers to the routine use of computers and the Internet in basic science education. Not all students have computers, and few campuses have network connections that allow them to gain access to the Internet from classrooms, libraries, or other campus facilities. Networking bandwidth and servers within the institution cannot always handle the dozens of students trying to access the same resources at the same time.
The Internet can also reshape clinical education to overcome some limitations of the apprenticeship model. First, supervising clinicians may not themselves be up-to-date on certain issues and thus are not always the best source of information on these issues. Second, the location of clinical education can limit the student's exposure to certain types of patients and diseases. For instance, certain infectious disease such as tuberculosis and AIDS are seen more frequently in urban hospitals. Students who do their clinical education in a rural setting might not be properly equipped to deal with such patients if they later practice in an urban setting. Structural changes in the health care industry may also serve to limit the diversity of health problems students gain exposure to during their clinical education. The rise of HMOs and specialty clinics makes it far harder for an intern to see a reasonable variety of patients and diseases just by working in a hospital. One effect of the rise of HMOs has been to shift the locus of care away from hospitals and toward local clinics and outpatient facilities. As a result, many of the patients interns see in a hospital setting have already been diagnosed in one of these other facilities, which will tend to limit the interns' experience.
Computer-based tools and the Internet can complement apprenticeship-based clinical education. Perhaps the best examples of such tools are those that allow students and clinicians to search and retrieve the latest medical literature over the Internet and use the evidence retrieved to guide clinical decisions. The process of incorporating knowledge from the medical literature into patient care decisions is referred to as evidence-based practice. Today, students can use the Internet to search MEDLINE, the bibliographic database containing millions of citations to the bio-soft
medical literature, to read abstracts of journal articles, andin some casesto download electronic versions of the original journal article. Similarly, new Internet-based systems are emerging that allow search and retrieval of textbooks, drug information, medical news, and patient education material. The impact of this trend on local networks and the Internet could be substantial. Assuming that some 70,000 medical students, 100,000 medical residents, and 150,000 students in allied health sciences (e.g., nursing, dentistry, pharmacy, public health) in the United States will regularly be accessing textbooks, journals, and other educational material from centralized repositories on the Internet, traffic could increase substantially not only on the Internet but on the LANs of medical education institutions.30 Although the item bandwidth required to transmit a single data element might be low (e.g., 100 kB for an HTML journal article with graphics), the total bandwidth requirements could be much larger owing to the large number of users and the high frequency of usage. This may call for a better understanding of network management and of the trade-offs between increased network capacity and the local caching of data or its replication on other sites to reduce bandwidth needs.
The trend to evidence-based practice will probably continue. Clinical students will increasingly be expected to support their patient care plans with evidence from the medical literature. To do this, they will become even heavier users of online literature retrieval systems such as MEDLINE and electronic journals. Although bandwidth will become an even larger network issue because more users than ever will be using these resources more frequently, the bandwidth needs will not generally be as great as the bandwidth needs to support real-time video streams for other applications.
Another trend in clinical education will be to community-based education. Students who trained mainly in academic hospitals will spend more time training in community hospitals and rural clinics. This trend could be accelerated by Internet links between remote areas and academic medical centers. Using such links, students can discuss cases with preceptors (using audio- and videoconferencing), share clinical experiences with fellow students, and download educational material from university and other Web sites. Internet access in all small community and rural health settings would be important for the success of such communication.
The Internet could also allow clinical students to take greater advantage of simulations to learn about diseases and situations they would not otherwise encounter during their training years. These simulations could take the form of interactive, multimedia modules retrieved over the Internet at the time of need. Modules could include high-resolution graphics and images, streaming audio and video, and text. Similar multimedia content would be required for simulations that test student knowledge for purposes such as allowing advancement through the schoolcontinue
curriculum and granting a license to practice. Such simulations will require that the Internet and local networks have adequate bandwidth and, for interactive simulations, low latency. Already researchers are working on systems to allow the simulation of surgical techniques. These simulations combine three-dimensional imagery with haptic feedback that recreates the touch and feel of live surgery. Such systems require extremely low latency, on the order of a few hundred milliseconds per round-trip, to prevent users from perceiving an unnatural lag between the time they take an action and sense a response (Table 2.5). They also require the elimination of mismatches between different data sources: visual, audio, and haptic information need to be properly synchronized for a user to properly experience a virtual surgical system.
Once clinicians are in practice, they are essentially on their own to keep their knowledge and skills up-to-date. They do so informally by reading journals and textbooks, by interacting with consultants, and by talking with peers. A formal process, designed to maintain and enhance clinician knowledge and skills, also exists and is referred to as continuing education (CE). CE credits are not a national requirement but are required by some states and subspecialty boards for licensure and board certification. For instance, 28 states require physicians to meet minimum CE requirements for licensure, and 9 specialties require it for board certification (AMA, 1996). The CE requirements vary, but they usually call for completing 150 hours of courses over a three-year period. Three of the eleven states with the largest concentration of physicians have no CE requirements (New York, New Jersey, and Illinois).
Traditional CE consists of a time-based system of credits that are awarded for attending conferences, workshops, or lectures. Typical CEcontinue
courses are teacher-initiated, use passive educational models such as lecture, and are often sponsored by the health care industry. Systematic reviews of CE interventions have shown that traditional CEshort courses, conferences, and seminarsare largely ineffective in improving knowledge or health care outcomes (Davis et al., 1995). Two newer approaches, academic detailing (targeted visits by physician educators such as pharmacists) and computerized reminders, have, on the other hand, been found to have a positive effect on knowledge and outcomes. The general success of interventions such as computerized reminders suggests that knowledge delivered in the context of daily patient care and for the purpose of assisting in problem solving is where CE should focus in the future. If this suggestion is acted on, the Internet and systems that integrate patient data with general medical knowledge will probably play a central role in transforming postgraduate education.
The main trend in postgraduate education will be continuous (as opposed to continuing) education. Instead of being concentrated in a week's worth of off-site conferences, education will be provided using multiple modalities available at different times during daily practice. Although traditional CE classrooms and conferences will still exist, virtual conferences will become more common. Using the Internet, clinicians will be able to choose from libraries of video and audio lectures, interactive courseware, and live discussions among colleagues from around the world. For the educational tools that are not live, clinicians will have great flexibility where and when they use them. CE credits, once awarded for sitting in lectures, will be awarded based on time spent and information learned using these online resources. To make virtual conferences a reality for all practicing clinicians, the clinicians will need high-speed Internet access from their health care sites and from their homes.
In addition to learning in virtual conferences, clinicians will do much of their learning of new diagnostic and therapeutic measures in the context of daily patient care. This new learning modality will be fueled by two converging trends: (1) the emergence of patient records in electronic form and (2) the availability of medical literature over networks. In this new modality, behind every abnormal test result, unfamiliar diagnosis, or new drug in the electronic medical record will be a link to the best available knowledge on that topic. Instead of having to initiate a search for information when a question arises, the answer will be anticipated and a link to the answer created within the patient record. As they use this up-to-date knowledge at the point of need, clinicians will also be able to fulfill CE requirements, because the time they spend using the resources and the effect the knowledge has on the patient care process will be logged and reported automatically.31 Of course, such capabilities also raise issues of privacy. Will care providers be able to peruse outside informationcontinue
sources and pursue learning opportunities without being monitored? Will the use of such resources be viewed positively (e.g., the provider is trying to expand his or her knowledge) or negatively (e.g., the provider does not understand some new procedure or diagnostic method)? Such issues will need to be addressed in order to ensure acceptance of these technologies (see Chapter 3 for a discussion of technologies to protect online anonymity). In this vision of integrated patient data and knowledge sources, computer networks will play a vital role. Because of the time-critical nature of the knowledge delivery during the patient care process, reliability of the network and information servers will be vital. Because queries posted to knowledge sources will be based on patient characteristics, security of the network will also be important.
Technical Requirements for Health Professional Education
The bandwidth requirements for health professional education are moderately high. Whether large numbers of people frequently use low-bandwidth applications, such as literature searching, or infrequently use high-bandwidth applications, such as teleconferencing or simulations, bandwidth will be important and sometimes a limiting factor. The development of virtual classrooms and interactive surgical simulations could drive bandwidth requirements even higher.
In general, applications to support health professional education do not require instantaneous delivery, and so the latency requirements of the Internet are not great. However, interactive simulations (such as those for teaching surgical techniques) and conferences would suffer from long latencies.
In general, the availability of the network for health professional education is of moderate importance. Many educational activities are not as time-critical as patient care activities and can tolerate low-level data losses or occasional unavailability. However, as the Internet becomes more and more of a tool for education, as students and instructors come to rely on it more as a communication medium, and as education becomes more integrated with patient care activities, the need for availability will increase. As computer-generated reminders and links to external resources becomecontinue
more closely integrated into medical records, availability will become more important.
For the most part, health professional education is based on public domain information, so the security requirements for the network are not great. However, with the interplay between patient data and medical knowledge required to support new modes of education, security will become critical. Tools for protecting anonymity may also become important to the extent that clinicians want to be able to consult online resources anonymously.
For health professional education, the ubiquity of the network is of great importance. Improvements in clinician knowledge and clinical outcomes will partially depend on the extensive deployment of new learning techniques and tools. Without access to the Internet from potential sites of care delivery and from their homes, clinicians practicing in poor or remote areas will not be able to benefit from these new capabilities. As a larger number of medical students do internships in remote locations, the need for access will also increase.
Biomedical research attempts to understand the mechanisms underlying human health and disease. It ranges from basic investigations of the molecular details of biological systems to the study of clinical implications of new scientific findings. In basic biology, the work tends to focus on (1) the biological sequencing of DNA and proteins, (2) the three-dimensional structures of anatomical parts and biochemical molecules, and (3) the determination of metabolic pathways. Progress in biomedical research has recently been fueled by an explosion of biological data available for analysis, as evidenced by the growth in the number of DNA bases (chemical units) that have been sequenced, from next to none in 1982 to in excess of 3 billion in 1999. The Internet has been widely accepted within the biomedical community and greatly facilitates the research enterprise by helping integrate disparate databases for improved analysis, allowing linked simulations, and enabling remote control of biomedical research apparatus. Each of these applications poses a range of technological challenges.break
The most important reason for the adoption of Internet technologies within the biomedical community has been the development of publicly available databases containing biological information. Many major biological databases are available at no charge on the Web and offer rapid access and query capability (Table 2.6), and research laboratories are beginning to release primary data onto their Web sites so colleagues can use them for reanalysis or testing new hypotheses. Some databases are extremely popular: each day some 600,000 searches are run from 120,000 different addresses against PubMed, a Web-based service hosted by the National Center for Biotechnology Information (NCBI) with abstracts and some full articles from MEDLINE plus additional journals in the life sciences. These figures grew at an annual rate of 50 percent over the last 3 years.32
Such high rates of use create a number of problems for the host sites. The network bandwidth required for any individual database request may be small (a few kilobytes of data per query), but the aggregate effect of this traffic on bandwidth going into the database server can be significant, overwhelming capacity. Because databases such as MEDLINE and GENBANK are intended to serve a multitude of users in a timely fashion, they are designed for individuals to use in an episodic manner; they cannot routinely allow companies or institutions to perform many queries in a short period of time, such as to run automated queries of a program that is searching systematically through the literature as part of some data-mining application. Commercial online databases have similar problems (Box 2.5). For this reason, many users prefer to obtain a local copy of the databases so that they can subject them to high levels of use withoutcontinue
monopolizing public resources.33 Doing so also allows companies to use the databases without fear that their searches will be watched by competitors. Knowing the kinds of information that companies are searching for can yield clues about the projects they are working on. Local replication would be unnecessary if trusted security services were available across the Internet that could guarantee that queries and results from a Web site remained anonymous and confidential or if the servers could supportcontinue
individual use as well as heavy automatic use by computer programs employing data-mining techniques.
While increasing the bandwidth into biomedical research databases is one way of alleviating bottlenecks, the rate-limiting factor in some systems is the computational server, not the communications bandwidth. The NCBI, for example, uses a T3 line for connectivity to the Internet, which provides it with 45 Mbps of bandwidth. As of mid-1999, NCBI was utilizing only about one-third of that capacity.34 The computational server has become the bottleneck, because NCBI is receiving more requests to compare large data sets with one another and with data sets provided by users. The volume of requests for large data set comparisons is still small, but NCBI has developed some governors to limit requests from particular sites so that other users can access the system.
Network limitations also pose difficulties for users of biomedical databases. Some algorithms that act upon databases require that every single element of the database be compared with every other element. Thus, if there are 1,000,000 entries in the database, then 1,000,000,000,000 possible comparisons must be computed, requiring very fast computation if solutions are to be found in reasonable amounts of time. In many cases, investigators transfer their data sets to remote supercomputing sites (such as the sites sponsored by the National Science Foundation in San Diego and Illinois) so that the processing will not be slowed by data transfer rates over the Internet. Such remote processing has limitations, especially in providing real-time feedback to the researchers.
A data source commonly requiring extensive computational analysis is the output of high-resolution imaging devices. High-resolution images containing millions of elements cannot be transferred rapidly enough to allow researchers to manipulate them in real time. Thus, accurate visualization requires that the image be rendered by a local computer with sufficient computational capabilities or by the efficient transfer of information from an image server to a display device on the Internet. Furthermore, the size of the databases makes transfers slow, and replicating them makes it hard to keep them current during a computation.
This process can also be facilitated by improved networking capabilities. Replication of databases would not be necessary if researchers had higher bandwidth networks that could transfer a terabyte (TB, or 1012 bytes) of data in a few minutes. But doing so requires networking capabilities of tens of gigabits per second (downloading a 1 TB database in 10 minutes demands a network capable of 13 Gbps). Short of such bandwidth, techniques for rapid streaming of data would allow simulations to ''pretend" that the data is already local, even though it is being streamed from a remote database. This capability is difficult to achieve routinelycontinue
today. Although it is possible, it is not generally implemented, and local replication could be easier.
Some biomedical investigations require multiple simulations to be run simultaneously. For example, attempts to understand the physiology of vision might require simulations of both macroscopic and microscopic behaviors, including the quantum mechanics of photoreceptors, the molecular dynamics of macromolecules that respond to light, the population dynamics at the cell membrane as it signals the detection of photons, and the neural network of cells that convey these signals to the brain. These models are all highly interactive and need to share data with one another. The output of one model must be fed into another model. The amount of data transferred between simulations may be large or small, but the effects they induce on subsequent levels of simulation can be substantial. Each of these simulations may itself require significant computation.
Technologies that would facilitate distributed simulations include those for creating uniform techniques for accessing disparate data sources (static, preexisting data, as well as dynamically created data) on the Internet. Many important biomedical questions can be answered only by querying multiple databases, extracting subsets of data, and combining them to determine the final answer. A major software innovation that promises to make biomedical researchers more effective and efficient will be the development of intelligent software agents that assist the investigator in understanding what data are available, what they mean, and how to use them to test new hypotheses. As biomedical researchers perform experiments, such technologies could transfer data directly into a database using the Internet, making them available to other collaborating researchers or computational processes simultaneously. Researchers are creating software to monitor the progress of long, complicated experiments and to alert investigators to unanticipated irregularities in the data or the progress of data collection. As technologies are developed for representing the set of interests for a biomedical researcher (an "interest profile"), intelligent agents could scour the Internet for data of interest and relevance to the researcher, based on this profile. These agents could scan newly published biomedical literature, the publicly accessible Web sites of other scientists, and other Internet information resources, bringing the most relevant sources to the attention of the researcher or abstracting and summarizing them in a manner that is most relevant. Some early examples of this technology have long been available through various journals and online services that notify users when items that match their personal profiles (based on keywords) are published.break
The existence of databases on the Internet enables automated (or semiautomated) data mining to extract new principles from data. Data-mining techniques often use statistical associations between variables to postulate relationships that have not been appreciated previously and then go to the available databases seeking evidence to support or refute the association. For such software agents to be effective and reliable, the databases need to be accessible continuously. Even though each individual software agent might not require very high bandwidth, a network experiencing multiple agents operating for millions of individuals will have a large aggregate requirement for bandwidth.
Remote Control of Experimental Apparatus
The Internet provides a means for remotely controlling some of the expensive experimental equipment used in biomedical research, including electron microscopes, DNA sequencing facilities, gene chips for analyzing the expression of nucleic acid or protein sequences, nuclear magnetic resonance spectrometers, and X-ray crystallographic radiation sources.35 In such systems, investigators send samples of interest to device operators, who load the samples and prepare the equipment. The investigators can then run their experiments remotely, specifying the desired magnification, controlling the focus and field of view, and retrieving images as desired. Such systems have proven especially effective in instances (pathology, for example) where the desired information could not be gathered from a set of still images but called for moving the sample and changing the magnification of the microscope (Wolf et al., 1998).
The ability to remotely control experimental equipment offers several benefits. First, it could help make unique or expensive equipment available to a larger number of researchers. Just as networking has opened up the nation's supercomputer resources to the broader research community, so it could open up specialized facilities to the biomedical research community, thereby improving utilization rates. Second, remote access could reduce travel costs associated with experiments. Because sophisticated equipment is scarce, researchers often travel from their home institutions to remote locations to use it, which consumes both time and money. Moreover, the development of appropriate methods for specimen preparation and analysis is often an iterative process that is difficult to complete in a single visit to the laboratory. Remote access to instruments and computation could allow researchers more control over specimen preparation, data collection, and image processing without subjecting them to the time limits of a visit. Long-term studies that require multiple sessions could also be made more practical. Third, the networking of experimental apparatus could allow research results to be more easilycontinue
shared among collaborating researchers or displayed to a classroom of students for educational purposes. Most implementations of remotely controlled equipment to date send imagery back to the researcher via a Web site. Any researcher with a password can view the results, and some systems are being developed to allow collaborators to hand off control of the equipment during the course of an experiment.
Simple telemicroscopy systems create images that can be transmitted across the Internet with little difficulty. One system developed in Germany generated full-screen images measuring 1,024 × 768 pixels with 8 bits of gray scale, for a total file size of 786 kB. These images could be transferred uncompressed over a 28.8 kbps modem in less than 4 minutes. Using standard JPEG compression, this same image could be transferred in 20 seconds. In experiments with the system, overall response times were dominated by image compression times rather than by delays across the Internet. Indeed, the researchers in Germany were able to reduce response times to 2.5 to 4 seconds across a local area network; the times did not differ significantly when the microscope was operated through a direct Internet connection from other sites in Europe (Wolf et al., 1998).
Nevertheless, for higher resolution images the Internet can introduce significant lag times, especially when multiple images must be retrieved. For example, the National Center for Microscopy and Imaging Research (NCMIR) at the University of California at San Diego houses a state-of-the-art 400 kilo-electron-volt (keV), intermediate-high-voltage electron microscope (IVEM) that can be used to create three-dimensional images from multiple two-dimensional images via a technique known as electron tomography.36 The slices required for three-dimensional reconstruction are 1,024 × 1,024 pixels, with 16 bits of precision per pixel, for an image size of 2 MB. A typical data set consists of either 61 or 121 images, depending on experimental requirements (a total of 121 or 242 MB). During peak periods, three to four such tomographic data sets might be acquired in a single day, generating up to about 1 GB of raw data. The intermediate image-processing tasks can easily quadruple that storage requirement and the final tomographic volumes can alone easily exceed 400 MB. A new, high-resolution camera with an image dimension of 2,560 × 1,960 14-bit pixels has boosted data storage requirements by a factor of nearly five.37
The Collaboratory for Microscopic Digital Anatomy (CMDA) is building an infrastructure for allowing researchers to use NCMIR's IVEM and other imaging instruments from a remote site for the purposes of investigating their biological specimens and analyzing the three-dimensional structure using tomography.38 Early experiments with remote operation of the NCMIR found that the Internet was too slow to allow visual guidance of the microscope. As a result, researchers were forced to rely on acontinue
digital survey of the specimenconsisting of a large mosaic of low-magnification imagesto guide the process. Features on this survey calibrate the spatial coordinates for remote image acquisition. Researchers examine the survey with specialized software and issue requests to the microscope to image certain areas, create image mosaics, or collect a series of tilted images for tomographic reconstruction.
Improvements in information infrastructure and the anticipated availability of high-speed networks led researchers involved in CMDA to develop a video-based controller for the IVEM that can run on any Javaenabled Web browser. The video controller displays optical and stage parameters for the microscope, the command being executed, and a live video image of the specimen being examined to allow more natural, interactive control. Researchers can adjust the focus, brightness, stage position, and magnification of the microscope, and they can acquire and view high-resolution images of the specimen. Control can be traded among multiple researchers participating in a session, all of whom can view the images. Users can individually set the size of images transmitted to them and the amount of image compression in order to match the speed of their Internet connections to the frame rate desired. During sessions, video streams are generated for 1 to 4 hours.
In experiments conducted to date, simple commands to the microscope were processed in less than 1 second; automated commands for focus and exposure setting were performed in approximately 30 seconds. For users with conventional network connections, video streams were compressed using JPEG algorithms to create grayscale images varying in size from 3 to 12 kB per frame. With these connections, the system performed at a maximum rate of 8 frames per second (96 kB / sec), but average performance was more often in the 3 to 5 frames per second range. Higher bandwidth connections can allow the transmission of full-screen digital imagery to researchers. Such images require approximately 36.5 Mbps of bandwidth.39 In April 1999, researchers were able to use a combination of the vBNS and other networks to allow remote operation of the microscope from Osaka, Japan. High-resolution images were acquired and transmitted in as little as 45 seconds but currently require 36.5 Mbps (before intraframe compression).
The visually guided system has led to a dramatic improvement in remote use of the microscope. Researchers are now able to scan their specimen, find areas of interest, and capture high-resolution images with ease and great precision. The current network infrastructure is adequate for low-resolution, low-frame-rate video, which leads to increased control of the microscope. At times, frame rates are still too slow or latencies fluctuate too much (i.e., there is too much jitter) to provide the level of interactivity required for operations such as manual focusing from a dis-soft
tance. Higher speed networks and new transport protocols are needed for high-resolution video at fulland constantframe rates. NCMIR is on the vBNS and expects that a growing number of its collaborators will also join the network or other high-speed networks being developed under the Next Generation Internet or Internet 2 initiatives (see Chapter 1). With the higher transmission rates available on these networks, visually guided control may become more feasible. Use of MPEG compression may also allow higher frame rates to be transmitted over more conventional network connections.
Security, availability, and ubiquity of access are of less concern than bandwidth in the remote control of experimental apparatus but are still important. Security is important for ensuring the integrity of data returned to the investigator and, depending on the nature of the experiments, for maintaining the confidentiality of the data once collected. Reliability is of interest to the extent that researchers want to ensure that the network is available at the time they have been assigned for their experiment. Ubiquity of access is of less concern because experimental apparatus will be used by a small number of highly specialized researchers, most of whom have Internet access through their institutions. However, the system would be better for educational purposes if smaller educational institutions could download images or observe ongoing experiments remotely.
Publication on the Internet
Biomedical research depends on first creating a hypothesis about the world, then designing and running an experiment to test this hypothesis, and finally collecting and analyzing data to determine if the hypothesis is supported or refuted. Because hard-copy publication is so expensive, the scientific community has compromised by publishing papers that present the primary data in summary visual form and that describe the methods used to collect the data, as guarantees that the reported results are accurately reported. With the growth of the Internet, it now becomes possible to consider publishing all scientific data (in its raw form or after some processing) on the Internet for sharing and analysis by other scientists. This forms the basis of the E-Biomed proposal advanced by Harold Varmus, former director of the National Institutes of Health (NIH), for the NIH to house copies of publications and associated primary data sets for the life sciences (Varmus et al., 1999). The physics community has permitted the submission of primary data sets for many years, and certain types of biological data are being released routinely at the time of manuscript publication (for example, DNA sequencing data (GENBANK) and macromolecular structure data (PDB)). The advent of new experimentalcontinue
technologies (e.g., gene expression arrays, or "DNA chips," which record the level of expression of a gene product within a cell at a particular moment in time) that produce massive amounts of data makes it attractive to consider large-scale Internet publication of these data sets.
These data sets could be even more useful to the scientific community if they were linked with other data sources to create a grid of related biological information. By having otherwise disconnected data types linked together, computer programs could propose scientific hypotheses based on the data in one set of databases and then test them based on the data in another set of databases. The NCBI has already created a repository of roughly 10 databases that link biographic information, data on genetic sequences and structure, and data on human diseases. Other technologies are being developed for the similar linking of data (e.g., SRS, BioKleisli, and KEGG).40
The support of large-scale deposition, storage, and retrieval of primary biomedical data on the Internet calls mainly for availability and security, with moderate emphasis on bandwidth. It is critical that the data be reported accurately on networked resources and that the creators of the data be identified and authenticated; it is also critical that all data be captured and available, in order to avoid losses of valuable scientific data. Latency and ubiquity are less important, since the retrieval of these data is often asynchronous with their collection and is done by specialized researchers.
The success of online publishing of biomedical research findings (both primary data and the conclusions drawn from them), and the much larger audience that such publications may draw, could strain the existing model for scientific peer review. Peer review is essential to ensuring the validity of information published by the scientific community, but too many documents are released for public consumption for them all to be reviewed. Methods will be needed to track which documents have been read, reviewed, and revised by authors in response to critiques and which have not. Many social issues remain to be resolved (e.g., control of publication and dissemination, interaction between peer groups and publishers, and the very definition of peer groups), but technologies are also needed to support the outcome of the social negotiations. For example, methods are needed for (1) providing an enduring stamp of approval for documents on the network so that those that have been reviewed can be identified securely, (2) allowing peer groups to be defined and maintained, (3) searching the Internet to retrieve documents of interest, and (4) validating the authenticity of online documents by, for example, digital watermarking. In this context, it is important to note that the idea of peers can be generalized beyond the current idea that they are a group of scientific investigators from a particular field. Already, other groups have emergedcontinue
that may wish to provide a stamp of approval, including disease-specific activist groups, consumer groups, political groups, and others. There is no reason technologies cannot be used by all these groups to label and distinguish documents of interest to their members, using their own criteria.
Collaboration Among Researchers
The Internet could also prove to be a useful medium for enhancing collaboration among biomedical researchers in different locations. The remote control of experimental apparatus is one example of this capability, but others are also possible. For example, envision the following scenario:
Biomedical researchers in three distant cities are interested in the structure and biological function of a new transporter protein whose structure has just been reported in a journal as a result of the Human Genome Project. They believe this newly discovered transporter is expressed in abnormal amounts in a debilitating disease that affects many older individuals within the population. The researchers individually have studied various aspects of the biochemistry associated with this particular disease but think their work could be advanced considerably if they could collaborate with one another. Use of the Internet and specialized network-aware molecular modeling software could enable them to carry on their collaborative research from afar. They could conduct a virtual meeting from their respective offices using the Internet and specialized conferencing and interactive modeling software. Each scientist could display and interactively manipulate three-dimensional molecular models on his or her local workstation as well as the remote workstations of the other collaborators. By using the workstation mouse, one collaborator could, for example, point out the putative binding site on the protein while another suggests a small molecule that he or she thinks might be good at inhibiting the function of this protein. Together, the scientists function as a group and can accomplish much in a short time.
Such scientific collaborations are common and convenient when they take place within an institution, but when the participants are far apart, schedules must be coordinated and travel arranged to a single location. With enhanced Internet services and software, such collaborations couldcontinue
be performed at a distance as well. If the scenario described above is extended to a larger group, where one of the participants is an instructor and the others are students, the ability of the students to question the instructor interactively (e.g., to use the mouse to point to a portion of the protein in the above example and ask why this portion of the protein doesn't contribute to binding) adds an extremely important quality to the educational experience: the ability to enter into dialog with the instructor.
These kinds of applications would require that the Internet provide sufficient bandwidth to enable real-time multimedia communication among participants. To the extent that participants need to engage in real-time manipulation of biological images, the network would also need to support low latencies. Both distant scientific collaborations and interactive distance learning could benefit substantially from multicast protocols that allow sending network packets to multiple destinations simultaneously and efficiently. In fact, any time multiple recipients are involved, multicast protocols may substantially reduce the impact computer applications have on the network.
Another form of remote collaboration is virtual conferences. A critical element of scientific progress is the ability of scientists to gather at conferences to share new ideas, the latest results, and the latest theories. It is widely recognized that in addition to the formal proceedings at such conferences, the conversations that take place in side rooms are often just as critical for ensuring scientific progress. Thus, there would be some advantage in allowing remote participants not only to attend formal presentations but also to make contacts with their colleagues and have private conversations. Whether the cost of enhancing the Internet to provide such capabilities would exceed the benefit is not yet clear. Building in an infrastructure for ubiquitous real-time videoconferencing would be very expensive. Today, a researcher can attend a remote conference using technologies like RealVideo that produce quite good sound and passable video over the Internet, and when this is coupled with a shared whiteboard or shared applications, there is a good approximation to being thereexcept for the real-time interaction. Latencies across such networks are typically a few seconds, but that should not keep remote participants from listening to the speaker and viewing their slides. Some systems allow questions to be sent by e-mail or an electronic whiteboard, also with some time delay.
Biomedical research is an international enterprise, and language is still a barrier to communication. Although English is recognized as the dominant language for scientific communication, there are still some applications (especially for informal collaboration) where support for multilingual interactions would accelerate progress. Indeed, languagecontinue
translation capabilities could be of great help in the consumer health and clinical care arenas. One of the main reasons for the poor access to health care in this country as it becomes increasingly diverse is the number of non-English-speaking persons encountering an English-only-speaking health care system.
Clinical research involves both clinical trials to establish the efficacy of a drug or a device and the subsequent monitoring of the effectiveness of a product in general (rather than in controlled circumstances) after it has entered into widespread usage.41 Additional elements of operations management and organizational policy also have heavily clinical research overtones. The Internet can contribute to a number of these activities, as manifest in clinical trials. As computer-based health records become more widely available, health services researchers will likely use them to explore dimensions such as effectiveness and patient satisfaction via the Internet. The Agency for Healthcare Research and Quality, as well as the NIH (and NLM), is likely to become more interested in the potential of the Internet to achieve better quality outcomes and cost management.
Clinical trials are an essential activity in the creation and testing of new drugs and devices for medical diagnosis and therapy. The U.S. Food and Drug Administration requires careful and statistically valid testing by human volunteers before it gives marketing approval. With the mapping of the human genome and the rise of pharmacogenetics, clinical research and clinical trials could become even more prevalent. Knowledge of the availability of clinical trial opportunities, and guidance to conduct them in a timely and accurate fashion, present a significant knowledge distribution and management challenge for which the Internet is a useful infrastructure. Clinical research in human health and disease, such as that supported by the NIH via federal grants and contracts, has similar information management requirements. The Internet provides the capability to enroll patients, validate eligibility, collect data, and disseminate results to and from widely distributed urban and rural sites. Internet-based clinical trials may be extremely important to progress on a number of rare diseases that require large populations of patients in order to make clinical research feasible.
In the area of clinical research and clinical trials of drugs and devices, a growing number of companies and academic centers are using the Internet to recruit volunteer participants. Pursuant to the FDA Modernization Act of 1997, a congressionally mandated national clearinghouse and directory of clinical research studies for serious diseases is being developed as an Internet-accessible resource by the NLM in collaborationcontinue
with other federal health and science organizations. There is interesting work at the National Cancer Institute (NCI) on cancer trials using networked information facilities and proposals to mount collaborative national (and international) databases for other clinical trials that might reduce cost or increase effectiveness. Commercial companies are building and making available similar ''one-stop-shopping" information resources for patients interested in participating in clinical studies. Since clinical research requires detailed compliance with complex diagnostic and treatment schedules (called clinical protocols), there are both commercial and academic efforts under way to develop detailed, participant-specific protocol guidelines that can be transmitted from a central data management unit via the Internet to participating clinical investigators. Encounter-specific guidance and secure data capture via wide-area computer networks promise to improve the speed with which clinical trials can be completed, as well as to reduce errors of omission and commission in the conduct of clinical research. Current estimates indicate that each day of delay in introducing a new drug to the marketplace costs pharmaceutical companies $1 million in lost revenues (CyberAtlas, 1999).
Security is an extremely important technological consideration in clinical trials. In addition to concerns about the privacy of patients involved in the trials, there will probably be significant commercial interest in some of the resulting data sets, making security and control of the raw data a serious consideration. Tools will need to be in place to authenticate the source of information, protect the confidentiality of information collected, and protect its integrity. Ubiquity of access is important to the extent that it will allow researchers to draw upon larger population bases for their studies. Depending on the protocol for the trials, access at a physician's office or public kiosk may or may not suffice, and in some situations, access may be needed from the home.
Technical Requirements for Biomedical Research
The bandwidth requirements for many biomedical research applications are high. Teleconferencing and high-resolution, real-time transfer of images (during remote instrument manipulations, for example) have very high requirements for bandwidth. There is also a trend in the research community toward increasing dependence on the Internet for communicating data and scientific models. It is impossible to predict the long-term needs of biomedical research, but it is likely that the needs for bandwidth will increase as researchers invent new methodologies for the large-scale collection of data about entire genomes, organisms, and com-soft
munities of organisms. These data may be collected at points all over the world at very high rates. Aggregated traffic back to individual research centers could be very high.
In general, biomedical research is not a time-critical enterprise. There are exceptions, of course, such as the use of the Internet to drive biomedical research instruments (as, for instance, in remote telemicroscopy), where feedback is critical for positioning samples or for adjusting the settings of the instruments. Large distributed simulations also require low latency to improve the speed of their calculations.
For biomedical research, the availability of the network is of moderate importance. Research efforts are not often time-critical and can tolerate low-level losses of data or network unavailability. Obviously, long stretches of such poor performance would be unacceptable, but the needs for availability are not as great in this domain as they might be in clinical care or business applications. Nonetheless, as the Internet plays an ever larger role in research (that is, as it becomes the primary means for accessing primary data, publications, and professional colleagues), it is likely that availability will become more important and even mission-critical for the biomedical research enterprise. Most importantly, only if they perceive an available Internet will reticent adopters of Internet technologies embrace these technologies fully.
For the most part, biomedical research deals with public domain information, so the security requirements for the network are not stressed. Since most studies can be done on aggregate data in which no individual patient is identified, issues of privacy are not paramount. If the research deals with patient information (clinical or genomic), however, then security requirements of the Internet jump to the highest levels.
For biomedical research, the ubiquity of the network is not a critical factor. Most major medical centers and research institutions have network connectivity and are motivated to maintain first-class resources tocontinue
support their investigators, making the issues of universal access less relevant. One exception to this might be an epidemiological study in which data are collected from people over the Internet. In that case, the network would need to be accessible to all patient populations of relevance to the study.
Internet applications promise to improve the quality of, and access to, health care while simultaneously reducing its costs. Realizing these applications requires overcoming a number of technical and nontechnical obstacles. For example, quality of service across the Internet must be improved to provide the bandwidth and latency required for applications such as video consultations and remote surgery. Reliability must be improved to ensure that failures of critical network connections occur only infrequently and impose minimal consequences, especially where human life is at stake. Security capabilities must ensure the confidential transmission of health information across the Internet while vouching for the integrity of the information. Access controls must take into account the different access privileges of different kinds of health care workers. And, to achieve its most far-reaching effects, all care providers and patients must have access to the Internet. Additional detail on these needs is provided below. Chapter 3 goes on to examine technical challenges in further detail, while Chapters 4 and 5 provide additional insight into the organizational and policy issues that must be resolved.
High bandwidth is important for a number of health applications, especially those relying on the transmission of real-time video or large medical or biomedical images. Beyond high bandwidth for specific data-intensive applications there is a need for high aggregate bandwidth to support a high volume of moderately data-intensive applications, such as transfers of large medical records. But bandwidth is not the most important capability for all health care applications. Many consumer health and public health applications, for instance, can currently be supported by the bandwidth available on today's Internet. Bandwidth is particularly important in a number of biomedical research applications, especially in the rendering of three-dimensional images of biomedical structures. It could also be important in professional education, where it would support a virtual reality system for simulated surgeries and other forms of training.break
Certain highly specialized health applications, such as remote control of experimental equipment or simulation of surgical procedures for educational purposes, require much lower latency than is available on today's Internet. However, many other health care applications, such as searching for information on the Internet, do not require instantaneous delivery of information and therefore will not be adversely affected even by the latency of today's Internet.
Because health care can be a life-and-death matter, the availability of many Internet applications related to its provision and the network across which these applications run is paramount. If time-critical information is not available for decision making because data have been lost in transfer, then the safety and quality of patient care can be compromised. Although some health care applications might have lower requirements for network reliability, the most demanding applications still require a higher level of availability than most consumer applications. If health care organizations are to use the Internet for important patient care taskswhether retrieving medical records, accessing decision support tools, or conducting telemedicine sessionsthey need to know that the network will be available a large percentage of the time.
Because of the highly personal nature of health information and the detrimental effects inappropriate releases of such information could have on social standing, insurance eligibility, and employment, the level of protection required for some health information is extremely high. Such protection must be afforded by security protocols embedded in the relevant applications and in the computers connected to the Internet, as well as in the network itself. It will be as much a matter of the rules governing appropriate releases of information as it will be of technical security mechanisms, such as encryption. Equally or perhaps more important from a quality-of-care standpoint is the need to protect the integrity of data and software and the availability of critical services.
The continuing trend toward patient empowerment is being fueled by the greater access of patients to general and personal health informa-soft
tion. The Internet is already playing a large role in improving access to this information, but unfortunately not all Americans are able to benefit. Socioeconomic status and geographic location are still strong determinants of whether a person has access to the Internet. If it is a societal goal to give all persons access to Internet-based health care information and services, then near-ubiquitous access to the Internet will be required.
Use of the Internet in support of health care financial and administrative transactions, public health, professional health education, and biomedical research presents a number of technical challenges that rival those presented by the provision of health care (Table 2.7). Security is of utmost concern in financial and administrative uses, as well as in public health, both of which require access to health records containing patient-specific information. Availability is, in general, of lesser concern than in other health care applications of the Internet, if only because human life is not immediately at stake. Nevertheless, financial and administrative transactions, public health, and biomedical research all require high degrees of system availabilityespecially public health, where the network would have to continue to function even in the wake of a large-scale disaster. Ubiquity is important in all these applications, although fewer people would need access to the Internet for non-care-related activities than for those directly related to health care.
Beyond these demands for technical capabilities, applications of the Internet in health care financial and administrative transactions, public health, professional education, and biomedical research demand attention to a number of organizational and policy issues. Most importantly,continue
organizations engaged in these health-related activities need to recognize the value of the Internet for their missions. Second, they need to develop standards for information exchange, identifying the data elements of importance and agreeing on a standardized vocabulary for describing data and a standardized format for exchanging data. Third, organizations will need to ensure equitable access to Internet resources. This issue may be of greatest importance in the educational arena, where schools have begun to mandate the purchase of laptops by students but have found that some students lack high-bandwidth connectivity from their homes or off-campus work locations. These issues are explored in greater detail in Chapters 4 and 5 of this report.
Affiliated Health Information Networks of New England. 1999. Leading the Way to Health Information Exchange in the Electronic World. Massachusetts Health Data Consortium, Waltham, Mass., April.
American Medical Association (AMA). 1996. Continuing Medical Education Directory. AMA, Chicago, Ill.
Baker, D.B. 1998. "PCASSO: Providing Secure Internet Access to Patient Information," SAIC Science and Technology Trends II. Science Applications International Corporation, San Diego, Calif.
Biermann, J. Sybil, G.J. Golladay, M.L. Greenfield, and L.H. Baker. 1999. "Evaluation of Cancer Information on the Internet," Cancer 86(3):381-390, August 1.
Boodman, Sandra G. 1999. "Medical Web Sites Can Steer You Wrong," Washington Post, August 10, Health Section, p. 7.
Burton, Thomas M. 2000. "Medtronic to Join Microsoft, IBM in Patient-Monitoring Venture," Wall Street Journal, January 24, p. B12.
Carns, Ann. 1999. "www.doctorsmedicinesdiseasesgalore.com": Today's Cybercraze Is Any Web Site Devoted to Health or Maladies," Wall Street Journal, June 10, p. B1.
Centers for Disease Control and Prevention (CDC). 1998. Strengthening Community Health Protection Through Technology and Training: The Health Alert Network. CDC, Atlanta, Ga.
Chand, G., B.C. Breton, N.H.M. Caldwell, and D.M. Holburn. 1997. "World Wide Web-Controlled Scanning Electron Microscope," Scanning 19:292-296.
Computer Science and Telecommunications Board (CSTB), National Research Council. 1997. For the Record: Protecting Electronic Health Information. National Academy Press, Washington, D.C.
CyberAtlas. 1999. "Online Healthcare Market Looks Energized." Available online at <http://cyberatlas.internet.com/big-picture/demographics/article/0,1323,6061_153701,00.html>.
Davis D.A., M.A. Thomson, A.D. Oxman, and R.B. Haynes. 1995. "Changing Physician Performance: A Systematic Review of the Effect of Continuing Medical Education Strategies," Journal of the American Medical Association 274(September 6):700-705.
Dolin, R.H., W. Rishel, P.V. Biron, J. Spinosa, and J.E. Mattison. 1998. "SGML and XML as Interchange Formats for HL7 messages," pp. 720-724 in Proceedings of the AMIA Symposium, Bethesda, Md.break
Fridsma, D.B., P. Ford, and R. Altman. 1994. "A Survey of Patient Access to Electronic Mail: Attitudes, Barriers, and Opportunities," Paper presented at Eighteenth Annual Symposium on Computer Applications in Medical Care, Washington, D.C., October 15-19. See <http://smi-web.standord.edu/pubs/SMI_Abstracts/SMI-94-0524.html>.
Goedert, Joseph, 1999. "Electronic Claims Growth Sputters," Health Data Management (September):84-86.
Harman, J. 1998. "Topics for Our Times: New Health Care DataNew Horizons for Public Health," American Journal of Public Health 88:1019-1021.
Health Care Financing Administration (HCFA). 1999a. HCFA Information System Security Bulletin Handbook, Bulletin 98-01, Baltimore, Md., January.
Health Care Financing Administration (HCFA). 1999b. "Telecommunications Requirements: Migration of Medicare Managed Care Organizations (MCO) to the Medicare Data Communications Network and the Replacement of the RLINK Software," Operational Policy Letter No. 92 OPL99.092, U.S. Department of Health and Human Services, May 6. Available online at <http://www.hcfa/gov/medicare/op1092.htm>.
Hripcsak, G., P.D. Clayton, T.A. Pryor, P. Haug, O.B. Wigertz, and J. Van der Lei. 1990. "The Arden Syntax for Medical Logic Modules," pp. 200-204 in Proceedings of the Symposium on Computer Applications in Medical Care, R.A. Miller, ed. IEEE Computer Society Press, Los Alamitos, Calif.
Huang, H.K. 1996. "Teleradiology Technologies and Some Service Models," Computerized Medical Imaging and Graphics 20(2):59-68.
Huang, H.K. 1999. PACS: Basic Principles and Applications. Wiley-Liss, New York.
Institute of Medicine (IOM), Committee on the Quality of Health Care in America. 1999. To Err Is Human, Linda Kohn, Janet Corrigan, and Marla Donaldson, eds. National Academy Press, Washington, D.C.
Kohane, I.S., P. Greenspun, J. Fackler, C. Cimino, and P. Szolovits. 1996. "Building National Electronic Medical Record Systems via the World Wide Web," Journal of the American Medical Informatics Association 3(3):191-207.
Lasker, R.D. 1998. "Challenges to Accessing Useful Information in Health Policy and Public Health: An Introduction to a National Forum Held at the New York Academy of Medicine," Journal of Urban Health: Bulletin of the New York Academy of Medicine 75(4):779-784.
Lou, S.L., Edward A. Sickles, H.K. Huang, David Hoogstrate, Fei Cao, Jun Wang, and Mohammad Jahangiri. 1997. "Full-field Direct Digital Telemammography: Technical Components, Study Protocols, and Preliminary Results," IEEE Transactions on Information Technology in Biomedicine 1(4):270-278.
Mandl, Kenneth D., Isaac Kohane, and Allan M. Brandt. 1998. "Electronic Patient-Physician Communication: Problems and Promise," Annals of Internal Medicine 129:495-500.
McCormack, John. 2000. "Group Practices Find Their Way to the Internet," Health Data Management 8(1):46-53.
McGinnis, J.M., and W.H. Foege. 1993. "Actual Causes of Death in the United States," Journal of the American Medical Association 270:2207-2212.
Nash, Sharon. 1999. "The Doctor Is Online," PC Magazine Online, July 14.
Resnick, Paul. 1997. "Filtering Information on the Internet," Scientific American (March):106-108.
Reuters New Service. 1999. "Internet Could Organize Medical Records," July 27.
Rind, D.M., I.S. Kohane, P. Szolovits, C. Safran, H.C. Chueh, and G.O. Barnett. 1997. "Maintaining the Confidentiality of Medical Records Shared over the Internet and World Wide Web," Annals of Internal Medicine 127(2):138-141.
Rybowski, Lise, and Richard Rubin. 1998. Building an Infrastructure for Community Health Information: Lessons from the Frontier. Foundation for Health Care Quality, Seattle.break
Science Applications International Corporation (SAIC). 1998. Security and Risk Management for Business-to-Business Health Information Networks, Final Report, Three State Health Information Planning Project. SAIC, San Diego, Calif., June.
Science Panel on Interactive Communication and Health (SCIPICH). 1999. Wired for Health and Well-Being: The Emergence of Interactive Health Communication, Thomas R. Eng and David H. Gustafson, eds. Office of Disease Prevention and Health Promotion, U.S. Department of Health and Human Services, Washington, D.C., April. Available online at <http://www.scipich.org>.
USA Today. 1998. "Health-Related Activities Conducted Online," July 10.
U.S. Department of Health and Human Services. 1998. Healthy People 2010 Objectives. Draft for public comment, September 15, U.S. Department of Health and Human Services, Washington, D.C. Available online at <http://web.health.gov/healthypeople>.
U.S. Public Health Service, Public Health Data Policy Coordinating Committee. 1995. Making a Powerful Connection: The Health of the Public and the National Information Infrastructure. July 6. Available online at <http://www.nlm.nih.gov/pubs/staffpubs/lo/makingpd.html>.
Varmus, Harold, David Lipman, and Pat Brown. 1999. "E-BIOMED: A Proposal for Electronic Publications in the Biomedical Sciences," memorandum dated May 5. Available online at <http://www.nih.gov/welcome/director/pubmedcentral/ebiomedarch.htm>.
Wolf, Guenter, Detlev Petersen, Manfred Dietel, and Ever Petersen. 1998. "Telemicroscopy via the Internet," Nature 391 (February 5):613-614.
World Wide Web Consortium. 1998. "Extensible Markup Language (XML) 1.0. W3C Recommendation," Report No. REC-xml-19980210, February.
1. A search using Alta Vista on July 29, 1999, returned 40,156 Web pages in response to the query "diabetes mellitus."
2. For an example of the criteria according to which health-related Web sites can be evaluated, see <http://hitiweb.mitretek.org/iq/onlycriteria.html>.
3. Information on PICS is available online at <http://www.w3.org/PICS/>. See also Resnick (1997).
4. For example, a company named PersonalMD.com had stored the health records of 10,000 subscribers online free of charge as of July 1999. The company sends consumers a card with a personal access code that allows them to retrieve their records over the Internet or by a fax-back system (Reuters News Service, 1999). Another group, the Medical Registry, charges $100 to retain medical information online, allowing customers to update it as often as they wish.
5. The Medical Registry, which was started by emergency room physicians, allows doctors to access a patient's record during an emergency by entering their Drug Enforcement Act number. Patients are issued a wallet card and alert bracelet containing the address of the Web site, the patient's password, and the phone number of a fax-back service that can access and download the patient's records.
6. For more information on PCASSO, see Baker (1998).
7. The National Heart Attack Alert Program is a federal effort that may lead to improved techniques for remotely monitoring patients. The program has the overall goals of, first, reducing morbidity and mortality from acute myocardial infarctions (heart attacks) through rapid identification and treatment and, second, heightening the potential for an improved quality of life for patients and family members. Remote monitoring and collec-soft
tion of patient vital signs is seen as one possible avenue for early detection of heart attacks and for getting patients into the health care system quickly. Information about the program is available online at <http://www.nhlbi.nih.gov/about/nhaap/nhaap_pd.htm>.
8. Data from Michael Kiensle, associate dean for Clinical Affairs and BioMedical Communications, University of Iowa College of Medicine, personal communication, July 12, 1999.
9. In-home monitoring with a video link offers benefits to patients, but not for diagnostic reasons. As one reviewer of an early draft of this report noted, the patient needs to see the care provider to address the problem of noncompliance, which often results when patients misunderstand instructions and take medications at the wrong time, in the wrong dosage, and so on. The way to improve compliance is to ensure that the care provider captures the attention of the patient while delivering instructions. Video can help ensure this happens.
10. At present, teleconsultations conducted across networks that use the IP require approximately twice the bandwidth of traditional point-to-point networks. The reasons are twofold: (1) Internet protocols impose some additional overhead functions that require bandwidth and (2) the devices used to encode video streams into IP packets (coder/decoders, or codecs) are much less efficient than their non-IP counterparts. But IP codecs are less expensive, in part because they carry less hardware compression, and next-generation IP codecs are expected to provide better performance and impose less of a penalty on IP-based systems.
11. East Carolina University recently received a grant from the National Library of Medicine to investigate these requirements.
12. Pending further study of the medical efficacy of higher bandwidth for teleconsultations, an upper limit on bandwidth for video consultations can be estimated by considering the need for broadcast quality video. A video display with 640 × 480 pixels that is refreshed 30 times per second and has 24-bit color demands 221 Mbps. With standard compression technologies, such as that of the Motion Picture Experts Group (MPEG), reductions of 90 to 1 are common, resulting in a need for 2.5 Mbps. Improved coding may lower this figure further. For transmission quality equal to high-definition television, which is just entering consumer production, 19 Mbps would be required. These figures represent the maximum bandwidth that remote video consultations could be expected to use, but, as the evidence collected by ECU and other practitioners indicates, much less bandwidth is sufficient in many applications.
13. Information on the National Laboratory for the Study of Rural Telemedicine at the University of Iowa is available online at <http://telemed.medicine.uiowa.edu/index.html>.
14. Anthony Chou, University of California at San Francisco, presentation to the committee, December 16, 1998.
15. As described later in this chapter, attempts are being made to make these specialized instruments available to a larger number of researchers through the Internet.
16. Stentor, Inc., has developed a system that can provide high-resolution images over lower-bandwidth networks by providing only portions of the overall image at any one time.
17. In addition to the lack of standardization of medical data models, there has been no widespread adoption of portable decision-support tools, despite the efforts of many in projects such as the development of the Arden syntax (see Hripcsak et al., 1990). The absence of sound, widely accepted automated decision-support tools that are integrated with each other and with Internet health transactions will undermine the capabilities of such tools to achieve the desired goal of medical error reduction. For example, if one set of Internet transactions attempts to optimize for medication orders and another set of Internet transactions attempts to optimize the ordering of procedures, several possibly dangerouscontinue
and/or expensive interactions between the two might occur. In a tightly integrated system, as compared to disparate and separate Internet-based systems, such interactions might be minimized. This situation suggests that a near-term challenge will be to ensure quality control and coordination among the many different Internet-born clinical transactions and to develop robust medical decision-support tools that can serve a wide range of institutions and patient populations.
18. In a survey of 153 chief information officers conducted by the College of Health Information Management Executives in 1998, 80 percent said they use HL7 and 13.5 percent planned to implement it in the future.
19. All claims data in this paragraph derive from research conducted for Faulkner & Gray's 2000 Health Data Directory, as cited in Goedert (1999).
20. For additional information on these efforts, see Rybowski and Rubin (1998) and Affiliated Health Information Networks of New England (1999).
22. For example, the U.S. Public Health Service released a report in 1995 describing the potential applications of the Internet in public health and identifying technical challenges to be addressed (U.S. Public Health Service, 1995). In 1997, the New York Academy of Medicine and the National Library of Medicine cosponsored a symposium on public health informatics that called for improved structures and assessment mechanisms for public health information (Lasker, 1998). Slide presentations of several symposium speakers are available at <http://www.nlm.nih.gov/nichsr/nyam/nyam.html>. The Department of Health and Human Services' document Healthy People 2010 (U.S. Department of Health and Human Services, 1998) includes a section on objectives for improving the public health infrastructure. They include widespread access to the Internet and real-time, on-site access to public health data for public health workers and individuals. Section 14, objectives 5 and 6, is the most relevant example.
23. Participating organizations include the National Network of Libraries of Medicine, the Centers for Disease Control and Prevention, the Health Resources and Services Administration, the Association of State and Territorial Health Officials, and the National Association of County and City Health Officials.
24. Reports from physicians' offices and hospitals also tend to be reported on paper.
25. Jac Davies, Washington State Department of Health, presentation to the study committee, February 11, 1999, Seattle, Washington.
26. The traditional public health functions are surveillance, case identification, treatment, prevention, research, guidelines, education and feedback.
27. President Clinton's proposal for this program would also create a network of regional labs to provide rapid analysis and identification of select biological agents.
28. The Health Alert Network is part of a larger antibioterrorism effort that received $158 million in FY99. Another $72 million was proposed for FY2000, which would raise the total to $230 million.
29. This information is derived from "Health Alert Network Architectural Standards," supplement to the Centers for Disease Control and Prevention Program Announcement No. 99051.
30. The Association of American Medical Colleges reports that total enrollment in full-time undergraduate medical programs in the United States was 66,900 in the 1997-1998 academic year. There were 99,099 residents being trained in clinical settings (primarily teaching hospitals). According to the quinquennial survey, approximately 242,000 students were enrolled in all health sciences programs during the 1996-1997 academic year.
31. The SHINE project at Stanford Medical Center is experimenting with providing CMEcontinue
credit to physicians who request point-of-care information during patient interactions. Information on this program is available online at <http://shine.stanford.edu>.
32. These figures were provided by Dennis Benson at the National Library of Medicine in a personal communication dated February 11, 2000.
33. There have been laboratories whose access to NCBI/PubMed was suspended temporarily when usage rates climbed too high. One lab at Stanford lost access after a graduate student wrote programs that were downloading 3,000 abstracts per minute from the Web site. The scientific goals of this student were meritorious, but the resource was not built to sustain this use (Russ Altman, Stanford University, personal communication, December 22, 1999).
34. James Ostell, National Center for Biotechnology Information, presentation to the study committee on March 1, 1999, Washington, D.C.
35. Researchers at the University of Cambridge, the University of California at San Diego (see Box 2.4), and the University Hospital Charité in Berlin have all developed Internet-based systems for controlling experimental apparatus (Chand et al., 1997).
36. Electron tomography is a technique whereby three-dimensional structure is derived from a series of two-dimensional projections using advanced image processing steps. In the most common form, the specimen is tilted around a single axis and imaged at regular intervals. The IVEM at NCMIR is one of a few such instruments in the United States made available to the biological research community. Support for NCMIR is provided by the National Center for Research Resources (NCRR) of the National Institutes of Health (NIH).
37. This information is taken from a paper entitled ''NCMIR's Collaboratory for Microscopic Digital Anatomy: A National Science Foundation National Challenge Project," which is available online at <www-ncmir.ucsd.edu/CMDA/>.
38. CMDA has already been used by researchers at Montana State University to collect data on synaptic organization in the sensory ganglia of the insect nervous system and by scientists at the University of Oregon studying neurotransmission (synaptic vesicle release) in vestibular hair cell synapses. Other users are studying the abnormalities in nerve cells in Alzheimer's disease, the structural relationships of protein molecules responding to calcium within nerve cells, and the three-dimensional pattern of branching of the dendrites in neurons that create a highly linked network of cellular communication.
39. In the longer term, it is hoped that digital video standards will give good resolution and smooth motion at 30 frames per second at much lower bandwidth.
40. More information on SRS is available at <http://srs.ebi.ac.uk:5000/>. Information on Biokleisli is available at <http://smi-web.stanford.edu/projects/helix/mis214/bdkowvldb95.pdf> (a paper). Information on KEGG is available at <http://www.genome.ad.jp/dbget/dbget.links.html>.
41. Clinical research lies at the juncture of clinical care, biomedical research, and public health but is somewhat distinct from each of these topics. It is described in the section on biomedical research in this report for reasons of editorial convenience and exposition.break