There was a problem loading page 270.
Site Visit Summaries
As part of its data-gathering activities, the Committee on Enhancing the Internet for Health Applications visited eight sites that were either developing health-related applications of the Internet or engaged in health-related activities that could be transferred to the Internet in the future. These visits, conducted between December 1998 and February 1999, spanned half a day to one full day each. They provided committee members with a snapshot of the state of deployment of the Internet in the health community at that point in time and an opportunity to better understand the technical and other challenges associated with use of the Internet in support of health care.
This appendix briefly summarizes the committee's eight site visits, including four in California, two in North Carolina, and two in Washington State. The sites were the Laboratory for Radiological Informatics at the University of California at San Francisco (UCSF); Kaiser-Permanente of Northern California, Oakland; Stanford Center for Professional Development, Stanford University; the National Aeronautics and Space Administration (NASA) Ames Research Center, Mountain View, California; the Center for Health Sciences Communication at East Carolina University (ECU), Greenville, North Carolina; the University of North Carolina (UNC), Chapel Hill; and the University of Washington (UW) and Regence BlueShield in Seattle.break
Laboratory for Radiological Informatics
Members of the study committee spent half a day at the UCSF Laboratory for Radiological Informatics (LRI) on December 16, 1999. H.K. (Bernie) Huang and his colleagues at LRI demonstrated systems designed to share three different types of telemedical images: digital mammograms; cardiograms; and neurological images made by magnetic resonance imaging (MRI) and computerized tomography (CT).1 For the most part, these systems make use of a picture archiving and communications system (PACS) at UCSF that stores digital medical images in several terabytes of optical disk storage and an asynchronous transfer mode (ATM) wide-area network (WAN) that connects UCSF with nearby Mount Zion Hospital. Originally, the laboratory used a dedicated WAN, but then the university installed a synchronous optical network (SONET) ring, which now supports the network. A T1 connectionwhich supports data rates of 1.544 megabits per second (Mbps)links UCSF with Stanford University Medical Center, an hour's drive away.
Steve Frankel of the Breast Imaging Section at UCSF and Andrew Lou of LRI provided an overview of the teleimaging system. UCSF now has two full-field digital telemammography systems that produce images of 40 to 60 megabytes (MB) compressed (using an acceptably lossless compression scheme). A typical study generates four such images, two of each breast, but also requires a historical set of equal size for comparison. Images can be transmitted across the WAN for remote interpretation and diagnosis or real-time reading by expert mammographers. In the demonstration, staff physicians at UCSF sent images to Mount Zion for interpretation. Physicians at both UCSF and Mount Zion used high-resolution (2,000 × 2,000 pixel) monitors to view the images. Using custom software developed at UCSF, the referring and consulting physicians could use an on-screen dual-pointer to identify objects of interest and change the brightness and contrast of the images to aid in interpretation. An electronic magnifying glass enabled mammographers to examine portions of the image in greater detail.
Expert reading of mammograms in real time is not necessarily needed for regular screenings, but it can be useful if potential abnormalities are discovered. In such cases, remote experts can provide faster diagnoses or request additional images before the patient leaves the mammography center. Busy centers may examine 80 to 100 women per day (20 per day is more typical for an average center), with the expectation that images willcontinue
be read the following day. Longer delays are not uncommon; many mobile mammogram centers have 2-week turnaround times.
Telemammography is viewed as a means of supporting increased demand for mammograms. The National Cancer Institute (NCI) recommendation that all women over age 40 have an annual mammogram could, if widely heeded, greatly increase the rate of mammography, UCSF system developers noted. They also said the nation has too few expert mammographers to handle the increased volume and, moreover, that many rural areas have no local experts. To provide teleradiology interpretation services for other health care organizations, UCSF has established a consortium with Emory University in Atlanta; Wake Forest University in Winston-Salem, North Carolina; Brigham and Women's Hospital in Boston; and the University of Pennsylvania. The consortium, named Telequest, accepts images over a dedicated T1 line and provides transcribed diagnoses. Participating physicians must be cross-licensed in the examination sites.
Tony Chou, director of the catheterization labs, described a cardiology teleconferencing system linking UCSF and Stanford University. This application, which runs on a T1 line, is used as an educational tool to enable physicians to present cases to colleagues for postmortem reviews. It is not yet used for diagnostic purposes but it could be, once the institutions install digitized catheterization labs. The system has been used to review angiography and intravascular ultrasound images, which are currently captured as analog video and digitized. Digitized angiogram files are roughly 60 MB in size; intravascular ultrasounds are about 50 MB. The size of these files is expected to grow by a factor of 10 once the fully digital systems are installed. Videos are displayed at a rate of 25 frames per second.
In applications tested to date, digitized videos of angiograms and intravascular ultrasound have been transmitted across the T1 network and replicated on both sides of the connection, a process that takes approximately 5 minutes. In one case, images were transferred to a medical center in Germany overnight. Participants in the teleconferences can examine images simultaneously, using on-screen pointers to note items of interest and zooming in on particular portions of the video. Physicians report that the quality of the video is already sufficient for most diagnoses, but the system has been used only for reviewing outcomes and medical decision making.break
Bill Dillon, chief of the Neuroimaging Department, provided an overview of the neuroimaging teleconsultation system linking UCSF and Mount Zion. The system is used primarily for MRI and CT scans, which usually are digitally captured and stored in the PACS. Before the PACS was established, some 15 to 20 percent of films were lost and hundreds went unreadmeaning that the hospital could not bill for the service. The PACS allows immediate access to images and was accepted rapidly by physicians, who found that the system greatly increased the ease of locating needed studies before meeting with a patient.
The WAN has enabled UCSF to offer image interpretation services. In fact, UCSF now has a resident on call to read neuroimages taken at Mount Zion. Such centralization of interpretation skills helps accommodate the interests of the state of California, which is pressuring California hospitals and medical schools to reduce the number of specialists and trainees in specialty areas. Similar pressures are being felt at the national level as health care providers merge and attempt to consolidate services and eliminate duplicative capabilities across large health care delivery systems.
Kaiser-Permanente of Northern California
The study committee visited Kaiser-Permanente of Northern California on the afternoon of December 16, 1999. Kaiser-Permanente is the nation's largest health maintenance organization (HMO), with more than 9.4 million members in 20 states. It has approximately 100,000 employees, including 10,000 physicians and 30,000 nurses and pharmacists, and it operates some 30 hospitals with more than 8,000 beds. The HMO is not a single entity but, rather, an affiliation of two separate organizations: the Kaiser Foundation Health Plan, which handles administrative and management functions, and Permanente Medical Group, which consists of 12 separate groups of practitioners. Kaiser-Permanente has experimented with network-based telemedicine applications, particularly in teledermatology and teleradiology. Its most notable success was with a system for retinal screening of diabetic patients, which began when a clinic bought a camera and began sending images electronically to an expert reader for interpretation. This simple procedure increased initial screening rates from 30 to 93 percent of all patients who met the criteria in the risk guidelines.
Kaiser-Permanente staff began evaluating different Internet applications at the request of its chief executive officer. The result was a unified, three-pronged strategy consisting of a provider-oriented system; acontinue
customer-focused system; and a common, shared database. Development of the provider system and shared databasethe Permanente Knowledge Connection (PKC)has proceeded under the auspices of Kaiser-Permanente's Care Management Institute. Development of the consumer component, the KPOnline system, has proceeded separately. The discussions during the site visit focused on both the institute's efforts to develop internal applications for use by care providers and on the efforts of KPOnline, a customer-focused system.
Permanente Knowledge Connection
Peter Juhn, executive director, described the activities of the Care Management Institute (CMI), a national entity within Kaiser-Permanente that operates on behalf of both the Kaiser Foundation Health Plan and the Permanente Medical Group. CMI was established in 1997 and employs approximately 80 people, 30 of whom work in Oakland and the rest of whom are distributed throughout the Kaiser-Permanente system. Its principal function is to develop evidence-based approaches to care management. This work has three areas of emphasis: content, measurement, and implementation. Content work includes the development of management programs for conditions such as diabetes, asthma, and depression, as well as an overall compendium of clinical best practices. Measurement work encompasses large-scale national studies of health outcomes and has included studies of 200,000 diabetic patients, 320,000 cardiovascular patients, and 90,000 asthma patients in the Kaiser-Permanente system. The implementation work builds on the content foundation, using the collected information as a basis for clinical systems that can influence care at the point of delivery. The objective of these activities is to change care providers' behavior to coincide with best practices developed throughout the Kaiser-Permanente system. Such systems can have dramatic effects on care. Over the previous year and a half, Kaiser-Permanente found a 10 to 15 percent gain in the use of practice guidelines in some areas where it had developed content.
The PKC is a network-based application that was developed to support the CMI's objective of improving care. Its primary function is to allow care providers to access current CMI content on best practices. The CMI staff realized that each Kaiser-Permanente site was gathering useful information that could benefit other local and national care providers, but little of it was shared. Using Kaiser-Permanente's national intranet, the PKC now has national and regional databases of best-practice information that is vetted before it is entered into the databases. A board of directors that represents all executive and physician groups approves information for inclusion in the national database. Twelve regionalcontinue
groups establish approval processes for their own regions, and local offices establish processes for local approvals. This structure balances local freedom against the need for greater structure at the highest levels of the organization. The databases are linked together through the national office's intranet Web site, which makes the information searchable and reduces duplication of effort. In addition to linking care management information, the PKC provides a centralized outlet for other information resources of interest to care providers. It contains a section for continuing medical education (CME) that allows users to look up their CME credits. It also provides access to online text books and journals and supports discussion groups with threaded discussions. Its workgroup functions enable users to post material and conduct national meetings in a virtual manner. Workgroups can be designated for members only, with membership defined by the chair.
The CMI staff plan to build on the PKC's current capabilities to make PKC more useful to care providers. The staff will address topics such as improving search capabilities, tailoring information to provider needs, and expanding access to affiliated care providers. The PKC can support searches across the national intranet using a commercial search engine, but the CMI is looking into ways to improve search capabilities and help different types of users find information they need, perhaps by adding "metadata" capabilities. The staff also plan to use "push" (as opposed to pull) technologies and customization to provide users with information relevant to their immediate and long-term needs. Plans for this enhancement are tied to efforts to place a computer on every physician's desktop and provide access to a clinical information system (more than 40 percent of Kaiser physicians had desktop computers at the time of the site visit). Overall, however, the Kaiser-Permanente staff do not consider technology a limiting factor. They have modest aims for now, and technology is often a distraction from reaching other goals. They prefer to take known technology and find ways of using it to support their business, rather than pushing the technological envelope.
Nevertheless, determining how best to expand the PKC to affiliated providers will entail both technical and policy considerations. Kaiser-Permanente has agreements with approximately 40,000 affiliate providers in physician groups outside of California. How can Kaiser convince the affiliates to care for Kaiser patients according to Kaiser practices? Should the affiliates have access to all the available knowledge, or should some information be considered proprietary? How can information be layered and filtered easily to accommodate restrictions placed on affiliate access? How can firewalls be extended to include specific partners while still providing adequate protection for Kaiser's information systems? At present, the system does not contain patient-level data, which alleviatescontinue
some concerns regarding security, but Kaiser needs to evaluate alternative ways of providing security on an extranet and/or the Internet. It also needs to determine how best to blend public and proprietary information to benefit its providers. Kaiser officials do not consider their practice guidelines proprietaryand would even like to make them publicbut the tools for implementing these practices are unique and will be kept proprietary. Kaiser would like to translate the guidelines into lay terms and make them available on its consumer-oriented Web site.
The Kaiser staff anticipate three types of outcomes from the PKC. First are the financial benefits. They want to leverage the size of the organization while avoiding duplication of effortsomething the PKC can facilitate. The staff will try to determine how much money was saved by not starting new programs or sustaining existing ones because PKC indicated that similar work may prompt changes in clinical decision making that yield improved clinical outcomes and use of facilities. Other benefits will accrue from improved knowledge management, which should succeed in educating care providers about new diagnostic approaches and new techniques. The Kaiser staff hope to make the link between successful patient-provider interactions and the PKC system evident, to demonstrate the system's ability to support corporate objectives.
If the system is to be successful, then usage rates must increase. At the time of the site visit, 2,700 of Kaiser-Permanente's 10,000 physicians used the system, but all were expected to use it by the end of 1999. To attain and maintain that level of usage, the system will need to prove itself capable of educating physicians about the new diagnostic approaches and clinical techniques. The system will also need to demonstrate its capability to enhance the goals of the organization. Management will need to see a direct linkage between improved patient-provider interactions and the tools that support that interaction. Privacy issues will also need to be addressed so that providers know whether information will be collected on their searches of medical literature and whether such searches will be viewed as positive (i.e., the provider is engaging in continuous learning) or negative (i.e., searches indicate gaps in a provider's knowledge).
Anna-Lisa Silvestre and Richard Leopold described Kaiser's consumer-oriented Web site, the primary component of KPOnline. KPOnline is a three-tiered system with a Web page interface that interacts with legacy systems through an intermediate object layer. Ms. Silvestre views the Web site as a service for interacting with Kaiser-Permanente members. It is not a marketing tool or a mechanism for providing content. Rather, it is intended to provide members with an alternative to telephone calls andcontinue
office visits. The site provides members with capabilities for messaging, scheduling appointments, and checking prescriptions 24 hours a day, 7 days a week. Eventually, the Kaiser staff would like to integrate KPOnline with the PKC so they can take the information gathered from patients through KPOnline and incorporate it into practice guidelines and, conversely, incorporate practice guidelines into chat rooms and e-mail discussions with patients. The goal is to help patients better understand health information, tailor it to them, connect them with providers, and to help them make sound, coordinated decisions regarding their care.
At the time of the site visit, approximately 16,000 registered users had logged on to KPOnline more than once. This number is just a small fraction of Kaiser-Permanente's 9.4 million members, but the staff believes usage will increase as more of them gain Internet access. The recent biannual survey reported that 72 percent of members are adults, 53 percent of whom have Internet access at home, work, or both. Ten percent of all members have requested a personal identification number (PIN) to use with the system. Hence, Kaiser expects hundreds of thousands of members to access the system simultaneously in the near future (the target was 350,000 active users in 1999) and is in the process of procuring additional servers to handle the load. The organization is still trying to learn how members use the site and what types of services they seek. Kaiser is starting to collect data on site usage (it does not track an individual's movements within the site) but does not yet have adequate volume to examine usage by demographic category.
Consumers are coming online quickly, and KPOnline is expected to become a basic utility that will benefit both consumers and Kaiser-Permanente. A basic evaluation will be performed to determine who uses the system, their level of satisfaction, and the overall utility of the system. More formal cost-benefit analyses will also be conducted. A tangible cost-benefit analysis will evaluate processes such as online pharmacy refills and automated appointment scheduling and compare them to more traditional, manual processes. The Kaiser staff expects that online pharmaceutical services have the highest benefits per unit cost because filling prescriptions becomes much less expensive when done in high volumes. Other tangible benefits not associated with cost reductions, such as helping members make good decisions, will also be considered. Some of the benefits will be difficult to quantify, but there may be ways to determine if online material helped to prevent an unnecessary visit to a Kaiser facility, improve the appropriateness of a subsequently scheduled visit, or increase membership retention rates.
Early experience with KPOnline has uncovered numerous issues that need to be resolved. One of the main ones is the need for standards for measuring the quality of online transactions. For example, with respect tocontinue
online questions to nurses, it might be desirable to track the time of receipt, the time at which the question was answered, the time at which the member retrieved the response, and whether the answer given was valid. Similar standards are now in place to help train nurses who provide advice over the telephone, but such situations involve near-real-time feedback. In addition, nurses need training in how to provide care based strictly on text input (which creates a record of the interaction), with no voice or personal interaction with patients. At the time of the site visit, all of Kaiser-Permanente's care providers had e-mail accounts, but patients were not yet using them. The use of clinical e-mail raises several questions, as yet unanswered, regarding medical records.
Other issues include the determination of rules for intervening in sponsored chat groups. Kaiser-Permanente had a case in which a member's postings to a chat group suggested suicidal tendencies. The organization had to decide whether and how to intervene. Should it link the user's anonymous login with the medical record database to check the medical history and find contact information? In this case, the Kaiser staff did just that, and an advice nurse called the patient and arranged an appointment, which revealed that the patient was indeed suicidal. Events such as this prompt policy reviews.
Kaiser has established a set of technical measures and administrative processes to provide security on the Web site. The site uses Secure Socket Layer (SSL) encryption to protect messages between members and Kaiser, and members need a PIN to access chat groups. Members are required to provide their membership number and address to obtain a PIN. In addition, the manager of the business unit is a security trustee and has to ensure that policies and procedures are in place. These policies are reviewed regularly and upgraded as needed, and attempts are made to achieve consistency between online and off-line policies. For example, Kaiser dropped the authentication requirement for scheduling an appointment online because such authentication is not performed when scheduling appointments via the telephone.
Members of the Kaiser-Permanente staff identified several technical capabilities they would like to be able to incorporate into KPOnline:
• Authentication technologies that would allow multiple users within a single household to use the same computer but keep their information separate. In the current KPOnline system, if users forget to log out of a session, then other family members can see what they did and what information they retrieved. The Kaiser staff would like to obtain improved technologies to identify users and authenticate their identities. One possible solution is biometrics, but this approach would be costly to implement across many computers.break
• Higher-speed Internet connections for end users. The objective is to enable members to have ''a T1 experience" while using a 28.8 kilobit per second (kbps) modem. Kaiser has its own wide-area network that connects all Kaiser facilities. Should the organization become the equivalent of an Internet service provider (ISP) so that it can provide high-speed connections and ensure security? Should care providers act as ISPs themselves so that members can connect to the network using high-speed cable modems or digital subscriber line technologies?
• A standardized Web browser. The Kaiser support staff spends considerable time helping customers configure their browsers to work with KPOnline. Users sometimes confuse the application with the infrastructure and call Kaiser's support desk to report problems that are not associated with KPOnline but rather with their ISP.
• A simple telemedicine terminal. A computer that supports videoconferencing, and sensors for collecting diagnostic information, would enable care providers (such as triage nurses) to interpret and evaluate cases based on more than just text-based information. A pilot program using this technology is ongoing in the mid-Atlantic region to monitor diabetic patients, and there are plans for another pilot program in the Northeast with congestive heart failure patients. Telephone lines are used to transmit information. The main problem experienced to date is not capturing the information but incorporating it into the medical record. Security issues also need to be addressed.
Stanford Center for Professional Development
Committee members visited the Stanford Center for Professional Development on December 16, 1998, to meet with Andrew DiPaolo, its executive director, and members of his staff and learn more about the Stanford Online system. Stanford Online is the current Web-based implementation of the Stanford Instructional Television Network, a system pioneered by Stanford University's engineering school to teach courses at a distance, largely (in the past) to employees in local industry. The system allows students or employees to remain at their company sites while taking courses. Through the Honors Co-op Program, workers can maintain full-time employment while studying to earn a full-fledged master's degree. The program focuses on engineering classes, but members of the University's section on medical informatics have taught courses using the system.
The system started with live broadcasts over private systems between Stanford and the subscribing companies, generally with two-way audio supplementing one-way video so that students could ask questions of the instructors and participate in discussions. Subsequently, workers' needcontinue
for flexibility in scheduling of studies led industry to call for videotapes that could be watched at the convenience of students rather than only during live video feeds. The program was broadened into the Stanford Center for Professional Development (SCPD) when it became clear that the distance-education model would be more than simply broadcast television. Anoop Gupta, a computer science faculty member, had developed Vxtreme, a technology for streaming video, which was experimentally adopted by the SCDP as a means of distributing "videotapes" by the Internet rather than by courier. Vxtreme was later acquired by Microsoft Corporation, and the SCPD continues to use the Microsoft streaming video products (which are incompatible with RealVideo).
Today, Stanford Online has become an important means of disseminating video-recorded classes along with slides and photographs of blackboards or projection screens. Videos are up on the Web within 3 hours of the class. Class videos are indexed into segments by undergraduate students, who manually perform this function immediately after the class. Methods are being explored for automating the indexing and matching the video to the master copy of slides. A previous year's videos are discarded because the courses are taught again and the lectures are continually updated.
Stanford Online students see three things in their browser window: a relatively small video of the instructor giving the class; an index of topics covered by the lecture (topics can be selected as the student wishes); and PowerPoint slides or, alternatively, photos of projected slides or black-boards, coordinated with the audiovisual track. Even with a 28.8 kbps connection, students can work through a lecture and skip around based on the indexing outline. The browser has a plug-in for streaming video. The system is intended to be used asynchronously; students cannot ask questions in real-time via the Internet, but they can send e-mail to professors and participate in online discussion groups.
Students can take courses whenever they want if they are not seeking credit. If they want credit, then it is best for them to take the course during the regular semester, at the same pace as on-campus students, to take advantage of resources such as teaching assistants who are available to grade homework. Class videos are also broadcast to dormitories at specific times over the campus television system, enabling Stanford students who live on campus to view lectures they missed or wish to view a second time. Television viewing does not allow user-controlled access to specific portions of the lectures, however, as does the Stanford Online version. With the availability of both scheduled replays of lectures and Stanford Online video on demand, it is now possible for students to take two courses that meet at the same time. Students like the asynchronous method because they can follow courses when they are away from thecontinue
campus. However, the online capability can reduce the number of students attending the live lecture; in fact, some students have opted not to attend class at all, especially early in the morning. The smaller number of live participants can reduce class interactions and force changes in the way professors deliver lectures.
File sizes for 10 minutes of video range from 1.6 MB to 38 MB depending on the bit rate at which they are sent (Table A.1). The size of the file is minimally affected by the size of the video image on the computer screen. The image quality of the frame degrades as the number of frames per second (fps) increases, even though the throughput (bit rate) may remain constant. The frame rate depends on the video capture card that converts video imagery into digital format and on the horsepower of the authoring workstation.
It is not clear whether there are unique networking requirements for medical education. Some technologies may be more important for the delivery of lectures dealing with medicine than of these dealing with other fields. For example, techniques are needed for tracking laser pointers on slides, an essential in teaching radiology classes. The SCPD is trying to develop techniques for overlaying the movement of a pointer on the screen and ways of noting changes between or within items on the screen. It is conducting formal studies of performance and has involved the university's education school in the evaluation process. At present, Stanford Online is a technical implementation of current models of teaching; little thought has been given to changing the educational model to fit the technology. This approach has at least one advantage: it allows lecturers to deliver courses online without changing their form or structure. Some minor changes have been made, such as the use of bigger chalk to make writing more visible.
The SCPD makes money for the university, enough to pay for a large studio and the operation of five channels. Net income is roughly $4 million per year, some of which is divided among university departments. Faculty members have financial incentives to teach courses online: They receive a share of the tuition for online courses, as do their departments.continue
But new financial models may be needed to pay for new services. So far, off-campus services are offered only to "member companies." This approach would need to change if the university decided to allow individual students to pay for their own classes, as they would for CME.
The SCPD programthe online portion, in particularis very popular. Online participation may cut into the broadcast model. Obviously, this could be a big business for Stanford, a trend that threatens small colleges. SCPD has the capability on existing servers to teach 2,000 simultaneous users; additional capacity could be brought online to expand its scope.
NASA Ames Research Center
The committee visited NASA Ames Research Center on December 16, 1999. Committee members were hosted by Muriel Ross, then director of the Ames Center for Bioinformatics, and her team of biocomputing researchers. NASA's space exploration mission, particularly the possibility of crewed flights over long distances, poses particular challenges in network communication. As space vehicles venture further from Earth and for longer periods of time, it is unlikely that the crew will have all the needed medical expertise. Accordingly, the NASA team is focusing primarily on how to provide health care using telemedicine over long distances with high latency. The emphasis is on technologies that consume little power (such as personal computers) and that can leverage the power of more costly hardware through network connections. They are addressing two particular aspects of this challenge: the need to support remote, collaborative medical assessment and treatment using the Internet (as an approximation of the network that would be available using wireless communication) and the need to support rapid, accurate three-dimensional rendering of organs for assessment at a distance by medical professionals.
As part of its preparation for remote medical consultations, NASA Ames has established a collaboration among in-house staff and Stanford University scientists to experiment with the transmission of images, in particular complex three-dimensional models of hearts, skulls, and other structures. The challenge for NASA networking scientists is to send the images simultaneously to multiple sites (i.e., multicasting, as opposed to point-to-point communication). They have created a testbed network that links NASA Ames with Stanford University Medical Center, the Cleveland Foundation Clinic in Ohio (through NASA's Glenn Research Center), Salinas Valley Memorial Hospital (through the University of California at Santa Cruz), and the Navajo Nation at its Northern Navajo Medical Center in New Mexico. The network is extremely complicated, linking local-area networks (LANs) at the participating institutions to a variety ofcontinue
high-speed WANs, including the NASA Research and Educational Network; Abilene, a high-speed network backbone project of the University Consortium for Advanced Internet Development; and the very high performance backbone network service (vBNS), a research network launched through a 5-year cooperative agreement between MCI and the National Science Foundation. Connections to the Navajo Nation run through satellite and high-speed ground links.
The three-dimensional medical images used in NASA's tests contain several million polygons and are too complicated for typical computer workstations to render. NASA therefore planned to perform volume rendering on a powerful central graphics computer and then transmit the images (or changes in the images) to peripheral workstations at the remote sites. Because the files would be too large for straightforward transfer on today's Internet, NASA planned to use the Abilene network, which operates at 9,920 gigabits per second (Gbps). Limitations in bandwidth in various parts of the network, however, precluded even this design. Consequently, NASA designed a hybrid system in which high-resolution static images (containing roughly 1.2 million polygons) are transmitted from the centralized site and real-time rendering of object manipulations is handled at lower resolution (approximately 20,000 polygons) by the individual sites. Changes are made visible to all participants, and at the end of a manipulation, the centralized NASA computer sends an updated, full-resolution image back to each of the remote sites using multicast technologies. Users at the collaborating sites wear special glasses to view the three-dimensional images. The images for cranial-facial surgery consist of skin and skull only, but future modeling will include brain tissue and blood vessels.
The testbed provides simulations of remote medical care for Earthbound patients as well as astronauts. If, for instance, it was suspected that a child at the Navajo Nation might have a congenital birth defect of the heart, a local physician and technician would send three-dimensional images obtained with ultrasound or MRI technology to consultants at Stanford and Salinas for opinions about treatment. The three-way collaborative environment would allow the specialists and caregivers to assess the situation quickly and weigh two options: local management or transfer to a medical center.
East Carolina University
The visit to East Carolina University (ECU) on February 2, 1999, featured extended discussions of the university's Telemedicine Program as well as tours and demonstrations. The study team spent the day with David Balch, director of ECU's Center for Health Sciences Communica-soft
tion and director of the Telemedicine Program, and Gloria Jones, the Telemedicine Program coordinator. Other staff members and affiliated clinicians and educators participated in portions of the meeting.
Between August 1992 and February 1999, the ECU Telemedicine Program conducted about 2,500 real-time consultations in 31 different fields, using about 60 different doctors. The top five areas are dermatology; cardiology; neurology; gastroenterology; and allergy, asthma, and immunology. In addition, the program has conducted numerous radiological consultations, many on a store-and-forward basis. The Telemedicine Program also runs distance learning programs, the first of which began in 1989 over a statewide network that links major universities in North Carolina.
Between 1991 and 1999, ECU spent about $8 million on local facilities and infrastructure for telemedicine and teleeducation. Physical space is provided by the School of Medicine, and most equipment has been provided through grants from the Office of Rural Health Policy (ORHP) and Health Care Financing Administration (HCFA). The site visit committee toured the facilities and observed a telemedicine consultation to a rural site.
East Carolina University has two telemedicine suitesan older one and a newer onewith a total of eight rooms. Each room has a telephone, a personal computer for displaying electronic medical records, a display of the remote site, and a display of the consulting physician. The newer rooms are 6 by 8 feet and have a "sound dome" overhead to direct audio to the physician; the older rooms are slightly larger to accommodate a larger display and are soundproofed. The newer rooms were built for about $8,000, of which $4,500 was spent on equipment (e.g., computers, monitors, stethoscopes) and $3,500 was spent on furnishings and construction. Remote sites feature standard arrays of diagnostic equipment that can plug into the network.
Physicians typically schedule three or four consultations in a 2-hour session. The technical staff makes sure that all the equipment is working and that patients' online records (if there are any) are available. After each consultation, the physician completes a report, which is entered into the clinical database. Cases are presented at remote sites by a nurse or a physicians' assistant. A distant site provides a half-time dedicated employee who is trained by ECU to be a telemedicine presenter. The presenters generally train with the ECU specialists for a week and come in to the center about every other week; in this way, physicians and remote presenters learn to collaborate before being placed at opposite ends of acontinue
telemedicine link. Physician training includes a 1-hour interview with a simulated patient. ECU charges outsiders for this training; the university needs to subsidize training for its own people. Distant-site physicians are not involved in the telemedicine consult unless they wish to be; only rarely do they participate.
ECU centralizes the management of store-and-forward consultation messages, which typically consist of medical images attached to e-mail. This process allows managers to control bandwidth, monitor usage and flow of information, maintain the equipment, and capture the demographic and survey data they need for their own purposes and to satisfy requirements of the HCFA. ECU tried switching from motion Joint Photographic Experts Group (JPEG) to wavelet compression, but physicians who were already trained did not like the resulting interface change.
At distant sites, ECU pays for all equipment and 3 years of line charges, taking this as a marketing and referral expense. Standard installation costs $150,000 per site, including remote camera control, two-way audio and video, and continuing medical education content. Installation and bandwidth are sized to support echocardiograms and cineo-angiograms, the most bandwidth-intensive images. ECU tries to install identical capabilities at all sites, to avoid creating second-class sites with less expensive equipment.
Each telemedicine application supported by ECU has a different business model and a different networking infrastructure. Some of the applications are moneymakers, whereas others are seen as loss leaders for future services and still others may not be sustainable after federal funding expires. For example, the state fully funds Medicaid telemedicine sessions, but this arrangement is in jeopardy because of new national policies. ECU has a waiver from the HCFA and is receiving Medicare reimbursement for consultation sessions, but the paperwork burden is heavyup to a dozen forms filled out for each telemedicine session. Many of these consultations are store and forward, and new applications in primary care (e.g., wellness) are not considered consultation, so there is no reimbursement potential for these sessions.
The following five subsections describe the primary telemedicine services offered by ECU. The sixth subsection outlines how these services differ in terms of networking infrastructure.
ECU has a contract with the state of North Carolina for services to the state's central prison. The most popular applications include dermatology and endocrinology. This was ECU's first telemedicine program, and it is a moneymaker. ECU bought the equipment and charged back for itcontinue
over 4 years on a flat-rate basis that included equipment costs and maintenance plus 10 free consultations. The break-even point is 800 consultations per year. The state was able to document a per prisoner cost saving from avoiding transportation costs (it formerly cost about $800 per year to transport a prisoner to another site for health care). Officials also noticed a decline in the number of medical complaints by prisoners, who can no longer view a visit to the doctor as a chance to leave the prison. ECU probably will expand this program to cover the entire North Carolina prison system.
The hospital-to-hospital telemedicine link was funded initially by the HCFA but now is covered by ECU and the participating hospitals. There are 20 hospitals involved, each with a T1 connection. No more than half of the T1 link is needed for video, so the other half carries data. Originally, the system used so-called triple integrated services digital network (ISDN) at 384 kbps, but after the telephone company switched its billing practices, ECU converted to T1 lines for both conferencing and data. For echocardiography, triple ISDN might work in some situations, but it is not adequate for cine-angiography.
The ECU hospital funds the program at $150,000 per partner. ECU purchases and loans the equipment to the site; sets up the router, computer, and coder-decoders (codecs); orders the T1; installs the equipment; and trains and supervises the half-time staff person (usually a nurse or physician's assistant) who works the equipment at the remote site. The partner has to agree to fund this position, provide space and insurance for the equipment, and pick up the line charges after 3 to 5 years. The service is not a moneymaker for ECUno one makes money from consultations. It is viewed as a marketing effort that may establish relationships that bring in patients and eventually lead to procedures, which would generate income. For instance, the hospital might get more cardiac referrals as an indirect result of a well-baby telemedicine program. Such benefits would be difficult to track, however. The financial gain would be virtually impossible to calculate because any lost business opportunities would need to be figured in.
The home health program is funded by a grant from the ORHP within the U.S. Department of Health and Human Services. The goal is to determine if hospitalizations can be avoided through better home monitoring of health indicators, such as blood pressure and heart rate. The programcontinue
connects 14 homes through ordinary telephone lines (sometimes referred to as "plain old telephone service" or POTS), using two different vendors. At first ECU used two phone lines per home (one for transmitting audio from the stethoscope, the other to carry the patient's voice), but they have since found ways to compress the audio sufficiently to enable the use of a single phone line. Participants are selected on the basis of their frequent use of hospital facilities (e.g., high-risk obstetrics patients or patients who are seen more than twice a week in the emergency room) and their ability to have the system installed in their home for at least 6 months. ECU has approval from Medicaid for the program, but home health is not included in the new regulations promulgated by HCFA regarding government payment of health services delivered over electronic media.
Another ORHP grant is supporting two distributed networks (in contrast to the hub-and-spoke architecture of the other networks) that link mental health centers and schools to ECU. Over one network, ECU provides medical/psychiatric consultations to four mental health centers. The other network links four schools with the county health department, a mental health center, a local pediatrician, and ECU. The networks use 128-kbps ISDN lines (so-called single ISDN, which has proven to be more reliable than the triple-ISDN lines used in other applications). It is hoped that the networks will reduce travel for care providers.
The connection to the schools is used to screen students for mental health problems (25 percent were found to have such problems) and to hold wellness clinics and more general disease screenings. The project has a number of aspects: student health awareness (through focus sessions on topics such as peer relations, nutrition, and physical fitness); clinical consultations; continuing education in cultural sensitivity for teachers; clerkships for students in health professions; and health assessment in the schools. The school-based program sets up live videoconferences between a doctor and school students for discussions of health-related topics.
A store-and-forward system is used for multimedia e-mail consultations (using the VisiTran MD application). The system uses the Internet and POTS dial-up to a central server to connect a number of sites on the Outer Banks of North Carolina with ECU. Remote sites use standard desktop computers (Pentium class) to transmit image files averaging 3 MB in size, a figure driven by technical limitations. Nurses could easily takecontinue
more images or add more video, but most gateways cannot handle files larger than 5 MB. The system loses money because of the HCFA's reimbursement schedules, but income is generated by procedures that result from referrals.
The ECU program began in 1988 with a microwave network designed for educational purposes. Since then, the networking infrastructure has evolved into an amalgam of components cobbled together over the years as telecommunications companies have offered different technologies. Old networks have not been upgraded to provide greater uniformity because of the costs involved. Instead, ECU officials have developed a bridge to link their disparate systems. Codecs pass signals from one medium to another in a manner transparent to both ends. For example, the prison contract uses the state microwave system, but the signal is digitized for delivery to ECU's desktop workstations. The digital signal is converted back to microwaves for delivery to the prison. All of ECU's telemedicine activity takes place over leased lines rather than the general Internet, although North Carolina has built a fiber optic superhighway that links many of the state's universities.
The infrastructural variations are not solely technology-driven, because, as David Balch noted, one size does not fit all applications in telemedicine. Different specialties require different amounts of bandwidth. Echocardiography and cine-angiography require the highest bandwidth, 784 kbps; cardiology, emergency room consultations, and neurology require roughly 384 kbps; and psychiatric consultations require 128 kbps. Static image specialties such as radiology, dermatology, and pathology are generally bandwidth-independent because images can be sent in a store-and-forward mode. Home health requires just one telephone line. Hence, various communication media are used, including microwave, T1, single ISDN (128 kbps), triple ISDN (384 kbps), and POTS. Table A.2 summarizes the networking infrastructure for each of the systems ECU currently has in place.
East Carolina University has eight teleclassrooms on campus. The network was originally built to deliver distance education in engineering and is currently used for nursing classes. (The classrooms are too large for doctor-patient telemedicine applications.) A standard room has all networking modalities available: the North Carolina Information Highway (NCIH) at 45 Mbps; T1 networking at 1.5 Mbps, ISDN dial-up capability;continue
POTS lines; and CU-CMe (a software package for sending low-quality video across the Internet). The rooms hold 10 to 15 people each and have cameras aimed at each chair for video.
There is also a larger lecture hall, similarly equipped, that is used for grand rounds sessions. ECU provides grand rounds three times per week to 20 hospitals. These sessions can be carried out using one 15-person conference room and one small auditorium, both of which can connect to 37 sites on the North Carolina Research and Education Network (NCREN) at T3 speeds (or 45 Mbps), 127 NCIH sites at T1 speeds, and other sites using ISDN. PowerPoint slides are converted to video. Grand rounds are seen as a loss leader designed to acclimate remote users to the equipment and style of interaction, in the hopes that they will later try clinical consultations at the remote sites. Scheduling is complex; ECU uses scheduling software developed in-house.
ECU is using Internet-based streaming media (real-time audio), a chat channel, Web pages, and animations (Shockwave) for a small number of nursing courses. Both asynchronous and synchronous instruction are available (i.e., students can participate in lectures in real time or download the video for later viewing). The network was designed to accommodate 28.8 kbps modem speeds so that it could be accessed by a large number of users. That bandwidth is suitable for transmitting lecturers' slides, but it forces audio to be compressed into 6.8 kbps, making the video slightly choppy. The network determines the speed at which to deliver data based on the receiving student's access bandwidth.
The asynchronous system incorporates 20 kbps video, which consumes most of the available bandwidth. PowerPoint slides are converted to HTML for use by both the nursing instructors and the students at home. Audio is captured and sent in a 6.8 kbps stream to leave room for the graphics. Lecturers have added some small animations; it takes aboutcontinue
1 hour to create 10 seconds of animation. Materials are available online within 24 hours after a class session. The system can handle up to 25 users per class; 5- to 10-second delays are considered acceptable.
The synchronous system uses Microsoft NetMeeting or ICQ for interactivity. It takes advantage of the chat feature alongside PowerPoint slides for interaction between instructors and students, but little interactivity has been observed. In a large class, the professor might have an assistant answer the chat questions. Problem areas include scheduling, preparing auxiliary materials, marketing, and meeting assessment deadlines. Presenters must act as moderators and must learn to teach to an empty room or to a screen audience and a live audience at the same time.
Next Steps: Switching to the Internet
The Internet, especially the Next Generation Internet (NGI), would offer increased bandwidth and networking capabilities that would in turn enable a range of scenarios for telemedicine. One scenario might be physician-free medical practices in which nurses and physicians' assistants handle many patient complaints and consult with specialists only when needed. In another scenario, centers such as the one at ECU could become brokers between practices and specialists rather than hubs that provide the specialists. The center at ECU also could become a means of providing services to more affluent clients who opt to pay for their own health care rather than relying on their health plan. Real-time treatment planning, with its increased bandwidth requirements, is seen as unlikely. In the future, Internet Protocol (IP) video and Web TV will be close to providing telepresence.
ECU is not currently slated to receive a connection to the NGI, but researchers are trying to find a way to get a connection. In the meantime, telemedicine system developers have been thinking through the challenges inherent in moving their telemedicine network to an all-IP environment, which would provide more of a plug-and-play capability. Mr. Balch has been working with the American Telemedicine Association to sort out which disciplines are most likely to be effective using telemedicine and the bandwidth and throughput required for each. Table A.3 summarizes his conclusions, which divide telemedicine consultations into three broad categories: (1) those in which high-resolution, static images are needed (such as in pathology, dermatology, or radiology), (2) those in which medium-resolution imagery is needed, with little motion (such as in psychiatry or internal medicine); and (3) those in which medium-resolution images are needed, but with a high degree of motion (such as in cardiology).
ECU officials would consider switching from leased lines to thecontinue
Internet if there were evidence that it would reduce costs and lead to increased access, which it currently would not in rural eastern North Carolina. They would like to outsource the networking part of the business to a company experienced in dealing with telephone companies (i.e., an ISP). They are moving to IP-based video, in spite of the cost and access issues, to overcome the problem of heterogeneous systems. Beyond these issues, use of the Internet would require attention to a variety of administrative and technological challenges.
One challenge is rapid change in both health care models and technology. Modes of health care delivery and payment are in flux, and provider plans are becoming more integrated and national in scope. This is not currently a problem for ECU because it serves only North Carolina, so it is able to use physicians' assistants on the remote end and a physician on the central end. But working across state lines would require a physician on each end and perhaps some type of cross-licensing arrangement. Meanwhile, as noted earlier, rapid changes in technology have left ECU with a system that is cobbled together rather than designed for optimum overall functioning.
A variety of standards and quality control measures would have to be instituted. A technical service broker would need to document the sessions, keep records, and keep track of the time. For the system to be efficient, participants would need to use standard history-taking procedures, standard clinical protocols, and standard tools for teleconsultation. Triage and quality control would be needed on the store-and-forward systems. A certification system would also have to be created for consultants, presenters, and, perhaps, for overall telemedicine programs.
Security also would need to be improved. Security is an issue for both data transmission and medical records, which must be kept secure from unauthorized viewers, including technicians and hospital employees. Currently, ECU faxes medical records and uses Proshare, which provides encrypted mail, for the store-and-forward system. Each mes-soft
sage is checked to be certain that all the parts are there, but this model will not scale up. The NCREN fiber network cannot be used for telemedicine because it passes through the telephone company, creating too many security issues.
Quality of service (QOS) is not an issue for ECU today because the program uses leased lines; service-level agreements (pacts between ISPs and users on minimum bandwidth, maximum latency, etc.) to provide equivalent QOS would be needed if the Internet were used.
University of North Carolina, Chapel Hill
The site visit to the University of North Carolina (UNC) in Chapel Hill on February 3, 1999, featured two main activities: a visit to the computer graphics laboratory for demonstrations of telepresence systems with potential applications to medicine and a meeting to discuss Internet-based education programs in the School of Medicine.
UNC's Computer Graphics Lab
Guided by Henry Fuchs, the committee saw four demonstrations of computer graphics and virtual reality systems with potential applications to medicine: the Telepresent Office, augmented-reality ultrasound, wide-area tracking / walking, and image-based rendering / depth extraction. All of these systems attempt to convey telepresence.
The Office of the Future is an attempt to create an immersive environment that enhances the sense of participation in teleconferences by using multiple cameras and multichannel audio. The remote participant sits at a corner desk, and a 270-degree image of the central conference facility (or operating room) is projected on the walls of the room. Real-time video and audio are transmitted to the remote location. The setup requires about a dozen live video streams, which require substantial bandwidth (although bandwidth demands might be reduced). Persons at the central location see the remote participant on a video monitor and receive an audio feed.
Much attention has been focused on image registration by cameras and projectors and on the mapping of images onto room geometries that differ between sending and receiving locations. Flight simulator data show that occlusions and breaks must be kept at 50 msec to avoid interfering with a person's conscious attention to the task (100-msec breaks create a distraction). The 50-msec figure is used as a point of reference in build-soft
ing smoothness into the collaborative environment. The system has been used with multiple IP video streams over the vBNS. Empirical observations suggest that it does not work very well in spite of apparently adequate bandwidth; there is not much emphasis, at present, on the efficient network transport of video and audio.
Latency and time-stamping of different video streams are important, especially during the current transition from a two-dimensional to a three-dimensional system. A true sense of presence can be provided only if there is no latency. The system needs to factor in the distance between viewer and viewed and provide a 360-degree view without breaks. In the demonstration, two or three cameras were added to compensate for distance, and two or three more to ensure a 360-degree view. Then, all the video streams need to be delivered together and reconstructed into a seamless image. Perfect time-stamping, down to microseconds, is needed for reconstruction. The time stamp could be a property of either the network or the packet. There are problems with either approach: a time stamp in the packet helps with synchronization, for example, but it creates a latency problem.
The augmented-reality ultrasound system consists of a head-mounted display that provides an ''X-ray vision" view of a surgical patient's abdomen. The user can visualize hidden structures reconstructed from CT images and/or an ultrasound probe. The system is able to convert sequential two-dimensional ultrasound into three-dimensional objects within the abdominal cavity. It was developed in collaboration with a local UNC surgeon who specializes in minimally invasive procedures. It is not envisioned as a network application.
The wide-area tracking system uses a three-dimensional helmet display of a virtual architectural space. Guided by position sensors in the helmet, the system renders appropriate changes in perspective as the person wearing the headgear walks through a room. It is supplemented by tactile feedback from Styrofoam objects that correspond to three-dimensional images. The system is tied to high-speed position detection and three-dimensional rendering of local objects, but it could be converted to a network application.break
The image-based rendering / depth extraction technology is a three-dimensional modeling system in which architectural surfaces are scanned by a laser to acquire depth information, upon which surface information from two-dimensional images is mapped. The system produces photo-realistic rendering on a computer display (requiring tens of millions of image primitives per second) with interactive joystick control. Users can move around the scene. It runs on a pixel planes computer and is bound to a central processing unit and local memory. The system is not envisioned as a network application.
John Loonsk, head of information systems for the School of Medicine and the Division of Medical Informatics in the Department of Biomedical Engineering, discussed the UNC approach to technology-based curriculum support. The goal is to create an Internet-based learning environment for medical education for both resident and nonresident students. The system incorporates standard off-the-shelf software and a browser-based interface. It uses flat HTML for all pages, with URL interlinking for navigation. A staff of 45 handles all of the work; few faculty members are involved directly in creating materials.
All medical students are required to purchase a specific laptop computer, which has standard software and preconfigured ISP accounts for remote access. The staff transfers course notes from the live network environment so that students can download the previous, current, and following week's syllabus materials and use them off-network. A recent study of system use found that 136,000 pages were retrieved and videos were played 4,300 times in a month (Box A.1). Students also are required to take a medical informatics course delivered online. Before each class session, the students receive an e-mail message with HTML links to basic materials for that topic. Then a lab session is held in the five rooms in which student carrels are located; each student has a carrel with network access and power.
The preclinical version of the educational system has five components. One is reference materials based on the university's UNCLE system, which includes links to (1) MEDLINE; (2) products from Ovid Technologies that contain links to 60 full-text journals and textbooks; and (3) some UNC-specific links to content. A selection / editorial committee, with members from various UNC schools and run by the library, chooses and deselects materials.break
A second component is teaching materials, beginning with the syllabus, which is distributed to students in both an online and a printed version. Traditional instructional materials (e.g., lecture notes and slide images) are captured and integrated with common sources of published information to support reference / retrieval and problem solving. Search engines allow students to find material easily. Typically, course materials are provided two weeks in advance and left on the system until updated or replaced. Because the full 2 years of the preclinical syllabus are online, the School of Medicine has abandoned its curriculum management database; faculty members simply search the syllabus using keywords.
The third component is an image repositorya collection of medical images with some text descriptors and titles. Staff are trying to convert these into flat HTML format. A directory structure exists, but there is nocontinue
database structure for generating dynamic HTML pages. Staff are already building an image database and case repository using a standard format; they hope faculty will use the repositories to build instructional materials with reusable components. For legacy applications (PC-based, computer-aided instruction materials), a launcher was built that can be reached from the browser.
The fourth component is a case repository, which contains specific problems that provide an "evocative" presentation of a particular case. Different classes may use the same cases. Most cases are text-based; there are no simulations yet.
The fifth component is communications technology, consisting of a standard e-mail client. Dr. Loonsk wants to make more use of e-mail with embedded HTML and store-and-forward capabilities. There was an unsuccessful attempt to provide threaded discussion lists; the students complained of too many distracting messages.
The clinical learning environment was developed for third-year students. It contains quick references on UNCLE as well as access to full-text articles; clinical support tools so that students can document patient encounters (e.g., problem lists); and learning frames, which provide background and instruction on particular clinical problems. A learning frame is developed for each topic on a problem list that students use as they complete clerkships. The frame keep the basic didactic information up to date and also includes canned searches against basic references and a number of other custom views of the curriculum resources. Access to the clinical database, a noncommercial application, is provided to both local and remote students.
There are several constraints on the system. One is bandwidththe so-called last mile connectivity to students' homes. There is no digital subscriber line (a digital telecommunications protocol for sending data at high rates over copper telephone lines) or cable modem access in the area, and there are many different phone companies. If bandwidth were much higher and universally available, then UNC would use teleconferencing for student mentoring activities in which students start as a group on campus and then disperse to community sites. Another constraint is security, especially for clinical data. The hospital uses audit trails to deter improper access to medical records. A related problem is the absence of unique and secure IDs for user authentication. On the other hand, the use of IP authentication by vendor products limits off-campus access to some reference materials, a growing problem because 50 percent of a student's clinical time is spent outside the hospital.
Other constraints include the demands of managing virtual private network (VPN) and extranet services; administrative structures are needed to establish VPNs among sites and to distribute and revoke encryptioncontinue
keys as needed. There is also inadequate QOS to enable the provision of video and public-network-based telemedicine to academic health education centers. Users cannot be assured of getting the requisite 128 kbps or more consistently across the Internet without some sort of service-level agreement. There is also a need for higher level standards (e.g., XML) to supplement URLs. Finally, there is some faculty inertia in moving toward the use of new communications methods (e.g., newsgroups, listservs).
University of Washington
The visit to the University of Washington (UW) on February 10, 1999, consisted of a series of briefings on, and demonstrations of, a range of projects related to medical informatics and human-computer interfaces, a key factor in making the NGI more accessible to health care. Since the 1970s, UW has engaged in a series of computing, informatics, and Internet technology projects that have built upon one another. The site visit team heard about many of these projects as well as about changes in the Pacific Northwest brought about by information technology.
Computing Activities at the University of Washington and in the Seattle Area
Ed Lazowska, chair of UW's Computer Science Department and a member of the Computer Science and Telecommunications Board, provided an overview of computing and Internet-related activities at UW and in the Seattle area generally.
Over the past few decades, Seattle has been transformed from a lumber town into a community dominated by Boeing Corporation and then into a much more diversified city. The UW Medical Center has played a key role in that transformation, helping to spur the development of the local biomedical electronics, biotechnology, and software industries. Washington State now has the largest concentration of high-tech employees in the nation (i.e., workers employed in companies with higher than average levels of research and development). Although the aerospace industry has leveled off in both dollar and employment terms, other high-tech areas are growing quickly.
A recent survey by the Washington Software Alliance found that software is a $20 billion industry in the state, with 2,500 firms employing 47,000 permanent workers. Employment has grown by a factor of four over the last decade, and the software industry has an employment multiplier of 5.5 (meaning that each software job creates 5.5 additional jobs in other sectors), which is twice as high as that of the technology sector as a whole. The average annual wage in Washington's packaged-softwarecontinue
industry in 1997 was $198,000 (salary plus exercised options), not including the pay of 640 company officers. The average salary before options was $67,000. The software industry has 7,300 current vacancies and anticipates 64,000 new hires over the next 3 years. If these jobs can be filled, then the industry will generate an additional $12.8 billion in revenues over the 3-year period. Seventy percent of these jobs require a bachelor's degree or higher. These data have implications for local universities; namely, they find it difficult to recruit top people for academic jobs, and there is limited practical value in 2-year academic programs, whose graduates fail to meet the employment criteria of much of local industry.
The ARPANET was brought to the Pacific Northwest in 1979 by the UW Computer Science Department. The NWNet (part of the NSFNet) later formed around UW. As of August 1997, the vBNS was not designed to extend to the Pacific Northwest, even though there were somewhat redundant points of presence (POPs) in regions with supercomputer centers. (A POP is a dial-in site where a backbone network connects to access networks and where Internet service providers house switching hardware and transmission equipment.) Now, the region has a vBNS connection to Denver and San Francisco. Plans call for the development of regional collaborations, which are key to developing regional expertise and will allow for regional peering and traffic aggregation through a single point of connectivity. UW was one of the four original sites for Abilene and worked with Qwest, which is laying the fiber for Abilene as it lays its own national network, to get hooked up early, so the university now has a 2.5 Gbps fiber-optic connection. The Pacific Northwest gigaPOP, which includes Abilene and vBNS, is operated by UW, which received money from the state legislature for equipment. (A gigaPOP is a point of presence for accessing high-speed networks, sometimes called gigabit networks.)
A 10 Gbps link connects UW to the Westin Building, where the gigaPOP is located, in downtown Seattle. UW is working with Microsoft and U.S. West on a regional bandwidth experimentation program. UW also has requested National Science Foundation (NSF) funds for connections to regional universities, including Oregon Health Sciences and Oregon State. The remaining challenge is to convince agencies other than the NSF to connect to the gigaPOP. Private networks (such as those run by the departments of Defense and Energy and NASA) tend to serve the same geographic areas, often missing the Pacific Northwest. If agencies carried each other's traffic, then aggregate demand could be considered in determining the need for network/POP connections. However, shared networks might pose availability and security problems.break
Informatics and Internet-related Activities
Brent Stewart provided a brief overview of the School of Medicine's informatics and Internet-related activities. A number of early projects were not discussed in detail during the site visit. For instance, UW was one of the first sites in the Integrated Advanced Information Management Systems program of the National Library of Medicine (NLM). It also participated in the Advanced Communications Technology Satellite program, a NASA effort that started in 1972 with a satellite providing a connection to Alaska. A project on ultrasound telemedicine, sponsored by the Defense Advanced Research Projects Agency's Technology Reinvestment Program, was designed to develop dual-use ultrasound telemedicine technologies that could be used in ambulances in both battlefield and civilian situations.
Other efforts included the Bench-to-Bedside project, which extended the reach of the UW School of Medicine's Internet resources to community hospitals and libraries. It encompassed research, deployment, and testing activities. Bench to Bedside and Beyond (B3): Building and Testing a Regional Telemedicine Testbed was sponsored by the NLM; it expanded the original program to allow the sharing of clinical information among participants in the Washington, Wyoming, Alaska, Montana, and Idaho (WWAMI) Rural Telemedicine Network. It includes a secure Web interface to medical records, secure clinical e-mail, and access to medical library resources.
Projects that were discussed in some detail at the site visit are summarized in the following subsections. They include the distributed radiology oncology network, the MINDSCAPE interface to a clinical data repository, the WWAMI network, medical projects at the Human Interface Technology laboratory, the gigaPOP, NGI projects, Biomedical Library programs, and the Digital Anatomist electronic repository of anatomical images and teaching programs.
Ira Kalet, of the Radiation Oncology Division, discussed radiation planning software that he helped develop. It uses a client-server architecture to enable collaborative planning of radiation treatments by physicians in different locations. The software creates detailed three-dimensional visualizations of cancerous regions of the body by building up sequences of two-dimensional CT images. The images can then be sent across the network to be viewed and manipulated by the collaborating physicians. Radiation oncologists can use the system to identify tumors and plan treatments in collaboration with dosimetrists. The collaborators can viewcontinue
various radiation portals and beam trajectories to aid in planning treatments; the software synchronizes the images shown on each user's screen. Such collaboration formerly was possible only by faxing images to and from collaborating physicians. In its current configuration, the system allows sharing of images across a LAN. In the future, it will allow true remote collaboration.
System development poses a number of technical challenges. The difficult programming task, carried out in collaboration with the Computer Science Department, resulted in several papers published in computer science journals. Latency is a significant concern, as it currently ranges from 1 to 2 seconds and could increase across a wider area network. Obtaining funding for continuing research on such projects is difficult. While health organizations such as the National Cancer Institute understand the need for research on ways to calculate radiation doses, for example, they generally fail to see why they should fund research on collaboration software or new software design tools, according to Dr. Kalet. Industry resists supporting work like this, too, viewing it is as too risky and as taking too long to produce results that can be commercialized.
Tom Martin, director of systems development for UW Medical Center Information Systems, demonstrated MINDSCAPE, a Web-based interface for viewing a clinical data repository. MINDSCAPE is based on the Medical Information Networked Database (MIND) data repository developed at the medical center between 1991 and 1994. The system contains about 60 gigabytes of data, including clinical data (e.g., electronic medical records), and offers links to library reference materials (e.g., drug databases, MEDLINE, laboratory tests reference data, and clinical guidelines) as well as other decision support tools. MINDSCAPE generates reminders about exams that patients will need soon, lists of medications, and dynamic reports for several measures, such as hemoglobin levels in diabetic patients. Other reports also can be generated on a clinic- or provider-specific level. For example, the system can generate information about compliance rates of patients in a particular clinic or under a particular physician's care. Users can pull up patient records so they can call and remind them to come in for appointments.
Terminals and/or computers with access to MINDSCAPE are located in all exam rooms at the medical center. The system is also accessible through dial-up modems. Because it is primarily text-based, MINDSCAPE does not currently generate unusual requirements for bandwidth; however, images are being added to records, and that will increase bandwidthcontinue
requirements. MINDSCAPE uses commercially available Web-server encryption technologies (128-bit SSL) and server authentication for security, along with access controls. Access to system information is granted on a need-to-know basis, and overrides are in place to allow access in emergencies. All access can be audited to ensure compliance with confidentiality policies. The confidentiality and security policies are derived from policies in place for paper records. All physicians and other staff members with access to the system must sign a confidentiality agreement and complete training on confidentiality policies.
UW serves as the tertiary care center for the five member states of the WWAMI (Washington, Wyoming, Alaska, Montana, Idaho) Regional Medical Program and is the regional medical school for those states. WWAMI links the UW School of Medicine, the UW Medical Center, the Harborview Medical Center, and the Children's Hospital sites and clinical teaching sites throughout WWAMI. The participating stateswhich together cover about one-fourth of the U.S. land mass but have sparse populationscreated an affiliated education and care program. Students complete their first year of medical school in their home states and then come to Washington for their second year. The third year is decentralized rotations; about half of these students go out into the field. The residency program places students in member states.
WWAMI has undertaken many telemedicine activities since the 1970s, especially to Alaska, to support its education programs. Funds from the Rural Health Policy Agency allowed the consortium to set up six telemedicine consultation sites in small communities. The local primary care physicians are also preceptors for medical students, and an average of one or two consultations, mainly specialty consultations, are held at each remote site every month. Psychiatry, cardiology,and dermatology are the most common telemedicine specialties, although there is some use of the system with peripheral instruments and for trauma consults. Telemedicine has not worked for rheumatology. The network also is used as an administrative link among the UW sites.
The system uses a frame-relay setup with 56-kb switched lines to the rural sites, which lack either the capacity or funding to get digital lines. Communications are imperfect at this speed and produce video artifacts, for example. The local Seattle locations have T1 linkages. For teledermatology, the participants use store-and-forward capabilities; other consultations are synchronous. The telephone lines cost, on average, $3,000 to $5,000 monthly, which could not be covered without grant money. The system uses PictureTel equipment.break
The local physicians are satisfied with the program; they use the library's resources and learn to like the Internet. But several issues must be resolved before the telemedicine program can be expanded. First, the program crosses state lines and requires licensed physicians at either end, so beginning students cannot present patients. When students do clinical rotations in rural areas (as half of the students must do), they must get licensed in both statesunless they are participating in telemedicine programs operated by the federal government (such as the Veterans Administration), which are exempt from state licensure requirements. Second, insurance is an issue; some patients will not use this medium for consultations because their insurance will not cover it. Third, telemedicine can affect local referral patterns. Finally, reimbursement is a major problem. Consulting doctors are currently paid from the grant money. The HCFA and the Rural Health Policy Agency support only 80 percent of a normal payment, and the doctor has to split the fee with the referrer. Montana approved telemedicine for Medicaid because it saves money normally spent on patient travel.
The Human-Interface Technology (HIT) laboratory is a research unit in the UW College of Engineering. It has a roster of 108 people; 18 are regular staff members and the rest are made up of faculty associates, visiting scholars, graduate students, and so on. The income of the laboratory is about $17 million, two-thirds of which comes from grants and contracts. Forty companies participate as members of the Virtual Worlds Consortium, providing a little less than one-sixth of the revenue. Spinoffs from the laboratory include 13 different companies; 10 of these are still active, including 3 created recently.
Suzanne Weghorst, assistant director, introduced the HIT laboratory, which has a goal of developing and demonstrating mission-transferable technology. Tom Furness, they director, elaborated on the laboratory's history. He worked for many years at Wright-Patterson Air Force Base, designing human-machine interfaces in airplane cockpits. He came to Seattle in 1989 to found the laboratory, by which he hoped to extend the scope of his work, with the general goal of providing better coupling of humans to advanced machines.
Furness emphasizes his view of the future as technology that simulates "being there," improving humans' ability to transport themselves by moving their eyes to different places and times. Such experiences range from teleconferencing to "transport" through an endoscope to view hidden portions of the body. For instance, a videotape made 5 years ago showed an interactive teleconference in which participants in the Unitedcontinue
States and Japan wore virtual reality (VR) helmets and cooperated in the task of herding virtual creatures across a conference table with paddles. The telecommunications link consisted of four ISDN lines. The images displayed were somewhat flat and cartoonish, but the interaction was successful.
The committee was shown three active laboratory efforts. In the virtual operating room, the participant wears a VR helmet and holds a control stick. The helmet's location and orientation are sensed by a device mounted on a fixed stand over the space, and this information is used to drive the displays to the video helmet (much in the manner of current VR games). The environment is based on photographs and equipment from Harborview Medical Center. There is a patient on an operating table; by using the control stick, the participant can manipulate displays, such as the electrocardiogram output, and instruments, such an endoscope inserted in the patient's lower abdomen. The displays can be controlled so that they are visible regardless of the participant's perspective, or they can be fixed at various positions in the virtual room. There were some noticeable lags in the following speed of the display if the participant turned quickly, and the overall precision of the location sensors appeared to be on the order of inches rather than millimeters.
The second active effort involved simulation of surgical suturing through a computer-controlled force-feedback device. The user interacts with the environment through a pair of scissors holding a virtual needle. A standard video monitor displays the position of the needle relative to a wound in a small area of skin. A finite-element model of the skinit has about 200 nodes, with a relatively higher concentration of nodes near the woundsimulates the restoring force of the skin against the needle and controls the force feedback. If the user inserts and virtual needle into the skin orthogonally to the skin surface, for example, then little force is felt, but if he or she holds it at an oblique angle, then considerable force is needed to pierce the skin. The force-feedback device requires updates about 1,000 times per second; the visual display runs at a standard 30-cycles-per-second refresh rate.
The third effort is the Virtual Retinal Display. The idea is to paint an image directly on the retina with photons instead of projecting the display elsewhere and requiring the person to follow it visually. The image appears only on the participant's retina. The lab bench setup included low-power red, green, and blue laser light sources fed through an optical fiber, with the fiber's output scanned mechanically over the retina by a moving mirror. Although the lab setup seemed cumbersome, a company called Microvision was spun off in 1993 to commercialize this technology and evidently has had some success in reducing it to a practical size and weight for portable use. Because the image can be focused so that itcontinue
passes through only a small part of the user's cornea and can potentially be focused on a specific part of the retina, the technology holds promise for assisting persons with impaired vision and providing bright, high-precision displays for VR applications. A number of surprising results have been found in experimental use. For instance, users do not perceive flicker in the static images even at relatively low refresh rates (e.g., 15 frames per second), meaning that the system offers twice the resolution of other displays at the same bandwidth.
Jim Corbato discussed the evolution, current architecture, and expected future evolution of high-bandwidth IP network infrastructures for UW and its Seattle-area health care partners, which include the Harborview Medical Center, Children's Hospital, and the Fred Hutchinson Cancer Research Center. Network interconnections in Seattle are simplified by the physical proximity of data networks from various interexchange carriers (e.g., Qwest and US West) and ISPs at the gigaPOP facility in downtown Seattle. The network connections have relatively poor site security (because they are in a general-use office building) and present a potential single-point-of-failure for Internet access for the entire Pacific Northwest.
UW received a phase 1 award from the NLM to examine biomedical applications that would benefit from the NGI and has since received a phase 2 award from NLM to further this work. This project is titled Patient-Centric Tools for Regional Collaborative Cancer Care Using the NGI. Brent Stewart outlined the project, in which a high-performance metropolitan area network is being designed to transmit clinical data, including radiology images (with a capacity to deliver eight simultaneous 10-MB image sets simultaneously, using a 622-Mb channel), to a planned cancer care facility on the south shore of Lake Union. The center is scheduled to be completed within a year. The bandwidth requirement is based on fully digital radiology, with interpretation of images by radiologists at UW and a digital archive at UW rather than at the clinic site.
The development of this center is based on three hypotheses: that health care is becoming highly distributed and differentiated; that health care is operating in a resource-limited environment; and that the NGI will enable more collaborative practice, regardless of where patients are located at a given time. The NGI will enable the formation of the cancer care alliance; facilitate teaching and research; enable a fully integratedcontinue
team approach to diagnosis, treatment, and management of cases; and accelerate the discovery and dissemination of knowledge.
The underlying technology will consist of the local gigaPOP as well as a virtual, enterprise-wide multimedia electronic medical record based on MINDSCAPE. There will be a backup line in place, perhaps a leased DS-3 line. The system will transmit real-time video (e.g., ultrasound, fluoroscope, and synchronous telemedicine consultations), store-and-forward video, and interactive radiation oncology treatment planning (e.g., graphics, images, and video). Additional multimedia knowledge resources, such as the Digital Anatomist and streaming video for patient education, also will be available.
Technical requirements are based largely on the needs of remote radiological image archiving and display. To allow the simultaneous downloading of eight different 10-MB images within one second, the system needs bandwidth of 640 Mbps. The UW and Harborview radiology departments are all digital now, but they use computed radiography (in which an imaging plate is scanned by a laser) rather than flat-panel digital. The centers will become fully digitized once the technology comes down in price. Stewart envisions that a radiologist covering at a remote site might use the system to perform work that he or she would have done at the home site, downloading images remotely.
There are no formal plans for evaluating the technical infrastructure. Because the Internet today provides no QOS, UW will put in place as much bandwidth as possible and use whatever service level results. Dr. Stewart viewed medicine's demand for bandwidth as similar to that of other industries; he compared multisite telemedical collaboration to automobile companies linking together their remote research and development sites.
The UW Health Sciences Library is moving all of its services to Web-based delivery (see <healthlinks.washington.edu>. Geographic issues related to supporting the WWAMI program (i.e., mountains, small towns, long distances between towns) make this transformation necessary. In addition to the WWAMI medical education program, the pharmacy, nursing, public health, and social work programs have distance education programs. Faculty want to deliver digital video to their WWAMI-based students and provide them with access to course materials and the clinical digital library resources. This program will require high bandwidth and many servers. At present, the schools use scanned PDF format to deliver interlibrary loan and course materials (e.g., course notes andcontinue
reserve materials) to distant students. The documents are an estimated 300 kb in size each.
The Health Sciences Libraries offers over 1,400 full-text online electronic journals to UW faculty, staff, and students. A major issue is compliance with licensing agreements. UW librarians negotiate aggressively for licenses allowing digital materials to be available to faculty, staff, and students at any location. They control access with user IDs and passwords and UW IDs and are increasingly making materials available through a proxy server. Access is also a problem, because not all materials are locally loaded, and network latency for materials stored at remote sites (NLM, journal publishers, etc.) is an issue. The plan is to support nomadic computing because students, faculty, and residents move around constantly. Network latency is a serious problem for accessing remotely stored full-text journals and other resources.
UW was part of the NLM test on access time for PubMed over the Internet that was published recently in the Journal of the American Medical Informatics Association. NLM has tried to track the latency problem and 3 years ago performed a minor test for the Utah link for the online journals (because it intended to provide all resources remotely). Users see real degradation of performance from 11 a.m. to 3 p.m. Pacific Standard Time, but that latency is due in part to issues within the UW network (in other words, the latency formerly observed in NLM's MEDLINE for that time period now is observed on the Internet.) However, the latency depends on the connection. UW is upgrading the network to 10 Mbps Ethernet (10Base-T) to improve throughput; moving to 100 Mbps Ethernet (100Base-T) on every public library machine is a much more costly venture (about $400 per station). The UW Health Sciences Library has many public stations, including about 150 in the microcomputer lab it manages for the Health Sciences Center and over 100 public workstations in the three Health Sciences Library sites. Remote users may have trouble with commercial connections to library online resources (e.g., through MSN or AOL), which can impose latency problems during certain time periods.
The Digital Anatomist is an NLM-sponsored project to develop an electronic repository of anatomical images. Anatomy is, of course, fundamental to health sciences education, and it provides a framework for organizing other biomedical information. Jim Brinkley presented the work of the UW Structural Informatics Group. The group works in three areas: representations of structural anatomical information, from the level of individual cells to gross anatomy; methods for accessing and using structural information; and practical applications of their tools forcontinue
research, education, and clinical work. It tries to exploit opportunities for online systems dealing with anatomical information. The key data structure for its work is an ontology of anatomy, developed by Cornelius Rosse, that serves as a common data structure for most of the applications. They call this the foundational model of anatomy.
The system demonstrated for the committee provides authoring tools using both symbolic information (e.g., names, semantics, structures) and spatial images (typically three-dimensional images) of anatomical structures. It consists of a symbolic information database and a separate three-dimensional image database accessed through a single server. A number of intelligent agents have been developed to assist in retrieving and assembling data sets and images. The agents have knowledge of both the information available on the system and the user's level of sophistication. In response to the command ''Show me the structures of the left lung," for example, the system will check the symbolic database to find out what structures are in the left lung, then go to the image database to determine what images are available, and then use a scene-generator to assemble the pieces properly. The user then can highlight particular elements of interest, rotate or zoom in on the image, and remove objects that block the view of other objects of interest. All processing is done on the server.
On the authoring side, the system contains a knowledge-builder for adding information to the symbolic database. It is a relational database containing 25,000 terms describing all structures with dimensions of 1 mm or greater in a particular set of organisms, including humans. For the spatial database, the system can create volumetric, three-dimensional models of anatomical structures from two-dimensional images. It also can create and perform animations. A brain image demonstrated for the committee superimposed vascular structures onto the brain and contained some 100,000 polygons. Images can be annotated for clinical and educational purposes.
On the user side, the system contains an annotated image server. Users can call up images and click on individual structures within the image. The system outlines the selected structure and generates its name. This system is used in anatomy education, but the images are so large and the networks so slow that students tend to use a CD-ROM rather than access the database through either campus or remote networks. The system's quiz mode can test students' knowledge of anatomy. A tutorial system can embed images from the atlas into other documents. The Digital Anatomist and interactive atlas get an estimated 10,000 hits per day.
In a separate effort called the Brain Project, a digital neuroscientist system is being built that will overlay neurological images on images of the brain. The system will allow cutaways of volume-rendered images. These data are collected operatively by neurosurgeons who do real-timecontinue
stimulation and mapping of critical regions for various neurological functions, such as speech. NGI technology could streamline this process by enabling real-time superposition of neurological data on an open brain; the system then could sense the surgeon's probes and automatically maintain information about the location of the probe and the result. This would require capabilities similar to telesurgery, but the system would also be linked to databases for documentation and postmortem analysis.
The NGI offers several other opportunities for applying this technology. One is anatomical education, in the form of virtual dissections and intelligent scene generation. Another opportunity is brain mapping for either research (e.g., language mapping to identify correlations between brain structures and language skills/development) or clinical purposes, such as surgical planning. The technology also could provide structure-based visual access to biomedical information. Indeed, some have suggested that the ideal user interface to biomedical information resources is a model of human structure, which can serve as the organizing principle for this information.
The visit to the corporate offices of Regence BlueShield, the largest health care insurer in the Pacific Northwest with annual revenues of approximately $2 billion, took place on February 11, 1999. The committee heard presentations on Internet-related activities within Regence as well as related activities in the Seattle area. The related activities include programs operated by the Foundation for Health Care Quality (FHCQ); the Washington State Department of Public Health; and the Community Health Information Technology Alliance (CHITA), working with a group called Agora.
Regence Web-based Services
Steve Moe, manager of electronic business practices for the Regence Group, presented a Web-based interface application called Network Data Express (NDEX) for determining beneficiary eligibility and making referrals. The Web-based system offers claim status inquiries, provider directories, reference materials (such as the formulary), e-mail, and managed care data and reports. It processes about 20,000 transactions per month (peak times are early in the day and during lunch), doing the work of two or three full-time employees who otherwise would give the same information out by phone (Regence processes millions of claims a month). Regence has deployed 1,500 workstations (including 800 intranet and 700 dial-up systems linked to a private Web server), of which about 30 per-soft
cent are in use on a regular basis. All users are assigned an ID and password and sign a confidentiality agreement. Patient and customer information is indexed by social security number.
Kirk Bailey, manager of security policy, stated unequivocally that the Internet is considered unsafe and will not be used for Regence's electronic commerce (e-commerce) transactions until it has sufficient security measures and functionality to meet the company's business requirements. The Internet raises concerns about security, privacy, and reliability. Other factors that have slowed the adoption and use of NDEX are a lack of content sponsors, especially among payers; the lack of Web browsers in many provider offices (they will get browsers during their next hardware upgrades, but Regence does not fund such upgrades); and user behavior (e.g., administrative workers are accustomed to using the phone instead of the computer to get information).
Foundation for Health Care Quality
Rick Rubin, president of the FHCQ, gave an overview of the foundation, a not-for-profit entity created in 1988 to meet the shared health information needs of the Seattle region. The foundation serves as a neutral meeting ground for providers, payers, plan purchasers, consumers, and others involved in health care. It participates in or sponsors programs in three areas. One area is e-commerce pilot projects, including a multistate effort funded by the Robert Wood Johnson Foundation to define eligibility and referrals, and CHITA, described in further detail below. The second area is performance measurements for health plans and providers, and the third is consumer affairs.
The FHCQ, which views itself as an economic development agency for the region, has learned a number of lessons about operating in the highly competitive health care marketplace. These lessons emphasize the importance of (1) enabling instead of mandating standards, because mandates may change willingness but do not affect capabilities; (2) making a business case that differentiates needs (i.e., things that stakeholders are willing to pay for) from wants (i.e., things they are not willing to pay for); (3) the Internet, which is widely viewed in the region as a plausible means of achieving long-held visions of seamless integration of information across organizations and which allows organizations to assume that networking capabilities will be in place so they can concentrate on higher order functionality; (4) information security and privacy, which can be either a barrier or an enabler, depending upon the circumstance; (5) the widespread sharing of expertise and information; (6) education as a means of facilitating the migration of information technology into health care, especially through efforts to reengineer the way organizations operate (acontinue
process that can be more important than the technology itself); (7) working to refine national standards and develop implementation manuals; and (8) balancing competition and cooperation (firms can cooperate on some subsets of issues but not on others that are seen as having greater proprietary value).
Current or recent projects include an effort to standardize eligibility information, for which there is agreement on data items but not on presentation. Other regional projects are aimed at exchanging data on pediatric immunizations, referrals, claims, and lab transactions.
Community Health Information Technology Alliance Project with Agora
Peter B. Summerville, director of CHITA, and Kirk Bailey, manager of security policy for the Regence Group and founder of Agora, presented an overview of the Three-State Model Security Prototype. CHITA was chartered in 1997 and has 60 member organizations, including providers, payers, and state agencies. It is part of the FHCQ but has a separate board of directors. Agora, a local group interested in computer security, has about 450 members representing 120 Pacific Northwest region corporations. It was formed by chief information officers and security officers who became increasingly concerned about network vulnerabilities as their companies began to move online.
CHITA's early work focused on eligibility and referral transactionsnegotiating agreements on data fields and standards to facilitate the electronic interchange of information. CHITA and the FHCQ worked with organizations in Massachusetts and Minnesota on a three-state project focusing on electronic security. The goals were to determine how electronic security could be implemented affordably and to develop a business case for a community-wide, secure infrastructure for electronic business. The group worked with Science Applications International Corporation (SAIC) to develop a security and risk management plan for business-to-business health information networks.
The plan identifies seven levels of increasing health care security. Together with Agora, CHITA is working to implement health security level 6 (HSL 6) within participating organizations. HSL 6 includes specifications for three network-based information services: authenticated, secure messaging; authenticated, secure file escrow and transfer; and authenticated, role-based access. The security model has been developed and published, and CHITA is in the process of identifying a bridge operator organization that will function as a trusted intermediary to oversee a prototype implementation, followed by a wider pilot project in the region. Issues to be addressed include the identification of a certificate authority,continue
which might be a nonprofit organization, the government, or a private corporation such as Verisign.
CHITA has no plans to attempt to change the Internet or its directions but rather will attempt to accommodate whatever weaknesses it exhibits with respect to information security. While asserting that businesses need to move too quickly to wait for the NGI, Mr. Bailey wondered if it would be possible to allocate part of the Internet 2 (perhaps one or two frequencies) for health care. He also would consider the formation of a separate health information network as a means of avoiding some of the security concerns associated with the Internet.
According to him, security officers in health care have responsibilities that differ from those of their counterparts in other industries. The applicable state and federal laws are different, the privacy and security concerns are greater, and health care organizations must meet requirements for successful electronic data interchange. At the same time, the health care industry is driven by economics, not privacy.
Washington State Laboratory Reporting Project
Jac Davies, representing the Washington State Department of Health, described the Electronic Laboratory Reporting System (ELBRS) project, which involves the electronic submission and tabulation of reportable events within the state, of which there are fewer than 100,000 every year. (Physicians and testing laboratories are required to report certain conditions to their county health department.)
Such reports generally are sent by regular mail, fax, or voice mail. Public health officials then are required to follow up with the doctor and patient to further investigate possible causes, paths of contagion, and so on. Often, reports are sent to the wrong county and/or are not subsequently forwarded to the state. Furthermore, different states and counties tend to have their own lists of reportable conditions, which are tied closely to local concerns (the conditions vary, for example, between urban and agricultural counties), and they have different rules for where to send the information. As laboratories (and health organizations generally) consolidate into national entities, tracking different reporting requirements has become time-consuming. SmithKline Beecham, for example, operates a number of clinical laboratories and has three or four people dedicated to tracking different reporting requirements.
Under Washington's planned system, lab reports would be sent directly to the state rather than to local health departments. The state then would process the reports and forward information down to local communities and up to the Centers for Disease Control and Prevention (CDC), as necessary. Such centralization would allow the state to bettercontinue
track incidents across county lines. Planners hope that the system will encourage greater communication between the state and local communities or the CDC and that it will improve compliance with reporting requirements. Several issues have informed the planning for this proposed system. One is the use of the Internet, which is not only a logical choice but also the only viable option. Another issue is the sensitive nature of the data; there is, for example, a state requirement for reporting AIDS cases. A third issue is privacy, which is a major concern of the governor and residents of Washington.
A pilot program is under way with Group Health of Puget Sound. Labs encrypt their test reports and send them to the state health department's file transfer protocol server, which sits outside a firewall. State personnel move the file behind the firewall, check for errors, run it through an HL-7 formatter, put the data on an SQL database server, and send them to the county. They use a public key cryptography system (Pretty Good Privacy) described as minimal. There is no formal program in place for changing keys. According to a preliminary evaluation, the pilot program improved the completion and timeliness of reports. The time required to send information to the local health office improved modestly (to less than 1 day) and the time required to send information to the state improved by an average of 40 days (to about 1 day).
1. Bernie H.K. Huang relocated to the Children's Hospital of Los Angeles and the University of Southern California as professor and director of Informatics effective January 1, 2000.break