6
Innovation in Cardiac Imaging

STAN N. FINKELSTEIN, KEVIN NEELS, AND GREGORY K. BELL

Technological innovation has been the lifeblood of many sectors of the American economy and, as a result, managers, policymakers, and academic researchers have long sought to understand what factors encourage technological innovation and how this process can be made more productive. While innovation has been studied intensively in a wide range of contexts,1 there remains a considerable need for a better understanding of medical innovation as it occurs in academic, industrial, and government research and development settings. Innovation in medical technology takes place within a unique environment that raises many complex issues regarding the need for collaboration across disciplinary lines and the moral and ethical implications of working with human subjects. Improving our understanding of this process may help us to identify points of leverage and accelerate the pace of technological innovation.

This chapter presents some preliminary findings and hypotheses drawn from field interviews with key participants who are involved in the innovation process in two important and widely used technologies that provide diagnostic information about the heart: nuclear cardiology and echocardiography. These technologies pose some especially interesting problems for innovators since, in both instances, their development and eventual successful application required collaboration

1  

Nathan Rosenberg, for example, has published a number of papers on this topic. See, for example, "The direction of technological change: Inducement mechanisms and focusing devices" (Rosenberg, 1969); "Problems in the economist's conceptualization of technological innovation" (Rosenberg, 1975); and "The influence of market demand upon innovation: A critical review of some recent empirical studies" (Rosenberg and Mowery, 1979).



The National Academies | 500 Fifth St. N.W. | Washington, D.C. 20001
Copyright © National Academy of Sciences. All rights reserved.
Terms of Use and Privacy Statement



Below are the first 10 and last 10 pages of uncorrected machine-read text (when available) of this chapter, followed by the top 30 algorithmically extracted key phrases from the chapter as a whole.
Intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text on the opening pages of each chapter. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

Do not use for reproduction, copying, pasting, or reading; exclusively for search engines.

OCR for page 125
Sources of Medical Technology: Universities and Industry 6 Innovation in Cardiac Imaging STAN N. FINKELSTEIN, KEVIN NEELS, AND GREGORY K. BELL Technological innovation has been the lifeblood of many sectors of the American economy and, as a result, managers, policymakers, and academic researchers have long sought to understand what factors encourage technological innovation and how this process can be made more productive. While innovation has been studied intensively in a wide range of contexts,1 there remains a considerable need for a better understanding of medical innovation as it occurs in academic, industrial, and government research and development settings. Innovation in medical technology takes place within a unique environment that raises many complex issues regarding the need for collaboration across disciplinary lines and the moral and ethical implications of working with human subjects. Improving our understanding of this process may help us to identify points of leverage and accelerate the pace of technological innovation. This chapter presents some preliminary findings and hypotheses drawn from field interviews with key participants who are involved in the innovation process in two important and widely used technologies that provide diagnostic information about the heart: nuclear cardiology and echocardiography. These technologies pose some especially interesting problems for innovators since, in both instances, their development and eventual successful application required collaboration 1   Nathan Rosenberg, for example, has published a number of papers on this topic. See, for example, "The direction of technological change: Inducement mechanisms and focusing devices" (Rosenberg, 1969); "Problems in the economist's conceptualization of technological innovation" (Rosenberg, 1975); and "The influence of market demand upon innovation: A critical review of some recent empirical studies" (Rosenberg and Mowery, 1979).

OCR for page 125
Sources of Medical Technology: Universities and Industry between individuals trained in medicine or the life sciences and those trained in engineering or the physical sciences. Our approach has been to identify a number of distinct innovations within the overall development of each of the main technologies identified above. Through interviews with engineers, scientists, and clinicians in industry and academia who were involved in or highly knowledgeable about each development, we explored the sequences of events leading up to the innovation, the settings within which the events took place, and the backgrounds and interactions of the participants. (Several case write-ups of component innovations appear as appendixes.) Then, drawing upon the findings yielded by our research, we constructed a model to identify elements of the innovation process that seemed to be common to each of the developments we examined. Analysis of this tentative model of the innovation process helped us to identify some points of leverage for increasing the rate and sharpening the focus of innovation. We discuss how these levers could productively stimulate changes in managerial and public policy. Our focus upon two limited areas of technology reflects a conscious decision to opt for depth rather than breadth of analysis. With only two data points it is impossible to subject our observations and conclusions to rigorous empirical verification; thus, they should be taken as hypotheses and directions for further research rather than as firmly proven facts. Our hope is that an in-depth exploration of these two areas of innovation will provide greater insight into some of the qualitative and serendipitous aspects of the innovation process and inject some new ideas into the ongoing debate over what can and should be done to foster and support this process. OVERVIEW OF CARDIAC IMAGING The cardiovascular field provides an excellent opportunity to study the process of innovation. In the past 20 years especially, a number of technological advances in diagnosis and therapy have significantly changed clinical practice. Cardiology now attracts top medical school graduates and as its practice has become increasingly interventional many of these new capabilities have diffused from tertiary medical centers to the community. These developments have contributed to observed reductions in death rates from heart disease and to improvements in the quality of life. Numerous techniques are available for producing images of the heart that provide valuable information for guiding diagnosis, patient assessment, and therapeutic intervention. From this set of techniques we have selected two areas of technology that in recent decades have seen especially significant advances: nuclear cardiology and echocardiography.2 Nuclear medicine techniques are 2   Other imaging technologies that have been used in this therapeutic area include X-ray imaging with contrast agents and magnetic resonance imaging (MRI).

OCR for page 125
Sources of Medical Technology: Universities and Industry TABLE 6-1 Selected Innovations in Cardiac Imaging Echocardiography (uses ultrasound) Nuclear Cardiology (uses radioisotope emission scintigraphy) Two-dimensional Thallium-201 Real time Technetium-99 sestamibi (Tc-99 sestamibi) Phased array Single photon emission computerized tomography (SPECT) Pulse Doppler   Color flow SPECT camera Channel expansion (64-128) Triple-headed camera Transesophageal Quantitative image interpretation Acoustic quantification   minimally invasive and have been used, for the most part, in patients with known or suspected coronary artery disease—the progressive blockage of the coronary arteries that can eventually lead to ischemia, angina, and heart attack. The ultrasound technique, or echocardiography, is noninvasive and has figured most prominently in the diagnosis of a variety of heart conditions other than coronary disease, such as valve disease, septal defects, and wall motion abnormalities. As shown in Table 6-1, our study focused on successive innovations in the development of these imaging modalities. We found the two streams of innovation to be largely nonoverlapping clinically, although this may change in the future as echocardiography evolves toward a more prominent place in the evaluation of coronary artery disease patients. Nuclear Cardiology Although radioisotope tracer techniques have been used sporadically in cardiology since the 1920s, their use for the evaluation of myocardial blood flow and pumping ability has become widespread only within the past 15 years. With this technique, a radioisotope, given intravenously, is rapidly and selectively taken up by healthy cardiac tissue. The tracer agent emits high-energy photons whose spatial location can be detected by a scintillation camera. The pattern of high- and low-photon emission densities produces an image of the heart. The clinical utility of these images arises from the simple fact that the radioisotope is taken up only in regions of normal heart tissue with adequate blood flow. Thus, a region of scar tissue left over from a prior infarction will not take up the radioisotope and will show up on the image as a dark spot. Blockages of the coronary arteries resulting in regions of insufficient blood flow will also create dark spots. Such an image of the heart can provide information that is unavailable through other means. One of the most important clinical applications of nuclear cardiology involves the detection and evaluation of reversible defects. In patients with coronary artery disease the adequacy of the blood flow provided by

OCR for page 125
Sources of Medical Technology: Universities and Industry the occluded arteries depends upon the patient's level of exercise. At rest the blood flow may be adequate, thus images taken at rest may appear normal. With exercise, however, the oxygen demands of the myocardium may grow beyond what the occluded arteries can deliver, thus images taken after exercise may show defects. Such defects are characterized as reversible lesions.3 They can be treated via therapeutic interventions such as coronary bypass surgery or angioplasty aimed at restoring normal blood flow.4 However, defects that appear both at rest and after exercise are simply scar tissue, in contrast, a permanent defect. Such regions cannot be treated with the interventions noted above. Nuclear cardiology techniques have found other applications in the management of coronary artery disease. Measurements taken at rest or during stress have been shown to provide important information regarding a cardiac patient's long-term prognosis. Nuclear cardiology techniques have been used extensively in patients recovering from heart attacks to determine how much myocardium and function may have been lost (Kotler and Diamond, 1990). Images taken of the blood pool during the brief period before the radioisotope is taken up by the heart can provide valuable quantitative information about the heart's pumping action. There are at least three important categories of recent technological innovations in cardiac nuclear medicine imaging: advances in cameras and detectors; development of better isotropic labeling agents; and the wider use of computer techniques for image reconstruction and interpretation. Our research focused on five innovations within these three categories: the development of thallium-201 imaging (see Appendix A), Tc-99 sestamibi tracer (see Appendix B); the single photon emission computed tomographic (SPECT) camera; triple-headed camera; and computer-based quantitative image interpretation (see Appendix C for a review of developments in SPECT camera technology). 3   The use of the term "reversible" grows out of the nature of the imaging protocol used in connection with thallium-201. This agent is administered after the patient has exercised sufficiently to raise his or her heart rate to its maximum level. The agent is then taken up by those regions of normal myocardium that have adequate blood flow. Over the next several hours, the thallium-201 redistributes to areas of normal myocardium that have blood flow at rest. The process is not unlike that which occurs when a drop of ink is released into a glass of water; over time the ink will redistribute throughout the volume of water. In the standard thallium protocol, a second image will be taken several hours after the patient has exercised. The redistribution will then have "reversed" the original lesions. The improved imaging agent—Tc-99 sestamibi—was designed specifically to remain in those portions of the heart into which it was originally absorbed. Because it does not redistribute, its use entails a quite different protocol in which the patient receives two separate injections of the agent, each a day apart. 4   In bypass surgery the blocked coronary arteries are surgically replaced with open vessels taken from other parts of the body. In angioplasty a catheter with a balloon on the tip is inserted into the blocked artery. Inflation of the balloon tip at the site of the blockage mechanically forces open the artery.

OCR for page 125
Sources of Medical Technology: Universities and Industry Echocardiography Echocardiography employs high-frequency sound waves to generate visual images of cardiac anatomy and function. The basic principles involved are not unlike those involved in sonar. The technological basis of echocardiography fundamentally shapes the kinds of clinical information it provides. Because the nature of the signal depends strongly on the attenuation and reflection properties of the structures through which it passes, echocardiography has always had a strong anatomical focus. The high rate of image acquisition it provides has also made it a valuable tool for examining and evaluating the movement of cardiac structures. Over the past 15 years this technology has developed substantially, and there is little evidence that the procedures now in general use exhaust its potential. In the coming years we are likely to see continuing improvements in its accuracy, the breadth of clinical information it generates, and its ease of use. And, over the long-term, echocardiography may pose a substantial competitive threat to diagnostic nuclear cardiology. The recent history of echocardiography illustrates the flexibility and power of this technology. The M-mode machines used in early clinical diagnoses could look only along a single axis, providing what was sometimes called an "ice-pick" view. An ultrasonic signal would be transmitted along this axis and reflected off any anatomical structures it encountered. As these structures moved, differences in the length of the travel path created corresponding differences in the time between emission and detection of the reflection. Originally, these machines were limited to the diagnosis of valvular disorders. A skilled clinician who could orient the machine toward a heart valve could observe how the value moved with the beating of the heart. The later addition of scanning and more advanced signal processing capabilities enabled clinicians to generate two-dimensional images of the heart. With this development, users of echocardiography were able to examine large-scale cardiac anatomy. The technology quickly evolved from still pictures to real-time moving images. Moving pictures provided valuable information on cardiac function. Changes in volume over the course of the beat cycle provided a basis for assessing the heart's pumping ability. By revealing wall motion abnormalities, real-time imaging also provided indirect information on the presence of scar tissue and/or ischemia. The development of Pulse Doppler allowed echocardiography to use changes in frequency caused by the motion of the blood in the heart (i.e., the Doppler effect) to generate quantitative information about velocities. The subsequent development of Color Doppler made it possible to display this information visually through color-coding of what had previously been a gray-scale image. These capabilities made it possible for clinicians to use echocardiography to monitor the movement of blood through the heart. Using these techniques, they could

OCR for page 125
Sources of Medical Technology: Universities and Industry detect the backwash of blood through defective heart valves or see jets of blood generated by perforations in the wall separating the right and left sides of the heart. Other developments have increased the accuracy and usefulness of this technology. For instance, the development of reliable probes that could be inserted into the esophagus improved the resolution of these images by permitting clinicians to view the heart without the distortions caused by the passage of signals through fat, bone, and lung tissue (see Appendix D). The more recent development of acoustic quantification (AQ) using techniques to detect and monitor the boundary between the inner edge of the myocardium and the blood pool that it contains has provided automated real-time measurements of ventricular function—information that has been shown to have major prognostic significance (see Appendix E). A number of further efforts to extend the capabilities and increase the accuracy of echocardiography are currently underway. One line of investigation aims at the development of echocardiographic contrast agents. Current efforts aim at the development of agents that generate microbubbles able to pass through the microvasculature and that would produce an ultrasonic contrast effect. If successful, these agents would make it possible to determine the amount of blood supplied to different regions of the heart—a development that would place echocardiography in direct competition with nuclear cardiology. A second line of investigation involves the development of miniature probes that can be inserted into the heart through the coronary vasculature. At least one clinician has predicted that intraluminal ultrasound imaging using these probes will eventually displace angiography as the "gold standard" for detecting and localizing coronary artery disease. A third effort is attempting to use subtle aspects of the ultrasound signal to characterize the tissues through which the signal has passed. Investigators have attempted to use this technique to identify tumors and to distinguish between scar tissue and ischemic myocardium. Successful achievement of these goals would also firmly position echocardiography as a direct competitor of nuclear cardiology. THE INNOVATION PROCESS To analyze and profit from the lessons our collection of case studies generated, we should first describe the innovation lifecycle, which serves as a framework for organizing and interpreting the events that have occurred during the development of these two technologies. The innovation lifecycle consists of five distinct phases, through which most of the innovations that we studied passed. The length of a stage for a particular innovation can vary depending on the technical or clinical expertise and involvement required, scientific advances, and competitive actions. Table 6-2 is an outline of the innovation lifecycle. Evidence of its application, based upon examples drawn from our case studies, follows.

OCR for page 125
Sources of Medical Technology: Universities and Industry TABLE 6-2 Innovation Lifecycle Concept A revolutionary new product concept is often demonstrated in academia although the research may be funded by a manufacturer. The concept proves basic technological feasibility. Prototype The prototype is a working model of the product, not necessarily designed to resemble the actual product, but sufficient to demonstrate clinical utility and to obtain initial feedback from clinicians. Commercialization Production and marketing of the new product is undertaken. Much of the development effort between prototype and commercialization focuses on product design and manufacturing process issues. Diffusion The clinical capabilities of the product become clearer. Published articles appear rapidly as a progressively larger and more diffuse group of clinicians use the innovation. Refinement Evolutionary changes in the product are made to accommodate clinician requirements and to lower manufacturing costs. These changes represent incremental changes in the function and use of the innovation inspired by growing clinical and manufacturing experience. Examples from Echocardiography and Nuclear Cardiology Concept An innovation in imaging typically begins with an isolated investigator demonstrating the technological feasibility of a new technique. Commonly, the investigator is operating in an academic setting and in some instances the research may be funded by an equipment manufacturer, often through the donation of equipment. Proof of feasibility typically involves the use of a jury-rig setup that is awkward to use and that produces results too crude and/or too unreliable to be useful in ordinary clinical practice. Nonetheless, these early demonstrations prove or disprove the technological feasibility of the technique. Successful demonstrations suggest that if appropriately developed the technique could be clinically useful. Consider the Color Doppler echocardiography, tested at the University of Washington. Investigators there searching for defects in the wall separating the right and left sides of the heart and using early ultrasound equipment fed their signals into a color display that showed, through color coding, the direction of movement. Documents we reviewed state that nuclear medicine researcher David Kuhl conducted successful early work at the University of Pennsylvania

OCR for page 125
Sources of Medical Technology: Universities and Industry on computer-assisted image reconstruction of radioisotope scans before the wide dissemination of X-ray computed tomography.5 In the case of nuclear cardiology's SPECT camera, the demonstration took place at the University of Michigan, where John Keyes attached an early-generation planar gamma camera to a gantry and created the ''Humongatron," the first SPECT camera. Concept development, however, need not always rely on the development of instrumentation. Initial academic work on signal attenuation at soft tissue boundaries was conducted by a bioengineer who was later hired by Hewlett-Packard (HP). His initial investigations laid the foundation for the development of prototypes to test the theories of edge detection and acoustic quantification.6 The concept behind the development of the technetium-based sestamibi radiopharmaceutical was born from the need to create improved performance parameters. Academic investigations determined that the existing thallium imaging agent needed improvement along two specific dimensions: brightness, having to do with the number and energy levels of the emitted photons; and the distribution properties of the agent in heart tissue. Thus, the development of a prototype involved a meticulous search for an agent with the required characteristics. Long intervals sometimes separate the proof of a concept's viability from its commercialization. Academics offered initial proof of the technological feasibility of transesophageal echocardiography (TEE) at an early date, and Diasonics, a company active in the ultrasound equipment market, developed commercial TEE probes in the early 1980s. The company ran into financial difficulties, however, and only a small number of its TEE probes were ever produced. Later, at Hewlett-Packard, an electrical engineer working with "Herman," HP's lab skeleton, was concerned about the signal attenuation problems posed by fatty tissue around the ribs as well as with the narrow window on the heart afforded by the space between the ribs. He noticed the esophagus led directly behind the heart and surmised, correctly, that it would be an ideal window through which to image the heart. When he presented the idea to clinicians they were excited and recalled the successful product Diasonics developed a few years back. In fact, a few clinicians actually had an old Diasonics probe. One could argue that the technological feasibility of the concept was proven much earlier, but the commercial success of the innovation was triggered by this engineer's rediscovery of the idea while at Hewlett-Packard. Although, in retrospect, our study respondents seemed to be in substantial 5   Tomographic imaging involves the acquisition of multiple images, typically by rotating the camera or other image acquisition device around the patient. Computer analysis of these multiple images permits the construction of a three-dimensional representation of the structure under examination. That image can then be displayed at any depth and from any angle. Tomographic imaging stands in contrast to simpler planar or two-dimensional imaging techniques. 6   Acoustic quantification involves the use of electrocardiographic techniques to measure the heart's pumping action in real time.

OCR for page 125
Sources of Medical Technology: Universities and Industry agreement about when and where these breakthroughs occurred, it was frequently noted that the eventual significance of an innovation was not usually apparent at the time of concept development. A substantial technical gulf often separated the jury-rig setup, which proved technological feasibility, from the prototype, whose performance approximated that of the final commercial system. The genuine clinical capabilities of the commercialized system are visible only in embryonic forms at the proof-of-concept phase, and considerable clinical vision and experimentation is required to unveil the innovation's full potential. Indeed, sometimes the companies funding research conceived an application that is far removed from what eventually proved clinically useful and commercially viable. Consider the case of acoustic quantification. When Hewlett-Packard provided research funding to Washington University in St. Louis, its goal was tissue characterization—identification of ischemic heart tissue by its acoustic signature. Many did not anticipate that this research would play a significant role in the efforts to develop edge detection capabilities—that is identification of the heart/blood pool boundary—that were eventually to enable real-time measurements of ejection fraction7 or ventricular function. Prototype The development of a working prototype of the imaging device or agent marks the second step in our innovation process. The capabilities of this prototype bear some resemblance to those of the unit that is eventually made available commercially. At this stage, different groups—potential users, the manufacturing experts, system designers, and so forth—have an opportunity to assess both technical and economic feasibility. Ideally, initial problems are discovered in the sketches and models and corrected, and are not incorporated into the engineering of a full-scale prototype. Where the prototype is developed is also an important element of the innovation process. In almost all the cases we investigated, this step was carried out in an industrial setting. There was only one major exception—the development of the technetium-99 sestamibi radiopharmaceutical, which, because it did not involve development and/or operation of a complex, novel piece of equipment, could be developed by a collaborating pair of university-based research chemists at their own facilities.8 In contrast, the development of working prototypes of ultrasound systems capable of Color Doppler flow imaging or acoustic quantification 7   Ejection fraction is defined as the percent reduction in ventricular volume over the course of the beat cycle. A high ejection fraction (i.e., a large reduction in volume) implies strong pumping action. 8   Although thallium-201 is also a radiopharmaceutical, its production required the use of a cyclotron, a substantial piece of equipment. Indeed, when New England Nuclear enjoyed its most commanding position within the thallium market, it began construction of the world's only privately owned linear accelerator.

OCR for page 125
Sources of Medical Technology: Universities and Industry were sizable engineering undertakings whose success required the cooperation of teams of skilled technicians. Efforts of this type are not easily organized or carried out in an academic or clinical research setting. The prototype for HP's introduction of Color Doppler is an important example (see, also, appendix F). HP initially believed that Color Doppler would require a complete redesign of the system architecture. The existing equipment simply was not designed to accommodate so many upgrades (Color Doppler would be the eleventh in the series). Such a system redesign would require enormous time and effort; yet Color Doppler was coming on the heels of a long-overdue Pulse Doppler product and there were concerns about how long a complete redesign would take. Eventually an engineer at HP devised a prototype that allowed HP to introduce Color Doppler as a simple upgrade rather than as part of a redesigned system, thus preserving the installed-base advantage that HP was building and permitting HP to bring the product to market sooner. Even this simpler solution, however, required a sophisticated understanding of the architecture of the existing systems and input from some of HP's best engineers. Development of a fully functional prototype is a precondition for clinical research into the properties and capabilities of the new technology. Arguably the most successful and clinically significant imaging innovations allow physicians to see things they were not able to see before. For this very reason, however, the clinical significance of such innovations may not be initially clear. Acoustic quantification is a good example. Physicians are still somewhat unsure of the clinical implications of real-time measurements of ventricular function since they have never seen such measurements before. Access to prototypes allows leading physician collaborators to explore the new technology's capabilities. Publications based on these early investigations play a significant role in defining the initial market for the new technology, facilitating its diffusion into clinical practice. Commercialization From prototype to commercialization, the development effort focuses on design and manufacturing issues as the sponsoring firm strives to lower costs, to improve yields, and to enhance ease of use. The goal of this effort is commercial success; relatively less attention is paid to clinical concerns beyond those identified during trials with the prototype. At the prototype stage, clinical investigators may accept a degree of unwieldiness and unreliability that would be unthinkable in a product designed to appeal to a broad market. In the process of moving to commercial-scale production, however, these aspects of the devices must be refined. Ramping up to commercial scale for production and developing a more user-friendly interface are not simple tasks. Consider the somewhat ill-fated "Revision L," the twelfth in HP's series of echocardiography enhancements. This

OCR for page 125
Sources of Medical Technology: Universities and Industry project involved the complete redesign of the system architecture, a task that had been postponed by the successful upgrade for Color Doppler. Much of the impetus for this project was the competitive challenge posed by Accuson, a competitor of HP in the ultrasound market. Accuson developed a 128-channel phased array system, which was a great success in ultrasound radiology, and had plans to introduce this system to the cardiology market. There was no shared agreement among researchers at HP that 128 channels would necessarily lead to better images than the current 64-channel system. There was concern, however, that Accuson's success with a "more powerful" system in radiology would legitimize the success of 128-channel systems for echocardiography. Further, the 128-channel system was technologically challenging and offered a more appropriate platform for future expansion. Hewlett-Packard engineers pushed ahead, but encountered two serious problems. The first was due to a change in manufacturing technology. Plans called for the use of surface mount manufacturing techniques instead of straight pin circuit board design. This new technique promised superior cost and space economies, which were achieved only after a tortuous learning period marked by extreme difficulty in debugging the new circuit boards. The second problem involved the scheduling of developmental milestones. This entire effort posed tremendous engineering challenges, not the least of which was development of an appropriate 128-channel architecture. It simply was not possible to develop all of the requisite new technologies and adhere to the project schedule. The result was cost and time overruns and lab morale problems at HP. When the system was finally introduced, several engineers and clinicians agreed that the image quality the new units yielded was no better than the 64-channel units being replaced. Another example of the difficulties of commercializing a new technology is the development, in the early 1980s, of the multi-headed SPECT camera, which proceeded from a University of Texas laboratory to the Technicare subsidiary of Johnson & Johnson, whose engineers created a working prototype. In 1986, however, Johnson & Johnson closed down the operations of Technicare and licensed its work on the multi-headed SPECT to two start-up firms to complete the commercialization of the technology. SPECT cameras were eventually brought to market, but only after a number of different organizations had started and then abandoned the effort. Diffusion Diffusion of innovation is typically characterized by a process in which different categories of users become aware of and come to adopt the product. Early adopters are usually opinion leaders who learn about innovation from published scientific literature or from the marketing activities of manufacturers. Later adopters learn from earlier users with whom they are professionally and

OCR for page 125
Sources of Medical Technology: Universities and Industry however, who can act both passionately and objectively while constantly evaluating the allocation of scarce R&D resources among project champions. There are a number of other ways to integrate the clinical community more effectively into the development process for new medical technologies. One way to improve communication between clinicians and engineers is to recruit more multidisciplinary professionals such as bioengineers for the research lab. Some firms have traditionally preferred to hire the best electrical or mechanical or software engineers and encourage them to learn applications on the job. That practice reflected, in part, the culture of the workplace, but also the belief that those who would choose to be labeled as bioengineers were somehow not the most technically capable. Clinical consultants recognized as leading practitioners in the field are an invaluable source of insight, especially regarding concept creation and image quality assessment. At least one firm we know of invites clinicians to give presentations before lab employees. These presentations provide the R&D personnel with better insights regarding coronary function, the phenomena clinicians would like to observe, and the clinical implications of information on such phenomena. As leading practitioners, these consultants may be more aware of academic work by other clinicians regarding new uses for imaging technology. Furthermore, a long-term relationship allows the consultant to acquire a greater appreciation for the capabilities of the base technology and may help generate new concepts for internal development. In both the concept and prototype stages, these practitioners represent ideal sanity checks to determine if an innovation might become the leading edge in clinical practice or be relegated to the fanatical fringe. Of course, in a highly competitive marketplace, the firm must always be aware of the security risks posed by outside consultants, especially during the prototype stage. In the early years of echocardiography the same set of clinical experts were consulted by all companies. Later, one of the companies we interviewed decided to disguise innovations before requesting clinical assessments. The broad trends that have made clinical testing and university collaboration more formal and less flexible highlight the importance for industry of maintaining an environment conducive to innovation. The problem of an ossifying innovation process in an organization that can afford bureaucratic controls is one that arises repeatedly. The key to success is maintaining a spirit of entrepreneurship, especially important in the concept and prototype stages.11 One of the more useful tools identified in this regard was HP's apparently unstated policy of ten percent unstructured engineering time. This "under the benches" policy seemed to be acknowledged by a majority of the original department engineers we interviewed, but most noted that such a policy was now honored more in the breach 11   Paul M. Cook, CEO of Raychem Corporation, describes how his company has successfully maintained an innovative corporate culture in an interview with William Taylor (1990).

OCR for page 125
Sources of Medical Technology: Universities and Industry than in the observance. That 10 percent cushion, however, is said to be what permitted HP engineers to explore TEE and to propose a superior design for the upgrade of Color Flow Doppler. Maintaining a continuing stream of medical innovations must continue to be a critical objective of the academic, industrial, clinical and governmental-based individuals who contribute to medical research and development. We hope that our work serves to stimulate additional research that will eventually accelerate the rate and alter the nature of technological change and, thus, be more responsive to society's needs. ACKNOWLEDGMENTS The authors gratefully acknowledge the Hewlett-Packard Company's Medical Products group for the support of this research. The views expressed are those of the authors and not of the sponsoring organization. We especially thank Ben Holmes and Larry Banks, without whose time, effort, and encouragement this work would not have been possible. REFERENCES Blume, S. S. 1992. Insight and Industry: On the Dynamics of Technological Change. Cambridge, Mass.: The MIT Press. Coste, P. 1989. An Historical Examination of the Strategic Issues Which Influenced Technologically Entrepreneurial Firms Serving the Medical Diagnostic Ultrasound Market. Ph.D. dissertation. Claremont Graduate School. Katz, R. 1988. Managing Professionals in Innovative Organizations. New York: Harper Collins. Kotler, S. T., and G. A. Diamond. 1990. Exercise thallium-201 scintigraphy in the diagnosis and prognosis of coronary artery disease. Annals of Internal Medicine 113:684–702. Rosenberg, N. 1969. The direction of technological change: Inducement mechanisms and focusing devices. Economic Development and Cultural Change 18(October):1–24. Rosenberg, N. 1975. Problems in the economist's conceptualization of technological innovation. In: History of Political Economy, vol. 7. Durham, N.C.: Duke University Press. Rosenberg, N., and Mowery, D. 1979. The influence of market demand upon innovation: A critical review of some recent empirical studies. Research Policy 8:102–153. von Hippel, E., and S. Finkelstein. 1979. Analysis of innovation in automated clinical chemistry analyzers. Science and Public Policy 6:24–37. Taylor, W. 1990. The business of innovation: An interview with Paul Cook. Harvard Business Review, March–April 1990.

OCR for page 125
Sources of Medical Technology: Universities and Industry APPENDIX A Thallium Imaging The introduction and diffusion into widespread clinical practice of a workable procedure for thallium-201 cardiac perfusion imaging was based on a number of distinct discoveries and developments. The various components of the procedure currently in use were developed piecemeal by different individuals at different institutions. One of the important preconditions for the use of thallium-201 as a perfusion imaging agent was the development of a functional gamma camera. Development of the Anger camera by Hal Anger met this need, moving camera technology beyond the plateau it had achieved in 1960s. This linked array of photomultiplier tubes permitted higher-resolution pictures of the areas of the myocardium perfused by the radioactive tracer. This basic design was substantially refined by manufacturers who increased the number of photomultiplier tubes, improved collimators, added tomographic imaging capabilities, and increased the number of scanning heads in an effort to improve resolution. Some of the most significant early development work on the imaging agent was carried out by Elliot Leibowitz, then a radiochemist at Brookhaven National Laboratories. He described thallium as a potassium analogue and recognized the relationship between blood flow and thallium uptake by the myocardium that is the foundation of thallium's usefulness as a perfusion imaging agent. He also developed a procedure for postirradiation purification of the cyclotron-produced radioisotope that was suitable for use in commercial-scale production. Another key step in the development of the procedure took place at Massachusetts General Hospital, where time-delayed imaging studies carried out by Jerry Pohost explored the redistribution properties of thallium-201. These studies showed that over a period of hours following the initial injection of the radioisotope, it would redistribute to those portions of the heart that had initially experienced restricted blood flow. Reversible perfusion defects, it was found, served as markers for ischemic but viable areas of the myocardium. This discovery became the foundation for the development of the exercise-rest double-imaging protocol that is still used today. A number of regulatory factors facilitated the rapid spread of thallium imaging in clinical practice. The toxicology of thallium was well known from prior applications. In 1974, the Atomic Energy Commission announced that it was turning the regulation of radiopharmaceuticals over to the Food and Drug Administration. The actual transfer occurred 18 months later, just as the cardiac imaging procedures for thallium-201 were being refined. Thus, the FDA, because of its lack of experience, was not in a position to make onerous regulatory demands. New England Nuclear (NEN), the company leading the effort to commercialize thallium-201 for radionuclide scanning, also benefited from its location

OCR for page 125
Sources of Medical Technology: Universities and Industry in Massachusetts, one of the so-called "agreement" states that had taken over responsibility from the federal government for the regulation of cyclotron-produced products. Thus, NEN enjoyed a somewhat simplified regulatory regime even for its nuclear operations. By the late 1970s, numerous papers had been published documenting the diagnostic properties of thallium-201 imaging. Its value was well established as a tool for detecting the presence of coronary artery disease (CAD), for measuring the extent of viable myocardium in late-stage CAD patients, in postinfarction patients, and in patients who had undergone either angioplasty or bypass surgery. Publications appearing throughout the 1980s documented the findings of long-term follow-up studies showing that the results of thallium-201 scintigraphy were valuable in establishing prognoses for CAD patients. As experience with thallium-201 scintigraphy accumulated, its shortcomings also became apparent. Interpretation of the images produced by the test required a fair degree of skill. Interpretations could vary from one observer to another. Constraints on camera positioning caused certain areas of the myocardium to be difficult to image—specifically, the areas served by the left circumflex artery. Patients unable to achieve maximal exercise could not be tested reliably. The energy level of the photons emitted were not ideally matched to the detection capabilities of the available gamma cameras. Thallium-201's relatively long half-life, coupled with constraints on the total amount of radiation to which a patient could be subjected, limited the dose that could be administered to a patient, which placed a ceiling on the total number of photons emitted during a test and, thus, on the absolute information content of the test. And although the redistribution properties of thallium were valuable in distinguishing between scar tissue and areas of ischemia, they also placed constraints on the testing protocol. Initial imaging had to follow injection within a limited time period or redistribution would render the test results invalid. In the years following the widespread adoption of thallium-201 scintigraphy, efforts were made to overcome the limitations of the original testing protocol. These efforts, in turn, spawned a series of subsequent innovations. Attempts to make the interpretation process more consistent and more readily available to smaller centers led to the development of quantitative image interpretation software. Efforts to improve the ability of the test to detect perfusion defects in hard-to-image portions of cardiac anatomy lead to the development of Single Photon Emission Computerized Tomography (SPECT). Concern over failure of patients to achieve maximal stress and the effects of this failure on the test accuracy led to the use of pharmacological stress agents. Concerns over the emission profile and half-life of thallium-201 led to a search for an imaging agent based on the technetium-99 isotope. Investigators working in this area also hoped to develop an agent with more favorable redistribution properties. Their efforts led eventually to the development of Tc-99 sestamibi. Use of thallium-201 scintigraphy has been constrained by the widespread

OCR for page 125
Sources of Medical Technology: Universities and Industry availability of a number of competing testing modalities. The presence of coronary artery disease is often apparent simply from a patient's history and presenting symptoms. An electrocardiogram with stress testing can detect the presence of ischemia at a lower cost than thallium scintigraphy (although with less accuracy). At the other end of the spectrum, angiography is both more expensive and more invasive than thallium scintigraphy, but provides what many cardiologists regard to be superior diagnostic information. Angiography is also thought to be a prerequisite for either angioplasty or bypass surgery aimed at elimination of the underlying causes of ischemia, providing, as it does, a "road map" for the surgeon or catheterization expert. The proper place for thallium-201 scintigraphy in this crowded field of alternatives has been the subject of sometimes spirited debate. Some have argued that the incremental information content of a thallium test, given that the patient has already undergone a stress electrocardiogram, is small and of little value. A number of investigators have attempted to identify specific subsets of patients for whom thallium test results can play a critical role in defining the course of treatment. However, despite these systematic efforts to identify the appropriate role for thallium testing, the choice of diagnostic tests continues to be strongly influenced by individual physician preferences. APPENDIX B Tc-99 Sestamibi Tracer Tc-99 sestamibi is a technetium-based synthetic radioisotope with certain properties that enable it to produce images of the heart allowing for an assessment of the "viability" of cardiac muscle and the identification of the possible presence of coronary artery disease. It is regarded by many as an incremental innovation over the use of thallium as the agent for stress imaging studies. Observations about the clinical value of technetium-99 (Tc-99) were first made in the late 1950s by Powell Richardson, a nuclear medicine physician doing research at Brookhaven National Laboratory. He used a molybdenum generator to produce substantial amounts of this agent with high isotopic purity. His work was published but his process was not patented. The later development of the gamma camera, first by Hal Anger at the Lawrence Radiation Laboratory, stimulated this line of investigation, as well as the field of nuclear medicine as a whole. Various commercial firms, including New England Nuclear (NEN), went into the business of producing radioisotopes for medical application. One of the early products was a Tc-based agent to image bones. Thallium-201, produced by cyclotron, became the isotope of choice for cardiac imaging work. As thallium-201 gained acceptance and the performance of stress myocardial imaging became an important part of the evaluation of patients for coronary

OCR for page 125
Sources of Medical Technology: Universities and Industry artery disease, many sought to improve on thallium's properties. First, thallium's rapid redistribution made the nature and quality of the image highly dependent on the elapse of time following injection. Second, the cyclotron-based production process for thallium limits its accessibility.12 Third, the relatively long half-life of thallium-201 limited the dose that could be administered to a patient. Finally, the energy profile of thallium's gamma emissions poorly fit the detection capabilities of the available cameras. Two firms, NEN and Squibb, were known to be actively pursuing work on a Tc-99 agent for cardiac imaging in the early 1980s. NEN funded or collaborated with Deutsch at the University of Cincinnati to produce an agent in 1982. Its chemical structure was technetium dimethyl phosphene, and it was given the name "Cardiolyte." This agent was quite successful in animal studies, but when tested in humans at the Massachusetts General Hospital, it did not successfully image the human heart. Since 1980, Alan Davison, an inorganic chemist at the Massachusetts Institute of Technology, and Alun Jones, a nuclear chemist at Harvard, had been actively pursuing research on a synthetic Tc-99 agent that would correct some of thallium's shortcomings to image the heart. They had personal relationships with staff at NEN and were working with the expectation that NEN would be interested in licensing their agent when their product was developed. The acquisition of NEN by DuPont led to a shift in NEN's research priorities. Under DuPont's management, staff began to screen a large number of liquids for desirable properties in a search for an imaging agent of their own. Their initial lack of interest in Davison and Jones's agent can be seen as an example of the "not invented here" syndrome. DuPont eventually licensed the Davison/Jones agent—Tc-99 sestamibi—and sought FDA marketing approval. DuPont's inexperience with FDA submissions delayed the approval process by two years. Tc-99 sestamibi was approved and introduced to market in 1991, however, and by 1992 had achieved worldwide sales of over $100 million. The Tc-99 agent eventually introduced by Squibb for cardiac imaging has not been quite as successful. Its extremely rapid washout properties make perfusion imaging even more technically challenging than with thallium-201. It has been said, however, that in the hands of a skilled operator these properties permit high patient throughput. Clinicians have received these Tc-99 agents favorably and have adopted them for use in cardiac imaging. With regard to substituting for thallium, there is still debate over whether these new agents produce significantly more clinical information than their predecessors. 12   This process required that thallium-201 be produced at a small number of cyclotron sites, from which it would be shipped to physicians. During shipment the isotope would decay. Producers overfilled vials to guarantee that enough isotope would remain viable upon arrival to permit testing, which drove up costs. It was impossible to store thallium on-site, either at the cyclotron facility or at hospitals.

OCR for page 125
Sources of Medical Technology: Universities and Industry APPENDIX C Single Photon Emission Computed Tomography The early development of Single Photon Emission Computed Tomographic (SPECT) imaging took place between 1958 and the early 1970s at the University of Pennsylvania. Dr. David Kuhl, a physician trained in nuclear medicine, and his collaborators constructed an array of radiation detectors. They were then able to produce a map of radionuclide concentration by taking sequential images of a series of cross-sectional slices. These sequential images were then available for back projection and image reconstruction and could yield information on previously unobservable physiological changes in the body. The original work was driven by the clinical need to image the brain—cardiac imaging came later. Much of Kuhl's work preceded the development and availability of X-ray computed technology (CT) in the early 1970s, when the use of image reconstruction algorithms became commonplace. Broader application of the early SPECT research and further progress was, however, facilitated by and stimulated by the acceptance of X-ray CT. From 1970 to 1974, Dr. John Keyes, working at the University of Michigan and the University of Rochester, adapted a gamma camera to perform cross-sectional imaging of the brain using back projection. In 1974, he and his colleagues built a rotating gamma camera, facetiously referred to as the "Humongatron," used for early brain imaging by SPECT through 1976. Initial images produced by Keyes's camera or Kuhl's detector array were crude and needed a great deal of improvement. For instance, the term "error" was used to refer to the (qualitative) difference between the image and the actual clinical condition, but "error" encompasses several dimensions that were either actually or potentially measurable or quantifiable. These include resolution, attenuation, scatter, artifact, collimator error, and uniformity. Three streams of innovations (independently) pursued led to image improvement. These involved improved image reconstruction algorithms, better imaging agents, and the camera itself. Development of better algorithms took place in industry and academia, but most prominently at Lawrence Berkeley Laboratory under the direction of Thomas Budinger. Published articles (1978–1980) document his contribution to the physics and mathematics of SPECT image improvement. The wide use of SPECT for brain imaging came about only with the availability of the Tc-based imaging agents. Cardiac imaging with SPECT began to be performed in the early 1980s with thallium. Many investigators were not satisfied that the quality of the images produced by thallium-SPECT was significantly better than those taken with planar cameras. Even so, commercial SPECT systems capable of imaging the heart began to become available from a number of manufacturers between 1980 and 1983.

OCR for page 125
Sources of Medical Technology: Universities and Industry Around 1981, Dr. James Willerson, a cardiologist at the University of Texas at Dallas, received a grant to seed the development of a multi-headed SPECT camera. Willerson eventually forged a relationship with the Technicare (instruments) division of Johnson & Johnson to build a prototype of the new kind of camera. Clinical testing of the prototype was to begin around 1986, but was delayed for several years due to Johnson & Johnson's closure of Technicare in that year. The work was continued at two companies licensed by Johnson & Johnson, Ohio Imaging and Trionics. Ohio Imaging was later acquired by Picker. The University of Texas cardiology group finally acquired their three-headed SPECT camera from Picker in 1988. That company and several others now offer commercially available three-headed SPECT cameras. The three-headed SPECT cameras are capable of resolution to about 7-8 mm, compared to the 20-mm resolution of the very early SPECT cameras. Clinical users, including some who had been skeptical of any improvement in accuracy with single-headed SPECT, have been more impressed with the images that are produced by these multi-detector cameras. And new cardiac imaging agents such as Tc-99 sestamibi are said to produce further improvements in the quality of SPECT images over those generated with thallium. APPENDIX D Transesophageal Echocardiography Transesophageal echocardiography (TEE) uses the esophagus as a window to image the heart. An ultrasonic transducer is mounted near the tip of a modified gastroscope, which is manually inserted down the patient's throat. Controls then permit the operator to position the transducer optimally. TEE is used in both outpatient and operating room (OR) settings. In the outpatient market TEE is ideal for imaging otherwise "difficult" patients. Obesity, large chests, and narrow spacing between the ribs all make traditional echocardiography difficult and reduce its accuracy. With TEE, the transducer can be placed close to the heart with little attenuation of the ultrasound signal due to air-filled lungs or bony structures. In the OR setting, TEE allows the anesthesiologist to constantly monitor cardiac function once the chest cavity has been opened. The clear image it provides makes it relatively easy to detect the wall motion abnormalities that mark ischemia. Before TEE there was no other way to image cardiac function during such surgical procedures. Although TEE represents an advance in transducer technology and remote positioning, the TEE probe was actually developed as an add-on to established echocardiography systems. TEE began in the academic environment. In 1976, researchers mounted a transducer on a coaxial cable and used the esophagus to image the left atrium and

OCR for page 125
Sources of Medical Technology: Universities and Industry mitral valve. In 1982, academicians brought the idea to Diasonics, which initially commercialized the product. Diasonics sold approximately 50 TEE probes, but did not pursue the market opportunity. In 1985 an engineer at Hewlett-Packard (HP) was working with ''Herman," the lab skeleton, and considering the problem of signal attenuation due to tissue around the ribs. He noticed the esophagus behind the heart and had an idea of mounting a transducer at the end of a gastroscope. When he approached clinicians with this observation they expressed considerable interest and produced for his examination a number of the old Diasonics probes. At the time, HP was heavily involved in development of Color Flow Doppler imaging. As a result, there were few resources to spare to pursue the TEE opportunity. For the next year the engineer shepherded his underground project through the development process. Once out on the market, design deficiencies surfaced that had resulted from a poor appreciation for exactly how the device would actually be used. For instance, the engineer had talked to a number of gastroenterologists who were very concerned with the probe's potential to damage a diseased esophagus. Cardiologists, however, simply wanted to use the probe to obtain good pictures; they were not shy about applying force to the probe to position it correctly. Consequently, a number of the initial probes broke because the forces the cardiologist subjected them to were quite different from those that the engineer anticipated. Eventually the bugs were worked out of the design, and the TEE probe became a small but significant addition to HP's electrocardiographic product line. APPENDIX E Acoustic Quantification Acoustic quantification (AQ) is a technology that makes use of recognizable differences in patterns of ultrasonic back scatter to identify the "edge" of the heart, or the interface between the cardiac musculature and the blood. This software- and hardware-embodied innovation permits measurement of ventricular function on a beat-to-beat basis. The AQ technology emerged from research conducted largely by academic engineers and physicians aimed at use of ultrasound for the characterization of tissues (i.e., for a "noninvasive biopsy"). AQ represented a successful tangent to a main line of research that had been ongoing for nearly two decades without leading to meaningful products. Pete Melton, an academic researcher with B.S. and M.S. degrees in electrical engineering, a Ph.D. in biomedical engineering, and some experience running a large teaching hospital's clinical laboratory, began AQ work in 1978, examining the characteristics of the ultrasonic signal coming back from various tissues,

OCR for page 125
Sources of Medical Technology: Universities and Industry especially the heart. In 1981, he and a collaborator, working at the University of Iowa, documented the differences in back scatter from the heart compared to other tissues. They published an article in 1983 showing real-time images and proposed an algorithm for identifying the edge of the heart and measuring left ventricular volume. Melton left academia in 1984 and went to industry. After spending nine months at Diasonics, he came to the Hewlett-Packard (HP) Imaging Systems Division where he pursued this work until 1988. At HP, his assignment was to pursue tissue characterization (TC) rather than AQ. The common pathway to achieve successes in both TC and AQ diverged and the bioengineer made informal arrangements to continue his AQ work, making prototypes "quietly." To do so, he had help from two "unassigned" engineers, one specializing in hardware and one in software. Melton did much of his work "outside the system." Clinical trials were done rather informally in collaboration with an anesthesiologist at the University of California, San Francisco, and cardiologists at Washington University (St. Louis) and the University of Iowa. Melton's work eventually led to the development of a prototype suitable for demonstration before clinicians. Their enthusiastic response convinced HP to "give the effort a project number" and initiate a full-scale development effort. HP introduced AQ as an enhancement to its high-end cardiac ultrasound units. The company's eventual decision to emphasize AQ rather than TC was heavily influenced by the enthusiastic response of clinicians to Melton's prototype. AQ has, so far, been received quite favorably in the marketplace. HP believes it is gaining market share or at least solidifying its position as market leader because of it. Competitors acknowledge that their own products have suffered in comparison. Clinical specialists, intrigued by the AQ technology, say its real significance has yet to be established and it is still to early to say whether it will facilitate the making of noninvasive "statements" about cardiac function. APPENDIX F Color Flow Doppler uses ultrasound to measure blood velocity. Initially, Doppler units were simply glorified stethoscopes—they were blind in terms of the area where blood velocity was being measured. Ideally, the clinician wanted to place a cursor on the screen and detect blood velocity at a certain point. Color flow was essentially two-dimensional Doppler allowing the clinician to measure blood flow velocity, volume, and direction. The great advantage of this technique is that it allows clinicians to detect eddies and backflow, which were evidence of abnormalities that could not be detected otherwise.

OCR for page 125
Sources of Medical Technology: Universities and Industry Color Flow Doppler required engineering advances in signal processing. A major issue was simply separating the imaging signal from the Doppler signal. Engineers at Hewlett-Packard, Paul Magnin in particular, thought it was fairly obvious that physicians would like to measure and display velocity at every point in the image. They just did not know how to achieve it. After struggling to get the Doppler unit out, Magnin started to investigate color flow, developing some computer algorithms to process the signal. At the same time a paper published by the American Institute of Ultrasound Medicine from Aloka Research Labs showed an actual picture of a mechanically swept color flow image. Clearly it could be done, and done commercially. Consequently, development efforts were stepped up and from eight different algorithms, two were selected that seemed most likely to answer the call. It was not entirely clear which one would be more appropriate, so simulations were set up to test one algorithm against the other. Once clinicians saw the video of the various algorithms, however, it was not clear that color flow would be of clinical significance. Nonetheless, Hewlett-Packard pushed ahead with its development because it believed the technology has potential. The path from concept to prototype was arduous. It seemed that color flow would require a redesign of the system architecture, which had not been designed to be upgradable. Once a redesign was begun it was nearly impossible to prevent everyone's pet projects from being added to the revision. Introduction of the product seemed to be far away. Another engineer, again working off the critical path, found a solution to move the product into the marketplace more quickly—sell it as an upgrade. Although this solution was very well received, the decision to offer Color Flow Doppler as an upgrade postponed some of the redesign problems to the next revision.