waste levels that may approach a third or more of the nation’s $2 trillion in healthcare expenditures (Fisher et al. 2003; McGlynn 2003). In performance on the key vital statistics, the United States ranks below at least two dozen other nations, all of which spend far less for health care.
In part, these problems are related to fragmentation of the delivery system, misplaced patient demand, and responsiveness to legal and economic incentives unrelated to health outcomes. However, to a growing extent, they relate to a structural inability of evidence to keep pace with the need for better information to guide clinical decision making. Also, if current approaches are inadequate, future developments are likely to accentuate the problem. These issues take on added urgency in view of the rapidly shifting landscape of available interventions and scientific knowledge, including the increasing complexity of disease management, the development of new medical technologies, the promise of regenerative medicine, and the growing utility of genomics and proteomics in tailoring disease detection and treatment to each individual. Yet, currently, for example, the share of health expenses devoted to determining what works best is about one-tenth of 1 percent (AcademyHealth September 2005; Moses et al. 2005).
In the face of this changing terrain, the IOM Roundtable on Evidence-Based Medicine (“the Roundtable”) has been convened to marshal senior national leadership from key sectors to explore a wholly different approach to the development and application of evidence for health care. Evidence-based medicine (EBM) emerged in the twentieth century as a methodology for improving care by emphasizing the integration of individual clinical expertise with the best available external evidence (Sackett et al. 1996) and serves as a necessary and valuable foundation for future progress. EBM has resulted in many advances in health care by highlighting the importance of a rigorous scientific base for practice and the important role of physician judgment in delivering individual patient care. However, the increased complexity of health care requires a deepened commitment by all stakeholders to develop a healthcare system engaged in producing the kinds of evidence needed at the point of care for the treatment of individual patients.
Many have asserted that beyond determinations of basic efficacy and safety, the dependence on individually designed, serially constructed, prospective studies to establish relative effectiveness and individual variation in efficacy and safety is simply impractical for most interventions (Rosser 1999; Wilson et al. 2000; Kupersmith et al. 2005; Devereaux et al. 2005; Tunis 2005; McCulloch et al. 2002). Information technology will provide valuable tools to confront these issues by expanding the capability to collect and manage data, but more is needed. A reevaluation of how health care is structured to develop and apply evidence—from health professions training, to infrastructure development, patient engagement, payments, and measurement—will be necessary to orient and direct these tools toward the