National Academies Press: OpenBook

Patient Safety: Achieving a New Standard for Care (2004)

Chapter: Appendix C: Examples of Federal, State, and Private Sector Reporting Systems

« Previous: Appendix B: Glossary and Acronym List
Suggested Citation:"Appendix C: Examples of Federal, State, and Private Sector Reporting Systems." Institute of Medicine. 2004. Patient Safety: Achieving a New Standard for Care. Washington, DC: The National Academies Press. doi: 10.17226/10863.
×

C
Examples of Federal, State, and Private-Sector Reporting Systems

The To Err Is Human report (Institute of Medicine, 2000) boosted existing patient safety initiatives and stimulated new ones. In the United States, many types of patient safety reporting systems are now in operation or under development at the federal, state, and private-sector levels. The Institute of Medicine Committee on Data Standards for Patient Safety reviewed a large number of these systems during the study. This appendix summarizes a sample of reporting systems. For each sector the following areas are described:

  • Type of system—reporting or surveillance

  • History of reporting/surveillance system

  • Voluntary or mandatory

  • Reportable events

  • Classification system and severity index

  • Reporting time frame

  • Data collected—format and summary

  • Method of reporting

  • Who reports

  • Root-cause analysis trigger

  • Follow-up, including root cause

  • Other information collected

  • Confidentiality issues

  • Relationship with other reporting systems

  • Relationship with JCAHO

Suggested Citation:"Appendix C: Examples of Federal, State, and Private Sector Reporting Systems." Institute of Medicine. 2004. Patient Safety: Achieving a New Standard for Care. Washington, DC: The National Academies Press. doi: 10.17226/10863.
×

I. FEDERAL REPORTING SYSTEMS

Overview

Within the federal government, eight major patient safety reporting and surveillance systems (see Tables C–1a, C–1b, and C–1c for details) were examined. Most of these systems were initiated by the federal agencies that manage them; however, one was mandated in legislation—the Vaccine Adverse Event Reporting System (VAERS). These federal agencies include the Centers for Disease Control and Prevention (CDC), the Food and Drug Administration (FDA), and the Centers for Medicare and Medicaid Services (CMS), which are all part of the Department of Health and Human Services (DHHS); the Department of Defense (DOD); and the Department of Veterans Affairs (VA).

The CDC manages two of the eight systems: the National Nosocomial Infections Surveillance (NNIS) System and the Dialysis Surveillance Network (DSN). The NNIS system has two components—nosocomial infections and antimicrobial use and resistance. The CDC also works jointly with the FDA to manage VAERS.

The FDA manages MedWatch, which handles reporting of medical device, biologic and blood product, drug product, and special nutritionals events. CMS is developing and will manage the Medicare Patient Safety Monitoring Program (MPSMS), and the DOD manages the Military Health System Patient Safety Program (MHS PSP).

The VA manages the National Center for Patient Safety (NCPS) and is working with the National Aeronautics and Space Administration (NASA) to develop a complementary system called the Patient Safety Reporting System (PSRS).

The longest operating of these systems is NNIS, which was initiated by CDC in 1970. The rest began operating after 1990, including several in the past few years. The newest systems are the MPSMS, MHS PSP, and PSRS.

Surveillance or Reporting Systems

Two types of systems are used: surveillance and reporting. In general, surveillance systems abstract data from patient and other records and/or health care personnel to determine if an adverse event has occurred and/or to analyze the data in order to monitor trends. Reporting systems are designed for individuals to report specific events and, in some cases, conduct root-cause analyses (RCAs) to determine the causal factors for these events.

Suggested Citation:"Appendix C: Examples of Federal, State, and Private Sector Reporting Systems." Institute of Medicine. 2004. Patient Safety: Achieving a New Standard for Care. Washington, DC: The National Academies Press. doi: 10.17226/10863.
×

Like surveillance systems, reporting can be used to monitor trends. The two CDC-managed systems and the CMS MPSMS are considered surveillance systems; the other five are for event reporting. Most of these systems are essentially voluntary, with the exception of VAERS and MedWatch, which mandate reporting by certain parties (health professionals, manufacturers, and/or user facilities). In the cases where reporting is mandatory, specific time frames are established within which reports must be received; these time frames vary according to the seriousness of the event.

Reportable Events

In terms of the events reported and monitored by the federal systems, they vary a great deal from one to the next (see “Reportable events/events monitored” and “Classification system and/or severity index” rows of the tables). Some systems include reporting for close calls (i.e., near misses), while others focus solely on adverse events.1 However, a few general statements can be made about them. The CDC- and FDA-managed systems tend to focus on specific types of adverse events, based on patient outcome or what caused the event—nosocomial infections; infections resulting from hemodialysis; vaccine events; and medical device, biologic and blood product, drug product, and special nutritionals events. Although these systems are quite specific in terms of events reported/monitored, they can be used across numerous health systems. The focus of the other four systems—MPSMS, MHS PSP, NCPS, and PSRS—is essentially the opposite of the first four. They are designed for use within the health systems that serve their members: Medicare, the MHS, and the Veterans Health Administration (VHA). The types of events reported to and monitored by these systems are more general and, in some cases, are not categorized at all. Adverse/serious events are included in all of these systems; however, four of them—MHS PSP, NCPS, PSRS, and MedWatch (for device problems only via MedSun)—also include close calls and/or near misses. Additionally, the MHS PSP includes nonpatient specific events such as a fire or system failure in the facility. Often, an organization will classify an event or determine whether an RCA is needed based on a risk assessment scale. For example, the NCPS reporting system classifies events and close calls using the Safety Assessment Code (SAC) matrix and requires an RCA if a close call or adverse event has a high SAC score or at the discretion of the patient safety manager.

1  

Adverse/serious events and close calls/near misses are defined differently by each system (see tables).

Suggested Citation:"Appendix C: Examples of Federal, State, and Private Sector Reporting Systems." Institute of Medicine. 2004. Patient Safety: Achieving a New Standard for Care. Washington, DC: The National Academies Press. doi: 10.17226/10863.
×
Format for Reporting

Each system requires different data to be abstracted or reported, and most of them have a standard format for collecting those data (see “Data collected: Format and summary” row in the tables). The MHS PSP and NCPS do not use a standardized format for initially collecting data—they allow facilities to use locally accepted methods for reporting—but then report the data to their central agencies in a standardized manner. Most of the systems include patient-specific data in their reports; however, patient (and health care worker) identifiers are removed when the data are shared across the system or with an outside party. All of the data in these systems are protected from discovery by law or regulation.

Method of Reporting

Five of the systems allow for electronic transmission (via disk, e-mail, or the Internet) of reports to the central office; the rest require submission of hardcopy reports, which are then entered into databases by agency personnel. In terms of who can report to these systems and who abstracts the data, most are open to all personnel at participating facilities. The NNIS, however, uses trained personnel at participating hospitals to compile the data. The MHS PSP and NCPS allow reporting according to their facilities’ locally accepted methods, but specific personnel are responsible for compiling and transmitting the data to the central offices. Only two of these systems currently allow consumers (patients and their families) to report events—VAERS and MedWatch. The MHS PSP also welcomes reports from patients and families but has not yet developed the mechanisms to facilitate this avenue of reporting.

Analysis of More Serious Events

All of the systems have in place some means for following up on events, although the type and amount of follow-up vary a great deal across the systems. The primary means of follow-up used by the surveillance systems is data analysis and trend monitoring. Most of the systems allow facilities to do this on a local level. Overall analyses and comparisons are usually conducted by the central agency. In such cases, these analyses are often shared with the individual facilities. VAERS and MedWatch both involve reviews of the most serious events by the FDA. These reviews may result in several actions from alerts and label/packaging changes to recalls of vaccine batches or products.

Suggested Citation:"Appendix C: Examples of Federal, State, and Private Sector Reporting Systems." Institute of Medicine. 2004. Patient Safety: Achieving a New Standard for Care. Washington, DC: The National Academies Press. doi: 10.17226/10863.
×

MHS PSP and NCPS both involve extensive RCAs and action plans, which must be monitored for effectiveness. NCPS and MHS PSP also require prompt feedback to the reporter, and patients are informed when they have been involved in an adverse event. PSRS involves the least follow-up—it was built as a complement to the NCPS and is used primarily for learning purposes; however, reporters do receive a confirmation by mail that their report has been received.

Tabular Information

All of this information is broken out in more detail in the tables. Table C–1a includes the two CDC-managed systems and the joint FDA- and CDC-managed VAERS. Table C–1b includes the FDA-managed MedWatch system, CMS’s MPSMS, and the MHS PSP. Table C–1c includes the two VA-managed systems.

Suggested Citation:"Appendix C: Examples of Federal, State, and Private Sector Reporting Systems." Institute of Medicine. 2004. Patient Safety: Achieving a New Standard for Care. Washington, DC: The National Academies Press. doi: 10.17226/10863.
×

TABLE C–1a Selected Examples of Federal Patient Safety/Health Care Reporting and Surveillance Systems

Federal Agency

CDC

Name of System

National Nosocomial Infections Surveillance Systema

Type of System

Surveillance.

History of reporting/surveillance system

The NNIS system is a cooperative effort that began in 1970 between CDC and participating hospitals. The system is used to describe the epidemiology of nosocomial infections and antimicrobial resistance trends.

Voluntary or mandatory

Voluntary.

Reportable events/events monitored

The NNIS system has two components: (1) nosocomial infections and (2) antimicrobial use and resistance (AUR).

In two situations, an infection is considered nosocomial: (1) infection that is acquired in the hospital but does not become evident until after hospital discharge and (2) infection in a neonate that results from passing through the birth canal.

There are two special situations when an infection is NOT considered nosocomial: (1) infection associated with a complication or extension of infection already present on admission, unless a change in pathogen or symptoms strongly suggests the acquisition of a new infection, and (2) in an infant an infection known or proved to have been acquired transplacentally and evident 48 hours or less after birth.

aInformation on the NNIS system has been obtained from the following sources: Gaynes (1998), Gaynes and Horan (1999), Gaynes and Solomon (1996), Horan and Emori (1998), Richards et al. (2001), Centers for Disease Control and Prevention (2002).

Suggested Citation:"Appendix C: Examples of Federal, State, and Private Sector Reporting Systems." Institute of Medicine. 2004. Patient Safety: Achieving a New Standard for Care. Washington, DC: The National Academies Press. doi: 10.17226/10863.
×

CDC

Joint FDA/CDC

Dialysis Surveillance Networkb

Vaccine Adverse Event Report Systemc

Surveillance.

Reporting.

DSN is a national surveillance system for monitoring bloodstream and vascular infections. It was initiated by CDC in August 1999.

The National Childhood Vaccine Injury Act (NCVIA) of 1986 mandated the reporting of certain adverse events following vaccination to help ensure the safety of vaccines distributed in the United States. This act led to the establishment of VAERS in November 1990 by the U.S. Department of Health and Human Services.

Voluntary.

Mandatory for health professionals and manufacturers to report events listed in the Reportable Events Table. Voluntary for health professionals and consumers to report reactions to other vaccines not listed in the Reportable Events Table.

Only chronic hemodialysis patients are included. Reportable events are significant bacterial infections resulting from hemodialysis. These events are identified because they include either a hospitalization or in-unit intravenous (IV) antimicrobial start.

The NCVIA requires reporting of:

  • Any event set forth in the Reportable Events Table that occurs within a specified time period (these are summarized below).

  • Any event listed in the manufacturer’s package insert as a contraindication to subsequent doses.

Vaccine/toxoid =Tetanus in any combination

  • Anaphylaxis or anaphylactic shock

  • Brachial neuritis

bInformation on DSN has been obtained from the following sources: Centers for Disease Control and Prevention (1999), Centers for Disease Control and Prevention: Hospital Infections Program (2000).

cInformation on VAERS has been obtained from the following sources: Food and Drug Administration (1999, 2001b).

Suggested Citation:"Appendix C: Examples of Federal, State, and Private Sector Reporting Systems." Institute of Medicine. 2004. Patient Safety: Achieving a New Standard for Care. Washington, DC: The National Academies Press. doi: 10.17226/10863.
×

Federal Agency

CDC

Name of System

National Nosocomial Infections Surveillance Systema

 

The two conditions that are NOT infections include: (1) colonization or the presence of microorganisms that are not causing adverse clinical signs or symptoms, and (2) inflammation that results from tissue response to injury or stimulation by noninfectious agents, such as chemicals.

 

The AUR surveillance system requires, for a range of pathogens, the reporting of antimicrobial resistance. Each pathogen requires data for different antimicrobial agents.

 

The pathogens are Staphylococcus aureus, coagulase-negative staphylococci, Enterococcus species, Streptococcus pneumoniae, Escherichia coli, Klebsiella pneumoniae, Enterobacter species, and Pseudomonas aeruginosa.

Suggested Citation:"Appendix C: Examples of Federal, State, and Private Sector Reporting Systems." Institute of Medicine. 2004. Patient Safety: Achieving a New Standard for Care. Washington, DC: The National Academies Press. doi: 10.17226/10863.
×

CDC

Joint FDA/CDC

Dialysis Surveillance Networkb

Vaccine Adverse Event Report Systemc

 

  • Any sequela (including death) of above events

  • Events described in manufacturer’s package insert as contraindications to additional doses of vaccine

Vaccine/toxoid = Pertussis in any combination

  • Anaphylaxis or anaphylactic shock

  • Encephalopathy (or encephalitis)

  • Any sequela (including death) of above events

  • Events described in manufacturer’s package insert as contraindications to additional doses of vaccine

Vaccine/toxoid = Measles, mumps, and rubella in any combination

Same events as pertussis in any combination

Vaccine/toxoid = Rubella in any combination

  • Chronic arthritis

  • Any sequela (including death) of above events

  • Events described in manufacturer’s package insert as contraindications to additional doses of vaccine

Vaccine/toxoid = Inactivated Polio (IPV)

  • Anaphylaxis or anaphylactic shock

  • Any sequela (including death) of above events

  • Events described in manufacturer’s package insert as contraindications to additional doses of vaccine

Vaccine/toxoid = Hepatitis B

Same events as IPV

Suggested Citation:"Appendix C: Examples of Federal, State, and Private Sector Reporting Systems." Institute of Medicine. 2004. Patient Safety: Achieving a New Standard for Care. Washington, DC: The National Academies Press. doi: 10.17226/10863.
×

Federal Agency

CDC

Name of System

National Nosocomial Infections Surveillance Systema

Classification system and/or severity index

All infections are categorized into major and specific infection sites, using standard CDC definitions that include laboratory and clinical criteria.

Surgical site infection roles are stratified by a risk index based on wound classification,duration of operation,and the American Society of Anesthiologists severity assessment score.

Reporting time frame

Not applicable—surveillance is ongoing.

Suggested Citation:"Appendix C: Examples of Federal, State, and Private Sector Reporting Systems." Institute of Medicine. 2004. Patient Safety: Achieving a New Standard for Care. Washington, DC: The National Academies Press. doi: 10.17226/10863.
×

CDC

Joint FDA/CDC

Dialysis Surveillance Networkb

Vaccine Adverse Event Report Systemc

 

Vaccine/toxoid = Hemophilus influenzae type b (polysaccharide)

  • Early-onset Hib disease

  • Any sequela (including death) of above events

  • Events described in manufacturer’s package insert as contraindications to additional doses of vaccine

Vaccine/toxoid = Hemophilus influenzae type b (conjugate)

  • Events described in manufacturer’s package insert as contraindications to additional doses of vaccine

Vaccine/toxoid = Varicella

  • Same events as Hemophilus influenzae type b (conjugate)

Vaccine/toxoid = Rotavirus

  • Same events as Hemophilus influenzae type b (conjugate)

Vaccine/toxoid = Pneumococcal conjugate

  • Same events as Hemophilus influenzae type b (conjugate)

Events are classified initially according to outcome: hospitalization or in-unit IV antimicrobial start.

They are further classified according to the vascular accesses that the patient has, the problems that led to hospitalization or in-unit IV antimicrobial start, and the results of blood cultures done in the hospital or dialysis unit.

No severity index.

Reported adverse events that are listed on the Reportable Events Table are categorized by type of vaccine, to the extent possible.

No severity index, but outcomes are recorded.

Not applicable—surveillance is ongoing.

For consumers: No restriction on the time lapse between the vaccination and the start of the event or between the event and the time the report is made.

Suggested Citation:"Appendix C: Examples of Federal, State, and Private Sector Reporting Systems." Institute of Medicine. 2004. Patient Safety: Achieving a New Standard for Care. Washington, DC: The National Academies Press. doi: 10.17226/10863.
×

Federal Agency

CDC

Name of System

National Nosocomial Infections Surveillance Systema

Data collected: Format and summary

Standard format—data are collected using four standardized protocols called surveillance components: adult and pediatric intensive care unit, high-risk nursery, surgical patient, and antimicrobial use and resistance.

Essential data collected for infections include patient name, age, and sex; hospital identification number; service; ward/intensive care unit (ICU); admission date; infection onset date and site of infection; and laboratory data, including pathogen(s) and antibiogram. AUR surveillance system: Prescribing practices—each hospital must identify its antimicrobial agent prescribing practices. For each antimicrobial agent, identify whether it is in the formulary. If it is, whether an automatic stop order exists, whether approval for use is needed outside the ICU(s), and whether approval for use is needed inside the ICU(s).

Microbiology lab data: For the purposes of data collection, a hospital unit is defined to be an individual ICU or the total non-ICU inpatient care area or the total outpatient care area. Each unit must report: (1) the total number of clinical cultures processed for the particular month; (2) for each pathogen the total number of bacterial isolates classified as susceptible, intermediate, and resistant to at least one of the relevant antimicrobial agents; and (3) for each pathogen the total number of isolates processed in the laboratory that month.

Pharmacy data: Each inpatient unit must report the total number of grams or millions of units of each parental antimicrobial agent received and the total number of grams of each oral antimicrobial agent received in the particular month.

Method of reporting

Entered into CDC-provided software and transmitted routinely to CDC via dedicated phone line and modem. Reports are provided on a monthly basis.

Who reports

Trained infection control personnel at the participating 300 hospitals. To participate, a hospital must have 100 or more “set up and staffed” acute care beds to meet minimum requirements for infection control staffing.

Suggested Citation:"Appendix C: Examples of Federal, State, and Private Sector Reporting Systems." Institute of Medicine. 2004. Patient Safety: Achieving a New Standard for Care. Washington, DC: The National Academies Press. doi: 10.17226/10863.
×

CDC

Joint FDA/CDC

Dialysis Surveillance Networkb

Vaccine Adverse Event Report Systemc

 

For health professionals: Time frame between administration of the vaccine and onset of an adverse event varies according to type of vaccine and event. The onset interval is listed in the Reportable Events Table.

Standard format—information collected for hospitalizations: that patients have been hospitalized; the problem or diagnoses prompting hospital admission, especially whether the patient had signs and symptoms of access infection; and the results of blood cultures done in the hospital soon after admission.

Standard format: Data collected include: description of adverse event; relevant diagnostic tests and/or laboratory data; information about the vaccines administered (e.g., type, manufacturer, lot number, date administered); and patient information, including relevant history.

Information collected for in-unit IV antimicrobial starts: that patients were started on an IV antimicrobial in-unit; the problem or diagnosis prompting use of the IV antimicrobial, especially whether patients had signs and symptoms of access infection; and the results of blood cultures done in the unit.

 

Paper forms that are mailed to CDC or via an Internet-based system.

Form available online or by calling VAERS. It must be printed and mailed back to VAERS.

Dialysis center personnel.

Consumers, health professionals, and manufacturers.

Suggested Citation:"Appendix C: Examples of Federal, State, and Private Sector Reporting Systems." Institute of Medicine. 2004. Patient Safety: Achieving a New Standard for Care. Washington, DC: The National Academies Press. doi: 10.17226/10863.
×

Federal Agency

CDC

Name of System

National Nosocomial Infections Surveillance Systema

RCA trigger

None. However, a hospital compares its data to the aggregate and makes decisions about whether to intervene based on its own prevention targets.

Follow-up (including RCA)

Hospitals may use the data collected to compare its infection rates with similar patient populations within the hospital or with external benchmark rates or by comparing changes in rates over time in their own hospital.

Other information collected through the system

Information describing important risk factors for infection can be collected if it will be analyzed and used by the hospital.

Corresponding denominator data are collected so that risk-adjusted infection rates can be calculated.

Information on adverse outcomes of nosocomial infection is also collected (death, secondary bloodstream infection).

Confidentiality issues

The CDC Division of Healthcare Quality Improvement (formerly Division of Hospital Infections) obtained authorization to collect these data under the protection of Section 308(d) of the Public Health Service Act. The legislation stipulates that no information in a project protected by 308(d) can be used for any purpose other than the purpose for which it was supplied, nor be published or released in an identifiable format unless the establishment or person supplying the information or described in it has consented to such release.

Suggested Citation:"Appendix C: Examples of Federal, State, and Private Sector Reporting Systems." Institute of Medicine. 2004. Patient Safety: Achieving a New Standard for Care. Washington, DC: The National Academies Press. doi: 10.17226/10863.
×

CDC

Joint FDA/CDC

Dialysis Surveillance Networkb

Vaccine Adverse Event Report Systemc

None.

None.

No direct follow-up on events. However, centers using the Internet-based system can generate and print data analysis reports whenever desired.

The FDA reviews reports of individual serious events (including hospitalizations, life-threatening events, and deaths) weekly.

The FDA also analyzes patterns of reporting associated with vaccine lots, looking for more death reports than would be expected on the basis of factors such as time in use and chance variation and for any unusual patterns in other serious reports within a lot.

If evaluation of reports signaling a safety risk confirms that risk, the batch of vaccine can be recalled.

None.

Health professionals and consumers may report any clinically significant adverse event occurring after the administration of any vaccine licensed in the United States.

The CDC Division of Healthcare Quality Improvement (formerly Division of Hospital Infections) obtained authorization to collect these data under the protection of Section 308(d) of the Public Health Service Act. The legislation stipulates that no information in a project protected by 308(d) can be used for any purpose other than the purpose for which it was supplied, nor be published or released in an identifiable format unless the establishment or person supplying the information or described in it has consented to such release.

The National Childhood Vaccine Injury Act of 1986 provides liability protection through the Vaccine Injury Compensation Program. Therefore, practitioner liability is unaffected by the VAERS reporting requirement.

VAERS data are made available to the public only after removal of patient identification information.

Suggested Citation:"Appendix C: Examples of Federal, State, and Private Sector Reporting Systems." Institute of Medicine. 2004. Patient Safety: Achieving a New Standard for Care. Washington, DC: The National Academies Press. doi: 10.17226/10863.
×

Federal Agency

CDC

Name of System

National Nosocomial Infections Surveillance Systema

Relationship with other reporting systems

None. However, NNIS data are used with hospital discharge data for projections of how many patients had an infection at discharge.

Relationships with Joint Commission on Accreditation of Healthcare Organizations (JCAHO)/ Medicare certification

NNIS central line–associated bloodstream infection rate and device utilization measures are being pilot tested as JCAHO core measures.

Suggested Citation:"Appendix C: Examples of Federal, State, and Private Sector Reporting Systems." Institute of Medicine. 2004. Patient Safety: Achieving a New Standard for Care. Washington, DC: The National Academies Press. doi: 10.17226/10863.
×

CDC

Joint FDA/CDC

Dialysis Surveillance Networkb

Vaccine Adverse Event Report Systemc

None.

None.

None.

None.

Suggested Citation:"Appendix C: Examples of Federal, State, and Private Sector Reporting Systems." Institute of Medicine. 2004. Patient Safety: Achieving a New Standard for Care. Washington, DC: The National Academies Press. doi: 10.17226/10863.
×

TABLE C–1b Selected Examples of Federal Patient Safety/Health Care Reporting and Surveillance Systems

Federal Agency

FDA

Name of System

MedWatcha

Type of system

Reporting.

History of reporting/ surveillance system

The FDA has had a postmarketing surveillance program in place since 1961. The FDA’s system evolved into five separate reporting forms for different products. Then, in 1993, the FDA developed MedWatch to consolidate the forms and eliminate confusion. Three FDA centers are currently responsible for handling reports: The Center for Devices and Radiological Health (CDRH) handles medical device events, the Center for Biologics Evaluation and Research (CBER) handles biologic and blood product events, and the Center for Drug Evaluation and Research (CDER) handles drug product events. In addition to these centers, the Office of Special Nutritionals handles reports of events or product problems associated with special nutritionals, such as dietary supplements, infant formulas, and medical foods. Most recently, in 2002, the CDRH launched a pilot program called MedSun (Medical Product Surveillance Network), which provides a secure, Internet-based data entry system that automates the MedWatch form for reporting medical device problems. MedSun is managed by CODA, a professional research organization.

aInformation on MedWatch has been obtained from the following sources: Henkel (1998), Food and Drug Administration (1996, 2001a, 2002).

bInformation on MPSMS has been obtained from the following sources: personal communication, S. Jencks and S. Kellie, 2002; personal communication, S. Kellie, March 27, 2002.

Suggested Citation:"Appendix C: Examples of Federal, State, and Private Sector Reporting Systems." Institute of Medicine. 2004. Patient Safety: Achieving a New Standard for Care. Washington, DC: The National Academies Press. doi: 10.17226/10863.
×

CMS

DOD

Medicare Patient Safety Monitoring Systemb

Military Health System Patient Safety Programc

Surveillance.

Reporting.

MPSMS is being built under the auspices of DHHS’s Patient Safety Task Force, announced by Secretary Thompson in April 2001. Four federal agencies (AHRQ, CDC, FDA, VHA), which make up the Federal Agency Work Group, are working with CMS to build MPSMS. In addition, CMS has selected Qualidigm, the Connecticut QIO (Quality Improvement Organization), to provide administrative and technical support for the development and maintenance of the MPSMS. The CMS Clinical Data Abstraction Centers will provide data collection support. The MPSMS is being developed to measure and track over time adverse events and their associated patient risk factors among the Medicare population. The goal is to have the system producing national estimates for the initial groups of adverse events by the end of 2002 and to have them included in the National Quality Report in 2003.

Following the release of the IOM report, To Err Is Human (1999), and President Clinton’s Executive Memorandum of December 7, 1999, DOD convened the Patient Safety Working Group, an interdisciplinary group of individuals from the Armed Services, the Uniformed Services University, the Armed Forces Institute of Pathology (AFIP), and the Office of the Assistant Secretary of Defense to review patient safety in the MHS. This group consulted with the VA and implemented a pilot patient safety reporting system from October 2000 to April 2001; in August 2001, DOD Instruction Number 6025.17 “Military Health System Patient Safety Program” was signed. The instruction established a system for identifying and reporting actual and potential problems in medical systems and processes and implementing actions to improve patient safety and health care quality throughout the MHS. The instruction directed that the MHS reporting system would emulate, to the extent that is practical, the reporting system established by the VA. In June 2003, the DOD Working

cInformation on MHS PSP has been obtained from the following sources: personal communications, F. Stewart, February 20 and April 12, 2002; Armed Forces. Confidentiality of Medical Quality Assurance Records: Qualified Immunity for Participants. 10 U.S.C. SS Number 1102 (1986); U.S. Department of Defense 1986; Department of Defense (2001a, b, c).

Suggested Citation:"Appendix C: Examples of Federal, State, and Private Sector Reporting Systems." Institute of Medicine. 2004. Patient Safety: Achieving a New Standard for Care. Washington, DC: The National Academies Press. doi: 10.17226/10863.
×

Federal Agency

FDA

Name of System

MedWatcha

Voluntary or mandatory

Voluntary for consumers and health professionals; mandatory for user facilities, such as hospitals and nursing homes. In addition, MedSun allows for voluntary reporting by user facilities of “close calls” related to medical devices.

Reportable events/events monitored

Serious adverse events and product problems are reported to the FDA directly or via the manufacturer. These include:

  • Death: Report if patient’s death is suspected as being a direct outcome of the adverse event.

  • Life threatening: Report if patient was at substantial risk of dying at the time of the adverse event or it is suspected that the use or continued use of the product would result in the patient’s death.

  • Hospitalization (initial or prolonged): Report if admission to the hospital or prolongation of a hospital stay results because of the adverse event.

  • Disability: Report if the adverse event resulted in a significant, persistent, or permanent damage or disruption in the patient’s body function/structure, physical activities, or quality of life.

  • Congenital anomaly: Report if there are suspicions that exposure to a medical product prior to conception or during pregnancy resulted in an adverse outcome in the child.

Suggested Citation:"Appendix C: Examples of Federal, State, and Private Sector Reporting Systems." Institute of Medicine. 2004. Patient Safety: Achieving a New Standard for Care. Washington, DC: The National Academies Press. doi: 10.17226/10863.
×

CMS

DOD

Medicare Patient Safety Monitoring Systemb

Military Health System Patient Safety Programc

 

Group (now the Patient Safety Planning and Coordination Committee) established requirements on a Web-based Patient Safety Reporting System that will be implemented throughout the MHS. Anticipated deployment is 18 to 24 months. The system will enable voluntary reporting from point of care to the Patient Safety Center located at the AFIP, where deidentified data will be collected, analyzed, and reported. The DOD PSP has also been working with the Agency for Healthcare Research and Quality to integrate with the National Patient Safety Database currently under construction.

Voluntary.

Voluntary.

Adverse events are defined as unintended, measurable harms made more likely by the processes of health care delivery.

The Federal Agency Work Group developed five criteria to select adverse event categories for inclusion in the MPSMS:

  • The adverse event category represents a significant burden to the Medicare population as reflected in the frequency of its occurrence,associated severity of patient harm, morbidity, and/or mortality.

  • The adverse event category falls within the participating agencies’ missions and priorities.

  • The adverse event categories representing outcomes of interest across participating agencies are of higher priority.

Close calls: Defined by DOD Instruction Number 6025.17 as events or situations that may have resulted in harm to a patient but did not, either by chance or through timely intervention; such events also have been referred to as “near-miss” incidents. This definition has since been clarified further to state that near misses are events that did not reach the patient.

Adverse events: Defined by DOD Instruction Number 6025.17 as occurrences or conditions associated with care or services provided that cause unexpected harm to a patient during such care or services. These may be due to acts of commission or omission. Adverse events

Suggested Citation:"Appendix C: Examples of Federal, State, and Private Sector Reporting Systems." Institute of Medicine. 2004. Patient Safety: Achieving a New Standard for Care. Washington, DC: The National Academies Press. doi: 10.17226/10863.
×

Federal Agency

FDA

Name of System

MedWatcha

 

  • Requires intervention to prevent permanent impairment or damage: Report if it is suspected that the use of a medical product may result in a condition that required medical or surgical intervention to preclude permanent impairment or damage to a patient.

In addition, MedSun allows for reporting by user facilities of close calls or the rejection of a device over safety concerns.

Classification system and/or severity index

Adverse events or product problems are classified according to whether they are attributed to medical device, biologic and/or blood product, drug product, or special nutritional product.

No severity index—only serious adverse events or product problems are required to be reported.

Suggested Citation:"Appendix C: Examples of Federal, State, and Private Sector Reporting Systems." Institute of Medicine. 2004. Patient Safety: Achieving a New Standard for Care. Washington, DC: The National Academies Press. doi: 10.17226/10863.
×

CMS

DOD

Medicare Patient Safety Monitoring Systemb

Military Health System Patient Safety Programc

 

  • The adverse event category has been demonstrated to be associated with commonly occurring exposures or hazards.

  • The adverse event category measures may include adverse events themselves, surrogates for adverse events, or modifiable risk factors.

Using these five criteria, the following adverse event categories are currently under development and scheduled for inclusion in the initial version of MPSMS:

  • Adverse events associated with use of central vascular catheters

  • Postoperative pneumonia, urinary tract infection, deep vein thrombosis, and pulmonary embolus.

  • Adverse events associated with joint replacements—specifically hip and knee replacements—and including prosthetic device complications.

  • Bloodstream infections and sepsis syndrome.

  • Adverse drug events.

do not include intentional unsafe acts.

Sentinel events: Defined by DOD Instruction Number 6025.17 as unexpected occurrences involving death or serious physical or psychological injury or risk thereof (as defined by JCAHO).

For each adverse event, three primary elements are precisely defined:

  1. An explicit exposure case definition.

  2. An explicit event case definition, including associated symptoms, physical findings, laboratory values, and treatments particular to that event.

  3. An explicitly defined set of risk factors associated with the event; these risk factors help identify factors contributing to the occurrence of the events. Methodologically, these risk factors may be either confounding or effect-modifying variables.

Events are categorized according to the following types:

Patient suicides/attempts

Wrong site/person/procedure or surgery

Death/injury in restraints

Transfusion errors

Patient falls

Medication errors

Patient elopement

Delay in diagnosis/treatment

Perinatal death

Maternal death

Death associated with transfer

Infant abduction/wrong family

Suggested Citation:"Appendix C: Examples of Federal, State, and Private Sector Reporting Systems." Institute of Medicine. 2004. Patient Safety: Achieving a New Standard for Care. Washington, DC: The National Academies Press. doi: 10.17226/10863.
×

Federal Agency

FDA

Name of System

MedWatcha

Reporting time frame

Mandatory reporting regarding pharmaceuticals:

For each serious or unexpected adverse event, report must be submitted within 15 working days; all non-15-day reports must be reported quarterly for the first 3 years after drug approval, then annually; the frequency of reports of (1) serious and unexpected adverse events and (2) therapeutic failures must be periodically monitored, and any significant increase must be reported within 15 days.

Mandatory reporting regarding devices (as outlined by the Safe Medical Devices Act of 1990): User facility: Deaths within 10 working days to the FDA and manufacturer; serious injuries/illnesses within 10 working days to manufacturer or the FDA if manufacturer is unknown; semiannual reports to the FDA and/or manufacturer.

Manufacturer: Deaths, serious injuries, malfunctions to the FDA within 30 calendar days of becoming aware of event; within 5 working days if (1) event necessitates remedial action to prevent an unreasonable risk of substantial harm to the public health or (2) event is one the FDA has requested be reported within 5 days.

Suggested Citation:"Appendix C: Examples of Federal, State, and Private Sector Reporting Systems." Institute of Medicine. 2004. Patient Safety: Achieving a New Standard for Care. Washington, DC: The National Academies Press. doi: 10.17226/10863.
×

CMS

DOD

Medicare Patient Safety Monitoring Systemb

Military Health System Patient Safety Programc

No severity index.

Ventilator death or injury Anesthesia-related event Medical equipment event Fire Perioperative complication Other less frequent types

All close calls and adverse events are also classified according to the Safety Assessment Code. The SAC matrix takes into account (1) the actual severity of the event and (2) the probability of occurrence according to specific definitions. The matrix scores are 3 = highest risk, 2 = intermediate risk, and 1 = lowest risk. Events with scores of SAC 3 are put into one of two groups: adverse event or sentinel event. Events with scores of SAC 1 are also put into two groups: no harm and harm.

Not applicable.

Time frame from occurrence of event/close call to filing a report at an individual facility is determined by that facility’s locally accepted method.

Facilities submit a monthly summary of all close calls and events to the Patient Safety Center. If an event requires an RCA, the facility has 45 days from the date the facility’s patient safety manager becomes aware of the event to submit the RCA.

Suggested Citation:"Appendix C: Examples of Federal, State, and Private Sector Reporting Systems." Institute of Medicine. 2004. Patient Safety: Achieving a New Standard for Care. Washington, DC: The National Academies Press. doi: 10.17226/10863.
×

Federal Agency

FDA

Name of System

MedWatcha

 

Distributor: Deaths, serious injuries/illnesses, and malfunctions to the FDA and manufacturer within 10 working days.

Mandatory reporting regarding biologics/blood products: All events must be reported as soon as possible but no later than 45 calendar days from the date of discovery that a reportable event has occurred.

Data collected: Format and summary

Standard format: Data collected include description of event or problem, relevant tests and/or patient history, suspect product information, and reporter name and contact information.

Suggested Citation:"Appendix C: Examples of Federal, State, and Private Sector Reporting Systems." Institute of Medicine. 2004. Patient Safety: Achieving a New Standard for Care. Washington, DC: The National Academies Press. doi: 10.17226/10863.
×

CMS

DOD

Medicare Patient Safety Monitoring Systemb

Military Health System Patient Safety Programc

The MPSMS will secure data from administrative records and medical records that are already being submitted to the CMS Clinical Data Abstraction Centers for the Medicare Payment Error Prevention Program.

The proportion of hospitalized Medicare beneficiaries with central venous catheters (CVCs), for example, who have evidence of an infection can be calculated using the following numerator and denominator:

Numerator: Number of Medicare beneficiaries who have at least one CVC inserted during index hospitalization, who have an infection and (1) who are continuously entitled to Part A of Medicare for 12 months prior to index admission, (2) who are not enrolled in a managed care organization, (3) who are of any age, (4) whose index hospitalization occurs in an acute care hospital, and (5) whose hospital dates of discharge occur during specified time period.

Denominator: Number of Medicare beneficiaries who have at least one CVC inserted during index hospitalization and (1) who are continuously entitled to Part A of Medicare for 12 months prior to index admission, (2) who are not enrolled in a managed care organization, (3 )who are of any age, (4) whose index

Data collected remain in an Excel spreadsheet with all the reportable events mentioned in the classification system.Medication errors have been collected by a medication error reporting system (MedMARx) since June 1, 2003. MedMARx data are centrally collected in the Patient Safety Center.

Data collected by the Patient Safety Center at the AFIP are in two forms: (1) a monthly summary report on a standard form,including number of events in each category broken down according to whether it was a near miss,adverse event (SAC 1–3), or sentinel event (SAC 3), and (2) a copy of every RCA on a standard form.

Suggested Citation:"Appendix C: Examples of Federal, State, and Private Sector Reporting Systems." Institute of Medicine. 2004. Patient Safety: Achieving a New Standard for Care. Washington, DC: The National Academies Press. doi: 10.17226/10863.
×

Federal Agency

FDA

Name of System

MedWatcha

Method of reporting

Online (MedWatch directly or via MedSun for device problems) or by phone, mail, or fax (MedWatch only).

Who reports

Consumers, health professionals, and user facilities.

Suggested Citation:"Appendix C: Examples of Federal, State, and Private Sector Reporting Systems." Institute of Medicine. 2004. Patient Safety: Achieving a New Standard for Care. Washington, DC: The National Academies Press. doi: 10.17226/10863.
×

CMS

DOD

Medicare Patient Safety Monitoring Systemb

Military Health System Patient Safety Programc

hospitalization occurs in an acute care hospital, and (5) whose hospital dates of discharge occur during specified time period.

 

Explicit exposure and event case definitions embedded in an electronic medical record abstraction tool are used to identify exposures and any associated adverse events. Analysis of these data is then conducted to determine whether the patient did, in fact, suffer an adverse event.

In addition, to increase the efficiency of identifying medical records likely to include relevant exposures and associated adverse events, claims-based algorithms are used to target medical records for abstraction.

As a component of the beta test, cognitive interviews are being conducted with an interdisciplinary group of professionals, including clinicians and hospital epidemiologists, to validate the exposure and event case definitions as well as the associations between the exposures and adverse events.

MHS personnel can use a facility’s locally accepted method of reporting an adverse event or close call.

Each medical facility’s patient safety manager submits monthly summary reports (as Excel spreadsheets)and RCAs (including the action plans)to the Patient Safety Center at the AFIP.

Trained medical record abstractors.

Any MHS personnel can report.Patients and families are also welcome to report, but mechanisms to facilitate this reporting have not yet been developed. Names of reporting individuals are deleted from all reports.Prompt feedback to reporting individuals is required.

Suggested Citation:"Appendix C: Examples of Federal, State, and Private Sector Reporting Systems." Institute of Medicine. 2004. Patient Safety: Achieving a New Standard for Care. Washington, DC: The National Academies Press. doi: 10.17226/10863.
×

Federal Agency

FDA

Name of System

MedWatcha

RCA trigger

None. However, all reports from health professionals and specific reports from manufacturers are reviewed individually by an FDA health professional safety evaluator, with attention to all serious events that are not due to labeling in the case of pharmaceuticals.

Follow-up (including RCA)

MedWatch: No direct follow-up with reporter.

Based on review of incidents, the FDA can follow up with these actions: a “Dear Health Professional” letter or Safety Alert; labeling, name, or packaging changes; further epidemiologic investigations; requests for manufacturer-sponsored postmarketing studies; inspections of manufacturers’ facilities or records; or work with a manufacturer regarding possible withdrawal of a medical product from the market.

MedSun allows for additional follow-up, including a monthly newsletter reviewing all reports; alerts, advisories, and recall notices; access to special analyses of the MedSun and MAUDEd databases; and an annual conference.

Other information collected through the system

Health professionals may report any adverse event that they judge to be clinically significant, whether it is considered serious by the FDA definition or not.

dMAUDE is the Manufacturer and User Data Experience database, which serves as the reporting system for events involving medical devices.

Suggested Citation:"Appendix C: Examples of Federal, State, and Private Sector Reporting Systems." Institute of Medicine. 2004. Patient Safety: Achieving a New Standard for Care. Washington, DC: The National Academies Press. doi: 10.17226/10863.
×

CMS

DOD

Medicare Patient Safety Monitoring Systemb

Military Health System Patient Safety Programc

None. However, the presence of risk factors is noted during the medical record abstraction process.

All events with an actual SAC score of 3 require a full RCA. Facilities are also encouraged to do an RCA on any event or near miss when they believe it would be helpful.

If an event has an SAC score of 3 AND it is a fall or medication error, more data are collected and aggregated for an analysis done every quarter.

None at this time.

If an event/close call warrants an individual RCA, a team is formed to conduct the RCA. The team facilitator is the performance improvement subject matter expert and the team leader is the content expert. Three to five other members are selected for the team. This team can use several tools: (1) a computer-aided software tool (TapRoot) that leads them through the steps of an RCA and (2) a list of “triage” or “memory jogger” questions. The team then completes an RCA form (a set of Microsoft Word templates that include the RCA and the proposed action plan) for submission to the medical treatment facility, JCAHO if needed, and the Patient Safety Center.

None.

None.

Suggested Citation:"Appendix C: Examples of Federal, State, and Private Sector Reporting Systems." Institute of Medicine. 2004. Patient Safety: Achieving a New Standard for Care. Washington, DC: The National Academies Press. doi: 10.17226/10863.
×

Federal Agency

FDA

Name of System

MedWatcha

Confidentiality issues

Identities of both reporters and patients are protected by FDA regulations. In 1995, an additional regulation went into effect extending this protection against disclosure by preempting state discovery laws regarding voluntary reports held by pharmaceutical, biological, and medical device manufacturers.

Relationship with other reporting systems

Receives medication error reports from the U.S. Pharmacopeia’s (USP’s) Medication Errors Reporting (MER) Program and USP’s MedMARx system.

Receives reports of transfusion errors from the Medical Event Reporting System for Transfusion Medicine (MERS-TM).

Relationships with JCAHO/ Medicare certification

Adverse event monitoring is linked to JCAHO standards. To be accredited, JCAHO requires each hospital to monitor for adverse events involving pharmaceuticals and devices.

Suggested Citation:"Appendix C: Examples of Federal, State, and Private Sector Reporting Systems." Institute of Medicine. 2004. Patient Safety: Achieving a New Standard for Care. Washington, DC: The National Academies Press. doi: 10.17226/10863.
×

CMS

DOD

Medicare Patient Safety Monitoring Systemb

Military Health System Patient Safety Programc

The MPSMS is a QIO quality study, and information collected is protected by federal law against disclosure in a form that identifies individuals or providers, as well as against discovery or subpoena in civil actions.

All records and information of the MHS PSP are considered medical quality assurance records and are confidential under 10 U.S.C. 1102 and DOD Directive 6040.37 (references (d) and (e)). Aggregate statistical information at the DOD-wide or service-wide levels may be provided consistent with references (d) and (e). Except as specifically authorized (e.g., JCAHO sentinel events reporting), MHS PSP records or information are not to be disclosed unless authorized by references (d) and (e) and also by other applicable authority or authorized by the Assistant Secretary of Defense for Health Affairs.

No patient or health care provider identifiers are included in the reports, RCAs, action plans, or aggregate reviews.

None.

No direct relationships.

 

All sentinel events meeting the JCAHO definition of reviewable sentinel event are to be reported to JCAHO. The completed RCA and action plan also should be made available to JCAHO consistent with JCAHO’s policy and time limits.

Suggested Citation:"Appendix C: Examples of Federal, State, and Private Sector Reporting Systems." Institute of Medicine. 2004. Patient Safety: Achieving a New Standard for Care. Washington, DC: The National Academies Press. doi: 10.17226/10863.
×

TABLE C–1c Federal Patient Safety/Health Care Reporting and Surveillance Systems

Federal Agency

VHA

VHA/NASA

Name of System

National Center for Patient Safety Reporting Systema

Patient Safety Reporting Systemb

Type of system

Reporting.

Reporting.

History of reporting/ surveillance system

In 1997, the VA implemented the Patient Safety Improvement (PSI) initiative after identifying patient safety as a high priority within its health care system. The PSI included a Sentinel Event Reporting System, whose purpose was to prevent adverse events through an understanding of systems-level causes and then following up with corrective actions. This system was in place until late 1998 when, based on the recommendations of the External Panel on Patient Safety System Design, the VA established the dedicated National Center for Patient Safety to redesign the PSI in order to increase reporting and enhance the utility of reports. Then, after conducting two pilot studies, full-scale national rollout of the reporting system took place between April and August 2000.

In May 2000, the VHA formalized an agreement with NASA to develop PSRS, which is designed to be a complementary external system to the internal NCPS Reporting System. For the VA, the NCPS is a “safety valve” for incidents that otherwise may go unreported to the internal NCPS system. Pilot testing of PSRS began in March 2001 at a few selected VA medical centers, and the system became available to all VA medical centers in FY 2002. The VA pays NASA to independently operate PSRS according to the Memorandum of Understanding between the two agencies. PSRS builds on more than 25 years of NASA experience in running the Aviation Safety Reporting System for the Federal Aviation Administration.

aInformation on the NCPS Reporting System has been obtained from the following sources: Agency for Healthcare Research and Quality (2002); Overhage (2003), U.S. Code (1980), Department of Veterans Affairs (2001, 2002).

bInformation on the PSRS has been obtained from the following sources: Agency for Healthcare Research and Quality (2002), U.S. Code (1980), Department of Veterans Affairs (2001), Department of Veterans Affairs and National Aeronautics and Space Administration (2000).

Suggested Citation:"Appendix C: Examples of Federal, State, and Private Sector Reporting Systems." Institute of Medicine. 2004. Patient Safety: Achieving a New Standard for Care. Washington, DC: The National Academies Press. doi: 10.17226/10863.
×

Federal Agency

VHA

VHA/NASA

Name of System

National Center for Patient Safety Reporting Systema

Patient Safety Reporting Systemb

Voluntary or mandatory

Participation in the program is mandatory. Performing RCAs on adverse events that score high on the NCPS Safety Assessment Code is mandatory. Those incidents that are reported locally must be transmitted to the NCPS.

Voluntary.

Reportable events/events monitored

Close calls: Defined as events or situations that could have resulted in an accident, injury, or illness but did not, either by chance or through timely intervention.

Adverse events: Defined as untoward incidents, therapeutic misadventures, iatrogenic injuries, or other adverse occurrences directly associated with care or services provided within the jurisdiction of a medical center, outpatient clinic, or other facility. Adverse events may result from acts of commission or omission.

An event that is believed by a potential reporter to be a result of an “intentionally unsafe act” is NOT to be reported to the NCPS system but should be reported to the facility director or other authorities.

An “intentionally unsafe act” is defined as a criminal act, a purposefully unsafe act, an act related to alcohol or substance abuse by an impaired provider and/or staff, or events involving alleged or suspected patient abuse of any kind.

Adverse events and close calls (as defined by NCPS) and lessons learned or safety ideas. Intentionally unsafe acts (as defined by NCPS) are NOT to be reported to PSRS.

Suggested Citation:"Appendix C: Examples of Federal, State, and Private Sector Reporting Systems." Institute of Medicine. 2004. Patient Safety: Achieving a New Standard for Care. Washington, DC: The National Academies Press. doi: 10.17226/10863.
×

Federal Agency

VHA

VHA/NASA

Name of System

National Center for Patient Safety Reporting Systema

Patient Safety Reporting Systemb

Classification system and/or severity index

All close calls and adverse events are classified to establish their priority for analysis according to the Safety Assessment Code. The SAC matrix takes into account (1) the actual or potential severity of the event and (2) the probability of occurrence according to specific definitions. The matrix scores are 3 =highest risk, 2 = intermediate risk,and 1 = lowest risk.

When developing root-cause/ contributing factor (RC/CF) statements, the team uses a paper tool called “NCPS Triage Cards” that include several prompting questions. The applicable questions are documented with the RC/CF statements.

Additionally, four types of events—falls, medication errors, missing patients, and parasuicidal behavior—are categorized for aggregate RCA review.

Additional categorization of reports began in late 2002 using an NCPS-developed Primary Analysis and Categorization (PAC) methodology, which includes key attributes of the event such as the location of occurrence within the VA Medical Center, the activity or process under way at the time, and other aspects of the adverse events or close calls.

The reporter enters data that can aid in categorization or sorting of reports, such as staff position,where the event occurred, the time of occurrence, environmental factors that may have contributed, and other factors such as medical devices or medical records that may have been involved.

Suggested Citation:"Appendix C: Examples of Federal, State, and Private Sector Reporting Systems." Institute of Medicine. 2004. Patient Safety: Achieving a New Standard for Care. Washington, DC: The National Academies Press. doi: 10.17226/10863.
×

Federal Agency

VHA

VHA/NASA

Name of System

National Center for Patient Safety Reporting Systema

Patient Safety Reporting Systemb

Reporting time frame

Time frame from occurrence of event/close call to filing a report at an individual facility is determined by that facility’s locally accepted method.

Once an event/close call is entered into the Patient Safety Information System AND if an RCA is required, the facility has 45 days to complete the RCA.

None.

Data collected: Format and summary

Initial report is not in a standard format; each VA facility has its own locally accepted method of reporting an adverse event or close call to the local VAMC patient safety manager.

The Patient Safety Information System is a computer-aided software tool (SPOT) that is used to record a standard set of data to be used to manage and analyze the adverse event or close call reported.

Data collected in the Patient Safety Information System include date of event/close call; actual and potential SAC score; description of event/ close call; type of event (if it falls into one of the four categories of falls, medication errors,missing patients, and parasuicidal behavior); flowcharts indicating the initial and final understanding of the event; references, resources, and personnel consulted in the investigation; root-cause contributing factors, lessons

Standard paper form that is mailed to NASA directly by the individual reporting the incident.

Data collected include background information about the reporter’s position and experience, general event characteristics, and a narrative description of the event.

Suggested Citation:"Appendix C: Examples of Federal, State, and Private Sector Reporting Systems." Institute of Medicine. 2004. Patient Safety: Achieving a New Standard for Care. Washington, DC: The National Academies Press. doi: 10.17226/10863.
×

Federal Agency

VHA

VHA/NASA

Name of System

National Center for Patient Safety Reporting Systema

Patient Safety Reporting Systemb

 

learned, and corrective actions to be taken; and the outcome measures for each action, concurrence, and dialogue with leadership and time, money, and resources expended for RCAs.

 

Method of reporting

VA personnel can use a facility’s locally accepted method of reporting an adverse event or close call; a facility’s Patient Safety Manager (PSM) then uses a computer-aided software tool to triage and manage the event. Safety reports and RCAs are sent in a secure electronic fashion to the NCPS database when completed.

Forms can be obtained in paper format from a VA medical facility or by requesting them from NASA or in electronic (PDF) format from the PSRS Internet homepage. Forms then must be filled out by hand and mailed to NASA.

Who reports

Any VA personnel can report to each facility’s PSM.

Any VA personnel.

RCA trigger

  • All events with an actual SAC score of 3 receive a full RCA.

  • If a close call is a potential SAC score of 3 and it is one of the four categories indicated earlier, more data are collected and aggregated for an analysis done every quarter.

  • Any other close calls with a potential SAC score of 3 receive a full RCA.

  • At the discretion of the PSM and the facility director, any event or close call can undergo an RCA.

None (reports are not subject to RCA).

Suggested Citation:"Appendix C: Examples of Federal, State, and Private Sector Reporting Systems." Institute of Medicine. 2004. Patient Safety: Achieving a New Standard for Care. Washington, DC: The National Academies Press. doi: 10.17226/10863.
×

Federal Agency

VHA

VHA/NASA

Name of System

National Center for Patient Safety Reporting Systema

Patient Safety Reporting Systemb

Follow-up (including RCA)

If an event/close call warrants an individual RCA, a team of frontline VA personnel not involved in the event under consideration performs the RCA. This team uses two tools: (1) a computer-aided software tool (SPOT) that leads them through the steps of an RCA and (2) a cognitive aid called “Triage Questions for Root-Cause Analysis.”

Then, based on the results of the RCA for an event/close call, corrective actions are proposed by the RCA team. The facility director can choose to “concur” or “nonconcur” with these proposed actions. If the director issues a “nonconcur” statement, he or she must furnish a written rationale for this decision; then the RCA team proposes an alternative correction action. The RCA team also outlines the parties responsible for enacting the corrective actions, including a due date and how the effectiveness of these actions will be evaluated to verify that they had the intended effect.

Aggregate RCA review of the four most common events can be done quarterly.

Additionally, VA personnel who submit reports that result in an RCA receive prompt feedback on actions being taken as a result of their report.

NASA will return a portion of the reporting form, called the Reporter Return Receipt, to the reporter as proof that the report has been received. Although NASA does not retain any of the information on the return receipt prior to being returned, that information on the receipt may be used to contact the reporter for clarifications if necessary.

PSRS is designed to identify vulnerabilities but does not provide detailed solutions, except as proposed by the reporter.

Suggested Citation:"Appendix C: Examples of Federal, State, and Private Sector Reporting Systems." Institute of Medicine. 2004. Patient Safety: Achieving a New Standard for Care. Washington, DC: The National Academies Press. doi: 10.17226/10863.
×

Federal Agency

VHA

VHA/NASA

Name of System

National Center for Patient Safety Reporting Systema

Patient Safety Reporting Systemb

 

Informing patients: The VHA requires disclosure to patients who have been injured by adverse events. This is not associated with the RCA process and the results of RCAs are kept confidential, for use only in efforts to improve the quality and safety of care provided.

 

Other information collected through the system

None.

None.

Confidentiality issues

RCAs of adverse events and close calls are protected from disclosure under 38 U.S.C. 5705, as part of a medical quality assurance program.

Although there is a requirement to disclose adverse events to patients and families, legal restrictions limit disclosures that violate patient privacy. Specifically, the Privacy Act limits disclosures to families, and 38 U.S.C. 7332 limits disclosures related to a patient’s treatment for substance abuse, sickle cell anemia disease, and HIV status, even after a patient’s death.

No patient or VA personnel identifiers are included in the reports entered into the Patient Safety Information System, which contains RCA information.

PSRS reports are considered confidential and privileged quality assurance documents under the provisions of 38 U.S.C. 5705.

PSRS removes all personal names, facility names and locations, and other potentially identifying information before entering reports into its database.

Suggested Citation:"Appendix C: Examples of Federal, State, and Private Sector Reporting Systems." Institute of Medicine. 2004. Patient Safety: Achieving a New Standard for Care. Washington, DC: The National Academies Press. doi: 10.17226/10863.
×

Federal Agency

VHA

VHA/NASA

Name of System

National Center for Patient Safety Reporting Systema

Patient Safety Reporting Systemb

 

VA has developed detailed guidance on disclosing adverse events that is available online at http://www.va.gov/publ/direc/health/infolet/10200301.pdf.

 

Relationship with other reporting systems

No direct relationships.

No direct relationships. It is designed to be complementary to the VA’s NCPS Reporting System.

Relationships with JCAHO/ Medicare certification

If an event is an actual adverse event meeting the JCAHO definition of reviewable sentinel event, the facility can make the determination if it will report the event to JCAHO. If an event is reported to JCAHO, then the results of the RCA are also reported to JCAHO.

VHA policy requiring disclosure to patients who have been injured by adverse events is consistent with JCAHO requirements that hospitalized patients and their families be told of “unanticipated outcomes” of care.

None.

Suggested Citation:"Appendix C: Examples of Federal, State, and Private Sector Reporting Systems." Institute of Medicine. 2004. Patient Safety: Achieving a New Standard for Care. Washington, DC: The National Academies Press. doi: 10.17226/10863.
×

II. STATE REPORTING SYSTEMS

Overview

In April 2000, the National Academy for State Health Policy (NASHP) reported on a survey to determine the extent states had developed reporting systems for medical errors and adverse events (Rosenthal et al., 2000). All 50 states and the District of Columbia responded to the survey. The survey found that 15 states required mandatory reporting from acute and general hospitals of adverse events. Twenty-one states now require reporting (Rosenthal, 2003).

For illustrative purposes, this section of Appendix C gives an overview of the state-based reporting systems in place in New York and Florida (see Table C–2). These systems represent the broad differences in the types of reporting systems that have been developed to date. For a comprehensive review of the reporting systems for all 21 states, refer to the NASHP Web site at http://www.nashp.org.

Reportable Events

The NASHP reports (Rosenthal et al., 2000, 2001) confirmed the lack of a universal definition of the terms “adverse event” and “medical error.” Some states do not have generic definitions and, instead, specify the types of events that must be reported.

New York State provides the following preamble: “For the purpose of the New York Patient Occurrence and Reporting System (NYPORTS) reporting, an occurrence is an unintended adverse and undesirable development in an individual patient’s condition occurring in a hospital.” New York State also provides a detailed list of events that must be reported (see Table C–2a).

Florida provides this preamble: “The term ‘adverse incident’ means an event over which health care personnel could exercise control and which is associated in whole or in part with medical intervention, rather than the condition for which such intervention occurred, and which results in one of the following injuries.” Lists of events for annual report and code 15 report are provided in Table C–2e.

Format for Reporting

Most states have specified formats for reporting. The 2001 NASHP

Suggested Citation:"Appendix C: Examples of Federal, State, and Private Sector Reporting Systems." Institute of Medicine. 2004. Patient Safety: Achieving a New Standard for Care. Washington, DC: The National Academies Press. doi: 10.17226/10863.
×

study examined the commonality of data requirements for eight states and discovered the following:

  • All states collected information on the facility name, the date the incident occurred, and the type of incident.

  • A majority of states collected information on patient identification, provider identification, description of the incident, person reporting the incident, action taken by facility, patient outcome, and notification to other parties (e.g., professional bodies).

  • A minority of states collected information on the identity of witnesses.

In New York State, certain categories of occurrences (i.e., codes 201–854 in Table C–2a) only require the submission of a short form (the data collected are in Table C–2b). There is no specific time frame for reporting these occurrences. The idea is to aggregate the data for each category and carry out trend analyses to identify areas where a review of the process might yield improvements. A second set of codes (i.e., 901, 902, 914, 931–935, and 939 at the end of Table C–2a) represent more serious events or those that are statutorily required to be reported. These must reported within 24 hours or one business day from occurrence of the event. These also only require the submission of the short form (see Table C–2b). The final set of codes (i.e., 108–110, 911–913, 915–923, 938, 961–963 in Table C–2a) represent the most serious occurrences and require notification to the New York State Patient Safety Center within 24 hours or one business day from occurrence of the event using the short form (see Table C–2b) and an RCA carried out by the hospital (see Analysis of More Serious Events below).

Florida state law prescribes what data are to be collected. Some of the data elements are coded using existing health care data standards. All events in Table C–2e must be reported on annually, providing the data given in Table C–2f. More serious events must be reported on within 15 days (i.e., Code 15 reports—see Table C–2g for an overview of the data collected).

Method of Reporting

The most common method of delivery of information for state reporting systems is by fax. Regular or certified mail is also used (Rosenthal et al., 2001).

Of those included in the 2001 NASHP report, the New York State system has the most sophisticated delivery system—an Internet-based system with secure firewalls. The Florida system uses fax or certified mail.

Suggested Citation:"Appendix C: Examples of Federal, State, and Private Sector Reporting Systems." Institute of Medicine. 2004. Patient Safety: Achieving a New Standard for Care. Washington, DC: The National Academies Press. doi: 10.17226/10863.
×
Analysis of More Serious Events

Most states use the information collected to trigger on-site investigations and corrective action (Rosenthal et al., 2001), although RCAs are not always explicitly required.

As noted earlier, in New York State the most serious offenses (i.e., codes 108–110, 911–913, 915–923, 938, and 961–963 in Table C–2a) require notification to the New York State Patient Safety Center within 24 hours or one business day from occurrence of the event and an RCA carried out by the hospital. The RCA must be completed within 30 days and reported electronically to NYPORTS (an overview of the data required for the RCA is in Table C–2c). Medication errors (i.e., codes 108–110) are recognized as a special category and therefore require the collection of additional data (see Table C–2d).

The Florida reporting system does not explicitly require the carrying out of formal RCAs for any group of reportable events. However, code 15 reports require an extensive data collection exercise and descriptions of the causes of the incident and corrective or proactive actions taken (see Table C–2f).

Suggested Citation:"Appendix C: Examples of Federal, State, and Private Sector Reporting Systems." Institute of Medicine. 2004. Patient Safety: Achieving a New Standard for Care. Washington, DC: The National Academies Press. doi: 10.17226/10863.
×
Tabular Information
  • Table C–2: General information on the New York State and Florida reporting systems.

  • Table C–2a: New York State reportable events.

  • Table C–2b: Data collected for all New York State reportable events.

  • Table C–2c: Overview of the data required for all New York State reportable events needing an RCA.

  • Table C–2d: Extra data for all New York State reportable medication errors.

  • Table C–2e: Florida State reportable events.

  • Table C–2f: Data collected annually for all Florida State reportable events.

  • Table C–2g: Data collected with 15 days for all serious Florida State reportable events.

Suggested Citation:"Appendix C: Examples of Federal, State, and Private Sector Reporting Systems." Institute of Medicine. 2004. Patient Safety: Achieving a New Standard for Care. Washington, DC: The National Academies Press. doi: 10.17226/10863.
×

TABLE C–2 Selected Examples: The New York and Florida Reporting Systems

State

New York

Name of System

New York Patient Occurrence Reporting and Tracking Systemb

Type of system

Reporting.

History of reporting/ surveillance

Initial regulations requiring incident reporting were promulgated system in 1985. Shortly after, a medical malpractice crisis during the mid-1980s led to the enactment of a statutory reporting requirement—New York State Public Health Law Section 2805-1, Incident Reporting, which created the NYPORTS reporting system. The system covers all hospitals (inpatient and outpatient) and extension clinics listed on its Article 28 operating certificate. Freestanding diagnostic and treatment centers, including ambulatory surgery centers, report a limited list of incidents to the New York State Department of Health (NYSDOH) by regulation, such as patient deaths or transfers to hospitals. It does not cover long-term care (e.g., nursing homes, hospices), private medical practices, retail pharmacies, and home care.

Voluntary or mandatory

Mandatory.

Reportable events/ events monitored

For the purpose of NYPORTS reporting, an occurrence is an unintended adverse and undesirable development in an individual patient’s condition occurring in a hospital.

aInformation on the Florida State reporting system has been obtained from Florida Health and Human Services, Agency for Health Care Administration (2003); personal communication, A. Polk, Florida Agency for Health Care Administration, 2002; Rosenthal et al. (2001).

bInformation on the New York Patient Occurrence Reporting and Tracking System has been obtained from the following sources: New York Patient Occurrence Reporting and Tracking System (2001) and Rosenthal et al. (2001).

Suggested Citation:"Appendix C: Examples of Federal, State, and Private Sector Reporting Systems." Institute of Medicine. 2004. Patient Safety: Achieving a New Standard for Care. Washington, DC: The National Academies Press. doi: 10.17226/10863.
×

Floridaa

Reporting.

The medical malpractice crisis during the mid-1980s led to the promulgation of the Comprehensive Medical Malpractice Act of 1985, with provisions mandating reporting of adverse or untoward incidents to the Agency for Health Care Administration (AHCA), Bureau of Health Facility Regulation. Legislation modified the reporting requirements in 1998, adding a 24-hour reporting provision and narrowing the scope of reportable incidents.

Mandatory.

For purposes of reporting, the term “adverse incident” means an event over which health care personnel could exercise control and which is associated in whole or in part with medical intervention, rather than the condition for which such intervention occurred, and which:

  1. Results in one of the following injuries:

  1. Death;

  2. Brain or spinal damage;

  3. Permanent disfigurement;

  4. Fracture or dislocation of bones or joints;

  5. A resulting limitation of neurological, physical, or sensory function that continues after discharge from the facility;

  6. Any condition that required specialized medical attention or surgical intervention resulting from nonemergency medical intervention, other than an emergency medical condition, to which the patient has not given informed consent; or

  7. Any condition that required the transfer of the patient, within or outside the facility, to a unit providing a more acute level of care due to the adverse incident, rather than the patient’s condition prior to the adverse incident;

Suggested Citation:"Appendix C: Examples of Federal, State, and Private Sector Reporting Systems." Institute of Medicine. 2004. Patient Safety: Achieving a New Standard for Care. Washington, DC: The National Academies Press. doi: 10.17226/10863.
×

State

New York

Name of System

New York Patient Occurrence Reporting and Tracking Systemb

Classification system and/or severity index

Serious occurrences (codes 108–110, 911–913, 915–923, 938, 961–963, below): patient deaths unrelated to the natural course of illness, disease, or proper treatment in accordance with generally accepted medical standards; injuries and impairments of bodily functions in circumstances other than those related to the natural course of illness, disease, or proper treatment in accordance with generally accepted medical standards; equipment malfunction resulting in death or serious injury.

Less serious occurrences (codes 201–854): adverse events with less serious patient outcomes, such as complications of surgery, burns, and falls.

Other occurrences (codes 901, 902, 914, 931–935, and 939), fires or external disasters, strikes, and unscheduled termination of services vital to the continued safe operation of the facility or safety of its patients and personnel.

See Table C–2a for a detailed list of NYPORTS codes.

Reporting time frame

  • Serious occurrences: 24 hours/one business day.

  • Other occurrences: 24 hours/one business day.

  • Less serious occurrences: Within 30 days.

Data collected: Format and summary

  • Serious occurrences: Short form (see Table C–2b) plus RCA. Extra data collected for medication errors (see Table C–2d).

  • Other occurrences: Short form (see Table C–2b) only.

  • Less serious occurrences: Short form (see Table C–2b) only.

Suggested Citation:"Appendix C: Examples of Federal, State, and Private Sector Reporting Systems." Institute of Medicine. 2004. Patient Safety: Achieving a New Standard for Care. Washington, DC: The National Academies Press. doi: 10.17226/10863.
×

Floridaa

  1. Was the performance of a surgical procedure on the wrong patient, a wrong surgical procedure, a wrong-site surgical procedure, or a surgical procedure otherwise unrelated to the patient’s diagnosis or medical condition;

  2. Required the surgical repair of damage resulting to a patient from a planned surgical procedure, where the damage was not a recognized specific risk, as disclosed to the patient and documented through the informed-consent process; or

  3. Was a procedure to remove unplanned foreign objects remaining from a surgical procedure.

  1. Events that need to be reported within 15 days (code 15 reports—see Table C–2e).

  2. Events that must be reported on an annual basis (annual reports—see Table C–2e).

As identified above, events need to be reported:

  • Within 15 days, or

  • On an annual basis.

Notification to patient: An appropriately trained person designated by each licensed facility shall inform each patient, or the individual identified as the patient’s health care surrogate, in person about adverse incidents that result in serious harm to the patient. Such notice shall be given as soon as possible to allow the patient an opportunity to minimize damage or injury.

Code 15 report.

Annual report.

See Tables C–2f, C–2g, and C–2h for details.

Suggested Citation:"Appendix C: Examples of Federal, State, and Private Sector Reporting Systems." Institute of Medicine. 2004. Patient Safety: Achieving a New Standard for Care. Washington, DC: The National Academies Press. doi: 10.17226/10863.
×

State

New York

Name of System

New York Patient Occurrence Reporting and Tracking Systemb

Method of reporting

Internet based with secure fire walls.

Who reports

Identified contact within facility; usually risk managers or quality divisions.

RCA trigger

All serious occurrences require an RCA or are performed at the request of the DOH.

Follow-up (including RCA)

The RCA must be completed within 30 days and reported electronically to NYPORTS (see Table C–2c for more information on the RCA form).

Other information collected through the system

None.

Confidentiality issues

Statutory provisions make reports that are submitted in compliance with the reporting requirement confidential and protect individuals making reports from civil lawsuits and monetary damages (Public Health Law 2805–m).

The confidentiality provisions have been challenged under the state’s Freedom of Information law. In a 1997 decision, the court ruled that under this law, incident reports are protected by the confidentiality statute. However, the court ruled that hospital-specific aggregate (annual) data can be released.

Relationship with other reporting systems

Within hospitals/freestanding clinics, there are other relevant New York State reporting systems—cardiac adverse events, perinatal adverse events, and hemolytic transfusion reactions and other types of blood- and tissue-related adverse events. These four systems collect and analyze statistics—RCA is not mandated, but in-depth assessment similar to RCA is undertaken for the hemolytic and radiologic events by their respective systems. None of the reporting systems are managed by the New York State Patient Safety Center.

Suggested Citation:"Appendix C: Examples of Federal, State, and Private Sector Reporting Systems." Institute of Medicine. 2004. Patient Safety: Achieving a New Standard for Care. Washington, DC: The National Academies Press. doi: 10.17226/10863.
×

Floridaa

Fax or certified mail.

Each facility has a risk manager who collects the adverse event information. The term “root-cause analysis” is not used in the statute; however, the statute does require the facility to investigate and analyze adverse incidents and to develop appropriate measures and other innovative approaches to minimize the risk of adverse incidents to patients. The Code 15 report requires an analysis of the cause of the incident and a list of the corrective or proactive actions taken.

As indicated above, a Code 15 report includes some description of the cause of the event and corrective or proactive actions taken. AHCA may require further documentation from the facility about the incident and its corrective action plan (or RCA), and/or can initiate a survey to assess risk management functions related to the adverse incident (patient safety) and patient quality of care.

Biennial risk management survey required of all licensed hospitals and ambulatory surgical centers. AHCA is collecting data on the citation for nonreporting of adverse incidents.

Statutory provision makes reports of an adviser and untoward incidents confidential and not subject to discovery or admission into evidence in civil lawsuits. There has been no challenge to this provision to date.

Notification to patient of outcomes of care that result in harm to the patient under the section on patient notification shall not constitute an acknowledgment or admission of liability, nor can it be introduced as evidence.

Information is shared with professional boards. The Commission for Excellence in Health Care is exploring the coordination of data sources. Although AHCA is not responsible for the intake of complaints, the agency does investigate them and store information in a common database with incident reports.

Suggested Citation:"Appendix C: Examples of Federal, State, and Private Sector Reporting Systems." Institute of Medicine. 2004. Patient Safety: Achieving a New Standard for Care. Washington, DC: The National Academies Press. doi: 10.17226/10863.
×

State

New York

Name of System

New York Patient Occurrence Reporting and Tracking Systemb

 

Although there is some degree of overlap among these systems and NYPORTS, efforts were made to reduce duplicative reporting as much as possible.

In addition to the above three systems and NYPORTS, there is a voluntary “complaints” system covering all aspects of health care in the state. Complaints are processed on a case-by-case basis. Some effort is being made to integrate the complaints system and NYPORTS.

Relationships with JCAHO/Medicare certification

The New York State Department of Health does not deem JCAHO accreditation. However, it has a contract with JCAHO to share surveillance information. The contract is based on information sharing of the overall process, which includes complaint and incident investigations and a range of other surveillance activities. Additionally, there is direct overlap between the JCAHO sentinel events and NYPORTS serious events, with the exception of hemolytic transfusion reactions, which are captured in another unit within DOH.

Suggested Citation:"Appendix C: Examples of Federal, State, and Private Sector Reporting Systems." Institute of Medicine. 2004. Patient Safety: Achieving a New Standard for Care. Washington, DC: The National Academies Press. doi: 10.17226/10863.
×

Floridaa

The Florida Agency for Health Care Administration deems JCAHO accreditation as meeting its biennial licensure requirements. The agency performs validation surveys on approximately 5 percent of JCAHO-accredited facilities each year as directed by CMS.

Suggested Citation:"Appendix C: Examples of Federal, State, and Private Sector Reporting Systems." Institute of Medicine. 2004. Patient Safety: Achieving a New Standard for Care. Washington, DC: The National Academies Press. doi: 10.17226/10863.
×

TABLE C–2a NYPORTS Reportable Events

Broad Category

Codes

Medication

108.

A medication error occurred that resulted in permanent patient errors harm (harm that is enduring and cannot be rectified by treatment).

109.

A medication error occurred that resulted in a near-death event (e.g., cardiac or respiratory arrest requiring Basic Life Support (BLS) or Advanced Cardiac Life Support (ACLS).

110.

A medication error occurred that resulted in a patient death.

Aspiration

201.

Aspiration pneumonitis/pneumonia in a nonintubated patient related to conscious sedation.

Intravascular catheter related

301.

Necrosis or infection requiring repair (incision and drainage, debridgement, or other surgical intervention), regardless of the location of the repair.

302.

Volume overload leading to pulmonary edema.

303.

Pneumothorax, regardless of size or treatment.

Embolic and related disorders

401.

New, acute pulmonary embolism, confirmed, or suspected and treated.

402.

New documented deep-vein thrombosis.

Laparoscopic

501.

All unplanned conversions to an open procedure because of an injury and/or bleeding during the laparoscopic procedure.

Perioperative/ periprocedural related

601.

Any new central neurological deficit (e.g., stroke, hypoxic/anoxic encephalopathy).

602.

Any new peripheral neurological deficit (e.g., palsy, paresis) with motor weakness.

603.

Cardiac arrest with successful resuscitation.

604.

Acute myocardial infarction—unrelated to a cardiac procedure.

605.

Death occurring after procedure (specific to list of 10 procedures).

Burns, falls

701.

Second- and/or third-degree burns.

751.

Falls resulting in x-ray-proven fractures, subdural or epidural hematoma, cerebral contusion, traumatic subarachnoid hemorrhage, and/or internal trauma.

Procedure related

801.

Procedure-related injury requiring repair, removal of an organ, or other procedural intervention.

803.

Hemorrhage or hematoma requiring drainage, evacuation, or other procedural intervention.

804.

Anastomatic leakage requiring repair.

Suggested Citation:"Appendix C: Examples of Federal, State, and Private Sector Reporting Systems." Institute of Medicine. 2004. Patient Safety: Achieving a New Standard for Care. Washington, DC: The National Academies Press. doi: 10.17226/10863.
×

Broad Category

Codes

 

805.

Wound dehiscence requiring repair.

806.

Displacement, migration, or breakage of an implant, device, graft, or drain, whether repaired, intentionally left in place, or removed.

807.

Thrombosed distal bypass graft requiring repair.

808.

Post-op wound infection following clean or clean/contaminated case, requiring drainage during the hospital stay or inpatient admission within 30 days. ASA class required.

819.

Any unplanned operation or reoperation related to the primary procedure, regardless of setting of primary procedure.

851.

Postpartum hysterectomy.

852.

Inverted uterus.

853.

Ruptured uterus.

854.

Circumcision requiring repair.

RCA required

911.

Wrong patient, wrong site—surgical procedure.

912.

Incorrect procedure or treatment—invasive.

913.

Unintentionally retained foreign body due to inaccurate surgical count or break in procedural technique.

RCA required: Any unexpected adverse occurrence not directly related to the natural course of the patient’s illness or underlying condition resulting in:

915.

Death (e.g., brain death).

916.

Cardiac and/or respiratory arrest requiring BLS/ALCS intervention.

917.

Loss of limb or organ.

918.

Impairment of limb and impairment present at discharge or for at least 2 weeks after occurrence if patient is not discharged.

919.

Loss or impairment of bodily function and present at discharge or for at least 2 weeks after occurrence if patient is not discharged.

920.

Errors of omission/delay resulting in death or serious injury related to the patient’s underlying condition.

921.

Crime resulting in death or serious injury, as defined in 915–919.

922.

Suicides and attempted suicides with serious injury, as defined in 915–919.

923.

Elopement from the hospital resulting in death or serious injury, as defined in 915–919.

938.

Malfunction of equipment during treatment or diagnosis or a defective product that resulted in death or serious injury, as defined in 915–919.

961.

Infant abduction.

962.

Infant discharged to wrong family.

963.

Rape by another patient or staff (including alleged rape with clinical confirmation).

Suggested Citation:"Appendix C: Examples of Federal, State, and Private Sector Reporting Systems." Institute of Medicine. 2004. Patient Safety: Achieving a New Standard for Care. Washington, DC: The National Academies Press. doi: 10.17226/10863.
×

Broad Category

Codes

RCA NOT required

901.

Serious occurrence warranting DOH notification, not covered by 911–963.

902.

Patient transferred to the hospital from the diagnostic and treatment center.

914.

Misadministration of radioactive material (as defined by the Bureau of Environmental Radioactive Protection, section 16.25, 10NYCRR).

931.

Strike by hospital staff.

932.

External disaster outside the control of the hospital that affects facility operations.

933.

Termination of services vital to the continued safe operation of the hospital or to the health and safety of its patients or personnel (e.g., electricity, laundry services).

934.

Poisonings occurring within the hospital (water, air, food).

935.

Hospital fire disrupting patient care or causing harm to patients or staff.

937.

Malfunction of equipment during treatment or diagnosis or a defective product that has a potential for adversely affecting patient or hospital personnel or resulting in a retained foreign body.

Suggested Citation:"Appendix C: Examples of Federal, State, and Private Sector Reporting Systems." Institute of Medicine. 2004. Patient Safety: Achieving a New Standard for Care. Washington, DC: The National Academies Press. doi: 10.17226/10863.
×

TABLE C–2b NYPORTS Short Form

The short form collects a limited amount of data items, including the following:

  • Occurrence date

  • Occurrence code (three-digit code above)

  • ICD–9–CM code corresponding to the diagnosis for which patient was admitted

  • ICD–9 procedure code most closely associated with occurrence

  • Hospital medical record number

  • Location in hospital where incident occurred

  • SPARCS number—the Statewide Planning and Research Cooperative System is a comprehensive patient data system

  • The service for which the patient was originally admitted

  • Date of birth

  • Sex

  • Admission date

  • Readmission date

  • Do you believe that this occurrence will likely lead to (check all that apply) no action, change in policy, formal education/reeducation, discipline taken, process improvement, don’t know yet?

  • Brief summary of occurrence

  • Description of any process improvement that others could learn from

  • Any lesson learned that could be globally beneficial to others

  • Report date and reporter—automatically filled in

  • Hospital name

  • Incident ID number

Suggested Citation:"Appendix C: Examples of Federal, State, and Private Sector Reporting Systems." Institute of Medicine. 2004. Patient Safety: Achieving a New Standard for Care. Washington, DC: The National Academies Press. doi: 10.17226/10863.
×

TABLE C–2c NYPORTS Root-Cause Analysis Form

The root-cause analysis form requires a description of the occurrence, answers to several yes/no questions about why the occurrence happened, and the details of a corrective action plan. The “why it happened” section consists of about 30 questions under several headings:

  • Policy or process (system) in which the event occurred

  • Human resource factors and issues

  • Environment of care, including equipment and other related factors

  • Information management and communication issues

  • Standard of care

  • Leadership: Corporate culture

An example question is, “Staff are properly qualified, yes/no?” If the answer to a question is “no,” the respondent must elaborate on the root cause, develop a plan for improvement, and develop measures to assess effectiveness of risk reduction strategies.

Other elements required in the RCA form are:

  • Literature search

  • Executive summary

  • List of participant titles

TABLE C–2d NYPORTS Medication Supplement

For codes 108–110 the following extra information is collected:

  • Type of occurrence (e.g., wrong patient, wrong drug, wrong dose, wrong route, wrong frequency, wrong time, omission, administration after order discontinued/expired, wrong dilutent/concentration/dosage form, monitoring error, other)

  • Where in the process (e.g., prescribing, transcription, dispensing, administration, documentation on medical administration record)

  • Medication given

  • Medication intended to be given

  • Categories of staff involved

  • Discovery date/time

  • How the occurrence was discovered

Suggested Citation:"Appendix C: Examples of Federal, State, and Private Sector Reporting Systems." Institute of Medicine. 2004. Patient Safety: Achieving a New Standard for Care. Washington, DC: The National Academies Press. doi: 10.17226/10863.
×

TABLE C–2e Florida State Reportable Events

Events that need to be reported within 15 days (Code 15 reports):

All the above, plus the following:

  • The death of a patient

  • Brain or spinal damage to a patient

  • The performance of a surgical procedure on the wrong patient

  • The performance of a wrong site surgical procedure

  • The performance of a wrong surgical procedure

  • Surgical procedure that is unnecessary or otherwise unrelated to the patient’s diagnosis or medical condition

  • The surgical repair of damage resulting to a patient from a planned surgical procedure, where the damage is not a recognized specific risk, as disclosed to the patient and documented through the informed-consent process

  • The performance of procedures to remove unplanned foreign objects remaining from a surgical procedure

Events that must be reported on an annual basis (annual reports):

All the above, plus the following:

  • Permanent disfigurement

  • Fracture or dislocation of bones or joints

  • A resulting limitation of neurological, physical, or sensory function that continues after discharge from the facility

  • Any condition that requires specialized medical attention or surgical intervention resulting from nonemergency medical intervention, other than an emergency medical condition, to which the patient has not given informed consent

  • Any condition that required the transfer of the patient, within or outside the facility, to a unit providing a more acute level of care due to the adverse incident, rather than the patient’s condition prior to the adverse incident

Suggested Citation:"Appendix C: Examples of Federal, State, and Private Sector Reporting Systems." Institute of Medicine. 2004. Patient Safety: Achieving a New Standard for Care. Washington, DC: The National Academies Press. doi: 10.17226/10863.
×

TABLE C–2f Florida State Annual Report

  • Facility information

  • Total number of reportable incidents; total number of surgical incidents; total number of diagnostic incidents; total number of other actions causing injury

  • Surgical, diagnostic, or treatment procedure being performed at time of incident (using ICD–9 Codes 01–99.9)

  • Other actions causing medical injuries (using ICD–9 E Codes and Codes 800–999.9)

  • Accident, event, circumstances, or specific agent that caused the injury or event (using ICD–9 E Codes)

  • Resulting injury (using ICD–9 Codes 800–999.9)

  • License numbers of personnel (or social security numbers of unlicensed personnel) directly involved in incident and relationship to facility

  • A description of all malpractice claims filed against the facility, including the nature of the incident, license numbers of persons involved in the claim, and the status or disposition of each claim

  • Total number of new claims

  • Total number of claims pending

  • Total number of claims closed during the reporting year

  • Copy of the facility’s policies and procedures to reduce risk of patient injuries and adverse incidents

  • Copy of each regular summary to the facility governing board from the risk manager for the calendar year

Suggested Citation:"Appendix C: Examples of Federal, State, and Private Sector Reporting Systems." Institute of Medicine. 2004. Patient Safety: Achieving a New Standard for Care. Washington, DC: The National Academies Press. doi: 10.17226/10863.
×

TABLE C–2g Florida State Code 15 Report

  • Facility information

  • Patient information (i.e., name, identification number, address, age, sex, Medicaid/ Medicare, date of admission, admitting diagnosis, ICD–9 code for admit diagnosis)

  • Incident information (date/time/location)

  • Notification of medical examiner (yes/no/name/contact number)

  • Autopsy performed (yes/no)

  • Description of incident

  • Surgical, diagnostic, or treatment procedure being performed at time of incident (using ICD–9 Codes 01–99.9)

  • Accident, event, circumstances, or specific agent that caused the injury or event (using ICD–9 E Codes)

  • Resulting injury (using ICD–9 Codes 800–999.9)

  • List any equipment directly involved in incident

  • Outcome (e.g., death, fetal death, brain damage, spinal damage, surgical procedure performed on the wrong patient, surgical procedure performed on the wrong site, wrong surgical procedure performed, surgical procedure unrelated to patient’s diagnosis, surgical procedure to remove foreign objects remaining from a surgical procedure, surgical repair of injuries from a planned surgical procedure)

  • License numbers of personnel and capacity or social security numbers of unlicensed personnel directly involved in incident

  • License numbers of witnesses or docial decurity numbers of unlicensed witnesses)

  • Analysis of cause of incident (description)

  • Corrective or proactive actions taken (description)

Suggested Citation:"Appendix C: Examples of Federal, State, and Private Sector Reporting Systems." Institute of Medicine. 2004. Patient Safety: Achieving a New Standard for Care. Washington, DC: The National Academies Press. doi: 10.17226/10863.
×

III. PRIVATE-SECTOR REPORTING SYSTEMS

Overview

In the private sector, a number of initiatives are working to develop patient safety and/or health care reporting and surveillance systems. Some of these systems are being developed by universities and companies for use in multiple health care organizations and settings, whereas others are being developed by hospital systems for their own internal use or by groups with an interest in specific nonhospital-based practice settings (e.g., family practices). This section addresses the first of these system types.

As noted earlier, it is not the intention of this appendix to be comprehensive but instead to review a representative sample of the patient safety reporting and surveillance systems that are being developed in the private sector. The four private-sector systems summarized in the attached tables were all established for reporting purposes. These systems are:

  • The Medical Event Reporting System for Transfusion Medicine (MERS-TM), which is primarily based and managed at Columbia University and is funded under a grant from the National Heart, Lung, and Blood Institute of the National Institutes of Health.

  • The Medication Errors Reporting (MER) Program, which is operated by the United States Pharmacopeia (USP) in cooperation with the Institute for Safe Medication Practices (ISMP).

  • MedMARx, which is owned and managed by USP.

  • The Joint Commission on Accreditation of Healthcare Organizations (JCAHO) Sentinel Event Policy

All of these systems were initiated in the 1990s. The longest operating of the four is the USP MER Program, which was begun in 1991 through a USP partnership with the ISMP. USP then purchased the MER Program from ISMP in 1994, but the two organizations continue to operate the system jointly.

In addition, all of these are essentially voluntary nonpunitive systems, with the possible exception of the JCAHO Sentinel Event Policy. JCAHO-accredited organizations are “encouraged, but not required” to report events meeting its criteria for reviewable sentinel events (see the “Reportable events” row of Table C–3b for more detail). However, if the Joint Commission becomes aware of a reviewable sentinel event that occurred at an accredited organization and was not reported, then that organization must

Suggested Citation:"Appendix C: Examples of Federal, State, and Private Sector Reporting Systems." Institute of Medicine. 2004. Patient Safety: Achieving a New Standard for Care. Washington, DC: The National Academies Press. doi: 10.17226/10863.
×

prepare and submit an RCA and action plan to JCAHO. If an acceptable RCA and action plan are not submitted to the Joint Commission within a designated time frame, then the organization can be placed on Accreditation Watch and risk its accreditation status being changed to Preliminary Non-Accreditation or Not Accredited.

Reportable Events

Because these four systems were developed by private organizations and are essentially voluntary, they tend to be more limited in their scope than the federal patient safety/health care reporting systems. MERS-TM, the MER Program, and MedMARx systems focus on specific types of events based on what is believed to have caused the event—blood components/transfusion services and medication errors. MERS-TM and MedMARx also collect reports of near-miss events. Of these three systems, MERS-TM and the MER Program are the most applicable across multiple health care practice settings, whereas MedMARx is currently limited to hospital reporting of medication errors. In fact, MERS-TM is in the process of expanding the current transfusion medicine-based near-miss system to a hospital-wide application by investing in information technology for handling large amounts of incident data coming from many locations. The input forms feed directly into the database, which can compare incoming reports with those already in the database. However, all three of these systems are employed only by organizations that choose to participate, and therefore the three systems do not collect data on a nationwide level, as do several of the federal reporting and surveillance systems.

The Joint Commission’s Sentinel Event Policy is more general than the other three—the types of events reported to JCAHO are not limited by causality. Any type of event meeting JCAHO’s Sentinel Event definition (which may be interpreted slightly differently by each accredited organization) can be reported to JCAHO; however, the only events that must be reported are those meeting JCAHO’s list of reviewable sentinel events (see the “Reportable events” row of Table C–3b for more detail). This system covers all organizations that are JCAHO accredited or seeking accreditation; approximately 80 percent of U.S. hospitals are currently involved in the JCAHO accreditation process (Joint Commission on Accreditation of Healthcare Organizations, 2002c).

Suggested Citation:"Appendix C: Examples of Federal, State, and Private Sector Reporting Systems." Institute of Medicine. 2004. Patient Safety: Achieving a New Standard for Care. Washington, DC: The National Academies Press. doi: 10.17226/10863.
×
Format for Reporting

Each system requires slightly different data to be reported, and most of them use a standard format for collecting these data (see “Data collected: Format and summary” row in Tables C–3a and C–3b). The only system that does not use a standard format is JCAHO. The Joint Commission does make a form available for organizations to use in self-reporting sentinel events but does not require use of the form. In addition, JCAHO allows RCAs and action plans to be conducted according to each organization’s locally accepted method; however, these RCAs and action plans are required to be thorough and credible before they will be accepted by JCAHO. Most of these systems include patient information and information about the staff that were involved in, discovered, and in some cases reported an event, but no specific identifiers of individuals are used. In terms of classifying and/or coding the data collected, the MERS-TM and MedMARx systems have the most involved data models (see “Classification system and/or severity index” row in the tables).

Method of Reporting

Both the MER Program and MedMARx allow for online reporting of data, while the other two systems rely on paper forms transmitted via mail. The only one of the four that currently allows for reporting by patients and their families is JCAHO.

Analysis of More Serious Events

All of these systems have some means for following up on reported events. Three of the four have trigger mechanisms in place to indicate when an RCA should be conducted. Those three systems also provide guidelines for how to conduct the RCAs as well as for how to develop subsequent action plans. The exception in this area is the MER Program. Although the MER Program does not include RCAs event reports submitted to the program are forwarded to the FDA MedWatch system and to the product manufacturer where applicable. The FDA and the manufacturer can then follow up on these events as appropriate. In addition, events reported to MedMARx are also forwarded to the FDA.

The managers of the systems discussed in this section, with the exception of the MER Program, maintain a database of their reports. These data-

Suggested Citation:"Appendix C: Examples of Federal, State, and Private Sector Reporting Systems." Institute of Medicine. 2004. Patient Safety: Achieving a New Standard for Care. Washington, DC: The National Academies Press. doi: 10.17226/10863.
×

bases then allow for data analysis, such as monitoring event trends so that alerts can be issued when necessary.

Tabular Information

All of this information is broken out in more detail in the following tables. Table C–3a includes MERS-TM, the MER Program, and MedMARx. Table C–3b includes JCAHO’s Sentinel Event Policy.

Suggested Citation:"Appendix C: Examples of Federal, State, and Private Sector Reporting Systems." Institute of Medicine. 2004. Patient Safety: Achieving a New Standard for Care. Washington, DC: The National Academies Press. doi: 10.17226/10863.
×

TABLE C–3a Selected Examples of Private Patient Safety/Health Care Reporting and Surveillance Systems

Name of System

Medical Event Reporting System for Transfusion Medicinea

System owner or manager

Primarily based at Columbia University (under a grant from the National Heart, Lung, and Blood Institute of the National Institutes of Health).

Type of system

Reporting.

History of reporting/ surveillance system

In 1995, the University of Texas (UT) Southwestern Medical Center at Dallas received a grant from the National Heart, Lung, and Blood Institute to design, develop, and implement an event-reporting system in transfusion medicine. UT Southwestern researchers brought together an interdisciplinary team of experts to design a prototype medical event-reporting system for transfusion medicine. The FDA, American Association of Blood Banks, America’s Blood Centers, American Blood Resources Association, American Red Cross, and Blood Systems, Inc., all participated in the early design of MERS-TM. Initial implementation in hospital Transfusion Services and Blood Centers began in 1997. Management of the system moved to Columbia University in 1998, when the principal investigator relocated. MERS-TM has since grown from a PC-based to a Web-based system and is now in use in 27 transfusion services and one blood center. It is being piloted as the national system for Canada and Ireland (as a near-miss system).

Voluntary or mandatory

Voluntary.

aInformation on MERS-TM has been obtained from the following sources: Battles et al. (1998), Callum et al. (2001), Columbia University (2001), Kaplan et al. (1998), and Westat (2001).

bInformation on the MER Program has been obtained from the following sources: U.S. Pharmacopeia (1997, 2001).

cInformation on MedMARx has been obtained from the following sources: Cousins (2001) and National Coordinating Council for Medication Error Reporting and Prevention (1998).

Suggested Citation:"Appendix C: Examples of Federal, State, and Private Sector Reporting Systems." Institute of Medicine. 2004. Patient Safety: Achieving a New Standard for Care. Washington, DC: The National Academies Press. doi: 10.17226/10863.
×

Medication Errors Reporting Programb

MedMARxc

United States Pharmacopeia

United States Pharmacopeia

Reporting.

Reporting.

In 1991, USP began coordinating the MER Program with the Institute for Safe Medication Practices; in 1994, USP purchased the MER Program from ISMP. The USP MER Program is presented in cooperation with ISMP.

In 1998, USP spearheaded the formation of the National Coordinating Council for Medication Error Reporting and Prevention (NCC MERP). NCC MERP established a standardized definition of medication error and an Index for Categorizing Medication Errors. The council has issued recommendations on the error-prone aspects of prescription writing, drug dispensing and administering, and on labeling and packaging of drug products. In early 1997, USP began receiving requests for guidance from risk managers, quality assurance staff, pharmacists, and nurses on medication error analysis and reporting. In response to these questions, USP developed MedMARx—an Internet-accessible medication errors database for hospitals to anonymously report to a centralized system that resides at USP.

Voluntary.

Voluntary.

Suggested Citation:"Appendix C: Examples of Federal, State, and Private Sector Reporting Systems." Institute of Medicine. 2004. Patient Safety: Achieving a New Standard for Care. Washington, DC: The National Academies Press. doi: 10.17226/10863.
×

Name of System

Medical Event Reporting System for Transfusion Medicinea

System owner or manager

Primarily based at Columbia University (under a grant from the National Heart, Lung, and Blood Institute of the National Institutes of Health).

Reportable events/ events monitored

The system monitors all events (error, incident, deviation, variance, discovery, occurrence, or adverse or sentinel event) related to blood components and transfusion services. An event is defined as an occurrence with a potentially negative outcome that most often results from both latent conditions and human/active error. This includes:

Near-miss event: Event in which unwanted consequences were prevented because of recovery by identification and correction of the failure. Such a recovery could be by a planned barrier or critical control point or unplanned.

No-harm event: Event that has actually occurred (no recovery action was taken), but no actual harm has come to the patient or the organization. Except for “luck” (or in health care, the robust nature of human physiology), these accidents would have become misadventures.

Misadventure: Event in which there was no recovery and in which the patient has been harmed or the mission of the organization has been harmed or compromised.

Classification system and/or severity (risk assessment) index

At the local level, events are coded according to where/when in the work process the event was discovered and where/when the event occurred.

Events are assigned causal codes, which are based on the Eindhoven Classification Model—Medical Version (ECM). MERS-TM has 20 codes for describing causes of both active and latent errors. These codes are divided among three groups of causes: technical factors, organizational factors, and human factors.

Risk is measured as severity (or potential severity) multiplied by the probability of recurrence. Severity is termed the Quantified Estimate of Severity (QES) and the probability of recurrence is called Quantified Estimate of Probability (QEP). QES and QEP have numerical values assigned to them, and these numbers are multiplied to calculate the Risk Assessment Index (RAI) for an event. The RAI is then adjusted based on whether or not a product was issued and the type of recovery, if any.

Suggested Citation:"Appendix C: Examples of Federal, State, and Private Sector Reporting Systems." Institute of Medicine. 2004. Patient Safety: Achieving a New Standard for Care. Washington, DC: The National Academies Press. doi: 10.17226/10863.
×

Medication Errors Reporting Programb

MedMARxc

United States Pharmacopeia

United States Pharmacopeia

Medication errors (both actual and potential).

Medication errors: Defined by the NCC MERP as any preventable event that may cause or lead to inappropriate medication use or patient harm while the medication is in the control of the health care professional, patient, or consumer. Such events may be related to professional practice, health care products, procedures, and systems, including prescribing; order communications; product labeling, packaging, and nomenclature; compounding; dispensing; distribution; administration; education; monitoring; and use.

Events are categorized according to the categorization index developed by the NCC MERP. This index consists of nine categories (A through I):

Events are categorized according to the categorization index developed by the NCC MERP. This index consists of nine categories (A through I):

No error

No error

A. Circumstances or events that have the capacity to cause error.

A. Circumstances or events that have the capacity to cause error.

Error, no harm

Error, no harm

B. An error occurred, but the medication did not reach the patient.

B. An error occurred, but the medication did not reach the patient.

C. An error occurred that reached the patient but did not cause patient harm.

C. An error occurred that reached the patient but did not cause patient harm.

D. An error occurred that reached the patient and required monitoring to confirm that it resulted in no harm to the patient and/or required intervention to preclude harm.

D. An error occurred that reached the patient and required monitoring to confirm that it resulted in no harm to the patient and/or required intervention to preclude harm.

Suggested Citation:"Appendix C: Examples of Federal, State, and Private Sector Reporting Systems." Institute of Medicine. 2004. Patient Safety: Achieving a New Standard for Care. Washington, DC: The National Academies Press. doi: 10.17226/10863.
×

Name of System

Medical Event Reporting System for Transfusion Medicinea

System owner or manager

Primarily based at Columbia University (under a grant from the National Heart, Lung, and Blood Institute of the National Institutes of Health).

Reporting time frame

Not applicable.

Data collected: Format and summary

Event Discovery Report (standard format):

Section A: Data collected include date and time of discovery, where the event was discovered, information about who

dHarm is defined as impairment of the physical, emotional, or psychological function or structure of the body and/or resulting pain.

Suggested Citation:"Appendix C: Examples of Federal, State, and Private Sector Reporting Systems." Institute of Medicine. 2004. Patient Safety: Achieving a New Standard for Care. Washington, DC: The National Academies Press. doi: 10.17226/10863.
×

Medication Errors Reporting Programb

MedMARxc

United States Pharmacopeia

United States Pharmacopeia

Error, harm

Error, harmd

E. An error occurred that may have contributed to or resulted in temporary harm to the patient and required intervention.

E. An error occurred that may have contributed to or resulted in temporary harm to the patient and required intervention.

F. An error occurred that may have contributed to or resulted in temporary harm to the patient and required initial or prolonged care.

F. An error occurred that may have contributed to or resulted in temporary harm to the patient and required initial or prolonged care.

G. An error occurred that may have contributed to or resulted in permanent patient harm.

G. An error occurred that may have contributed to or resulted in permanent patient harm.

H. An error occurred that required intervention necessary to sustain life.

H. An error occurred that required intervention necessary to sustain life.

Error, death

Error, death

I. An error occurred that may have contributed to or resulted in the patient’s death.

I. An error occurred that may have contributed to or resulted in the patient’s death.

The NCC MERP also developed a standard taxonomy for use in classifying and coding all of the data elements in the reports.

Not applicable.

Not applicable. However, a hospital may hold a report aside in the database for 45 days in order to ensure that it has collected all of the necessary information and performed necessary follow-up and that the information in the database is as complete and accurate as possible. During this time, other hospitals cannot view that report.

Standard format: Data collected include description of event (actual or potential), type of staff or health care practitioner

Standard format: Amount of data collected is related to the category of error; therefore, category A error reports

Suggested Citation:"Appendix C: Examples of Federal, State, and Private Sector Reporting Systems." Institute of Medicine. 2004. Patient Safety: Achieving a New Standard for Care. Washington, DC: The National Academies Press. doi: 10.17226/10863.
×

Name of System

Medical Event Reporting System for Transfusion Medicinea

System owner or manager

Primarily based at Columbia University (under a grant from the National Heart, Lung, and Blood Institute of the National Institutes of Health).

 

discovered the event, a description of what was discovered and how it was discovered, when in the work sequence the event was discovered, and the action taken with regard to the product or record.

Section B: Data collected include date and time of occurrence, job classification and name of person involved in the event, where in the process the event first occurred, location of the occurrence, and information about whether the product was issued and administered.

Quality Assurance Systems Operator (QA Sys Op)e Investigation Report (standard format):

First section: Data collected include the report accession number, event codes, an additional description of the event, risk information, follow-up action, preventive action to be taken, and type of investigation the event will receive.

Second section: Cause codes and other information for events undergoing routine investigation. Option to link to a similar event already in the database.

Third section: Used to record notes.

Causal Tree Worksheet (standard format, but boxes can be added or deleted as necessary): Data collected include the consequent event, antecedent events, root causes, and root-cause classification codes.

Root-Cause Analysis Report (standard format): Consequent event code and description, antecedent events codes and descriptions, and system action.

eInformation on the Joint Commission on Accreditation of Healthcare Organizations (JCAHO) Sentinel Event Policy has been obtained from the following sources: Joint Commission on Accreditation of Healthcare Organizations (2002a, c), and Schyve (2002).

Suggested Citation:"Appendix C: Examples of Federal, State, and Private Sector Reporting Systems." Institute of Medicine. 2004. Patient Safety: Achieving a New Standard for Care. Washington, DC: The National Academies Press. doi: 10.17226/10863.
×

Medication Errors Reporting Programb

MedMARxc

United States Pharmacopeia

United States Pharmacopeia

who made the initial error, patient outcome, any intervention that prevented the medication from reaching the patient, who discovered the error, when and how the error was discovered, where the error occurred, if another practitioner was involved in the error, if patient counseling wasprovided, description of product involved, relevant patient information (no patient identifiers included), recommendations by reporter as to how to prevent this error in the future, reporter information, and whether or not the reporter chooses to have his/her information released to the manufacturer, the FDA, or other persons. RCA.

capture significantly less information than reports on category E errors, where the patient is harmed.

Data collected for category E errors and above are as follows: date and time of error, description of event, type of error, possible causes of error, contributing factors, node in the process at which initial error occurred (e.g., prescribing, dispensing), location at which error was made, level of staff who made the initial error, level of staff who were involved in the error, level of staff who discovered the error, actions taken to avoid similar errors of this type, and a summary of the

Once these elements are completed, if a product is involved, information can be entered about that product. These data include generic and brand names; therapeutic classification; dosage, route, and strength; manufacturer; repacker; compounded ingredients; and container.

Also, for error categories C through I, a patient profile section captures data that include age and gender, outcome, and other relevant information.

Standardized pick lists are used for nearly all of the data entries; however, reporters are not limited to these lists. These pick lists are constructed based on the NCC MERP Taxonomy of Medication Errors.

Suggested Citation:"Appendix C: Examples of Federal, State, and Private Sector Reporting Systems." Institute of Medicine. 2004. Patient Safety: Achieving a New Standard for Care. Washington, DC: The National Academies Press. doi: 10.17226/10863.
×

Name of System

Medical Event Reporting System for Transfusion Medicinea

System owner or manager

Primarily based at Columbia University (under a grant from the National Heart, Lung, and Blood Institute of the National Institutes of Health).

Method of reporting

MERS-TM is a Web-based system. Paper forms may be downloaded if desired for initial data collection, or information may be entered directly into the hospital’s database. The server resides at Columbia University.

Who reports

Everyone in a participating organization is encouraged to report any and all events that have the potential for having an adverse effect on blood products or patient or donor safety.

RCA trigger

For events that are new or unique or that have an RAI of ≥0.5, the QA Sys Op performs/facilitates RCAs and constructs causal trees to further characterize the event.

In addition, if an event has an RAI of less than 0.5, BUT it represents a significant risk to the organization (i.e., potential for financial loss or damaged reputation), the QA Sys Op may decide to perform an expanded investigation.

Follow-up (including RCA)

The data are collected and interpreted for three main purposes: modeling, monitoring, and mindfulness.

Modeling the types of events and recovery steps that occur in the transfusion process allows for the identification of factors or system elements that have the potential to cause future errors.

Monitoring the existing areas of concern to determine whether the incidence of near misses and accidents is changing and to evaluate the impact of corrective actions.

Mindfulness increases alertness by disseminating information about potential risks and error-producing precursors.

RAI value (<0.5 – monitor; ≥0.5 and ≤0.7 – monitor and consider change; >0.7 – propose change), and the potential for organizational risk. Two MERS-TM software tools allow for database searching and monitoring. “Query by Field” searches for events with exact matches to user-selected fields. “HAWK” is

Suggested Citation:"Appendix C: Examples of Federal, State, and Private Sector Reporting Systems." Institute of Medicine. 2004. Patient Safety: Achieving a New Standard for Care. Washington, DC: The National Academies Press. doi: 10.17226/10863.
×

Medication Errors Reporting Programb

MedMARxc

United States Pharmacopeia

United States Pharmacopeia

Online or by mail or fax.

Online.

Individuals in hospitals that do not participate in MEDMARx and health professionals who practice in other settings.

There is usually a “gatekeeper” or administrator at each hospital who is responsible for releasing records into the system—most often this person is the pharmacist. However, multiple users are permitted at each site and may be given read-only or read-and-write levels of access by the administrator.

None.

All errors that result in harm as defined by the NCC MERP—Category E to Category I errors—merit an RCA.

Reporters may be contacted with additional questions for clarification.

Reports are forwarded to the FDA MedWatch system and to the manufacturer and those entities may conduct follow-up.

Hospitals should conduct RCAs on Category E to Category I errors. These RCAs can be conducted according to each hospital’s locally accepted method. However, certain RCA data elements are collected for Category E to I errors in a standardized manner, using the NCC MERP taxonomy (see “Data collected” row above).

In addition, the following options are available to participating hospitals:

1. They can track and analyze trends in medication errors through a standardized format that can be inculcated into the hospital’s internal quality improvement

Suggested Citation:"Appendix C: Examples of Federal, State, and Private Sector Reporting Systems." Institute of Medicine. 2004. Patient Safety: Achieving a New Standard for Care. Washington, DC: The National Academies Press. doi: 10.17226/10863.
×

Name of System

Medical Event Reporting System for Transfusion Medicinea

System owner or manager

Primarily based at Columbia University (under a grant from the National Heart, Lung, and Blood Institute of the National Institutes of Health).

 

based on the theories of case-based reasoning (CBR) and searches the database for similar cases based on weighted form fields.

Users may analyze and interpret both their local data and the central aggregate database using preprogrammed online reports or by downloading their sites’ data into Excel or Access. This allows for benchmarking.

The local database is evaluated regularly to assess the effectiveness of the system and impact of corrective actions. After evaluation, regular feedback about the system to all staff and immediate feedback to incident reporters are strongly recommended. The central database is evaluated for trends and analyzed using data mining software.

Other information collected through the system

None.

Confidentiality issues

Event reporting is completely confidential and not linked to employee performance assessment.

Relationship with other reporting systems

Any events defined by the FDA as reportable are transmitted to the FDA’s Blood Products Deviation (BPD) system.

Relationships with JCAHO/Medicare certification

For JCAHO-accredited organizations: All sentinel events meeting the JCAHO definition of a reviewable sentinel event can be reported to JCAHO (this is determined at the local level).

Suggested Citation:"Appendix C: Examples of Federal, State, and Private Sector Reporting Systems." Institute of Medicine. 2004. Patient Safety: Achieving a New Standard for Care. Washington, DC: The National Academies Press. doi: 10.17226/10863.
×

Medication Errors Reporting Programb

MedMARxc

United States Pharmacopeia

United States Pharmacopeia

 

activities and pharmacy and therapeutics committee activities.

2. They can do comparative analyses against similar institutions.

3. Eventually, they will be able to use MedMARx for benchmarking.

None.

None.

Although reporters provide their contact information, they can require that their identities be kept anonymous when the reports are forwarded to ISMP, the FDA, the manufacturer, and other persons requesting a copy of their reports.

Reports are anonymous, but randomly assigned facility IDs (each facility only knows its own ID) are used to group the reports. These IDs are associated with facility profiles, which allow each facility to compare its information with similar facilities without knowing the actual identities of those facilities.

All information is forwarded to the FDA MedWatch system.

 

 

All sentinel events meeting the JCAHO definition of reviewable sentinel event can be downloaded into a JCAHO template located in MedMARx.

Suggested Citation:"Appendix C: Examples of Federal, State, and Private Sector Reporting Systems." Institute of Medicine. 2004. Patient Safety: Achieving a New Standard for Care. Washington, DC: The National Academies Press. doi: 10.17226/10863.
×

TABLE C–3b Selected Examples of Private Patient Safety/Health Care Reporting and Surveillance Systems

Name of System

Sentinel Event Policya

System Owner or Manager

Joint Commission on Accreditation of Healthcare Organizations

Type of system

Reporting.

History of reporting/ surveillance system

JCAHO has been involved in patient safety reporting systems since 1995. In 1996, the Sentinel Event Policy was implemented. This was followed by the establishment of a Sentinel Event Database and the implementation of sentinel event standards. These standards were first included in the Joint Commission accreditation manual in 1999, and in July 2001 additional patient safety standards went into effect for hospitals.

Voluntary or mandatory

Voluntary; organizations are “encouraged, but not required” to report any sentinel event meeting the JCAHO criteria for reviewable sentinel events (see below). If the Joint Commission becomes aware of a reviewable sentinel event that occurred at an accredited organization, whether self-reported or not, that organization must prepare and submit an RCA and action plan to JCAHO or otherwise provide evidence of having completed a thorough and credible RCA and action plan (see “Method of reporting” below for available alternatives).

Reportable events/ events monitored

A sentinel event is defined as an unexpected occurrence involving death or serious physical or psychological injury, or the risk thereof. Serious injury specifically includes loss of limb or function. The phrase “or the risk thereof” includes any process variation for which a recurrence would carry a significant chance of a serious adverse outcome. Note that the definition does include ”near misses.” Such events are called “sentinel” because they signal the need for immediate investigation and response.

The following events are defined as reviewable sentinel events and should be reported to JCAHO:

1. An event that has resulted in an unanticipated death or major permanent loss of function, not related to the natural course of the patient’s illness or underlying condition or

aInformation on the JCAHO Sentinel Event Policy has been obtained from the following sources: Joint Commission on Accreditation of Healthcare Organizations (2002a, b) and personal communication, P. Schyve, JCAHO, 2002.

Suggested Citation:"Appendix C: Examples of Federal, State, and Private Sector Reporting Systems." Institute of Medicine. 2004. Patient Safety: Achieving a New Standard for Care. Washington, DC: The National Academies Press. doi: 10.17226/10863.
×

Name of System

Sentinel Event Policya

System Owner or Manager

Joint Commission on Accreditation of Healthcare Organizations

 

2. An event that is one of the following (even if the outcome was not death or major permanent loss of function): (a) Suicide of a patient in a setting where the patient receives round-the-clock care (e.g., hospital, residential treatment center, crisis stabilization center), (b) Infant abduction or discharge to the wrong family, (c) Rape, (d) Hemolytic transfusion reaction involving administration of blood or blood products having major blood group incompatibilities, (e) Surgery on the wrong patient or wrong body part. Note: This subset of events excludes “near-miss” sentinel events.

Classification system and/or severity (risk assessment) index

No standard system. Leadership standard (LD.5.1) requires each accredited organization to define sentinel event for its own purposes in establishing mechanisms to identify, report, and manage these events. At a minimum, an organization’s definition must include those events defined as reviewable sentinel events by JCAHO; however, they have latitude in setting more specific parameters to define “unexpected,” “serious,” and “the risk thereof.”

Reporting time frame

If the Joint Commission becomes aware (through voluntary self-reporting or otherwise) of a reviewable sentinel event that occurred at an accredited organization, that organization must prepare and submit an RCA and action plan to JCAHO within 45 calendar days of the event or of becoming aware of the event.

If an organization fails to submit or make available an acceptable RCA and action plan within 45 days, the Accreditation Committee can place the organization on Accreditation Watch.b An organization on Accreditation Watch has an additional 15 days to submit an acceptable RCA and action plan.

Data collected: Format and summary

There is a form that organizations may use when reporting the occurrence of a sentinel event. The information collected on this form includes name and address of organization, date of incident, textual summary of incident (which should not include

bAccreditation Watch status is considered information that can be publicly disclosed.

Suggested Citation:"Appendix C: Examples of Federal, State, and Private Sector Reporting Systems." Institute of Medicine. 2004. Patient Safety: Achieving a New Standard for Care. Washington, DC: The National Academies Press. doi: 10.17226/10863.
×

Name of System

Sentinel Event Policya

System Owner or Manager

Joint Commission on Accreditation of Healthcare Organizations

 

names of patients, caregivers, or other individuals involved in the event), method for sharing event-related information (via mail or one of the four alternatives—see “Method of reporting” below for more detail on these alternatives), and contact information for the event reporter.

There are no standard formats for RCAs and action plans; they may be conducted according to each organization’s locally accepted method. However, JCAHO does require that RCAs be thorough and credible before they will accept them (see “Follow-up” below for more detail on these requirements). In addition, JCAHO does provide a sample framework for an RCA and action plan, which may be used as an aid for organizing the steps in RCAs.

The JCAHO Sentinel Event Database has certain required data elements that are abstracted from RCAs, action plans, and follow-up activities. The three major categories of data elements included are sentinel event data, root-cause data, and risk reduction data.

Method of reporting

The primary means of submitting RCAs and action plans to JCAHO is via the mail. JCAHO then acknowledges receipt of the information and, once it has been processed, will return the original RCA and destroy all remaining copies of the document.

Alternative 1: The organization can schedule an appointment to personally bring the RCA and other sentinel event-related documents to the JCAHO headquarters building for review by JCAHO staff, then leave with all of these documents still in the organization’s possession.

Alternative 2: The organization can request an on-site review of the RCA and other sentinel event–related documents by a JCAHO surveyor. This surveyor can then review these documents and interview staff. No copy of the RCA will be retained by JCAHO.

Alternative 3: The organization can request an on-site visit by a JCAHO surveyor to conduct interviews and review relevant documentation to obtain information about the process and findings of the RCA and action plan, without actually reviewing the RCA documents. No copy of the RCA will be requested or retained by JCAHO.

Suggested Citation:"Appendix C: Examples of Federal, State, and Private Sector Reporting Systems." Institute of Medicine. 2004. Patient Safety: Achieving a New Standard for Care. Washington, DC: The National Academies Press. doi: 10.17226/10863.
×

Name of System

Sentinel Event Policya

System Owner or Manager

Joint Commission on Accreditation of Healthcare Organizations

 

Alternative 4: The organization can request an on-site review of its process for responding to a sentinel event and the relevant policies and procedures preceding and following the organization’s review of a specific event. This option is to be used in those instances where the organization meets specified criteria respecting the risk of waiving legal protections for RCA information shared with JCAHO.

Who reports

JCAHO-accredited organizations are self-reporting often through the quality improvement coordinator, sometimes the chief executive officer or another senior executive, or the risk manager. Also, JCAHO can be made aware of sentinel events through patients and their families, employees of the accredited organizations, or the media.

RCA trigger

All events defined by the accredited organization as sentinel events, which will, at a minimum, include JCAHO reviewable sentinel events, require an RCA.

Follow-up (including RCA)

Each organization can conduct RCAs and develop action plans according to its own locally accepted methods. JCAHO then determines if the RCA and action plan are acceptable. An RCA will be considered acceptable for accreditation purposes if it has the following characteristics:

  • The analysis focuses primarily on systems and processes, not individual performance.

  • The analysis progresses from special causes in clinical processes to common causes in organizational processes.

  • The analysis repeatedly digs deeper by asking “Why?” Then, when answered, “Why?” again, and so on.

  • The analysis identifies changes that could be made in systems and processes—either through redesign or development of new systems or processes—that would reduce the risk of such events occurring in the future.

  • The analysis is thorough and credible.

 

To be thorough, the RCA must include:

  • A determination of the human and other factors most directly

Suggested Citation:"Appendix C: Examples of Federal, State, and Private Sector Reporting Systems." Institute of Medicine. 2004. Patient Safety: Achieving a New Standard for Care. Washington, DC: The National Academies Press. doi: 10.17226/10863.
×

Name of System

Sentinel Event Policya

System Owner or Manager

Joint Commission on Accreditation of Healthcare Organizations

 

associated with the sentinel event and the process(es) and systems related to its occurrence;

  • Analysis of the underlying systems and processes through a series of “Why?” questions to determine where redesign might reduce risk;

  • Inquiry into all areas appropriate to the specific type of event as described in the current edition of Minimum Scope of Review of Root Cause Analysis;

  • Identification of risk points and their potential contributions to this type of event; and

  • A determination of potential improvement in processes or systems that would tend to decrease the likelihood of such events in the future or a determination, after analysis, that no such improvement opportunities exist.

 

To be credible, the RCA must:

  • Include participation by the leadership of the organization and by the individuals most closely involved in the processes and systems under review;

  • Be internally consistent;

  • Provide an explanation for all findings of “not applicable” or “no problem”; and

  • Include consideration of any relevant literature.

 

An action plan will be considered acceptable if it:

  • Identifies changes that can be implemented to reduce risk or formulates a rationale for not undertaking such changes; and

  • Where improvement actions are planned, identifies who is responsible for implementation, when the action will be implemented, and how the effectiveness of the actions will be evaluated.

 

After the RCA and action plan are accepted by JCAHO, an Official Accreditation Decision Report is issued. This report:

  • Reflects JCAHO’s determination to continue or modify the organization’s current accreditation status and to terminate the Accreditation Watch, if previously assigned; and

  • Assigns an appropriate follow-up activity, typically a written progress report or follow-up visit, to be conducted within 6 months.

Suggested Citation:"Appendix C: Examples of Federal, State, and Private Sector Reporting Systems." Institute of Medicine. 2004. Patient Safety: Achieving a New Standard for Care. Washington, DC: The National Academies Press. doi: 10.17226/10863.
×

Name of System

Sentinel Event Policya

System Owner or Manager

Joint Commission on Accreditation of Healthcare Organizations

 

Follow-up activities are conducted when the organization believes it can demonstrate effective implementation, but no later than 6 months following receipt of the Official Accreditation Decision Report.

Other information collected through the system

None.

Confidentiality issues

Handling of any submitted RCA and action plan is restricted to specially trained staff in accordance with procedures designed to protect the confidentiality of the documents. Upon completing the review of any submitted RCA and action plan and abstracting the required data elements for the Joint Commission’s Sentinel Event Database:

  • The original RCA documents will be returned to the organization and any copies will be shredded.

  • The action plan resulting from the analysis of the sentinel event will be initially retained to serve as the basis for the follow-up activity. Once the action plan has been implemented to the satisfaction of the Joint Commission, as determined through follow-up activities, the Joint Commission will return the action plan to the organization.

Relationship with other reporting systems

No direct relationships, but organizations can use other reporting and surveillance systems to facilitate their reporting to JCAHO. However, aggregate data on event characteristics, root causes, and risk reduction strategies contribute to the evidence base for publication of Sentinel Event Alert and the Joint Commission’s patient safety bulletin and for the annual JCAHO National Patient Safety Goals, which are utilized by other organizations.

Relationships with JCAHO/Medicare certification

Failure to comply with the JCAHO Sentinel Event Policy by accredited organizations can result in being placed on Accreditation Watch or having status changed to Preliminary Non-accreditation or nonaccredited.

Suggested Citation:"Appendix C: Examples of Federal, State, and Private Sector Reporting Systems." Institute of Medicine. 2004. Patient Safety: Achieving a New Standard for Care. Washington, DC: The National Academies Press. doi: 10.17226/10863.
×

REFERENCES

Agency for Healthcare Research and Quality. 2002. Patient Safety Database: Request for Proposal No. AHRQ-02-0015.


Battles, J.B., H. S. Kaplan, T. W. Van der Schaaf, and C. E. Shea. 1998. The attributes of medical event-reporting systems: Experience with a prototype medical event-reporting system for transfusion medicine. Arch Pathol Lab Med 122(3):231–238.


Callum, J. L., H. S. Kaplan, L. L. Merkley, P. H. Pinkerton, B. Rabin Fastman. R. A. Romans, A. S. Coovadia, and M. D. Reis. 2001. Reporting of near-miss events for transfusion medicine: Improving transfusion safety. Transfusion (Paris) 41(10): 1204–1211.

Centers for Disease Control and Prevention. 1999. Dialysis Surveillance Network (DSN). [Online]. Available: http://www.cdc.gov/ncidod/hip/Dialysis/dsn.htm [accessed April 10, 2002].

———. 2002. NNIS—National Nosocomial Infections Surveillance System. [Online]. Available: http://www.cdc.gov/ncidod/hip/SURVEILL/NNIS.HTM [accessed April 15, 2002].

———, Hospital Infections Program. 2000. Surveillance for Bloodstream and Vascular Access Infections in Outpatient Hemodialysis Centers: Procedure Manual. Atlanta: Public Health Service, Department of Health and Human Services.

Columbia University. 2001. Medical Event Reporting System—Transfusion Medicine. [Online]. Available: http://www.mers-tm.net/ [accessed March 20, 2002].

Cousins, D. D. 2001. Medication Errors, MedMARx and Hospitals. MedMARx: The National Database to Reduce Hospital Medication Errors (pamphlet). U.S. Pharmacopeia.


Department of Defense. 1986. Department of Defense Directive Number 6040.37.

———. 2001a. Near Miss/Adverse Events/Sentinel Event Reporting Form (unpublished).

———. 2001b. Root Cause Analysis (RCA) Form. Used with permission (as modified) from the VA National Center for Patient Safety (unpublished).

———, Aug. 16, 2001c. Department of Defense Instruction Number 6025.17.

Department of Veterans Affairs. 2001. Excerpt from VA Briefing Book on Major Quality Improvement and Evaluation Programs: The VA’s National Center for Patient Safety (NCPS) (unpublished).

———. 2002. Veterans Health Administration (VHA) National Patient Safety Improvement Handbook. Washington, DC: U.S. Department of Veterans Affairs.

Department of Veterans Affairs and National Aeronautics and Space Administration. 2000. The Patient Safety Reporting System (PSRS) (pamphlet). Moffett Field, CA: National Aeronautics and Space Administration and U.S. Department of Veterans Affairs.


Florida Health and Human Services, Agency for Health Care Administration. 2003. AHCA Risk Management. [Online]. Available: http://www.fdhc.state.fl.us/MCHQ/Health_Facility_Regulation/Risk/index.shtml [accessed July 14, 2003].

Food and Drug Administration. 1996. The Clinical Impact of Adverse Event Reporting: A MedWatch Continuing Education Article. [Online]. Available: http://www.fda.gov/medwatch/articles/medcont/medcont.htm [accessed February 26, 2002].

Suggested Citation:"Appendix C: Examples of Federal, State, and Private Sector Reporting Systems." Institute of Medicine. 2004. Patient Safety: Achieving a New Standard for Care. Washington, DC: The National Academies Press. doi: 10.17226/10863.
×

———. 1999. Vaccine Adverse Event Reporting System: Table of Reportable Events Following Vaccination. [Online]. Available: http://www.fda.gov/cber/vaers/eventtab.htm [accessed March 13, 2002].

———. 2001a. MedWatch: What Is a Serious Adverse Event? [Online]. Available: http://www.fda.gov/medwatch/report/desk/advevnt.htm [accessed September 4, 2001].

———. 2001b. Vaccine Adverse Event Reporting System. [Online]. Available: http://www.fda.gov/cber/vaers/vaers.htm [accessed September 4, 2001].

Food and Drug Administration and CODA. 2002. MedSun: Playing a Vital Role in Ensuring Medical Device Safety. [Online]. Available: https://www.medsun.net/about.html [accessed August 12, 2003].


Gaynes, R. P. 1998. Surveillance of nosocomial infections. In: D. Abrutyn, D. A. Goldmann, and W. E. Scheckler, eds. Saunders Infection Control Reference Service. Philadelphia: W. B. Saunders.

Gaynes, R. P., and T. C. Horan. 1999. Surveillance of nosocomial infections. In: C. G. Mayhall, ed. Hospital Epidemiology and Infection Control. 2nd ed. Philadelphia: Lippincott, Williams and Wilkins. Pp. 1285–1318.

Gaynes, R. P., and S. Solomon. 1996. Improving hospital-acquired infection rates: The CDC Experience. Jt Comm J Qual Improv 22(7):457–467.


Henkel, J. 1998. MedWatch: FDA’s “Heads Up” on Medical Product Safety. FDA Consumer Magazine.

Horan, T. C., T. G. Emori. 1998. Definitions of nosocomial infections. In: E. Abrutyn, D. A. Goldmann, and W. E. Scheckler, eds. Saunders Infection Control Reference Service. Philadelphia: W. B. Saunders. Pp. 308–316.


Institute of Medicine. 2000. To Err Is Human: Building a Safer Health System. L. T. Kohn, J. M. Corrigan, and M. S. Donaldson, eds. Washington, DC: National Academy Press.


Jencks, S., and S. Kellie. 2002. Personal communication: conference call on the Medicare Patient Safety Monitoring System.

Joint Commission on Accreditation of Healthcare Organizations. 2002a. Sentinel Events Main Page. [Online]. Available: http://www.jcaho.org/sentinel/sentevnt_main.html [accessed April 24, 2002].

———. 2002b. Sentinel Event Policy and Procedures. [Online]. Available: http://www.jcaho.org/sentinel/se_pp.html [accessed April 24, 2002].

———. 2002c. Understanding the 2001 Hospital Performance Report. [Online]. Available: http://www.jcaho.org/lwapps/perfrep/undrstd/hap/2001.htm [accessed May 3, 2002].


Kaplan, H. S., J. B. Battles, T. W. Van der Schaaf, C. E. Shea, and S. Q. Mercer. 1998. Identification and classification of the causes of events in transfusion medicine. Transfusion (Paris) 38(11–12):1071–1081.

Kellie, S. March 27, 2002. Personal communication to IOM Staff. E-mail regarding the Medicare Patient Safety Monitoring System: Overview, Technical Specifications, and Contact List.


National Coordinating Council for Medication Error Reporting and Prevention. 1998. NCC MERP Taxonomy of Medical Errors.

New York Patient Occurrence Reporting and Tracking System. 2001. NYPORTS User’s Manual. Version 2.1.

Suggested Citation:"Appendix C: Examples of Federal, State, and Private Sector Reporting Systems." Institute of Medicine. 2004. Patient Safety: Achieving a New Standard for Care. Washington, DC: The National Academies Press. doi: 10.17226/10863.
×

Overhage, M. 2003. Enhancing Public Health, Healthcare System, and Clinician Preparedness: Strategies to Promote Coordination and Communication. The Indiana Network for Patient Care.


Polk, A. 2002. Personal communication to IOM Staff. Conference call regarding the Agency for Health Care Administration, Florida.


Richards, C., T. G. Emori, J. Edwards, S. Fridkin, J. Tolson, and R. Gaynes. 2001. Characteristics of hospitals and infection control professionals participating in the National Nosocomial Infections Surveillance System. 1999. Am J Infect Control 29(6):400–403.

Rosenthal, J. 2003. List of States with Mandatory Reporting Systems. Personal communication to Institute of Medicine’s Committee on Data Standards for Patient Safety.

Rosenthal, J., T. Riley, and M. Booth. 2000. State Reporting of Medical Errors and Adverse Events: Results of a 50-State Survey. Portland, ME: National Academy for State Health Policy.

Rosenthal, J., M. Booth, L. Flowers, and T. Riley. 2001. Current State Programs Addressing Medical Errors: An Analysis of Mandatory Reporting and Other Initiatives. Portland, ME: National Academy for State Health Policy.


Schyve, P. 2002. Personal communication to IOM Staff. Joint Commission on Accreditation of Healthcare Organizations. Conference call regarding JCAHO Sentinel Event Policy.

Stewart, F. February 20, 2002a. Personal communication to IOM Staff. Conference call regarding the IOM study on Data Standards for Patient Safety.

———. April 12, 2002b. Personal communication to IOM Staff. MHS Patient Safety System. Conference call regarding the DoD patient safety reporting system in development.


The Kevric Company. 2003. National Patient Safety Database Project: Coding & Classification Report. Silver Spring: The Kevric Company, Inc.


U.S. Code. Oct. 7, 1980. Title 38—Veterans’ Benefits. Sec. 5705—Confidentiality of Medical Quality-Assurance Records.

———. Nov. 14, 1986. Title 10—Armed Forces. Sec. 1102—Confidentiality of Medical Quality Assurance Records: Qualified Immunity for Participants.

U.S. Pharmacopeia. 1997. USP Medication Errors Reporting Program (Reporting Form). [Online]. Available: http://www.usp.org/reporting/medform.pdf [accessed January 30, 2002].

———. 2001. Practitioner Reporting: Medication Errors Reporting (MER) Program. [Online]. Available: http://www.usp.org/reporting/mer.htm [accessed October 12, 2001].


Westat. 2001. MERS-TM: Medical Event Reporting System for Transfusion Medicine Reference Manual. In support of Columbia University under a grant from the National Heart, Lung, and Blood Institute, National Institutes of Health (Grant RO1 HL53772, Harold S. Kaplan, M.D., Principal Investigator). Version 3.0. New York: Trustees of Columbia University.

Suggested Citation:"Appendix C: Examples of Federal, State, and Private Sector Reporting Systems." Institute of Medicine. 2004. Patient Safety: Achieving a New Standard for Care. Washington, DC: The National Academies Press. doi: 10.17226/10863.
×
Page 341
Suggested Citation:"Appendix C: Examples of Federal, State, and Private Sector Reporting Systems." Institute of Medicine. 2004. Patient Safety: Achieving a New Standard for Care. Washington, DC: The National Academies Press. doi: 10.17226/10863.
×
Page 342
Suggested Citation:"Appendix C: Examples of Federal, State, and Private Sector Reporting Systems." Institute of Medicine. 2004. Patient Safety: Achieving a New Standard for Care. Washington, DC: The National Academies Press. doi: 10.17226/10863.
×
Page 343
Suggested Citation:"Appendix C: Examples of Federal, State, and Private Sector Reporting Systems." Institute of Medicine. 2004. Patient Safety: Achieving a New Standard for Care. Washington, DC: The National Academies Press. doi: 10.17226/10863.
×
Page 344
Suggested Citation:"Appendix C: Examples of Federal, State, and Private Sector Reporting Systems." Institute of Medicine. 2004. Patient Safety: Achieving a New Standard for Care. Washington, DC: The National Academies Press. doi: 10.17226/10863.
×
Page 345
Suggested Citation:"Appendix C: Examples of Federal, State, and Private Sector Reporting Systems." Institute of Medicine. 2004. Patient Safety: Achieving a New Standard for Care. Washington, DC: The National Academies Press. doi: 10.17226/10863.
×
Page 346
Suggested Citation:"Appendix C: Examples of Federal, State, and Private Sector Reporting Systems." Institute of Medicine. 2004. Patient Safety: Achieving a New Standard for Care. Washington, DC: The National Academies Press. doi: 10.17226/10863.
×
Page 347
Suggested Citation:"Appendix C: Examples of Federal, State, and Private Sector Reporting Systems." Institute of Medicine. 2004. Patient Safety: Achieving a New Standard for Care. Washington, DC: The National Academies Press. doi: 10.17226/10863.
×
Page 348
Suggested Citation:"Appendix C: Examples of Federal, State, and Private Sector Reporting Systems." Institute of Medicine. 2004. Patient Safety: Achieving a New Standard for Care. Washington, DC: The National Academies Press. doi: 10.17226/10863.
×
Page 349
Suggested Citation:"Appendix C: Examples of Federal, State, and Private Sector Reporting Systems." Institute of Medicine. 2004. Patient Safety: Achieving a New Standard for Care. Washington, DC: The National Academies Press. doi: 10.17226/10863.
×
Page 350
Suggested Citation:"Appendix C: Examples of Federal, State, and Private Sector Reporting Systems." Institute of Medicine. 2004. Patient Safety: Achieving a New Standard for Care. Washington, DC: The National Academies Press. doi: 10.17226/10863.
×
Page 351
Suggested Citation:"Appendix C: Examples of Federal, State, and Private Sector Reporting Systems." Institute of Medicine. 2004. Patient Safety: Achieving a New Standard for Care. Washington, DC: The National Academies Press. doi: 10.17226/10863.
×
Page 352
Suggested Citation:"Appendix C: Examples of Federal, State, and Private Sector Reporting Systems." Institute of Medicine. 2004. Patient Safety: Achieving a New Standard for Care. Washington, DC: The National Academies Press. doi: 10.17226/10863.
×
Page 353
Suggested Citation:"Appendix C: Examples of Federal, State, and Private Sector Reporting Systems." Institute of Medicine. 2004. Patient Safety: Achieving a New Standard for Care. Washington, DC: The National Academies Press. doi: 10.17226/10863.
×
Page 354
Suggested Citation:"Appendix C: Examples of Federal, State, and Private Sector Reporting Systems." Institute of Medicine. 2004. Patient Safety: Achieving a New Standard for Care. Washington, DC: The National Academies Press. doi: 10.17226/10863.
×
Page 355
Suggested Citation:"Appendix C: Examples of Federal, State, and Private Sector Reporting Systems." Institute of Medicine. 2004. Patient Safety: Achieving a New Standard for Care. Washington, DC: The National Academies Press. doi: 10.17226/10863.
×
Page 356
Suggested Citation:"Appendix C: Examples of Federal, State, and Private Sector Reporting Systems." Institute of Medicine. 2004. Patient Safety: Achieving a New Standard for Care. Washington, DC: The National Academies Press. doi: 10.17226/10863.
×
Page 357
Suggested Citation:"Appendix C: Examples of Federal, State, and Private Sector Reporting Systems." Institute of Medicine. 2004. Patient Safety: Achieving a New Standard for Care. Washington, DC: The National Academies Press. doi: 10.17226/10863.
×
Page 358
Suggested Citation:"Appendix C: Examples of Federal, State, and Private Sector Reporting Systems." Institute of Medicine. 2004. Patient Safety: Achieving a New Standard for Care. Washington, DC: The National Academies Press. doi: 10.17226/10863.
×
Page 359
Suggested Citation:"Appendix C: Examples of Federal, State, and Private Sector Reporting Systems." Institute of Medicine. 2004. Patient Safety: Achieving a New Standard for Care. Washington, DC: The National Academies Press. doi: 10.17226/10863.
×
Page 360
Suggested Citation:"Appendix C: Examples of Federal, State, and Private Sector Reporting Systems." Institute of Medicine. 2004. Patient Safety: Achieving a New Standard for Care. Washington, DC: The National Academies Press. doi: 10.17226/10863.
×
Page 361
Suggested Citation:"Appendix C: Examples of Federal, State, and Private Sector Reporting Systems." Institute of Medicine. 2004. Patient Safety: Achieving a New Standard for Care. Washington, DC: The National Academies Press. doi: 10.17226/10863.
×
Page 362
Suggested Citation:"Appendix C: Examples of Federal, State, and Private Sector Reporting Systems." Institute of Medicine. 2004. Patient Safety: Achieving a New Standard for Care. Washington, DC: The National Academies Press. doi: 10.17226/10863.
×
Page 363
Suggested Citation:"Appendix C: Examples of Federal, State, and Private Sector Reporting Systems." Institute of Medicine. 2004. Patient Safety: Achieving a New Standard for Care. Washington, DC: The National Academies Press. doi: 10.17226/10863.
×
Page 364
Suggested Citation:"Appendix C: Examples of Federal, State, and Private Sector Reporting Systems." Institute of Medicine. 2004. Patient Safety: Achieving a New Standard for Care. Washington, DC: The National Academies Press. doi: 10.17226/10863.
×
Page 365
Suggested Citation:"Appendix C: Examples of Federal, State, and Private Sector Reporting Systems." Institute of Medicine. 2004. Patient Safety: Achieving a New Standard for Care. Washington, DC: The National Academies Press. doi: 10.17226/10863.
×
Page 366
Suggested Citation:"Appendix C: Examples of Federal, State, and Private Sector Reporting Systems." Institute of Medicine. 2004. Patient Safety: Achieving a New Standard for Care. Washington, DC: The National Academies Press. doi: 10.17226/10863.
×
Page 367
Suggested Citation:"Appendix C: Examples of Federal, State, and Private Sector Reporting Systems." Institute of Medicine. 2004. Patient Safety: Achieving a New Standard for Care. Washington, DC: The National Academies Press. doi: 10.17226/10863.
×
Page 368
Suggested Citation:"Appendix C: Examples of Federal, State, and Private Sector Reporting Systems." Institute of Medicine. 2004. Patient Safety: Achieving a New Standard for Care. Washington, DC: The National Academies Press. doi: 10.17226/10863.
×
Page 369
Suggested Citation:"Appendix C: Examples of Federal, State, and Private Sector Reporting Systems." Institute of Medicine. 2004. Patient Safety: Achieving a New Standard for Care. Washington, DC: The National Academies Press. doi: 10.17226/10863.
×
Page 370
Suggested Citation:"Appendix C: Examples of Federal, State, and Private Sector Reporting Systems." Institute of Medicine. 2004. Patient Safety: Achieving a New Standard for Care. Washington, DC: The National Academies Press. doi: 10.17226/10863.
×
Page 371
Suggested Citation:"Appendix C: Examples of Federal, State, and Private Sector Reporting Systems." Institute of Medicine. 2004. Patient Safety: Achieving a New Standard for Care. Washington, DC: The National Academies Press. doi: 10.17226/10863.
×
Page 372
Suggested Citation:"Appendix C: Examples of Federal, State, and Private Sector Reporting Systems." Institute of Medicine. 2004. Patient Safety: Achieving a New Standard for Care. Washington, DC: The National Academies Press. doi: 10.17226/10863.
×
Page 373
Suggested Citation:"Appendix C: Examples of Federal, State, and Private Sector Reporting Systems." Institute of Medicine. 2004. Patient Safety: Achieving a New Standard for Care. Washington, DC: The National Academies Press. doi: 10.17226/10863.
×
Page 374
Suggested Citation:"Appendix C: Examples of Federal, State, and Private Sector Reporting Systems." Institute of Medicine. 2004. Patient Safety: Achieving a New Standard for Care. Washington, DC: The National Academies Press. doi: 10.17226/10863.
×
Page 375
Suggested Citation:"Appendix C: Examples of Federal, State, and Private Sector Reporting Systems." Institute of Medicine. 2004. Patient Safety: Achieving a New Standard for Care. Washington, DC: The National Academies Press. doi: 10.17226/10863.
×
Page 376
Suggested Citation:"Appendix C: Examples of Federal, State, and Private Sector Reporting Systems." Institute of Medicine. 2004. Patient Safety: Achieving a New Standard for Care. Washington, DC: The National Academies Press. doi: 10.17226/10863.
×
Page 377
Suggested Citation:"Appendix C: Examples of Federal, State, and Private Sector Reporting Systems." Institute of Medicine. 2004. Patient Safety: Achieving a New Standard for Care. Washington, DC: The National Academies Press. doi: 10.17226/10863.
×
Page 378
Suggested Citation:"Appendix C: Examples of Federal, State, and Private Sector Reporting Systems." Institute of Medicine. 2004. Patient Safety: Achieving a New Standard for Care. Washington, DC: The National Academies Press. doi: 10.17226/10863.
×
Page 379
Suggested Citation:"Appendix C: Examples of Federal, State, and Private Sector Reporting Systems." Institute of Medicine. 2004. Patient Safety: Achieving a New Standard for Care. Washington, DC: The National Academies Press. doi: 10.17226/10863.
×
Page 380
Suggested Citation:"Appendix C: Examples of Federal, State, and Private Sector Reporting Systems." Institute of Medicine. 2004. Patient Safety: Achieving a New Standard for Care. Washington, DC: The National Academies Press. doi: 10.17226/10863.
×
Page 381
Suggested Citation:"Appendix C: Examples of Federal, State, and Private Sector Reporting Systems." Institute of Medicine. 2004. Patient Safety: Achieving a New Standard for Care. Washington, DC: The National Academies Press. doi: 10.17226/10863.
×
Page 382
Suggested Citation:"Appendix C: Examples of Federal, State, and Private Sector Reporting Systems." Institute of Medicine. 2004. Patient Safety: Achieving a New Standard for Care. Washington, DC: The National Academies Press. doi: 10.17226/10863.
×
Page 383
Suggested Citation:"Appendix C: Examples of Federal, State, and Private Sector Reporting Systems." Institute of Medicine. 2004. Patient Safety: Achieving a New Standard for Care. Washington, DC: The National Academies Press. doi: 10.17226/10863.
×
Page 384
Suggested Citation:"Appendix C: Examples of Federal, State, and Private Sector Reporting Systems." Institute of Medicine. 2004. Patient Safety: Achieving a New Standard for Care. Washington, DC: The National Academies Press. doi: 10.17226/10863.
×
Page 385
Suggested Citation:"Appendix C: Examples of Federal, State, and Private Sector Reporting Systems." Institute of Medicine. 2004. Patient Safety: Achieving a New Standard for Care. Washington, DC: The National Academies Press. doi: 10.17226/10863.
×
Page 386
Suggested Citation:"Appendix C: Examples of Federal, State, and Private Sector Reporting Systems." Institute of Medicine. 2004. Patient Safety: Achieving a New Standard for Care. Washington, DC: The National Academies Press. doi: 10.17226/10863.
×
Page 387
Suggested Citation:"Appendix C: Examples of Federal, State, and Private Sector Reporting Systems." Institute of Medicine. 2004. Patient Safety: Achieving a New Standard for Care. Washington, DC: The National Academies Press. doi: 10.17226/10863.
×
Page 388
Suggested Citation:"Appendix C: Examples of Federal, State, and Private Sector Reporting Systems." Institute of Medicine. 2004. Patient Safety: Achieving a New Standard for Care. Washington, DC: The National Academies Press. doi: 10.17226/10863.
×
Page 389
Suggested Citation:"Appendix C: Examples of Federal, State, and Private Sector Reporting Systems." Institute of Medicine. 2004. Patient Safety: Achieving a New Standard for Care. Washington, DC: The National Academies Press. doi: 10.17226/10863.
×
Page 390
Suggested Citation:"Appendix C: Examples of Federal, State, and Private Sector Reporting Systems." Institute of Medicine. 2004. Patient Safety: Achieving a New Standard for Care. Washington, DC: The National Academies Press. doi: 10.17226/10863.
×
Page 391
Suggested Citation:"Appendix C: Examples of Federal, State, and Private Sector Reporting Systems." Institute of Medicine. 2004. Patient Safety: Achieving a New Standard for Care. Washington, DC: The National Academies Press. doi: 10.17226/10863.
×
Page 392
Suggested Citation:"Appendix C: Examples of Federal, State, and Private Sector Reporting Systems." Institute of Medicine. 2004. Patient Safety: Achieving a New Standard for Care. Washington, DC: The National Academies Press. doi: 10.17226/10863.
×
Page 393
Suggested Citation:"Appendix C: Examples of Federal, State, and Private Sector Reporting Systems." Institute of Medicine. 2004. Patient Safety: Achieving a New Standard for Care. Washington, DC: The National Academies Press. doi: 10.17226/10863.
×
Page 394
Suggested Citation:"Appendix C: Examples of Federal, State, and Private Sector Reporting Systems." Institute of Medicine. 2004. Patient Safety: Achieving a New Standard for Care. Washington, DC: The National Academies Press. doi: 10.17226/10863.
×
Page 395
Suggested Citation:"Appendix C: Examples of Federal, State, and Private Sector Reporting Systems." Institute of Medicine. 2004. Patient Safety: Achieving a New Standard for Care. Washington, DC: The National Academies Press. doi: 10.17226/10863.
×
Page 396
Suggested Citation:"Appendix C: Examples of Federal, State, and Private Sector Reporting Systems." Institute of Medicine. 2004. Patient Safety: Achieving a New Standard for Care. Washington, DC: The National Academies Press. doi: 10.17226/10863.
×
Page 397
Suggested Citation:"Appendix C: Examples of Federal, State, and Private Sector Reporting Systems." Institute of Medicine. 2004. Patient Safety: Achieving a New Standard for Care. Washington, DC: The National Academies Press. doi: 10.17226/10863.
×
Page 398
Suggested Citation:"Appendix C: Examples of Federal, State, and Private Sector Reporting Systems." Institute of Medicine. 2004. Patient Safety: Achieving a New Standard for Care. Washington, DC: The National Academies Press. doi: 10.17226/10863.
×
Page 399
Suggested Citation:"Appendix C: Examples of Federal, State, and Private Sector Reporting Systems." Institute of Medicine. 2004. Patient Safety: Achieving a New Standard for Care. Washington, DC: The National Academies Press. doi: 10.17226/10863.
×
Page 400
Suggested Citation:"Appendix C: Examples of Federal, State, and Private Sector Reporting Systems." Institute of Medicine. 2004. Patient Safety: Achieving a New Standard for Care. Washington, DC: The National Academies Press. doi: 10.17226/10863.
×
Page 401
Suggested Citation:"Appendix C: Examples of Federal, State, and Private Sector Reporting Systems." Institute of Medicine. 2004. Patient Safety: Achieving a New Standard for Care. Washington, DC: The National Academies Press. doi: 10.17226/10863.
×
Page 402
Suggested Citation:"Appendix C: Examples of Federal, State, and Private Sector Reporting Systems." Institute of Medicine. 2004. Patient Safety: Achieving a New Standard for Care. Washington, DC: The National Academies Press. doi: 10.17226/10863.
×
Page 403
Suggested Citation:"Appendix C: Examples of Federal, State, and Private Sector Reporting Systems." Institute of Medicine. 2004. Patient Safety: Achieving a New Standard for Care. Washington, DC: The National Academies Press. doi: 10.17226/10863.
×
Page 404
Suggested Citation:"Appendix C: Examples of Federal, State, and Private Sector Reporting Systems." Institute of Medicine. 2004. Patient Safety: Achieving a New Standard for Care. Washington, DC: The National Academies Press. doi: 10.17226/10863.
×
Page 405
Suggested Citation:"Appendix C: Examples of Federal, State, and Private Sector Reporting Systems." Institute of Medicine. 2004. Patient Safety: Achieving a New Standard for Care. Washington, DC: The National Academies Press. doi: 10.17226/10863.
×
Page 406
Suggested Citation:"Appendix C: Examples of Federal, State, and Private Sector Reporting Systems." Institute of Medicine. 2004. Patient Safety: Achieving a New Standard for Care. Washington, DC: The National Academies Press. doi: 10.17226/10863.
×
Page 407
Suggested Citation:"Appendix C: Examples of Federal, State, and Private Sector Reporting Systems." Institute of Medicine. 2004. Patient Safety: Achieving a New Standard for Care. Washington, DC: The National Academies Press. doi: 10.17226/10863.
×
Page 408
Suggested Citation:"Appendix C: Examples of Federal, State, and Private Sector Reporting Systems." Institute of Medicine. 2004. Patient Safety: Achieving a New Standard for Care. Washington, DC: The National Academies Press. doi: 10.17226/10863.
×
Page 409
Suggested Citation:"Appendix C: Examples of Federal, State, and Private Sector Reporting Systems." Institute of Medicine. 2004. Patient Safety: Achieving a New Standard for Care. Washington, DC: The National Academies Press. doi: 10.17226/10863.
×
Page 410
Suggested Citation:"Appendix C: Examples of Federal, State, and Private Sector Reporting Systems." Institute of Medicine. 2004. Patient Safety: Achieving a New Standard for Care. Washington, DC: The National Academies Press. doi: 10.17226/10863.
×
Page 411
Suggested Citation:"Appendix C: Examples of Federal, State, and Private Sector Reporting Systems." Institute of Medicine. 2004. Patient Safety: Achieving a New Standard for Care. Washington, DC: The National Academies Press. doi: 10.17226/10863.
×
Page 412
Suggested Citation:"Appendix C: Examples of Federal, State, and Private Sector Reporting Systems." Institute of Medicine. 2004. Patient Safety: Achieving a New Standard for Care. Washington, DC: The National Academies Press. doi: 10.17226/10863.
×
Page 413
Suggested Citation:"Appendix C: Examples of Federal, State, and Private Sector Reporting Systems." Institute of Medicine. 2004. Patient Safety: Achieving a New Standard for Care. Washington, DC: The National Academies Press. doi: 10.17226/10863.
×
Page 414
Suggested Citation:"Appendix C: Examples of Federal, State, and Private Sector Reporting Systems." Institute of Medicine. 2004. Patient Safety: Achieving a New Standard for Care. Washington, DC: The National Academies Press. doi: 10.17226/10863.
×
Page 415
Suggested Citation:"Appendix C: Examples of Federal, State, and Private Sector Reporting Systems." Institute of Medicine. 2004. Patient Safety: Achieving a New Standard for Care. Washington, DC: The National Academies Press. doi: 10.17226/10863.
×
Page 416
Suggested Citation:"Appendix C: Examples of Federal, State, and Private Sector Reporting Systems." Institute of Medicine. 2004. Patient Safety: Achieving a New Standard for Care. Washington, DC: The National Academies Press. doi: 10.17226/10863.
×
Page 417
Suggested Citation:"Appendix C: Examples of Federal, State, and Private Sector Reporting Systems." Institute of Medicine. 2004. Patient Safety: Achieving a New Standard for Care. Washington, DC: The National Academies Press. doi: 10.17226/10863.
×
Page 418
Suggested Citation:"Appendix C: Examples of Federal, State, and Private Sector Reporting Systems." Institute of Medicine. 2004. Patient Safety: Achieving a New Standard for Care. Washington, DC: The National Academies Press. doi: 10.17226/10863.
×
Page 419
Suggested Citation:"Appendix C: Examples of Federal, State, and Private Sector Reporting Systems." Institute of Medicine. 2004. Patient Safety: Achieving a New Standard for Care. Washington, DC: The National Academies Press. doi: 10.17226/10863.
×
Page 420
Suggested Citation:"Appendix C: Examples of Federal, State, and Private Sector Reporting Systems." Institute of Medicine. 2004. Patient Safety: Achieving a New Standard for Care. Washington, DC: The National Academies Press. doi: 10.17226/10863.
×
Page 421
Suggested Citation:"Appendix C: Examples of Federal, State, and Private Sector Reporting Systems." Institute of Medicine. 2004. Patient Safety: Achieving a New Standard for Care. Washington, DC: The National Academies Press. doi: 10.17226/10863.
×
Page 422
Suggested Citation:"Appendix C: Examples of Federal, State, and Private Sector Reporting Systems." Institute of Medicine. 2004. Patient Safety: Achieving a New Standard for Care. Washington, DC: The National Academies Press. doi: 10.17226/10863.
×
Page 423
Suggested Citation:"Appendix C: Examples of Federal, State, and Private Sector Reporting Systems." Institute of Medicine. 2004. Patient Safety: Achieving a New Standard for Care. Washington, DC: The National Academies Press. doi: 10.17226/10863.
×
Page 424
Suggested Citation:"Appendix C: Examples of Federal, State, and Private Sector Reporting Systems." Institute of Medicine. 2004. Patient Safety: Achieving a New Standard for Care. Washington, DC: The National Academies Press. doi: 10.17226/10863.
×
Page 425
Suggested Citation:"Appendix C: Examples of Federal, State, and Private Sector Reporting Systems." Institute of Medicine. 2004. Patient Safety: Achieving a New Standard for Care. Washington, DC: The National Academies Press. doi: 10.17226/10863.
×
Page 426
Next: Appendix D: Clinical Domains for Patient Safety »
Patient Safety: Achieving a New Standard for Care Get This Book
×
Buy Hardback | $54.95
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF

Americans should be able to count on receiving health care that is safe.

To achieve this, a new health care delivery system is needed — a system that both prevents errors from occurring, and learns from them when they do occur. The development of such a system requires a commitment by all stakeholders to a culture of safety and to the development of improved information systems for the delivery of health care. This national health information infrastructure is needed to provide immediate access to complete patient information and decision-support tools for clinicians and their patients. In addition, this infrastructure must capture patient safety information as a by-product of care and use this information to design even safer delivery systems. Health data standards are both a critical and time-sensitive building block of the national health information infrastructure.

Building on the Institute of Medicine reports To Err Is Human and Crossing the Quality Chasm, Patient Safety puts forward a road map for the development and adoption of key health care data standards to support both information exchange and the reporting and analysis of patient safety data.

  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    Switch between the Original Pages, where you can read the report as it appeared in print, and Text Pages for the web version, where you can highlight and search the text.

    « Back Next »
  6. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  7. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  8. ×

    View our suggested citation for this chapter.

    « Back Next »
  9. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!