National Academies Press: OpenBook
« Previous: 3 Conceptual Framework for Measuring the Impact of IPE
Suggested Citation:"4 Strengthening the Evidence Base." Institute of Medicine. 2015. Measuring the Impact of Interprofessional Education on Collaborative Practice and Patient Outcomes. Washington, DC: The National Academies Press. doi: 10.17226/21726.
×

4

Strengthening the Evidence Base

Over the past few years, a growing body of work has shown that interprofessional education (IPE) can improve learners’ perceptions of interprofessional practice and enhance collaborative knowledge and skills (IOM, 2010; Paradis and Reeves, 2013; Reeves et al., 2011; Remington et al., 2006; Stone, 2006; Thistlethwaite, 2012; Zwarenstein et al., 2009). In contrast, establishing a direct cause-and-effect relationship between IPE and patient, population, and system outcomes has proven more difficult (Brashers et al., 2001; see also Appendixes A and B). It should be emphasized, however, that the evidence directly linking any health professions education intervention with individual, population, and system outcomes is far from convincing (Chen et al., 2004; Forsetlund et al., 2009; Lowrie et al., 2014; Marinopoulos et al., 2007; Swing, 2007).

The lack of a well-established causal relationship between IPE and health and system outcomes is due in part to the complexity of the environment in which education interventions are conducted. Generating evidence is difficult even in well-resourced settings; it is even more difficult in parts of the world with fewer research and data resources (Price, 2005; Weaver et al., 2011). The lack of alignment between education and practice (see Chapter 2), the lack of a commonly agreed-upon taxonomy and conceptual model linking education interventions to specific outcomes (see Chapter 3), and the

“The lack of a well-established causal relationship between IPE and health and system outcomes is due in part to the complexity of the environment in which education interventions are conducted.”

Suggested Citation:"4 Strengthening the Evidence Base." Institute of Medicine. 2015. Measuring the Impact of Interprofessional Education on Collaborative Practice and Patient Outcomes. Washington, DC: The National Academies Press. doi: 10.17226/21726.
×

relatively long lag time between education interventions and health and system outcomes are major reasons for the paucity of convincing evidence. Other factors include the existence of multiple and often opaque payment structures and a plethora of confounding variables. At the same time, inconsistencies in study designs and methods and a lack of full reporting on the methods employed limit the applicability and generalizability of many research findings (Abu-Rish et al., 2012; Cooper et al., 2001; Olson and Bialocerkowski, 2014; Reeves et al., 2011, 2013; Remington et al., 2006; Salas et al., 2008a; Weaver et al., 2010; Zwarenstein et al., 2009).

With these considerations in mind, the committee commissioned a paper to examine the most current literature linking IPE to health and system outcomes (see Appendix A). Brashers and colleagues explored the challenges of conducting high-quality research in this area, focusing on papers contained in a Cochrane review (Reeves et al., 2013) and studies published between January 2011 and July 2014. After examining more than 2,000 abstracts, they identified 39 studies that met their inclusion criteria, including the 15 studies initially identified in the 2013 Cochrane review. To supplement this work, a group of committee members examined reviews published after a prior article considering the “meta-evidence” for the effects of IPE on patient, population, and system outcomes (Reeves et al., 2010; see Appendix B). They searched PubMed for reviews published from 2010 to 2014 and identified 16 reviews, 8 of which met their inclusion criteria. This chapter draws heavily on the evidence detailed in both of these background papers.

METHODOLOGICAL CHALLENGES

Quantitative, experimental study designs may have limited utility for measuring the effects of IPE on individual, population, and system outcomes. For example, while the committee does not dispute the value of designs such as randomized controlled trials (RCTs) in supporting causal inference, this method has certain limitations for studying the impact of education interventions in general and IPE in particular. Some of these constraints are mentioned by Brashers and colleagues in their background paper (see Appendix A) and are addressed in more detail by Sullivan (2011). In essence, any tightly controlled study design presents challenges for use in studying IPE because the environments in which IPE occurs are highly variable and complex, and the selection of meaningful control groups is problematic (Reeves et al., 2013). Ideally, the control group would receive the same education as the intervention group, but in a uniprofessional manner (Reeves et al., 2009); however, this is rarely feasible.

Table 4-1 contrasts a variety of quantitative study designs with a mixed-methods approach, showing the strengths and limitations of each (Reeves et

Suggested Citation:"4 Strengthening the Evidence Base." Institute of Medicine. 2015. Measuring the Impact of Interprofessional Education on Collaborative Practice and Patient Outcomes. Washington, DC: The National Academies Press. doi: 10.17226/21726.
×

TABLE 4-1 Types of Evaluation Design

Qualitative

Design Type Description Strengths Limitations
Ethnography This approach entails studying the nature of social interactions, behaviors, and perceptions that occur within teams, organizations, networks, and communities. The central aim of ethnography is to provide rich, holistic insights into people’s views and actions, as well as the nature of the location they inhabit, through the collection of detailed observations and interviews. Generates detailed accounts of actual interactive processes from observational work Time-consuming and expensive
Grounded theory This approach is used to explore social processes that present within human interactions. Grounded theory differs from other approaches in that its primary purpose is to develop a theory about dominant social processes rather than to describe particular phenomena. Researchers develop explanations of key social processes that are grounded in or derived from the data. Provides rich data; can generate new theoretical insight Development of “micro” theories with limited generalizability
Phenomenology Phenomenology allows for the exploration and description of phenomena important to the developers of or participants in an activity. The goal is to describe lived experience. Phenomenology is therefore the study of “essences.” Provides rich and detailed descriptions of human lived experience Focus on a very small number of individuals can generate concerns about limited transferability of findings
Suggested Citation:"4 Strengthening the Evidence Base." Institute of Medicine. 2015. Measuring the Impact of Interprofessional Education on Collaborative Practice and Patient Outcomes. Washington, DC: The National Academies Press. doi: 10.17226/21726.
×
Design Type Description Strengths Limitations
Action research This approach is known by various names, including “cooperative learning,” “participatory action research,” and “collaborative research.” The research is focused on people involved in a process of change that is the result of a professional, organizational, or community activity. It adopts a more collaborative approach than the designs described above, whereby evaluators play a key role with participants in the processes of planning, implementing, and evaluating the change linked to an activity. Empowers research participants to make changes in practice Difficult and time-consuming; typically smaller-scale methods (single case study)

Qualitative

Design Type Description Strengths Limitations
Randomized controlled trials (RCTs) In this type of design, participants are randomly selected for inclusion in either intervention or control groups. RCTs can provide a rigorous understanding of causality. Randomization of individuals reduces bias related to selection or recruitment Findings are difficult to generalize to those who do not meet the selection criteria (subjects do not represent the larger population)
Controlled before-and-after studies The approach is similar to an RCT design, but does not entail randomizing who receives the intervention. Can robustly measure change, but lacks rigor because of the lack of randomization Cannot be used to evaluate whether reported outcomes are sustained over time
Suggested Citation:"4 Strengthening the Evidence Base." Institute of Medicine. 2015. Measuring the Impact of Interprofessional Education on Collaborative Practice and Patient Outcomes. Washington, DC: The National Academies Press. doi: 10.17226/21726.
×
Design Type Description Strengths Limitations
Interrupted time series studies This nonrandomized design uses multiple measurements before and after an intervention to determine whether it has an effect that is greater than the underlying trend. This design usually requires multiple time points before the intervention to identify any underlying trends or cyclical phenomena, and multiple points after the intervention to determine whether there has been any change in the trend measured previously. Allows for statistical investigation of potential biases in estimates of the effect of the intervention; strengthens before-and-after designs (measuring multiple time periods) Does not control for outside influences on outcomes; also difficult to undertake in settings where routine outcome data are not collected
Before-and-after studies This is a nonrandomized design in which the evaluator collects data before and after an intervention through the use of surveys. Helps detect changes resulting from the intervention as data are collected at two points in time: before and after the intervention Difficult to detect accurately whether any change is attributable to the intervention or another confounding influence

Mixed Methods

Design Type Description Strengths Limitations
Mixed methods These designs entail gathering different types of quantitative and qualitative data (e.g., from surveys, interviews, documents, observations) to provide a detailed understanding of processes and outcomes. There are two main types: sequential (where data are gathered and analyzed in different stages) and convergent (where data are combined together). Triangulation of quantitative and qualitative data can help generate more insightful findings Combining different data sets when using a convergent design is methodologically challenging

SOURCE: Adapted from Reeves et al., 2015. For more information, visit http://tandfonline.com/loi/ijic.

Suggested Citation:"4 Strengthening the Evidence Base." Institute of Medicine. 2015. Measuring the Impact of Interprofessional Education on Collaborative Practice and Patient Outcomes. Washington, DC: The National Academies Press. doi: 10.17226/21726.
×

al., 2015). However, relatively few studies of IPE have employed qualitative designs or realist approaches to address important contextual issues and confounding factors or variables. While quantitative outcomes are important, such studies can describe only what has occurred; they cannot provide an empirical account of how or why the outcomes were produced. A mixed-methods approach that combines qualitative and quantitative outcomes (see Chapter 5) can offer much more nuanced explanations of IPE interventions.

“Relatively few studies of IPE have employed qualitative designs or realist approaches to address important contextual issues and confounding factors or variables.”

Well-designed IPE studies may also be cost-prohibitive (Sullivan, 2011; Swing, 2007). Cost is believed to be the main reason behind the particularly scarce evidence for the effectiveness of IPE in developing countries, although insufficient curriculum integration and a lack of strong leadership may also pose significant challenges (Reed et al., 2005; Sunguya et al., 2014). As a result, the World Health Organization (WHO) has promoted IPE in developing countries based on evidence derived from developed countries; however, the transferability of this evidence may be suspect given the significant differences in their education and health systems. Even in developed countries, moreover, limited resources for studying the impacts of education have affected how IPE studies are conducted.

AREAS OF NEED

The committee identified four major areas of need in which research could begin to establish a more direct rigorous relationship between IPE and individual, population, and system outcomes: (1) constructing well-designed mixed-methods studies that utilize robust qualitative data as well as validated tools for evaluating IPE outcomes, (2) developing a consistent framework for reporting the methodological details of IPE studies, (3) examining the cost and cost-effectiveness of IPE interventions, and (4) linking IPE with changes in collaborative behavior.

Constructing Well-Designed Studies

Study designs in IPE research have improved progressively over the past decade. As with many of the studies in health professions education, however, a considerable number of IPE studies continue to have methodological limitations. All the reviews discussed in Appendix B cite design or methodological weaknesses in the included studies. A number of studies offer only limited or partial descriptions of the interventions. Moreover, many

Suggested Citation:"4 Strengthening the Evidence Base." Institute of Medicine. 2015. Measuring the Impact of Interprofessional Education on Collaborative Practice and Patient Outcomes. Washington, DC: The National Academies Press. doi: 10.17226/21726.
×

studies provide little discussion of the methodological limitations of this work. Efforts to detect changes in collaborative behavior are particularly poor, often relying on self-reports by learners themselves (Reeves, 2010).

Vocabulary

The inconsistent vocabulary used to describe collaborative work and its associated learning activities and outcomes is a major problem. Use of particular terms is strongly influenced by funding agencies as grant seekers work to match their words and phrasing with that of the funding organizations, and this is one reason for the varied taxonomy currently in use. More than 20 years ago, Leathard (1994) noted the confused terminology in the IPE literature. She pointed out that while the terms “interdisciplinary” and “interprofesssional” are conceptually distinct, it was not uncommon for them to be used interchangeably. Similar findings continue to be reported (e.g., Thannhauser et al., 2010). Such inconsistency in terminology or vocabulary confounds the search for standard research instruments and relevant published articles.

More recently, Paradis and Reeves (2013) analyzed the literature to evaluate trends in the use of interprofessional-related language in article titles. Employing the search terms “interprofessional,” “multiprofessional,” “multidisciplinar,” “interdisciplinar,” “transprofessional,” and “transdisciplinary,” their query yielded 100,488 articles published between 1970 and 2010. The authors found decreasing use of the terms “multidisciplinary/multidisciplinarity” and “interdisciplinary/interdisciplinarity” since the 1980s, while “interprofessional” grew in popularity starting in the 1990s and has remained the dominant term. They also found that “multiprofessional,” “transprofessional,” and “transdisciplinary” were never widely used.

Reference Models

The lack of a widely accepted model for describing IPE and its associated learning activities and outcomes is another major problem. Studies rarely are based on an explicit conceptual model, and their design and execution suffer as a result. Moreover, the lack of a standard model hinders comparisons among studies and greatly increases the risk entailed in generalizing results across different environments. This issue is discussed in greater detail in Chapter 3.

Measurement Instruments

In their concept analysis, Olenick and colleagues (2010) explore attributes and characteristics of IPE, which they describe as a “complex concept”

Suggested Citation:"4 Strengthening the Evidence Base." Institute of Medicine. 2015. Measuring the Impact of Interprofessional Education on Collaborative Practice and Patient Outcomes. Washington, DC: The National Academies Press. doi: 10.17226/21726.
×

that would benefit from greater consistency among educators, professionals, and researchers. Given the numerous IPE studies that have been conducted using instruments that lack documented reliability and validity, it is apparent that much confusion remains over appropriate instruments for measuring IPE. Moreover, poorly defined target endpoints have resulted in an incomplete catalogue of potentially available instruments. The background paper in Appendix A identifies three new RCTs in addition to the seven RCTs described in the 2013 Cochrane review (Reeves et al., 2013), each of which suffers from “difficult-to-measure endpoints” (Hoffman et al., 2014; Nurok et al., 2011; Riley et al., 2011).

In addition, the methods used to study the impact of IPE on health and system outcomes vary greatly. The Canadian Interprofessional Health Collaborative (CIHC, 2009) reviewed the literature for available quantitative tools used to measure outcomes of IPE and collaborative practice and identified 128 tools in 136 articles. They found 119 differently named evaluation instruments or methods reported by 20 IPE and collaborative, patient-centered practice projects. However, many of the included tools had not been validated, and their use in other studies would be problematic. The U.S. National Center for Interprofessional Practice and Education is presently engaged in providing better information on IPE evaluation tools.1

Sample Size

IPE studies frequently rely on self-reported data, and are small and insufficiently powered to evaluate specified outcomes. For example, Brandt and colleagues (2014) found that approximately 62 percent of the 133 studies they reviewed had sample sizes smaller than 50.

Control Groups

Most IPE studies are not designed to control for differences between comparison and intervention groups. Others suffer from selective reporting of differences in outcomes. Allocation to groups generally is not concealed, and blinding in the assessment of outcomes is often inadequate.

Intermediate Learning Outcomes

Other methodological limitations include a lack of documentation and measurement of intermediate learning outcomes (see Figure 3-2 in Chapter 3). Without documentation of the application and fidelity of the intervention and of important process variables and proximal outcomes,

____________

1See https://nexusipe.org/measurement-instruments (accessed November 6, 2015).

Suggested Citation:"4 Strengthening the Evidence Base." Institute of Medicine. 2015. Measuring the Impact of Interprofessional Education on Collaborative Practice and Patient Outcomes. Washington, DC: The National Academies Press. doi: 10.17226/21726.
×

studies cannot demonstrate clearly that teamwork training actually results in improved teamwork prior to the assessment of health and system outcomes. Similarly, information often is lacking as to whether those trained together actually work collaboratively in the practice setting.

PopulationBased Outcomes

Studies examining the impact of IPE all too often ignore important patient and population outcomes. For example, of 39 papers—drawn from the more than 2,000 reviewed abstracts that met the inclusion criteria of Brashers and colleagues (see Appendix A)—none examined population health or community outcomes, and only 4 reported patient or family satisfaction. Rather, the majority focused on organizational or practice processes, with a few addressing a culture of safety. Similar findings have been reported by others (Reeves et al., 2011, 2013; Thistlethwaite, 2012).

Longitudinal Study Design

Studies that span the education continuum and that follow trainees over time, encompassing classrooms, simulation laboratories, and practice settings, are generally lacking (Deutschlander et al., 2013). While there are numerous publications providing examples of brief interprofessional encounters at the learner level, interventions that look at health and system outcomes require longitudinal designs that are more complex and are therefore undertaken less often (Clifton et al., 2006). The imbalance of short-versus long-term studies is exacerbated by a scarcity of coordinating centers at universities for conducting IPE activities, resulting in a large number of “one-off” IPE events that are then evaluated and published. Overcoming the barriers to longitudinal IPE studies would add immeasurably to the evaluation of the effectiveness of IPE.

Developing a Consistent Reporting Framework

The lack of important methodological details in published studies makes analysis suspect, replicability difficult, and generalizability uncertain. The effect of incomplete reporting on the ability to reach general conclusions is evident from the observations on the quality of evidence made by the authors of the reviews summarized in Table B-2 by Reeves and colleagues (see Appendix B). Likewise, Brashers and colleagues (see Appendix A) rate only 4 of the 39 studies they reviewed as “high,” indicating that the researchers used a strong study design that produced consistent, generalizable results.

The lack of methodological details reported in IPE publications may be the result of a weak study design or incomplete recording of information on the education intervention itself. For example, authors sometimes give

Suggested Citation:"4 Strengthening the Evidence Base." Institute of Medicine. 2015. Measuring the Impact of Interprofessional Education on Collaborative Practice and Patient Outcomes. Washington, DC: The National Academies Press. doi: 10.17226/21726.
×

inadequate descriptions of the study participants (e.g., how many, which professions, levels of training) or the type and quantity (“dose”) of the intervention as significant variables influencing outcomes (Reeves et al., 2009). It may also be due to the publishing parameters of journal editors that enforce word limitations (Jha, 2014). The literature would be significantly enhanced by the development of a consistent reporting framework for linking IPE to specific learning, health, and system outcomes.

Examining Cost and Cost-Effectiveness

Efforts increasingly focus on documenting the total cost of health care (e.g., the Health Partners model); however, estimates of the total cost of IPE or education in general are lacking. Of the 39 papers in the review by Brashers and colleagues (Appendix A), only 3 identify efficiencies in care (Banki et al., 2013; Capella et al., 2010; Wolf et al., 2010), and only 1 reports changes in practice costs (Banki et al., 2013). While the latter study notes significant cost reductions, they could not be attributed definitively to the IPE intervention itself.

Thirteen of these 39 papers examine outcomes over many months to several years (Armour Forse et al., 2011; Hanbury et al., 2009; Helitzer et al., 2011; Mayer et al., 2011; Morey et al., 2002; Pettker et al., 2009; Phipps et al., 2012; Pingleton et al., 2013; Rask et al., 2007; Sax et al., 2009; Thompson et al., 2000a,b; Wolf et al., 2010). Although these longer-term studies document effects on provider or patient outcomes, the effects tended to decay over time. Moreover, only 2 of the 39 studies (Hanbury et al., 2009; Pettker et al., 2009) were well designed (interrupted time series methodology), making the collective findings difficult to interpret.

Similar observations are made in the analysis of the 8 IPE reviews, encompassing more than 400 individual studies, summarized by Reeves and colleagues (see Appendix B). Across these studies, most authors report only on short-term impacts on learner attitudes and knowledge following various IPE interventions, and do not provide cost analyses. As a result, understanding of the long-term impact of IPE on both education and health system costs continues to be limited. A PubMed search revealed one study that demonstrated the cost-effectiveness of a Danish interprofessional training unit compared with a conventional ward, with no apparent differences in quality or safety between the two (Hansen et al., 2009).

Likewise, while the U.S.-based Vermont Blueprint for Health2 has

____________

2 Defined as a “program for integrating a system of health care for patients, improving the health of the overall population, and improving control over health care costs by promoting health maintenance, prevention, and care coordination and management” (Vermont Government, 2015).

Suggested Citation:"4 Strengthening the Evidence Base." Institute of Medicine. 2015. Measuring the Impact of Interprofessional Education on Collaborative Practice and Patient Outcomes. Washington, DC: The National Academies Press. doi: 10.17226/21726.
×

linked the introduction of its community-based, patient-centered medical home initiative to cost savings, the relationship between these savings and the training of providers to work in teams is unclear.3 Similar results are emerging from the U.S.-based Veterans Health Administration’s patient-aligned care team initiative, which has documented team-based improvements in system outcomes and costs but has not explicitly examined potential relationships between purposeful training for collaborative practice and these outcomes.

Without well-designed studies addressing cost-effectiveness, it will be challenging to formulate a strong business case for IPE. Developing a financial justification for IPE will require knowing the adequate “dose” of IPE (as described in Chapter 3) and having competency or performance measures with which to determine proficiency. These elements and thus the financial justification would no doubt vary given the broad range of “IPE programs” worldwide. Optimally, the business case would include evidence on the sustainability of IPE interventions; their impact on system outcomes, including organizational and practice changes and health care costs; and the resulting patient and population benefits. However, it is worth noting that complex analyses of this type typically are not being conducted for any education reform effort and that IPE should not be held to a unique standard.

CONCLUSION

A comprehensive literature search revealed a dearth of robust studies specifically designed to better link IPE with changes in collaborative behavior or answer key questions about the effectiveness of IPE in improving patient, population, and health system outcomes.

Conclusion 3. More purposeful, welldesigned, and thoughtfully reported studies are needed to answer key questions about the effectiveness of IPE in improving performance in practice and health and system outcomes.

Linking IPE with Changes in Collaborative Behavior

An essential intermediate step in linking IPE with health and system outcomes is enhanced collaborative behavior and performance in practice (see “Learning Outcomes” in Figure 3-2 in Chapter 3). While considerable attention has been focused on developing measures of interprofessional collaboration (CIHC, 2012; McDonald et al., 2014; National Center for Inter

____________

3 Personal communication, C. Jones, Blueprint for Health, Deptment of Vermont Health Access, 2014.

Suggested Citation:"4 Strengthening the Evidence Base." Institute of Medicine. 2015. Measuring the Impact of Interprofessional Education on Collaborative Practice and Patient Outcomes. Washington, DC: The National Academies Press. doi: 10.17226/21726.
×

professional Practice and Education, 2013; Reeves et al., 2010; Schmitz and Cullen, 2015), no such measures have as yet been broadly accepted or adopted (Clifton, 2006; Hammick et al., 2007; Thannhauser et al., 2010). In fact, the strong contextual dependence of presently available measures (Valentine et al., 2015; WHO, 2013) limits their application beyond a single study or small group of studies. Differences in setting and patient population, education programs and health care delivery institutions, health care workforce composition and patterns of collaboration, and national education and health care policies create significant complexities in study design and interpretation. To address this deficiency the committee makes the following recommendation:

“The strong contextual dependence of presently available measures of collaborative behavior limits their application beyond a single or small group of studies.”

Recommendation 1: Interprofessional stakeholders, funders, and policy makers should commit resources to a coordinated series of well-designed studies of the association between interprofessional education and collaborative behavior, including teamwork and performance in practice. These studies should be focused on developing broad consensus on how to measure interprofessional collaboration effectively across a range of learning environments, patient populations, and practice settings.

These studies could employ different approaches that might include developing instruments and testing their reliability, validity, and usefulness specific to collaborative practice; conducting head-to-head comparisons of existing instruments within particular contexts; and extending the validation process for an existing “best-in-class” instrument to additional professions, learning environments, patient populations, health care settings, and countries. At a minimum, however, these studies should take into account the intended learner outcomes in the three major components of the education continuum—foundational education, graduate education, and continuing professional development (as noted in the “Learning Continuum” of Figure 3-2). Therefore, each such study should clearly define the intermediate (learner) and more distal (health and system) outcome target(s) of the study—for example, how a particular feature of teamwork might be linked to enhanced performance in practice and how such collaboration might promote a particular health or systems outcome (Baker et al., 2006; Franco et al., 2009; Salas et al., 2008b). This perspective, which is often missing or incompletely specified, is essential to the design of robust evaluations of any education intervention in practice (Marinopoulos et al., 2007; Reeves et al., 2013; Swing, 2007).

Suggested Citation:"4 Strengthening the Evidence Base." Institute of Medicine. 2015. Measuring the Impact of Interprofessional Education on Collaborative Practice and Patient Outcomes. Washington, DC: The National Academies Press. doi: 10.17226/21726.
×

Addressing the Areas of Need

Addressing these gaps will entail giving IPE greater priority by forming partnerships among the education, practice, and research communities to design studies that are relevant to patient, population, and health system outcomes. Engaging accreditors, policy makers, and funders in the process could provide additional resources for establishing more robust partnerships. Only by bringing all these constituencies together will a series of well-designed studies emerge.

“Engaging accreditors, policy makers, and funders in the process could provide additional resources for establishing more robust partnerships.”

REFERENCES

Abu-Rish, E., S. Kim, L. Choe, L. Varpio, E. Malik, A. A. White, K. Craddick, K. Blondon, L. Robins, P. Nagasawa, A. Thigpen, L. L. Chen, J. Rich, and B. Zierler. 2012. Current trends in interprofessional education of health sciences students: A literature review. Journal of Interprofessional Care 26(6):444-451.

Armour Forse, R., J. D. Bramble, and R. McQuillan. 2011. Team training can improve operating room performance. Surgery 150(4):771-778.

Baker, D. P., R. Day, and E. Salas. 2006. Teamwork as an essential component of high-reliability organizations. Health Services Research 41(4, Pt. 2):1576-1598.

Banki, F., K. Ochoa, M. E. Carrillo, S. S. Leake, A. L. Estrera, K. Khalil, and H. J. Safi. 2013. A surgical team with focus on staff education in a community hospital improves outcomes, costs and patient satisfaction. American Journal of Surgery 206(6):1007-1014; discussion 1014-1015.

Brandt, B., M. N. Lutfiyya, J. A. King, and C. Chioreso. 2014. A scoping review of interprofessional collaborative practice and education using the lens of the triple aim. Journal of Interprofessional Care 28(5):393-399.

Brashers, V. L., C. E. Curry, D. C. Harper, S. H. McDaniel, G. Pawlson, and J. W. Ball. 2001. Interprofessional health care education: Recommendations of the National Academies of Practice expert panel on health care in the 21st century. Issues in Interdisciplinary Care 3(1):21-31.

Capella, J., S. Smith, A. Philp, T. Putnam, C. Gilbert, W. Fry, E. Harvey, A. Wright, K. Henderson, and D. Baker. 2010. Teamwork training improves the clinical care of trauma patients. Journal of Surgical Education 67(6):439-443.

Chen, F. M., H. Bauchner, and H. Burstin. 2004. A call for outcomes research in medical education. Academic Medicine 79(10):955-960.

CIHC (Canadian Interprofessional Health Collaborative). 2009. Program evaluation for interprofessional initiatives: Evaluation instruments/methods of the 20 IECPCP projects. A report from the evaluation subcommittee. Vancouver, BC: CIHC.

CIHC. 2012. An inventory of quantitative tools measuring interprofessional education and collaborative practice outcomes. Vancouver, BC: CIHC.

Clifton, M., C. Dale, and C. Bradshaw. 2006. The impact and effectiveness of interprofessional education in primary care: An RCN literature review. London, England: Royal College of Nursing. https://www.rcn.org.uk/__data/assets/pdf_file/0004/78718/003091.pdf (accessed March 17, 2015).

Suggested Citation:"4 Strengthening the Evidence Base." Institute of Medicine. 2015. Measuring the Impact of Interprofessional Education on Collaborative Practice and Patient Outcomes. Washington, DC: The National Academies Press. doi: 10.17226/21726.
×

Cooper, H., C. Carlisle, T. Gibbs, and C. Watkins. 2001. Developing an evidence base for interdisciplinary learning: A systematic review. Journal of Advanced Nursing 35(2):228-237.

Deutschlander, S., E. Suter, and R. Grymonpre. 2013. Interprofessional practice education: Is the “interprofessional” component relevant to recruiting new graduates to underserved areas? Rural Remote Health 13(4):2489.

Forsetlund, L., A. Bjorndal, A. Rashidian, G. Jamtvedt, M. A. O’Brien, F. Wolf, D. Davis, J. Odgaard-Jensen, and A. D. Oxman. 2009. Continuing education meetings and workshops: Effects on professional practice and health care outcomes. Cochrane Database of Systematic Reviews 2:Cd003030.

Franco, L. M., L. Marquez, K. Ethier, Z. Balsara, and W. Isenhower. 2009. Results of collaborative improvement: Effects on health outcomes and compliance with evidencebased standards in 27 applications in 12 countries. Collaborative Evaluation Series. Prepared by University Research Co., LLC (URC). Bethesda, MD: USAID Health Care Improvement Project.

Hammick, M., D. Freeth, I. Koppel, S. Reeves, and H. Barr. 2007. A best evidence systematic review of interprofessional education: BEME guide no. 9. Medical Teacher 29(8):735-751.

Hanbury, A., L. Wallace, and M. Clark. 2009. Use of a time series design to test effectiveness of a theory-based intervention targeting adherence of health professionals to a clinical guideline. British Journal of Health Psychology 14(Pt. 3):505-518.

Hansen, T. B., F. Jacobsen, and K. Larsen. 2009. Cost effective interprofessional training: An evaluation of a training unit in Denmark. Journal of Interprofessional Care 23(3):234-241.

Helitzer, D. L., M. Lanoue, B. Wilson, B. U. de Hernandez, T. Warner, and D. Roter. 2011. A randomized controlled trial of communication training with primary care providers to improve patient-centeredness and health risk communication. Patient Education and Counseling 82(1):21-29.

Hoffmann, B., V. Muller, J. Rochon, M. Gondan, B. Muller, Z. Albay, K. Weppler, M. Leifermann, C. Miessner, C. Guthlin, D. Parker, G. Hofinger, and F. M. Gerlach. 2014. Effects of a team-based assessment and intervention on patient safety culture in general practice: An open randomised controlled trial. BMJ Quality and Safety 23(1):35-46.

IOM (Institute of Medicine). 2010. The future of nursing: Leading change, advancing health. Washington, DC: The National Academies Press.

Jha, K. N. 2014. How to write articles that get published. Journal of Clinical and Diagnostic Research 8(9):XG01-XG03.

Leathard, A. 1994. Interprofessional developments in Britain. In Going interprofessional: Working together for health and welfare, edited by A. Leathard. London: Brunner-Routledge. Pp. 3-37.

Lowrie, R., S. M. Lloyd, A. McConnachie, and J. Morrison. 2014. A cluster randomised controlled trial of a pharmacist-led collaborative intervention to improve statin prescribing and attainment of cholesterol targets in primary care. PLoS ONE 9(11):e113370.

Marinopoulos, S. S., T. Dorman, N. Ratanawongsa, L. M. Wilson, B. H. Ashar, J. L. Magaziner, R. G. Miller, P. A. Thomas, G. P. Prokopowicz, R. Qayyum, and E. B. Bass. 2007. Effectiveness of continuing medical education. Evidence Report/Technology Assessment No. 149. Rockville, MD: Agency for Healthcare Research and Quality.

Mayer, C. M., L. Cluff, W. T. Lin, T. S. Willis, R. E. Stafford, C. Williams, R. Saunders, K. A. Short, N. Lenfestey, H. L. Kane, and J. B. Amoozegar. 2011. Evaluating efforts to optimize TeamSTEPPS implementation in surgical and pediatric intensive care units. Joint Commission Journal on Quality and Patient Safety 37(8):365-374.

Suggested Citation:"4 Strengthening the Evidence Base." Institute of Medicine. 2015. Measuring the Impact of Interprofessional Education on Collaborative Practice and Patient Outcomes. Washington, DC: The National Academies Press. doi: 10.17226/21726.
×

McDonald, K. M., E. Schultz, L. Albin, N. Pineda, J. Lonhart, V. Sundaram, C. SmithSpangler, J. Brustrom, E. Malcolm, L. Rohn, and S. Davies. 2014. Care coordination measures atlas version 4 (Prepared by Stanford University under subcontract to American Institutes for Research on Contract No. HHSA290-2010-00005I). AHRQ Publication No. 14-0037-EF. Rockville, MD: Agency for Healthcare Research and Quality. http://www.ahrq.gov/professionals/prevention-chronic-care/improve/coordination/atlas2014 (accessed April 9, 2015).

Morey, J. C., R. Simon, G. D. Jay, R. L. Wears, M. Salisbury, K. A. Dukes, and S. D. Berns. 2002. Error reduction and performance improvement in the emergency department through formal teamwork training: Evaluation results of the MedTeams project. Health Services Research Journal 37(6):1553-1581.

National Center for Interprofessional Practice and Education. 2013. Measurement instruments. https://nexusipe.org/measurement-instruments (accessed April 9, 2015).

Nurok, M., L. A. Evans, S. Lipsitz, P. Satwicz, A. Kelly, and A. Frankel. 2011. The relationship of the emotional climate of work and threat to patient outcome in a high-volume thoracic surgery operating room team. BMJ Quality and Safety 20(3):237-242.

Olenick, M., L. R. Allen, and R. A. Smego, Jr. 2010. Interprofessional education: A concept analysis. Advances in Medical Education and Practice 1:75-84.

Olson, R., and A. Bialocerkowski. 2014. Interprofessional education in allied health: A systematic review. Medical Education 48(3):236-246.

Paradis, E., and S. Reeves. 2013. Key trends in interprofessional research: A macrosociological analysis from 1970 to 2010. Journal of Interprofessional Care 27(2):113-122.

Pettker, C. M., S. F. Thung, E. R. Norwitz, C. S. Buhimschi, C. A. Raab, J. A. Copel, E. Kuczynski, C. J. Lockwood, and E. F. Funai. 2009. Impact of a comprehensive patient safety strategy on obstetric adverse events. American Journal of Obstetrics and Gynecology 200(5):492.e1-492.e8.

Phipps, M. G., D. G. Lindquist, E. McConaughey, J. A. O’Brien, C. A. Raker, and M. J. Paglia. 2012. Outcomes from a labor and delivery team training program with simulation component. American Journal of Obstetrics and Gynecology 206(1):3-9.

Pingleton, S. K., E. Carlton, S. Wilkinson, J. Beasley, T. King, C. Wittkopp, M. Moncure, and T. Williamson. 2013. Reduction of venous thromboembolism (VTE) in hospitalized patients: Aligning continuing education with interprofessional team-based quality improvement in an academic medical center. Academic Medicine 88(10):1454-1459.

Price, J. 2005. Complexity and interprofessional education. In The theorypractice relationship in interprofessional education, Ch. 8, edited by H. Colyer, M. Helme, and I. Jones. King’s College, London: Higher Education Academy. Pp. 79-87.

Rask, K., P. A. Parmelee, J. A. Taylor, D. Green, H. Brown, J. Hawley, L. Schild, H. S. Strothers III, and J. G. Ouslander. 2007. Implementation and evaluation of a nursing home fall management program. Journal of the American Geriatric Society 55(3):342-349.

Reed, D. A., D. E. Kern, R. B. Levine, and S. M. Wright. 2005. Costs and funding for published medical education research. Journal of the American Medical Association 294(9):1052-1057.

Reeves, S. 2010. Ideas for the development of the interprofessional field. Journal of Interprofessional Care 24(3):217-219.

Reeves, S., M. Zwarenstein, J. Goldman, H. Barr, D. Freeth, M. Hammick, and I. Koppel. 2009. Interprofessional education: Effects on professional practice and health care outcomes (review). Cochrane Database Systematic Reviews (1):1-21.

Reeves, S., J. Goldman, A. Burton, and B. Sawatzky-Girling. 2010. Synthesis of systematic review evidence of interprofessional education. Journal of Allied Health 39(Suppl. 1):198-203.

Suggested Citation:"4 Strengthening the Evidence Base." Institute of Medicine. 2015. Measuring the Impact of Interprofessional Education on Collaborative Practice and Patient Outcomes. Washington, DC: The National Academies Press. doi: 10.17226/21726.
×

Reeves, S., J. Goldman, J. Gilbert, J. Tepper, I. Silver, E. Suter, and M. Zwarenstein. 2011. A scoping review to improve conceptual clarity of interprofessional interventions. Journal of Interprofessional Care 25(3):167-174.

Reeves, S., L. Perrier, J. Goldman, D. Freeth, and M. Zwarenstein. 2013. Interprofessional education: Effects on professional practice and healthcare outcomes (update). Cochrane Database of Systematic Reviews 3.

Reeves, S., S. Boet, B. Zierler, and S. Kitto. 2015. Interprofessional education and practice guide no. 3: Evaluating interprofessional education. Journal of Interprofessional Care 29 (4):305-312.

Remington, T. L., M. A. Foulk, and B. C. Williams. 2006. Evaluation of evidence for interprofessional education. The American Journal of Pharmaceutical Education 70(3):66.

Riley, W., S. Davis, K. Miller, H. Hansen, F. Sainfort, and R. Sweet. 2011. Didactic and simulation nontechnical skills team training to improve perinatal patient outcomes in a community hospital. Joint Commission Journal on Quality and Patient Safety 37(8):357-364.

Salas, E., D. DiazGranados, C. Klein, C. S. Burke, K. C. Stagl, G. F. Goodwin, and S. M. Halpin. 2008a. Does team training improve team performance? A meta-analysis. Human Factors: The Journal of the Human Factors and Ergonomics Society 50(6):903-933.

Salas, E., N. J. Cooke, and M. A. Rosen. 2008b. On teams, teamwork, and team performance: Discoveries and developments. Human Factors: The Journal of the Human Factors and Ergonomics Society 50(3):540-547.

Sax, H. C., P. Browne, R. J. Mayewski, R. J. Panzer, K. C. Hittner, R. L. Burke, and S. Coletta. 2009. Can aviation-based team training elicit sustainable behavioral change? Archives of Surgery 144(12):1133-1137.

Schmitz, C. C., and M. J. Cullen. 2015. Evaluating interprofessional education and collaborative practice: What should I consider when selecting a measurement tool? https://nexusipe.org/evaluating-ipecp (accessed April 9, 2015).

Stone, N. 2006. Evaluating interprofessional education: The tautological need for interdisciplinary approaches. Journal of Interprofessional Care 20(3):260-275.

Sullivan, G. M. 2011. Getting off the “gold standard”: Randomized controlled trials and education research. Journal of Graduate Medical Education 3(3):285-289.

Sunguya, B. F., M. Jimba, J. Yasuoka, and W. Hinthong. 2014. Interprofessional education for whom?: Challenges and lessons learned from its implementation in developed countries and their application to developing countries: A systematic review. PLoS ONE 9(5):e96724.

Swing, S. R. 2007. The ACGME outcome project: Retrospective and prospective. Medical Teacher 29(7):648-654.

Thannhauser, J., S. Russell-Mayhew, and C. Scott. 2010. Measures of interprofessional education and collaboration. Journal of Interprofessional Care 24(4):336-349.

Thistlethwaite, J. 2012. Interprofessional education: A review of context, learning and the research agenda. Medical Education 46(1):58-70.

Thompson, C., A. L. Kinmonth, L. Stevens, R. C. Peveler, A. Stevens, K. J. Ostler, R. M. Pickering, N. G. Baker, A. Henson, J. Preece, D. Cooper, and M. J. Campbell. 2000a. Effects of a clinical-practice guideline and practice-based education on detection and outcome of depression in primary care: Hampshire depression project randomised controlled trial. Lancet 355(9199):185-191.

Thompson, R. S., F. P. Rivara, D. C. Thompson, W. E. Barlow, N. K. Sugg, R. D. Maiuro, and D. M. Rubanowice. 2000b. Identification and management of domestic violence: A randomized trial. American Journal of Preventive Medicine 19(4):253-263.

Valentine, M. A., I. M. Nembhard, and A. C. Edmondson. 2015. Measuring teamwork in health care settings: A review of survey instruments. Medical Care 53(4):e16-e30.

Suggested Citation:"4 Strengthening the Evidence Base." Institute of Medicine. 2015. Measuring the Impact of Interprofessional Education on Collaborative Practice and Patient Outcomes. Washington, DC: The National Academies Press. doi: 10.17226/21726.
×

Vermont Government. 2015. Vermont Government website: New reports show Blueprint is lowering health care costs. http://governor.vermont.gov/node/2223 (accessed March 17, 2015).

Weaver, L., A. McMurtry, J. Conklin, S. Brajtman, and P. Hall. 2011. Harnessing complexity science for interprofessional education development: A case study. Journal of Research in Interprofessional Practice and Education 2(1):100-120.

Weaver, S. J., M. A. Rosen, D. DiazGranados, E. H. Lazzara, R. Lyons, E. Salas, S. A. Knych, M. McKeever, L. Adler, M. Barker, and H. B. King. 2010. Does teamwork improve performance in the operating room? A multilevel evaluation. Joint Commission Journal on Quality and Patient Safety 36(3):133-142.

WHO (World Health Organization). 2013. Interprofessional collaborative practice in primary health care: Nursing and midwifery perspectives. Six case studies. Geneva: WHO.

Wolf, F. A., L. W. Way, and L. Stewart. 2010. The efficacy of medical team training: Improved team performance and decreased operating room delays: A detailed analysis of 4863 cases. Annals of Surgery 252(3):477-483.

Zwarenstein, M., J. Goldman, and S. Reeves. 2009. Interprofessional collaboration: Effects of practice-based interventions on professional practice and healthcare outcomes. Cochrane Database of Systematic Reviews 3:CD000072.

Suggested Citation:"4 Strengthening the Evidence Base." Institute of Medicine. 2015. Measuring the Impact of Interprofessional Education on Collaborative Practice and Patient Outcomes. Washington, DC: The National Academies Press. doi: 10.17226/21726.
×

This page intentionally left blank.

Suggested Citation:"4 Strengthening the Evidence Base." Institute of Medicine. 2015. Measuring the Impact of Interprofessional Education on Collaborative Practice and Patient Outcomes. Washington, DC: The National Academies Press. doi: 10.17226/21726.
×
Page 39
Suggested Citation:"4 Strengthening the Evidence Base." Institute of Medicine. 2015. Measuring the Impact of Interprofessional Education on Collaborative Practice and Patient Outcomes. Washington, DC: The National Academies Press. doi: 10.17226/21726.
×
Page 40
Suggested Citation:"4 Strengthening the Evidence Base." Institute of Medicine. 2015. Measuring the Impact of Interprofessional Education on Collaborative Practice and Patient Outcomes. Washington, DC: The National Academies Press. doi: 10.17226/21726.
×
Page 41
Suggested Citation:"4 Strengthening the Evidence Base." Institute of Medicine. 2015. Measuring the Impact of Interprofessional Education on Collaborative Practice and Patient Outcomes. Washington, DC: The National Academies Press. doi: 10.17226/21726.
×
Page 42
Suggested Citation:"4 Strengthening the Evidence Base." Institute of Medicine. 2015. Measuring the Impact of Interprofessional Education on Collaborative Practice and Patient Outcomes. Washington, DC: The National Academies Press. doi: 10.17226/21726.
×
Page 43
Suggested Citation:"4 Strengthening the Evidence Base." Institute of Medicine. 2015. Measuring the Impact of Interprofessional Education on Collaborative Practice and Patient Outcomes. Washington, DC: The National Academies Press. doi: 10.17226/21726.
×
Page 44
Suggested Citation:"4 Strengthening the Evidence Base." Institute of Medicine. 2015. Measuring the Impact of Interprofessional Education on Collaborative Practice and Patient Outcomes. Washington, DC: The National Academies Press. doi: 10.17226/21726.
×
Page 45
Suggested Citation:"4 Strengthening the Evidence Base." Institute of Medicine. 2015. Measuring the Impact of Interprofessional Education on Collaborative Practice and Patient Outcomes. Washington, DC: The National Academies Press. doi: 10.17226/21726.
×
Page 46
Suggested Citation:"4 Strengthening the Evidence Base." Institute of Medicine. 2015. Measuring the Impact of Interprofessional Education on Collaborative Practice and Patient Outcomes. Washington, DC: The National Academies Press. doi: 10.17226/21726.
×
Page 47
Suggested Citation:"4 Strengthening the Evidence Base." Institute of Medicine. 2015. Measuring the Impact of Interprofessional Education on Collaborative Practice and Patient Outcomes. Washington, DC: The National Academies Press. doi: 10.17226/21726.
×
Page 48
Suggested Citation:"4 Strengthening the Evidence Base." Institute of Medicine. 2015. Measuring the Impact of Interprofessional Education on Collaborative Practice and Patient Outcomes. Washington, DC: The National Academies Press. doi: 10.17226/21726.
×
Page 49
Suggested Citation:"4 Strengthening the Evidence Base." Institute of Medicine. 2015. Measuring the Impact of Interprofessional Education on Collaborative Practice and Patient Outcomes. Washington, DC: The National Academies Press. doi: 10.17226/21726.
×
Page 50
Suggested Citation:"4 Strengthening the Evidence Base." Institute of Medicine. 2015. Measuring the Impact of Interprofessional Education on Collaborative Practice and Patient Outcomes. Washington, DC: The National Academies Press. doi: 10.17226/21726.
×
Page 51
Suggested Citation:"4 Strengthening the Evidence Base." Institute of Medicine. 2015. Measuring the Impact of Interprofessional Education on Collaborative Practice and Patient Outcomes. Washington, DC: The National Academies Press. doi: 10.17226/21726.
×
Page 52
Suggested Citation:"4 Strengthening the Evidence Base." Institute of Medicine. 2015. Measuring the Impact of Interprofessional Education on Collaborative Practice and Patient Outcomes. Washington, DC: The National Academies Press. doi: 10.17226/21726.
×
Page 53
Suggested Citation:"4 Strengthening the Evidence Base." Institute of Medicine. 2015. Measuring the Impact of Interprofessional Education on Collaborative Practice and Patient Outcomes. Washington, DC: The National Academies Press. doi: 10.17226/21726.
×
Page 54
Suggested Citation:"4 Strengthening the Evidence Base." Institute of Medicine. 2015. Measuring the Impact of Interprofessional Education on Collaborative Practice and Patient Outcomes. Washington, DC: The National Academies Press. doi: 10.17226/21726.
×
Page 55
Suggested Citation:"4 Strengthening the Evidence Base." Institute of Medicine. 2015. Measuring the Impact of Interprofessional Education on Collaborative Practice and Patient Outcomes. Washington, DC: The National Academies Press. doi: 10.17226/21726.
×
Page 56
Next: 5 Improving Research Methodologies »
Measuring the Impact of Interprofessional Education on Collaborative Practice and Patient Outcomes Get This Book
×
Buy Paperback | $47.00 Buy Ebook | $37.99
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF

Interprofessional teamwork and collaborative practice are emerging as key elements of efficient and productive work in promoting health and treating patients. The vision for these collaborations is one where different health and/or social professionals share a team identity and work closely together to solve problems and improve delivery of care. Although the value of interprofessional education (IPE) has been embraced around the world - particularly for its impact on learning - many in leadership positions have questioned how IPE affects patent, population, and health system outcomes. This question cannot be fully answered without well-designed studies, and these studies cannot be conducted without an understanding of the methods and measurements needed to conduct such an analysis.

This Institute of Medicine report examines ways to measure the impacts of IPE on collaborative practice and health and system outcomes. According to this report, it is possible to link the learning process with downstream person or population directed outcomes through thoughtful, well-designed studies of the association between IPE and collaborative behavior. Measuring the Impact of Interprofessional Education on Collaborative Practice and Patient Outcomes describes the research needed to strengthen the evidence base for IPE outcomes. Additionally, this report presents a conceptual model for evaluating IPE that could be adapted to particular settings in which it is applied. Measuring the Impact of Interprofessional Education on Collaborative Practice and Patient Outcomes addresses the current lack of broadly applicable measures of collaborative behavior and makes recommendations for resource commitments from interprofessional stakeholders, funders, and policy makers to advance the study of IPE.

  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    Switch between the Original Pages, where you can read the report as it appeared in print, and Text Pages for the web version, where you can highlight and search the text.

    « Back Next »
  6. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  7. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  8. ×

    View our suggested citation for this chapter.

    « Back Next »
  9. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!