Over the past half century, there have been ebbs and flows of interest in linking what is now called interprofessional education (IPE) with interprofessional collaboration and team-based care. As a result, a commitment to designing, implementing, and evaluating IPE curricula also has come in and out of favor. Since the mid-2000s, concerns about the quality and cost of health care, limited access to care for some groups and populations, and patient safety, together with increasing interest in transforming health professions education, have stimulated a resurgence of interest in IPE as a viable approach to developing interprofessional competencies for effective collaborative practice (IOM, 2000, 2001). Today, however, as contemporary health care approaches have become more outcomes-based, so have the questions raised about the impact and effectiveness of IPE (Cerra and Brandt, 2011; IPEC, 2011). Whereas considerable research has focused on student learning, only recently have researchers begun to look beyond the classroom and beyond learning outcomes for the impact of IPE on such issues as patient safety, patient and provider satisfaction, quality of care, health promotion, population health, and the cost of care (Moore et al., 2009; Walsh et al., 2014).
STUDY CHARGE
In this context, the Institute of Medicine (IOM) convened the Committee on Measuring the Impact of Interprofessional Education on Collaborative Practice and Patient Outcomes. The committee was charged to “analyze the available data and information to determine the best methods
for measuring the impact of interprofessional education (IPE) on specific aspects of health care delivery and the functioning of health care systems.” The committee’s charge required moving beyond examining the impact of IPE on learners’ knowledge, skills, and attitudes to focus on the link between IPE and performance in practice, including the impact of IPE on patient and population health and health care delivery system outcomes. Learning has been defined as the act of “developing knowledge, skills or new insights, bringing about a change in understanding, perspective, or the way something is done or acted upon” (Nisbet et al., 2013, p. 469). Therefore, how a professional masters knowledge as an individual or as part of an interprofessional team, group, or network; develops new skills; modifies attitudes and behaviors; and achieves competence and expertise over time all impact these outcomes.
The particular setting within which learning occurs also is vitally important (Bridges et al., 2011; Oandasan and Reeves, 2005; Salas and Rosen, 2013; WHO, 2010). Given the rapidity with which health care around the world is changing, the committee quickly realized the need to reconsider the existing paradigm of how, where, and with whom health professions learning takes place. A central tenet of this shift in perspective is the need to recognize the vital role of the direct involvement of patients, families, and communities in the education-to-practice continuum to help ensure that education, training, and professional development are designed in ways that have a positive impact on health. Therefore, the desired outcome is not just improving learning but improving the health of individuals and populations and enhancing the responsiveness of health systems to such nonhealth dimensions as respect for patients and families, consumer satisfaction, and the affordability of health care for all.
Another question the committee had to confront is whether it is possible to evaluate the impact of any health professions education intervention on improving health or system outcomes given the degree to which confounding variables can obscure the evaluation results. Such variables can be in the form of enabling or interfering factors in such areas as professional or institutional culture and workforce or financing policy.
ADDRESSING THE GAPS
In reviewing the IPE literature, it became apparent that it is possible to link the learning process with downstream person-, population-, or system-directed outcomes provided that thoughtful, collaborative, and well-designed studies are intentionally targeted to answering such questions. Despite accumulating data, however, the committee identified numerous gaps in the evidence linking IPE to patient, population, and system outcomes.
In light of these gaps, the committee found it necessary to highlight four areas that, if addressed, would lay a strong foundation for evaluating the impact of IPE on collaborative practice and patient, population, and system outcomes: (1) more closely aligning the education and health care delivery systems, (2) developing a conceptual framework for measuring the impact of IPE, (3) strengthening the evidence base for IPE, and (4) linking IPE with changes in collaborative behavior.
Alignment of Education and Health Care Delivery Systems
Coordinated planning among educators, health system leaders, and policy makers is a prerequisite to creating an optimal learning environment and an effective health workforce (Cox and Naylor, 2013). To this end, educators need to be cognizant of health system redesign efforts, while health system leaders need to recognize the realities of educating and training a competent health workforce. Joint planning is especially important when health systems are undergoing rapid changes, as they are across much of the world today (Coker et al., 2008). IPE is particularly affected by the need for joint planning because the practice environment is where much of the imprinting of such concepts as collaboration and effective teamwork takes place. Despite calls for greater alignment, however, education reform is rarely well integrated with health system redesign (Cox and Naylor, 2013; Earnest and Brandt, 2014; Frenk et al., 2010; Ricketts and Fraher, 2013; WHO, 2010, 2011). Accountability for workforce and health outcomes often is dispersed among academic health centers and health care networks (Ovseiko et al., 2014). Possible exceptions include the rare cases in which ministries of education and health work together on individual initiatives (Booth, 2014; Frenk et al., 2010; MOH, 2014). Even in these cases, however, collaboration tends to be restricted to a single health profession.
Conclusion 1. Without a purposeful and more comprehensive system of engagement between the education and health care delivery systems, evaluating the impact of IPE interventions on health and system outcomes will be difficult.
Such engagement will require the active participation of the major health professions and the health system venues within which their students and practitioners learn together. It would be further enabled if individuals and organizations responsible for overseeing health professions education and health care delivery (including patient, population, and system outcomes) were to align and assume joint accountability for IPE across the lifelong learning continuum.
A Conceptual Framework for Measuring the Impact of IPE
Following an extensive literature search for interprofessional models of learning, the committee determined that no such models sufficiently incorporate all of the components needed to guide future studies effectively. The committee therefore developed a conceptual model that encompasses the education-to-practice continuum, a broad array of learning- and health-related outcomes, and major enabling and interfering factors. The committee puts forth this model with the understanding that it will need to be tested empirically and may need to be adapted to the particular settings in which it is applied. For example, educational structures and terminology differ considerably around the world, and the model may need to be modified to suit local or national conditions. However, the model’s overarching concepts—a learning continuum, learning- and health-related outcomes, and major enabling and interfering factors—would remain.
Adoption of a conceptual model of IPE to guide future study designs would focus related research and evaluations on patient, population, or system outcomes that go beyond learning and testing of team function. Visualizing the entire IPE process illuminates the different environments where IPE exists, as well as the importance of aligning education and practice.
Conclusion 2. Having a comprehensive conceptual model would greatly enhance the description and purpose of IPE interventions and their potential impact. Such a model would provide a consistent taxonomy and framework for strengthening the evidence base linking IPE with health and system outcomes.
Without such a framework, evaluating the impact of IPE on health and system outcomes will be difficult and perhaps impossible. If the individuals and organizations responsible for promoting, overseeing, and evaluating IPE were to address this gap—assuming joint accountability for the development of a consistent taxonomy and comprehensive conceptual framework that accurately describe IPE and all its outcomes—more systematic and robust research would likely be produced.
A Stronger Evidence Base
A comprehensive literature search revealed a dearth of robust studies specifically designed to better link IPE with changes in collaborative behavior or answer key questions about the effectiveness of IPE in improving health and system outcomes. The lack of a well-defined relationship between IPE and patient and population health and health care delivery system outcomes is due in part to the complexity of the learning and prac-
tice environments. It is difficult to generate this evidence in well-resourced settings, but even more difficult in parts of the world with fewer research and data resources (Price, 2005; Weaver et al., 2011).
Efforts to generate this evidence are further hindered by the relatively long lag time between education interventions and patient, population, and system outcomes; the lack of a commonly agreed-upon taxonomy and conceptual model linking education interventions to specific outcomes; and inconsistencies in study designs and methods and a lack of full reporting on the methods employed, which reduce the applicability and generalizability of many IPE study findings (Abu-Rish et al., 2012; Cooper et al., 2001; Olson and Bialocerkowski, 2014; Reeves et al., 2011, 2013; Remington et al., 2006; Salas et al., 2008a; Weaver et al., 2010; Zwarenstein et al., 2009).
There also are a plethora of enabling and interfering factors that directly or indirectly impact outcomes and program evaluation. Diverse and often opaque payment structures and differences in professional and organizational cultures generate obstacles to innovative workforce arrangements, thereby impeding interprofessional work. On the other hand, positive changes in workforce and financing policies could enable more effective collaboration and foster more robust evaluation.
Conclusion 3. More purposeful, welldesigned, and thoughtfully reported studies are needed to answer key questions about the effectiveness of IPE in improving performance in practice and health and system outcomes.
Linking IPE with Changes in Collaborative Behavior
An essential intermediate step in linking IPE with health and system outcomes is enhanced collaborative behavior and performance in practice. While considerable attention has been focused on developing measures of interprofessional collaboration (CIHC, 2012; McDonald et al., 2014; National Center for Interprofessional Practice and Education, 2013; Reeves et al., 2010; Schmitz and Cullen, 2015), no such measures have as yet been broadly accepted or adopted (Clifton, 2006; Hammick et al., 2007; Thannhauser et al., 2010). In fact, the strong contextual dependence of presently available measures (Valentine et al., 2015; WHO, 2013) limits their application beyond a single study or small group of studies. To address this deficiency the committee makes the following recommendation:
Recommendation 1: Interprofessional stakeholders, funders, and policy makers should commit resources to a coordinated series of well-designed studies of the association between interprofessional education and collaborative behavior, including teamwork and performance in
practice. These studies should be focused on developing broad consensus on how to measure interprofessional collaboration effectively across a range of learning environments, patient populations, and practice settings.
These studies could employ different approaches that might include developing instruments and testing their reliability, validity, and usefulness specific to collaborative practice; conducting head-to-head comparisons of existing instruments within particular contexts; and extending the validation process for an existing “best-in-class” instrument to additional professions, learning environments, patient populations, health care settings, and countries. At a minimum, however, these studies should take into account the intended learner outcomes in the three major components of the education continuum—foundational education, graduate education, and continuing professional development. Therefore, each such study should clearly define the intermediate (learner) and more distal (health and system) outcome target(s).
Addressing these four gaps will entail giving IPE greater priority by forming partnerships among the education, practice, and research communities to design studies that are relevant to individual, population, and health system outcomes. Engaging accreditors, policy makers, and funders in the process could provide additional resources for establishing more robust partnerships. Only by bringing all these constituencies together will a series of well-designed studies emerge.
IMPROVING RESEARCH METHODOLOGIES
Understanding the full complexity of IPE and the education and health care delivery systems within which it resides is critical for designing studies to measure the impact of IPE on individual, population, and health system outcomes. Given this complexity, the use of a single type of research method alone may generate findings that fail to provide sufficient detail and context to be informative. IPE research would gain in stature from the adoption of a mixed-methods approach that combines focused quantitative and qualitative data to yield insight into the “what” and “how” of an IPE intervention/activity and its outcomes. Such an approach has been shown to be particularly useful for exploring the perceptions of both individuals and society regarding issues of quality of care and patient safety (Curry et al., 2009; De Lisle, 2011).
The committee recognizes the value of using a variety of data sources and methods for measuring the impact of IPE, including large data sets (i.e., “big data”) for exploring potential relationships among variables. Similarly, the committee acknowledges the reality that demonstrating a return on investment will generally be necessary to spur greater financial investments
in IPE. This is where alignment between the education and health care delivery systems becomes critical so that both the academic partner (creating the IPE intervention) and the health care delivery system partner (hosting the intervention and showcasing its outcomes) are working together. In this regard, policy makers, regulatory agencies, accrediting bodies, and professional organizations that oversee or encourage collaborative practice might provide additional incentives for programs and organizations to better align IPE with collaborative practice so that the potential long-term savings in health care can be evaluated.
Another issue identified by the committee is that a majority of IPE research is conducted by individual educators working alone who may not have evaluation expertise or time and resources to conduct the protocols required to address the key questions in the field. In the absence of robust research designs, there is a distinct risk that future studies testing the impact of IPE on health and system outcomes will continue to be unknowingly biased, underpowered to measure true differences, and not generalizable across different systems. These problems could be overcome by teams of individuals with complementary expertise, including an educational evaluator, a health services researcher, and an economist, in addition to educators and others engaged in IPE.
Based on the evidence and the committee’s expert opinion, it is apparent that using either quantitative or qualitative methods alone will limit the ability of investigators in both developed and developing countries to produce high-quality studies linking IPE with patient, population, and health system outcomes. The committee therefore makes the following recommendation:
Recommendation 2: Health professions educators and academic and health system leaders should adopt a mixed-methods research approach for evaluating the impact of interprofessional education (IPE) on health and system outcomes. When possible, such studies should include an economic analysis and be carried out by teams of experts that include educational evaluators, health services researchers, and economists, along with educators and others engaged in IPE.
Once best practices for designing, implementing, and evaluating IPE outcomes have been established, disseminating them widely through detailed reporting or publishing can strengthen the evidence base and help guide future studies linking IPE to outcomes. Such studies should include those focused on eliciting in-depth patient, family, and caregiver experiences of interprofessional collaborative practice. In the meantime, the committee has developed an outline of the key elements of a potential program for research connecting IPE to health and system outcomes for further consideration by educators, health care delivery system leaders, and policy makers.
CLOSING REMARKS
Although there is a widespread and growing belief that IPE may improve interprofessional collaboration, promote team-based health care delivery, and enhance personal and population health, definitive evidence linking IPE to desirable intermediate and final outcomes does not yet exist. This report identifies and analyzes the major challenges to closing this evidence gap and offers a range of strategies for overcoming barriers that limit the establishment of a clear linkage between IPE and improved health and system outcomes.
The committee reached three major conclusions and formulated two recommendations that collectively are aimed at elevating the profile of IPE in a rapidly changing world. The committee hopes this report will shed additional light on the value of collaboration between educators and practitioners and patients, families, and communities, as well as all those who come together in working to improve lives through treatment and palliation, disease prevention, and wellness interventions. As with other forms of health professions education, only through the publication of rigorously designed studies can the potential impact of IPE on health and health care be fully realized.
REFERENCES
Abu-Rish, E., S. Kim, L. Choe, L. Varpio, E. Malik, A. A. White, K. Craddick, K. Blondon, L. Robins, P. Nagasawa, A. Thigpen, L. L. Chen, J. Rich, and B. Zierler. 2012. Current trends in interprofessional education of health sciences students: A literature review. Journal of Interprofessional Care 26(6):444-451.
Booth, D. 2014. Remarks by U.S. Ambassador Donald Booth at the inauguration of the new medical education initiative Ambo University. http://ethiopia.usembassy.gov/latest_embassy_news/remarks/remarks-by-u.s.-ambassador-donald-booth-on-inauguration-of-the-new-medical-education-initiative-ambo-university (accessed January 12, 2015).
Bridges, D. R., R. A. Davidson, P. S. Odegard, I. V. Maki, and J. Tomkowiak. 2011. Interprofessional collaboration: Three best practice models of interprofessional education. Medical Education Online 16(6035):1-10.
Cerra, F., and B. F. Brandt. 2011. Renewed focus in the United States links interprofessional education with redesigning health care. Journal of Interprofessional Care 25(6):394-396.
CIHC (Canadian Interprofessional Health Collaborative). 2012. An inventory of quantitative tools measuring interprofessional education and collaborative practice outcomes. Vancouver, BC: CIHC.
Clifton, M., C. Dale, and C. Bradshaw. 2006. The impact and effectiveness of interprofessional education in primary care: An RCN literature review. London, England: Royal College of Nursing. https://www.rcn.org.uk/__data/assets/pdf_file/0004/78718/003091.pdf (accessed March 17, 2015).
Coker, R., R. A. Atun, and M. McKee. 2008. Health systems and the challenge of communicable diseases: Experiences from Europe and Latin America, European Observatory on Health Systems and Policies Series. Maidenhead and New York: McGraw-Hill Education.
Cooper, H., C. Carlisle, T. Gibbs, and C. Watkins. 2001. Developing an evidence base for interdisciplinary learning: A systematic review. Journal of Advanced Nursing 35(2):228-237.
Cox, M., and M. Naylor. 2013. Transforming patient care: Aligning interprofessional education with clinical practice redesign. Proceedings of a Conference sponsored by the Josiah Macy Jr. Foundation in January 2013. New York: Josiah Macy Jr. Foundation. http://macyfoundation.org/docs/macy_pubs/JMF_TransformingPatientCare_Jan2013Conference_fin_Web.pdf (accessed March 17, 2014).
Curry, L. A., I. M. Nembhard, and E. H. Bradley. 2009. Qualitative and mixed methods provide unique contributions to outcomes research. Circulation 119(10):1442-1452.
De Lisle, J. 2011. The benefits and challenges of mixing methods and methodologies: Lessons learnt from implementing qualitatively led mixed methods research designs in Trinidad and Tobago. Caribbean Curriculum 18:87-120.
Earnest, M., and B. Brandt. 2014. Aligning practice redesign and interprofessional education to advance triple aim outcomes. Journal of Interprofessional Care 28(6):497-500.
Frenk, J., L. Chen, Z. A. Bhutta, J. Cohen, N. Crisp, T. Evans, H. Fineberg, P. Garcia, Y. Ke, P. Kelley, B. Kistnasamy, A. Meleis, D. Naylor, A. Pablos-Mendez, S. Reddy, S. Scrimshaw, J. Sepulveda, D. Serwadda, and H. Zurayk. 2010. Health professionals for a new century: Transforming education to strengthen health systems in an interdependent world. Lancet 376(9756):1923-1958.
Hammick, M., D. Freeth, I. Koppel, S. Reeves, and H. Barr. 2007. A best evidence systematic review of interprofessional education: BEME guide no. 9. Medical Teacher 29(8):735-751.
IOM (Institute of Medicine). 2000. To err is human: Building a safer health system. Washington, DC: National Academy Press.
IOM. 2001. Crossing the quality chasm: A new health system for the 21st century. Washington, DC: National Academy Press.
IPEC (Interprofessional Education Collaborative). 2011. Core competencies for interprofessional collaborative practice: Report of an expert panel. Washington, DC: IPEC.
McDonald, K. M., E. Schultz, L. Albin, N. Pineda, J. Lonhart, V. Sundaram, C. SmithSpangler, J. Brustrom, E. Malcolm, L. Rohn, and S. Davies. 2014. Care coordination measures atlas version 4 (Prepared by Stanford University under subcontract to American Institutes for Research on Contract No. HHSA290-2010-00005I). AHRQ Publication No. 14-0037-EF. Rockville, MD: Agency for Healthcare Research and Quality. http://www.ahrq.gov/professionals/prevention-chronic-care/improve/coordination/atlas2014 (accessed April 9, 2015).
MOH (Ministry of Health, Kingdom of Saudi Arabia). 2014. The MOH, in collaboration with the Ministry of Education, evaluates the role of the health affairs directorates in educating on MERS CoronaVirus. http://www.moh.gov.sa/en/Ministry/MediaCenter/News/Pages/News-2014-05-13-002.aspx (accessed March 17, 2015).
Moore, D. E., Jr., J. S. Green, and H. A. Gallis. 2009. Achieving desired results and improved outcomes: Integrating planning and assessment throughout learning activities. Journal of Continuing Education in the Health Professions 29(1):1-15.
National Center for Interprofessional Practice and Education. 2013. Measurement instruments. https://nexusipe.org/measurement-instruments (accessed April 9, 2015).
Nisbet, G., M. Lincoln, and S. Dunn. 2013. Informal interprofessional learning: An untapped opportunity for learning and change within the workplace. Journal of Interprofessional Care 27(6):469-475.
Oandasan, I., and S. Reeves. 2005. Key elements for interprofessional education. Part 1: The learner, the educator and the learning context. Journal of Interprofessional Care 19(Suppl. 1):21-38.
Olson, R., and A. Bialocerkowski. 2014. Interprofessional education in allied health: A systematic review. Medical Education 48(3):236-246.
Ovseiko, P. V., A. Heitmueller, P. Allen, S. M. Davies, G. Wells, G. A. Ford, A. Darzi, and A. M. Buchan. 2014. Improving accountability through alignment: The role of academic health science centres and networks in England. BMC Health Services Research 14:24.
Price, J. 2005. Complexity and interprofessional education. In The theorypractice relationship in interprofessional education, Ch. 8, edited by H. Colyer, M. Helme, and I. Jones. King’s College, London: Higher Education Academy. Pp. 79-87.
Reeves, S., S. Lewin, S. Espin, and M. Zwarenstein. 2010. Interprofessional teamwork for health and social care. London: Wiley-Blackwell.
Reeves, S., J. Goldman, J. Gilbert, J. Tepper, I. Silver, E. Suter, and M. Zwarenstein. 2011. A scoping review to improve conceptual clarity of interprofessional interventions. Journal of Interprofessional Care 25(3):167-174.
Reeves, S., L. Perrier, J. Goldman, D. Freeth, and M. Zwarenstein. 2013. Interprofessional education: Effects on professional practice and healthcare outcomes (update). Cochrane Database of Systematic Reviews 3:CD002213.
Remington, T. L., M. A. Foulk, and B. C. Williams. 2006. Evaluation of evidence for interprofessional education. The American Journal of Pharmaceutical Education 70(3):66.
Ricketts, T. C., and E. P. Fraher. 2013. Reconfiguring health workforce policy so that education, training, and actual delivery of care are closely connected. Health Affairs (Millwood) 32(11):1874-1880.
Salas, E., and M. A. Rosen. 2013. Building high reliability teams: Progress and some reflections on teamwork training. BMJ Quality and Safety 22(5):369-373.
Salas, E., D. DiazGranados, C. Klein, C. S. Burke, K. C. Stagl, G. F. Goodwin, and S. M. Halpin. 2008a. Does team training improve team performance? A meta-analysis. Human Factors: The Journal of the Human Factors and Ergonomics Society 50(6):903-933.
Schmitz, C. C., and M. J. Cullen. 2015. Evaluating interprofessional education and collaborative practice: What should I consider when selecting a measurement tool? https://nexusipe.org/evaluating-ipecp (accessed April 9, 2015).
Thannhauser, J., S. Russell-Mayhew, and C. Scott. 2010. Measures of interprofessional education and collaboration. Journal of Interprofessional Care 24(4):336-349.
Valentine, M. A., I. M. Nembhard, and A. C. Edmondson. 2015. Measuring teamwork in health care settings: A review of survey instruments. Medical Care 53(4):e16-e30.
Walsh, K., S. Reeves, and S. Maloney. 2014. Exploring issues of cost and value in professional and interprofessional education. Journal of Interprofessional Care 28(6):493-494.
Weaver, L., A. McMurtry, J. Conklin, S. Brajtman, and P. Hall. 2011. Harnessing complexity science for interprofessional education development: A case study. Journal of Research in Interprofessional Practice and Education 2(1):100-120.
Weaver, S. J., M. A. Rosen, D. DiazGranados, E. H. Lazzara, R. Lyons, E. Salas, S. A. Knych, M. McKeever, L. Adler, M. Barker, and H. B. King. 2010. Does teamwork improve performance in the operating room? A multilevel evaluation. Joint Commission Journal on Quality and Patient Safety 36(3):133-142.
WHO (World Health Organization). 2010. Framework for action on interprofessional education and collaborative practice. Geneva: WHO.
WHO. 2011. Transformative scale up of health professional education. Geneva: WHO.
WHO. 2013. Interprofessional collaborative practice in primary health care: Nursing and midwifery perspectives. Six case studies. Geneva: WHO.
Zwarenstein, M., J. Goldman, and S. Reeves. 2009. Interprofessional collaboration: Effects of practice-based interventions on professional practice and healthcare outcomes. Cochrane Database of Systematic Reviews 3:CD000072.