National Academies Press: OpenBook
« Previous: 3 Implementing Interprofessional Education for Improving Collaboration
Suggested Citation:"4 Metrics." Institute of Medicine. 2013. Interprofessional Education for Collaboration: Learning How to Improve Health from Interprofessional Models Across the Continuum of Education to Practice: Workshop Summary. Washington, DC: The National Academies Press. doi: 10.17226/13486.
×

4

Metrics

Summary: The importance of measuring impacts from interprofessional education resonated with Forum member and workshop planning committee co-chair Scott Reeves of the University of California, San Francisco. Reeves, who has devoted much of his career to studying the impact of interprofessional education (IPE), said, “If we want to understand culture and begin to develop robust metrics, we need to go in there and we need to study it,” he asserted. In essence, implementers of IPE need to be clear about the purpose of their work so that researchers can confidently analyze whether or not a program is successful. According to Reeves, having robust measurements of the effectiveness of IPE allows programs to be compared and conclusions to be drawn. This assertion was echoed by other participants at the workshop and forms the foundation for this chapter on developing metrics to advance interprofessional education and collaborative care.

EMBRACING A COMMON PARLANCE

Without clear conceptualizations of what is being investigated and without a common understanding of what various terms mean, researchers studying IPE face a variety of problems, Reeves said. He also noted that throughout the workshop participants had inaccurately used some words interchangeably. For example, he said, despite how some participants had used the words, “assessment” is not the same as “evaluation.” And although “interprofessional” had been defined early in the workshop,

Suggested Citation:"4 Metrics." Institute of Medicine. 2013. Interprofessional Education for Collaboration: Learning How to Improve Health from Interprofessional Models Across the Continuum of Education to Practice: Workshop Summary. Washington, DC: The National Academies Press. doi: 10.17226/13486.
×

participants continued to mix their terms and offer examples of interdisciplinary and multidisciplinary education and care. Reeves emphasized that one must be clear about the terminology and concepts or the entire research methodology becomes flawed.

Assessment Versus Evaluation

Reeves made a useful distinction between assessment and evaluation. Assessment is done to determine the level of understanding by a learner, while evaluation is a tool to determine how well a program or an educator teaching a course is conveying messages. For assessment, he says, there needs to be a meaningful analysis of how the individual learns, not just in the short term but in the long term as well. For evaluation, thoughtful consideration is needed to determine how well the program is conveying the desired messages and information.

Interprofessional, Interdisciplinary, or Multidisciplinary

The terms “interprofessional” and “interdisciplinary” are often used interchangeably in the literature, but at the workshop most speakers and participants used the word “interprofessional.” This is not surprising, said Reeves, given that the workshop title included the term “interprofessional education.” These terms imply an integrative, collaborative approach to education or practice, he said. On the other hand, multidisciplinary simply means several fields, areas of expertise, or disciplines coming together without integrating the services (Reeves et al., 2010).

MEASUREMENT PRACTICES IN IPE

According to Forum member Eric Holmboe of the American Board of Internal Medicine (ABIM), there are two overarching themes that arise when one discusses measurement practices in IPE: the need for competency-based models and the need for a more robust evidence base. Although work is under way to fill the gaps in the evidence base, serious obstacles remain because of uncertainty about what to measure and how to measure it.

Currently, Holmboe said, there are differences of opinion regarding what the unit of analysis should be when measuring various aspects of IPE (i.e., individual, programmatic, institution) and where such an assessment should start. One Forum member suggested that, regardless of whether the analysis is of the faculty, the curriculum, the patient, or the community, the tools do exist, but the analysis needs to be broken out in a way that allows tools to be applied.

Suggested Citation:"4 Metrics." Institute of Medicine. 2013. Interprofessional Education for Collaboration: Learning How to Improve Health from Interprofessional Models Across the Continuum of Education to Practice: Workshop Summary. Washington, DC: The National Academies Press. doi: 10.17226/13486.
×

Analyzing Program Design

A number of workshop participants proposed starting with the desired results and working backward to determine the best ways to educate students. However, Holmboe said, this design goes against most health professional education models, which typically start with the student and work forward. Holmboe added that working this way also means educators have to predict what future practice will entail and to attempt to prepare health professional students to fit within that model. The World Health Organizatin (WHO) International Classification of Functioning, Disability and Health framework, presented by workshop speaker Stefanus Snyman in Chapter 2, may be a useful tool for envisioning such a practice, he said.

Purposeful IPE Research and Program Design

As the leader of the small group on IPE assessment, Holmboe reported to the wider audience the group’s contention that before initiating any assessment, the purpose of the assessment should be clarified. If the purpose is to drive improvements and feedback, for instance, tools could be built that have catalytic effects that impel future learning to improve health and to drive education. One example of this is the Kaiser Permanente care teams in Colorado. A care team includes physicians, clinical pharmacists, nurses, and medical assistants. A main focus of the teams’ care since 2008 has been hypertension control. During that time the percentage of members who kept their hypertension under control went from 61 to 83 percent, the latter of which is roughly 10 to 30 percent above the national average. As workshop speaker Dennis Helling, executive director of pharmacy operations and therapeutics at Kaiser Permanente, said, “We are a team-based, fully integrated delivery system, with an electronic medical record that is a great site for IPE.” And, he added, the pharmacy operations section is taking full advantage of this IPE opportunity by engaging its students in meaningful work as part of these well-functioning teams.

Despite the accepted benefits of student exposure to well-functioning teams like those at Kaiser Permanente Colorado, it has not been possible to directly measure the effects of interprofessional education on health. As Holmboe said, to assess IPE well, researchers will likely need to embrace more complex measurement strategies that require developmental expertise as well as a knowledge of methodology and program evaluation. It is possible, he said, that a combination of approaches and tools that includes both qualitative and quantitative methods will be required.

Holmboe speculated that the argument against a complex approach to analysis would be that it is easier to use the reductionist model of measuring small pieces of IPE. The problem, as he sees it, is that such a simplification

Suggested Citation:"4 Metrics." Institute of Medicine. 2013. Interprofessional Education for Collaboration: Learning How to Improve Health from Interprofessional Models Across the Continuum of Education to Practice: Workshop Summary. Washington, DC: The National Academies Press. doi: 10.17226/13486.
×

inevitably leads to a loss of information, bringing into question the meaning and the value of the assessment. Speaker Mark Earnest of the University of Colorado agreed and then elaborated on the issue. To assess collaboration effectively, he said, one needs measurements that are valid and reliable. He added, to be valid and reliable, the data need to be multi-source (that is, not just from a single person), to occur over multiple points in time across multiple settings, and to be measured against a standardized rubric. This is quite difficult to accomplish, Earnest said, although ABIM is working on developing such a model. Holmboe, who is from ABIM, pointed to the realist evaluation strategy by Ray Pawson and Nick Tilley and also to Michael Quinn Patton’s developmental evaluation as approaches that might provide insights into how IPE could be assessed and evaluated (Pawson and Tilley, 1997; Patton, 2011). In thinking through the various models to assess his students’ ability to work collaboratively, Earnest said that he studied the pros and cons of various educational models. More details are provided in Box 4-1.

Self-Directed Assessment

Assessment is something that all health professionals need to do to remain relevant within a field, but, Holmboe said, most often the assessor is not the person who would benefit most from the assessment. Thus he suggested that organizations should increasingly move to self-directed assessments. However, he said, this would have ramifications for the measurements of professional collaborative relationships. “When you ask an audience if they collaborate well, everybody puts their hands up, because nobody wants to say they’re a bad collaborator.” Thus one issue is whether self-assessment is biased and, if it is, how that would impact interprofessional assessment.

One participant from the breakout group on assessment suggested using newer technologies to track self-assessments in a more structured manner. This might include portfolios, blogs, or electronic applications installed on mobile devices, such as iPhones and iPads, which could be sources of information for measuring the effectiveness of IPE applications. In fact, the participant said, some IPE programs are already using blogs within portfolios that capture what happens over time, particularly from a developmental perspective.

Forum and planning committee member Jan De Maeseener of Ghent University in Belgium commented that the IPE instructors at Ghent University require students to maintain a portfolio of written and electronic reflections that begin with their first year and continue throughout their 6 years at the university. The reason for having students’ include their clinical

Suggested Citation:"4 Metrics." Institute of Medicine. 2013. Interprofessional Education for Collaboration: Learning How to Improve Health from Interprofessional Models Across the Continuum of Education to Practice: Workshop Summary. Washington, DC: The National Academies Press. doi: 10.17226/13486.
×

experiences in the portfolio, he said, is to encourage them to internalize the need for lifelong continuous professional development.

ASSESSMENT TOOLS

Eric Holmboe, in his presentation about the breakout group he led, talked about the need for faculty who are competent in IPE. “A general problem for all of concept-based education,” he said, “is that we have a faculty workforce across all the health professions who were not trained in the very system we are trying to create.” Based on the discussions of his small group, Holmboe commented that many faculties are struggling, so it will be necessary to offer many co-learning activities around assessment as well as education.

Despite the challenges to measuring competencies among learners, a number of presenters at the workshop did report the existence of fairly robust tools for assessing learners and evaluating programs at their institutions. The tools reported by the presenters are described below, organized by the universities at which the various IPE measurement methods are used.

Curtin University

At Curtin University in Australia, faculty have developed the Interprofessional Capability Assessment Tool (ICAT), illustrated in Figure 4-1. Drawn from models developed at Sheffield Hallam University and the University of Toronto, the ICAT assesses students within four domains: communication, professionalism, collaborative practice, and client-centered service and care. Students, faculty, and field preceptors all complete the ICAT form to provide students with feedback on the development of their interprofessional capabilities.

University of Colorado

Earnest, the IPE director from the University of Colorado, reported using an assessment program from Purdue University called the Comprehensive Assessment for Team-Member Effectiveness (CATME). With this tool, self- and peer-assessment information is gathered to determine how successfully each member contributed to the team’s performance. There are no assessments from individuals provided in the CATME report, only group feedback created by aggregating the data from individual responses. The eventual goal is to be able to compare these outcomes with team performance scores gathered from other interprofessional activities in order to measure their students’ interprofessional growth over time.

Suggested Citation:"4 Metrics." Institute of Medicine. 2013. Interprofessional Education for Collaboration: Learning How to Improve Health from Interprofessional Models Across the Continuum of Education to Practice: Workshop Summary. Washington, DC: The National Academies Press. doi: 10.17226/13486.
×

BOX 4-1
Mark Earnest, M.D., Ph.D.
University of Colorado

image

When designing the interprofessional experience for students at the University of Colorado, Mark Earnest and colleagues studied the pros and cons of various educational models. They were particularly interested in finding a model that could assess student learning. Through their research they considered the following models:

•    Traditional model of the facilitated discussion

•    Group projects

•    Problem-based learning

•    Michaelson’s team-based learning model

In the traditional model of the facilitated discussion, students participate in a planned “experience” and read literature to more fully understand the experience. They then come back to the university and discuss what they learned, typically with a faculty preceptor who serves as the referee. The goal is to engage all learners in speaking and active listening. In this model, the group does not necessarily have to make a decision, but if they do, the stakes are fairly small.

The group projects model requires students to work together in completing a term paper. For example, each student may write a paragraph, and in the final product all the paragraphs are assembled. But this is not teamwork or collaboration, Earnest said, and, generally, the students do not feel invested in the product in part

Suggested Citation:"4 Metrics." Institute of Medicine. 2013. Interprofessional Education for Collaboration: Learning How to Improve Health from Interprofessional Models Across the Continuum of Education to Practice: Workshop Summary. Washington, DC: The National Academies Press. doi: 10.17226/13486.
×

because they do not believe the paper is read with sufficient attention. Furthermore, evaluating each student’s contribution to the term paper is difficult.

Problem-based learning has a strong methodological foundation, but measuring the contribution of individual students is still difficult. Measuring or comparing the performance of one student team to another is difficult, as is finding problems that are amenable to this learning method and that all students embrace and are equally ready for.

Michaelson’s team-based learning model had a number of valuable components, but, as with the other models, much of the student work ultimately cannot be assessed. One team’s outcome can be qualitatively compared with that of another team, but an individual team’s performance is not measurable.

Given the limitations of each of the models, Earnest and his colleagues devised a new model with a set of principles for what they considered optimal conditions for learning about teamwork. One condition was the requirement that the team be the unit of learning and the unit of work. With the method that Earnest and colleagues developed, the team’s goal is important enough to them that they do not need a faculty preceptor. This situation more closely emulates real work environments, where there are no referees and team members need to work out challenges among themselves.

Borrowing from team- and problem-based learning models, Earnest’s model has student teams receive an activity that requires group problem solving and collaboration for successful completion. Unlike the case with the group term paper, this activity cannot be easily or efficiently accomplished by single individuals or by individuals working in parallel. In addition, the team performance is measurable so that at the end of the learning activity, members can compare how they did in a standardized objective way and find out how well their team performed compared to other teams. Those teams with better collaboration receive higher scores.

In this model, an activity begins with roughly eight teams gathering in a room with a single facilitator who keeps time and directs the learning experience. The teams work in parallel to solve a multidimensional clinical puzzle in which they identify potential harms and process errors. Teams are given an hour to complete the task. At the end of that time, their work is done, and each team receives a score that is posted at the front of the room. This is followed by a debriefing that focuses on what each team did to accomplish the activity and how the team got to its answer.

Through this team-based, competitive activity, educators at the University of Colorado hope to create a language and a set of experiences that students can translate into clinical settings that will provide them with a richer and more sophisticated understanding of how to collaborate effectively.

Suggested Citation:"4 Metrics." Institute of Medicine. 2013. Interprofessional Education for Collaboration: Learning How to Improve Health from Interprofessional Models Across the Continuum of Education to Practice: Workshop Summary. Washington, DC: The National Academies Press. doi: 10.17226/13486.
×

image

FIGURE 4-1 Interprofessional education capability framework—and the ICAT.

SOURCE: Brewer and Jones, in press.

University of Virginia

Faculty of the University of Virginia (UVA) IPE program are also interested in longitudinal assessment of student learning, said Valentina Brashers, the UVA presenter at the workshop. Their tool, the Interprofessional Teamwork Objective Structured Clinical Examination, assesses students’ pre- and post-clinical/clerkship outcomes in order to better understand student learning before and after completing four IPE simulation experiences, which are done in the same year. Students are also assessed following each individual simulation experience, she added. Using the Collaborative Behaviors Observational Assessment Tool, faculty can track student achievement of competencies corresponding to a specific simulation activity. Another assessment tool used at UVA is the Team Skills Scale. According to Brashers, this tool was developed by Hepburn and colleagues (1996) to assess self-perceived team skills in the preclinical education phase.

Brashers also said that researchers from UVA are looking into how well participants of the Continuing Interprofessional Education (CIE) Program

Suggested Citation:"4 Metrics." Institute of Medicine. 2013. Interprofessional Education for Collaboration: Learning How to Improve Health from Interprofessional Models Across the Continuum of Education to Practice: Workshop Summary. Washington, DC: The National Academies Press. doi: 10.17226/13486.
×

follow through on expressed commitments to change. In this Commitment to Change model, CIE participants are asked to fill out a “commitment to change” form before leaving the premises; UVA staff follow up with each participant 3 and 6 months later to ask whether the participant made the intended change. Although the results from this activity at UVA are still pending, Brashers said, studies have shown that health providers who make such commitments are more likely to change their behavior than those who do not make the commitments (Wakefield et al., 2003; Fjortoft, 2007).

University of Missouri

The University of Missouri’s IPE presenter, Carla Dyer, reported how a measure of safety—decreasing hospital patient falls—has been used as the endpoint for assessing student-based interprofessional interventions in an attempt to link IPE to patient outcomes. Using patient interviews to assess student success, the research group found that despite the lack of evidence demonstrating a significant impact on patient falls—which may have been an artifact of the small sample size—93 percent of patients reported that the students’ interventions had value. Furthermore, through pre- and post-intervention testing of the participating medical and nursing students, faculty did find that the students had significantly greater confidence in assessing and intervening at-risk patients after participating in the interventions.

Department of Veterans Affairs Administration

One of the evaluation tools used by the Department of Veterans Affairs (VA), reported on by Kathryn Rugen at the workshop, is the VA Learner Perception Survey. According to Rugen, this tool was modified specifically for use in primary care to include attributes of the PACT (Patient Aligned Care Teams) model of patient-centered, team-based interprofessional care. This revised survey was piloted in 2012. Rugen said that preliminary analysis showed that the trainees within the centers of excellence were reporting higher satisfaction rates, although further assessments (which are forthcoming) are needed to confirm these preliminary results.

Linköping University

At Linköping University in Sweden, Margaretha Wilhelmsson and colleagues were interested in knowing whether certain personal attributes indicated a readiness for interprofessional learning. According to Wilhelmsson, who represented the university’s IPE program at the workshop, they studied approximately 700 medical and nursing students from programs across

Suggested Citation:"4 Metrics." Institute of Medicine. 2013. Interprofessional Education for Collaboration: Learning How to Improve Health from Interprofessional Models Across the Continuum of Education to Practice: Workshop Summary. Washington, DC: The National Academies Press. doi: 10.17226/13486.
×

Sweden. Using the Readiness for Interprofessional Learning instrument, they found that women and those enrolled in nursing programs displayed earlier readiness for interprofessional learning. The study included only nursing and medical students, but it does indicate that some students may be more ready than others to work collaboratively. Such increased readiness could lead to greater success in interprofessional education and collaborations (Wilhelmsson et al., 2011).

EVALUATING IPE TO INTERPROFESSIONAL PRACTICE

In her summary remarks Forum member Gillian Barclay from the Aetna Foundation said that activities are under way to measure “care coordination” in the United States. For example, she pointed out that in 2010 the National Quality Forum published Preferred Practices and Performance Measures for Measuring and Reporting Care Coordination (NQF, 2010), and that same year the Agency for Healthcare Research and Quality produced Care Coordination Measures Atlas (AHRQ, 2012). The following year the National Committee for Quality Assurance made the Care Coordination Process Measures available in addition to similar measurement publications already available from other organizations. Despite these laudable efforts to measure care coordination activities, however, no organizations are attempting to measure linkages between IPE and interprofessional practice (IPP). As Barclay said, “It is a bit disturbing because the assumption is made that care can be coordinated without really figuring out if people have competencies and skills to work together as a team. It is not as simple as just putting people there and having them coordinate care.” In addition, she added, many of the indicators used to measure outcomes in care coordination come from the clinical environment, such as the 30-day readmission rate and the time spent in a waiting room. Barclay then challenged the audience to go beyond the walls of the clinical environment to use IPE-to-IPP indicators that measure outcomes in population health.

Although Forum Member Brenda Zierler from the University of Washington agreed with Barclay, she added that, from a clinical perspective, there may be difficulties with linking patient outcomes to IPE training events in the simulation lab or classroom for pre-licensure students. One reason for this is that students are trained together in team-based activities and then placed in clinical sites, one student at a time. The other issue is the inability of high-functioning clinical teams to articulate team competencies to students. This issue was also raised by Matthew Wynia of the American Medical Association, who found in a study with colleagues that team members do not always see what they do as transferrable, teachable, or something that others could adopt and learn (Mitchell et al., 2012). As a result, there are potential teachers and role models of team care that go

Suggested Citation:"4 Metrics." Institute of Medicine. 2013. Interprofessional Education for Collaboration: Learning How to Improve Health from Interprofessional Models Across the Continuum of Education to Practice: Workshop Summary. Washington, DC: The National Academies Press. doi: 10.17226/13486.
×

untapped because these individuals do not recognize that their activities are teachable.

Key Messages Raised by Individual Speakers

•    Implementers of IPE need to be clear about the purpose of their work so researchers can confidently analyze whether or not a program is successful. (Reeves)

•    Uncertainty over how to measure IPE creates obstacles to developing competency-based models and an evidence base for IPE. (Holmboe)

•    A complex, multi-sourced approach to assessment and evaluation is needed to distill the meaning and value of IPE. (Earnest and Holmboe)

•    Tools for assessing interprofessional learning are being developed and refined. (Brashers, Dyer, Earnest, Forman, and Rugen)

REFERENCES

AHRQ (Agency for Healthcare Research and Quality). 2012. Patient centered medical home resource center. http://www.pcmh.ahrq.gov/portal/server.pt/community/pcmh__home/1483/pcmh_defining_the_pcmh_v2 (accessed March 4, 2013).

Brewer, M., and S. Jones. In press. An interprofessional practice capability framework focusing on safe, high quality client centred health service. Journal of Allied Health.

Fjortoft, N. 2007. The effectiveness of commitment to change statements on improving practice behaviors following continuing pharmacy education. American Journal of Pharmacy Education 71(6):112.

Hepburn, K., R. A. Tsukuda and C. Fasser. 1996. Team skills scale. In G. D. Heinemann and A. M. Zeiss, eds., Team performance in health care: Assessment and development. New York: Kluwer Academic/Plenum Publishers.

Mitchell, P., M. Wynia, R. Golden, B. McNellis, S. Okun, C. E. Webb, V. Rohrbach, and I. von Kohorn. 2012. Core principles and values of effective team-based care. Discussion Paper, Institute of Medicine, Washington, DC. http://iom.edu/Global/Perspectives/2012/TeamBasedCare.aspx (accessed March 12, 2013).

NQF (National Quality Forum). 2010. Preferred practices and performance measures for measuring and reporting care coordination: A consensus report. Washington, DC: NQF.

Patton, M. 2011. Developmental evaluation: Applying complexity concepts to enhance innovation and use. New York: Guilford Press.

Pawson, R., and N. Tilley. 1997. Realist evaluation. London, UK: Sage.

Reeves, S., S. Lewin, S. Espin, and M. Zwarenstein. 2010. Interprofessional teamwork for health and social care. Oxford, UK: Wiley-Blackwell.

Suggested Citation:"4 Metrics." Institute of Medicine. 2013. Interprofessional Education for Collaboration: Learning How to Improve Health from Interprofessional Models Across the Continuum of Education to Practice: Workshop Summary. Washington, DC: The National Academies Press. doi: 10.17226/13486.
×

Wakefield, J., C. P. Herbert, M. Maclure, C. Dormuth, J. M. Wright, J. Legare, P. Brett-MacLean, and J. Premi. 2003. Commitment to change statements can predict actual change in practice. Journal of Continuing Education in the Health Professions 23(2):81–92.

Wilhelmsson, M., S. Ponzer, L. O. Dahlgren, T. Timpka, and T. Faresjö. 2011. Are female students in general and nursing students more ready for teamwork and interprofessional collaboration in healthcare? BMC Medical Education 11:15.

Suggested Citation:"4 Metrics." Institute of Medicine. 2013. Interprofessional Education for Collaboration: Learning How to Improve Health from Interprofessional Models Across the Continuum of Education to Practice: Workshop Summary. Washington, DC: The National Academies Press. doi: 10.17226/13486.
×
Page 43
Suggested Citation:"4 Metrics." Institute of Medicine. 2013. Interprofessional Education for Collaboration: Learning How to Improve Health from Interprofessional Models Across the Continuum of Education to Practice: Workshop Summary. Washington, DC: The National Academies Press. doi: 10.17226/13486.
×
Page 44
Suggested Citation:"4 Metrics." Institute of Medicine. 2013. Interprofessional Education for Collaboration: Learning How to Improve Health from Interprofessional Models Across the Continuum of Education to Practice: Workshop Summary. Washington, DC: The National Academies Press. doi: 10.17226/13486.
×
Page 45
Suggested Citation:"4 Metrics." Institute of Medicine. 2013. Interprofessional Education for Collaboration: Learning How to Improve Health from Interprofessional Models Across the Continuum of Education to Practice: Workshop Summary. Washington, DC: The National Academies Press. doi: 10.17226/13486.
×
Page 46
Suggested Citation:"4 Metrics." Institute of Medicine. 2013. Interprofessional Education for Collaboration: Learning How to Improve Health from Interprofessional Models Across the Continuum of Education to Practice: Workshop Summary. Washington, DC: The National Academies Press. doi: 10.17226/13486.
×
Page 47
Suggested Citation:"4 Metrics." Institute of Medicine. 2013. Interprofessional Education for Collaboration: Learning How to Improve Health from Interprofessional Models Across the Continuum of Education to Practice: Workshop Summary. Washington, DC: The National Academies Press. doi: 10.17226/13486.
×
Page 48
Suggested Citation:"4 Metrics." Institute of Medicine. 2013. Interprofessional Education for Collaboration: Learning How to Improve Health from Interprofessional Models Across the Continuum of Education to Practice: Workshop Summary. Washington, DC: The National Academies Press. doi: 10.17226/13486.
×
Page 49
Suggested Citation:"4 Metrics." Institute of Medicine. 2013. Interprofessional Education for Collaboration: Learning How to Improve Health from Interprofessional Models Across the Continuum of Education to Practice: Workshop Summary. Washington, DC: The National Academies Press. doi: 10.17226/13486.
×
Page 50
Suggested Citation:"4 Metrics." Institute of Medicine. 2013. Interprofessional Education for Collaboration: Learning How to Improve Health from Interprofessional Models Across the Continuum of Education to Practice: Workshop Summary. Washington, DC: The National Academies Press. doi: 10.17226/13486.
×
Page 51
Suggested Citation:"4 Metrics." Institute of Medicine. 2013. Interprofessional Education for Collaboration: Learning How to Improve Health from Interprofessional Models Across the Continuum of Education to Practice: Workshop Summary. Washington, DC: The National Academies Press. doi: 10.17226/13486.
×
Page 52
Suggested Citation:"4 Metrics." Institute of Medicine. 2013. Interprofessional Education for Collaboration: Learning How to Improve Health from Interprofessional Models Across the Continuum of Education to Practice: Workshop Summary. Washington, DC: The National Academies Press. doi: 10.17226/13486.
×
Page 53
Suggested Citation:"4 Metrics." Institute of Medicine. 2013. Interprofessional Education for Collaboration: Learning How to Improve Health from Interprofessional Models Across the Continuum of Education to Practice: Workshop Summary. Washington, DC: The National Academies Press. doi: 10.17226/13486.
×
Page 54
Next: 5 Interprofessional Education Within the Health System »
Interprofessional Education for Collaboration: Learning How to Improve Health from Interprofessional Models Across the Continuum of Education to Practice: Workshop Summary Get This Book
×
Buy Paperback | $45.00 Buy Ebook | $35.99
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF

Every year, the Global Forum undertakes two workshops whose topics are selected by the more than 55 members of the Forum. It was decided in this first year of the Forum's existence that the workshops should lay the foundation for future work of the Forum and the topic that could best provide this base of understanding was "interprofessional education." The first workshop took place August 29-30, 2012, and the second was on November 29-30, 2012. Both workshops focused on linkages between interprofessional education (IPE) and collaborative practice. The difference between them was that Workshop 1 set the stage for defining and understanding IPE while Workshop 2 brought in speakers from around the world to provide living histories of their experience working in and between interprofessional education and interprofessional or collaborative practice.

A committee of health professional education experts planned, organized, and conducted a 2-day, interactive public workshop exploring issues related to innovations in health professions education (HPE). The committee involved educators and other innovators of curriculum development and pedagogy and will be drawn from at least four health disciplines. The workshop followed a high-level framework and established an orientation for the future work of the Global Forum on Innovations in Health Professional Education. Interprofessional Education for Collaboration: Learning How to Improve Health from Interprofessional Models Across the Continuum of Education to Practice summarizes the presentations and small group discussions that focused on innovations in five areas of HPE:

1. Curricular innovations - Concentrates on what is being taught to health professions' learners to meet evolving domestic and international needs;

2. Pedagogic innovations - Looks at how the information can be better taught to students and WHERE education can takes place;

3. Cultural elements - Addresses who is being taught by whom as a means of enhancing the effectiveness of the design, development and implementation of interprofessional HPE;

4. Human resources for health - Focuses on how capacity can be innovatively expanded to better ensure an adequate supply and mix of educated health workers based on local needs; and

5. Metrics - Addresses how one measures whether learner assessment and evaluation of educational impact and care delivery systems influence individual and population health.

  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    Switch between the Original Pages, where you can read the report as it appeared in print, and Text Pages for the web version, where you can highlight and search the text.

    « Back Next »
  6. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  7. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  8. ×

    View our suggested citation for this chapter.

    « Back Next »
  9. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!