National Academies Press: OpenBook

Assessing Health Professional Education: Workshop Summary (2014)

Chapter: 2 Practical Examples of Health Professional Education Assessment

« Previous: 1 Setting the Stage
Suggested Citation:"2 Practical Examples of Health Professional Education Assessment." Institute of Medicine. 2014. Assessing Health Professional Education: Workshop Summary. Washington, DC: The National Academies Press. doi: 10.17226/18738.
×

2

Practical Examples of Health Professional Education Assessment

Key Messages

  • It is not enough to assess the product of a team process; one needs to also observe how the decision gets made in order to give feedback to the team about how to improve. (Baker, Zierler)
  • Although the design of the scale is important, what really matters is how the assessors are trained to observe. (Baker)
  • Communication is the most single important patient safety issue. (Zierler)
  • Assessing teams and assessing communication are very difficult to do. (Baker, Zierler)
  • There is no one-tool-fits-all for interprofessional education (IPE). The assessment instrument needs to be tailored based on the curriculum objectives, the goals, and the setting in which the interprofessional experience will take place. (Baker, Zierler)

As the moderator of the session on practical examples, Forum and workshop planning committee member Carol Aschenbrener from the Association of American Medical Colleges (AAMC) opened with remarks emphasizing comments made by workshop speaker John Norcini, that when health professional learners are tested using real-life situations, they go to the bedside to learn. The following are three examples of existing as-

Suggested Citation:"2 Practical Examples of Health Professional Education Assessment." Institute of Medicine. 2014. Assessing Health Professional Education: Workshop Summary. Washington, DC: The National Academies Press. doi: 10.17226/18738.
×

sessments that would prompt students to go to the bedside to learn because the answers to these questions cannot be found in a textbook.

David Baker, who is the senior vice president for health at IMPAQ International, was the first speaker. He focused on general observational tools for assessing team skills in the clinical setting. The next speaker was Jody Frost, who is the lead academic affairs specialist at the Association of Physical Therapy. She is also the lead on the Interprofessional Professionalism Consortium (IPC), and she focused on an emerging instrument to assess a special interprofessional skill—interprofessional professionalism. The third speaker was Forum member Brenda Zierler. Zierler is the co-director of the Center for Interprofessional Education, Research and Practice at the University of Washington Health Science Center. She talked about the system of assessments used at the University of Washington to assess both the learners and the program.

TEAM-BASED CARE AND COMMUNICATION

David Baker, IMPAQ International

Baker began his talk by framing the way he thinks about teamwork within four separate categories: (1) the components, (2) the elements, (3) the measures that relate to those components, and (4) the challenges (see Table 2-1).

Components

Baker broke the components of teamwork down into knowledge, skills, attitudes related to team performance, and outcomes.

Knowledge

For a team to reach its goal, members need to know (knowledge) the roles and responsibilities of each team member and how individuals’ roles and job assignments fit in with the rest of the team members’ roles and jobs. Accomplishing a shared goal assumes the team has a shared understanding or a shared mental model of the work of the team. For example, both knowing the plan of care and when the goal has been reached need to be understood by all of the team members in order to accomplish the overall goal.

Skills

In terms of skills, Baker referred to the Team Strategies and Tools to Enhance Performance and Patient Safety (TeamSTEPPS) curriculum

Suggested Citation:"2 Practical Examples of Health Professional Education Assessment." Institute of Medicine. 2014. Assessing Health Professional Education: Workshop Summary. Washington, DC: The National Academies Press. doi: 10.17226/18738.
×

TABLE 2-1 Baker’s Teamwork Framework: Components, Elements, Measures, and Challenges

Components Elements Measures Challenges
Knowledge

• Roles/responsibilities

• Shared mental model

• Knowledge test

• Knowledge structures

• Too easy

• Too complex

Skills

• Leadership

• Communication

• Situation monitoring

• Mutual support

• Self-report

• Observation

• Too easy to fake

• Necessary evil

Attitudes

• Importance of teamwork

• Mutual trust

• Self-report

• Too easy to fake

Outcomes

• Accuracy

• Timeliness

• Safety

• Performance

• Number correct

• Time to …

• Error counts

• Complication rate

• Mortality

• Neglects the “how” s

SOURCE: Baker, 2013.

published by the Agency for Healthcare Research and Quality (AHRQ). TeamSTEPPS is an evidence-based system designed for health care professionals to improve safety through better teamwork (AHRQ, n.d.). Their strategies and tools to enhance performance and patient safety can be divided into four areas, which are the basic learnable skills for teamwork. The four skills are leadership, communication, situation monitoring, and mutual support. Leadership, communication, and monitoring can be taught, and knowing the roles and responsibilities of each team member allows for assessment of whether or not individual members are performing up to expectations. Mutual support involves a different cultural context than the other three elements, and it fosters a climate of assistance and support for obtaining a high level of patient safety.

Attitudes

The importance of teamwork and mutual trust is emphasized under the attitudes element. TeamSTEPPS is designed to influence an individual member’s attitudes toward teamwork by improving skills and increasing knowledge about the effectiveness of teams. It is assessed through the TeamSTEPPS Teamwork Attitudes Questionnaire (T-TAQ) that was

Suggested Citation:"2 Practical Examples of Health Professional Education Assessment." Institute of Medicine. 2014. Assessing Health Professional Education: Workshop Summary. Washington, DC: The National Academies Press. doi: 10.17226/18738.
×

designed to measure individual attitudes related to the core components of teamwork (i.e., leadership, mutual support, situation monitoring, and communication) (Baker et al., 2008).

Outcomes

Outcomes of teamwork is the last category Baker described. While Baker considers knowledge, skills, and attitudes to be part of the formative assessment, he considers outcomes to be part of the summative assessment. The summative assessment of outcomes could include measuring a team’s accuracy, timeliness, safety, and performance.

Measures and Challenges

Baker then looked at measures that align with these process and outcome elements and some of the challenges to each of these components. In terms of assessing knowledge, one could administer knowledge tests through multiple-choice exams that test how much the learner knows about teamwork in general, about specific teams, and about the roles and responsibilities of individual team members. However, these exams tend to be too easy; most people know how to act within teams and how to communicate. An alternative would be to develop a test that looks at knowledge structures, like how one organizes information and thinks about the roles and responsibilities of those on a team. But, such tests are fairly complex, and Baker is unconvinced about their usefulness.

When looking at skills and attitudes, self-reporting is an area that receives considerable attention. The problem, as pointed out during the introductory session, is that the truth may not always be apparent to oneself. A more accurate way to assess is to rely more on outside observers for assessing skills; although the process can be a painful one, the likelihood of an honest assessment is much greater than with self-assessments.

Observational assessments are also a good way to measure outcomes because they are observable and easy to measure, said Baker. The downside to that is outcomes do not explain how the result was obtained.

Baker provided an example of an assessment scale known as the Trauma Team Performance Observation Tool (T-TPOT) to pull the entire framework together. Box 2-1 shows the leadership section of the observation tool that culminates with an overall team performance rating. Looking specifically at the leadership subdomains, he noted that the behaviors are very specific and observable. And although the instrument may be a bit outdated, it is a tool that is available and could be adapted for use in other studies. Continuing with the framework and how he views it, Baker then

Suggested Citation:"2 Practical Examples of Health Professional Education Assessment." Institute of Medicine. 2014. Assessing Health Professional Education: Workshop Summary. Washington, DC: The National Academies Press. doi: 10.17226/18738.
×

BOX 2-1
Trauma Team Performance Observation Tool (T-TPOT)

Leadership—The Team Leader Rating
Conducts a brief prior to patient arrival (e.g., identifies self, assigns members roles and responsibilities, discusses initial plan based on current information, anticipates interventions [chest tube, OR, etc.])
Continually renders plan of care to team
Feedback provided to team members is constructive Ensures task prioritization (e.g., important tasks performed first, ABCs and survey sequence are being completed)
Asks nonresponse team members to leave when they are distracting
Overall Rating

The T-TPOT was used to assess trauma team performance using simulation and in the trauma bay.

SOURCE: Capella et al., 2010.

provided some practical guidelines that he characterized in terms of the what, the how, and the where.

Starting first with determining which team element to observe, Baker commented on the extreme difficulties with team observations. An assessor would have to focus on and understand explicit skills and behaviors that could be observed, which he thought was extremely difficult to do. The example he used was “mutual trust,” which is not very observable from a behavioral concept. One would have to be able to see it to be able to assess it. Additionally, Baker said that one needs to think beyond what is being observed and consider why it is being observed. It is not enough to assess the product of a team process; one needs to also observe how the decision gets made in order to give feedback to the team about how to improve.

How observers are trained and the tools used to assess through observation come in a wide variety of choices. For example, there are different rating scales and different checklists depending on what is valued and what is needed in a given scenario. That scenario could be an on-the-job

Suggested Citation:"2 Practical Examples of Health Professional Education Assessment." Institute of Medicine. 2014. Assessing Health Professional Education: Workshop Summary. Washington, DC: The National Academies Press. doi: 10.17226/18738.
×

observation or a simulated experience. Observations capturing on-the-job assessments will likely rely more on generic instruments than observations conducted in a controlled environment, like simulation.

Baker underscored the importance of proper training for observers, saying that although the design of the scale is important, what really matters is how the assessors are trained to observe. Training in how to rate is by far more important than scale design; teaching observers to rate and observe from the standpoint of a common frame of reference is key to the reliability of the assessment, he said. However, the location of the observation also influences the assessment. Baker used the examples of an assessment of teams in the trauma bay and in simulation. For the real-life scenario, a trauma rater effect was noted because the observers are standing in the trauma bay during the study. Behaviors change, as noted by Forum member Scott Reeves from University of California, San Francisco; he brought up the Hawthorne effect with assessment using simulation, but it also exists in trauma bay assessments. With simulation, Baker noted an effect because people have tacit knowledge about how to behave so are often on their best behavior, which may or may not reflect their usual performance.

The positive aspect of simulation is that it allows more control over the test, unlike the on-the-job tests that may not offer an opportunity to express a desired behavior. In these cases, the scenario or the simulator can make sure the behavior is elicited and give people multiple times to try to perform it. If an opportunity for a formative assessment in a real-life situation is missed, it may not present itself again.

In summary, Baker says there is no escaping observation in team assessments, and properly training the observers significantly improves the value and accuracy of the assessment. For learning purposes, one should focus on process over outcomes. But numerous tools have been developed over the past 10 years that focus on both formative and summative assessments of teamwork and are published in the literature (see Appendix B for a description of the tools that were discussed at this workshop). This rapidly growing body of evidence is available and should be used by health professional educators to more effectively assess teamwork in a variety of education and practice settings.

ASSESSING INTERPROFESSIONAL PROFESSIONALISM

Jody Frost, Interprofessional Professionalism Collaborative (IPC)

The IPC is a collaborative representing 14 different professions that come together for the purpose of developing a valid and reliable assessment instrument that illustrates the desired elements of professionalism in an interprofessional environment. According to Jody Frost, who leads the IPC, this tool measures behaviors and is intended to be used by educators across

Suggested Citation:"2 Practical Examples of Health Professional Education Assessment." Institute of Medicine. 2014. Assessing Health Professional Education: Workshop Summary. Washington, DC: The National Academies Press. doi: 10.17226/18738.
×

BOX 2-2
Definition of Interprofessional Professionalism

Consistent demonstration of core values evidenced by professionals working together, aspiring to and wisely applying principles of altruism, excellence, caring, ethics, respect, communication, accountability to achieve optimal health and wellness in individuals and communities (Stern, 2006).

all the health professions (IPC, n.d.b). In developing the tool, Frost and her IPC colleagues reached out to professionalism and education experts on four different continents for their input on the content and structure of the tool. The outcome of their efforts is the IPC’s interprofessional professionalism assessment (IPA) tool that is designed to measure observable behaviors of professionalism in learning and practice environments.

This tool identifies 26 observable behaviors that are divided into six categories (communication, respect, altruism and caring, excellence, ethics, and accountability) based on the definition of interprofessional professionalism found in Box 2-2.

Within each of the six categories is a minimum of four observable behaviors. Table 2-2 shows examples of the sorts of interprofessional professionalism behaviors identified in the IPA. The complete list will be published in 2015 following the close of the pilot study.1

The instrument was designed for a five-point Likert scale that ranges from strongly disagree to strongly agree. There is also a category for “no opportunity,” indicating the behavior could not be observed in the particular environment where it is being used.

Forty-nine academic institutions across the United States are participating in the pilot study including up to 13 different health professions. To qualify as a pilot site, the institution must be involved in IPE or have their students engaged in a collaborative practice. Students completing their final practice experiences prior to earning their professional degree are eligible to participate. In the pilot, the preceptor is asked to watch the students throughout the interprofessional experience and assess them at the end. At the same time, the students receive an email to conduct a self-assessment of their behavior using the same list of behaviors provided to their preceptor.

The goal of this pilot is to collect 750 to 1,000 preceptor-student dyads across these 13 health professions. This final sample will be randomly split

___________________

1See http://interprofessionalprofessionalism.weebly.com/assessment.html for more information (accessed April 18, 2014).

Suggested Citation:"2 Practical Examples of Health Professional Education Assessment." Institute of Medicine. 2014. Assessing Health Professional Education: Workshop Summary. Washington, DC: The National Academies Press. doi: 10.17226/18738.
×

TABLE 2-2 Examples of Interprofessional Professionalism Behaviors Identified in the IPC’s IPA


Category Examples of Interprofessional Professionalism Behaviors

Communication Communicates with members of other health professions in a way that they can understand without using profession-specific jargon.
Respect Demonstrates confidence, without arrogance, while working with members of other health professions.
Altruism and caring Places patient/client needs above own needs and those of other health professionals.
Excellence Contributes to decisions about patient care regardless of hierarchy/profession-based boundaries.
Ethics Reports or addresses unprofessional or unethical behaviors when working with members of other health professions.
Accountability Accepts consequences for his or her actions without redirecting blame to members of other health professions.

NOTE: IPA = interprofessional professionalism assessment; IPC = Interprofessional Professionalism Collaborative.
SOURCE: IPC, n.d.a.

into subgroups in order to cross-validate the results. Through exploratory and confirmatory factor analysis, Frost intends to test how well the 26 behaviors fit within their assigned categories. In addition, metric calculations will be performed for convergent and discriminate validity and construct reliability.

Frost also intends to look at the variance between the preceptors and the students on the observed and self-assessed interprofessional professionalism behaviors, and how well preceptors feel the students are exhibiting the 26 interprofessional professionalism behaviors. This is intended to provide insight into how well preceptors model certain behaviors.

Once finalized, this instrument is expected to provide multiple benefits because it

  • Measures interprofessional professionalism construct through observable behaviors in practice situations;
  • Was piloted with different health professions, students, and preceptors from academic institutions with IPE to practice settings engaged in collaborative practice;
  • Can be used to connect higher education with health care environments;
Suggested Citation:"2 Practical Examples of Health Professional Education Assessment." Institute of Medicine. 2014. Assessing Health Professional Education: Workshop Summary. Washington, DC: The National Academies Press. doi: 10.17226/18738.
×
  • Can be used to connect interprofessional professionalism with quality care, patient safety, and patient/family-centered care; and
  • May improve how students and practitioners are educated and assessed with respect to interprofessional professionalism.

One identified gap in the tool, as noted by Frost, is the lack of input from the patient and care provider community. The plan is to modify the language so this assessment could be used to gather information from those who access the health care system. These patients, care providers, and others could provide valuable data for assessing providers’ interprofessional professionalism based on their own personal experiences.

The IPA instrument is expected to be released in 2015 as part of a tool kit being developed by the IPC members. It will provide information about how to use the IPA in education and practice, and its relevance in different environments. Frost directed participants to the IPC website for updates on the development of the IPA tool.2

ASSESSING IPE TEACHING AND LEARNING PERSPECTIVES

Brenda Zierler, University of Washington

In her presentation, Forum member Brenda Zierler described the team training she and her colleagues at the University of Washington developed to teach health professional students how to work together in a clinical environment using simulation. They were also charged to pilot a team-based simulation model that could be scaled up and used by others in similar educational settings.

Zierler added to her checklist throughout the 5-year project all the efforts they undertook to assess their team-based training approach (see Box 2-3). This was an iterative process as she and her colleagues developed and adjusted the curriculum then assessed the effects of these changes on learners and faculty.

The first step in developing a learning environment for a team approach to patient safety was to come up with a conceptual framework (see Box 2-4). Zierler and colleagues based their framework on the work of TeamSTEPPS3 described previously by David Baker in his talk. Zierler and her team adapted the TeamSTEPPS communication strategies to their simulation laboratory.

They elected to use simulation as the IPE learning activity because it

___________________

2See www.interprofessionalprofessionalism.weebly.com for more information (accessed April 18, 2014).

3TeamSTEPPS is a training system designed to maximize institutional collaboration and communication within teams in order to improve patient safety (AHRQ, 2008).

Suggested Citation:"2 Practical Examples of Health Professional Education Assessment." Institute of Medicine. 2014. Assessing Health Professional Education: Workshop Summary. Washington, DC: The National Academies Press. doi: 10.17226/18738.
×

BOX 2-3
IPE Assessment—Checklist Presented by Zierler

img Conceptual framework
img IPE learning activity (intervention)
img Learning objectives and outcomes (mapped to IPE competency statement(s) and associated behavior indicators)
img Approach/pedagogy
img Participants
img Assessment plan (including methods and tools)
img Feedback
img Other—faculty development

provides a safe environment for students and faculty to learn about team-based care and to improve their communication skills.

Zierler wrote objectives for their training module and mapped the competency statements with the competencies that were available at that time from Canada (Canadian Interprofessional Health Collaborative, 2010). Zierler and her team then spent a year developing a simulated case with students and faculty. The final product was based on an actual situation

BOX 2-4
Framework for Simulation Training

Interprofessional collaboration and communication → effective teamwork:

  • Communication
  • Leadership
  • Mutual support
  • Situational monitoring
  • Team structure

Suggested Citation:"2 Practical Examples of Health Professional Education Assessment." Institute of Medicine. 2014. Assessing Health Professional Education: Workshop Summary. Washington, DC: The National Academies Press. doi: 10.17226/18738.
×

that occurred in a high-stakes environment. And although their focus was on communication, Zierler felt strongly that each student coming to the simulation lab must have the skills needed to perform his or her job. If not, the entire team will fail. The students were also all provided an orientation to simulation to be sure they all had the same level of understanding about simulation.

Curriculum

Their simulation curriculum included an online pre- and post-training about TeamSTEPPS, as well as an in-person team building exercise with health professional students set up in interprofessional teams. Following a brief introduction and acquaintance period, the students are provided a short but intensive information session on communication and teamwork before being presented with three simulated cases. These cases are brought to life by human patient simulators, a standardized actor, or both. The 4-hour curriculum concludes with closing remarks by the organizers.

Content

Certain skills were the focus of the curriculum and helped form the structure of the case studies that were designed to force students to practice those skills (see Table 2-3). In this way, it was possible to assess whether or not the students learned and could demonstrate the acquired various teamwork and communication skills.

Assessment

Although the initial assessment plan was mostly unstructured, Zierler and her colleagues soon developed a strategy, based on previously tested tools, where students provided feedback to each other and to the faculty,

TABLE 2-3 TeamSTEPPS Skills Integrated into Simulated Cases


Communication Skills Team Skills

Brief Huddle
Callout Sharing the plan
Check-back Situational awareness
SBAR*  
Handoff  

* SBAR = Situation, background, assessment, recommendation. It is used to communicate information about patients in a structured format.
SOURCE: Zierler, 2013.

Suggested Citation:"2 Practical Examples of Health Professional Education Assessment." Institute of Medicine. 2014. Assessing Health Professional Education: Workshop Summary. Washington, DC: The National Academies Press. doi: 10.17226/18738.
×

and faculty provided feedback to the students. All of their discussions were about communication.

Through repeated simulation opportunities, students improved their skills as they practiced working as a team. The intermediate outcome of these experiential exercises was to improve the knowledge and skills around the attitude of working together in a team. Knowing that communication is the most single important patient safety issue, the long-term outcome was to improve communication among and across teams.

Their work was not set up to assess whether the skills acquired in the academic simulation lab transferred into practice, although it is a critical area for assessing the effect of this training.

Assessment Plan

From the onset, Zierler and her colleagues anticipated that their assessment plan would need to be flexible and responsive to their changing curricular needs. For example, the instructors stopped students in the middle of their simulation exercise if something was not working as they had envisioned. They would change the exercise on the spot and get students’ feedback about the alteration before continuing with the simulation. As the curriculum changed, the assessment of learners and faculty also changed in order to keep the assessment relevant to the training.

Both students and faculty benefitted from the assessments that took place halfway through the training. Students were assessed on teamwork and communication, and faculty were observed for how they facilitated the clinical case and communicated with students. Faculty could coach students on the clinical aspects but not on their ability to communicate. This was done so students learned from faculty about providing good care, but could use the “safe environment” to make mistakes in communication in order to learn.

The assessments consisted of self-evaluations and peer evaluations. The selected peer evaluators were given objective questions to impartially determine whether there was an appropriate handoff. This entails accurately and effectively transferring information from one care team to another, which, if done well, can decrease medical errors. Peer evaluators also looked at whether the teams huddled when they encountered a difficult situation, whether there was a briefing to different groups who entered into their exercise, and whether each member felt mutually supported within their team.

Zierler found it interesting that the evaluators who observed their peers in the initial case simulation actually performed better than the other students when they engaged in the third case. Although still analyzing the data, Zierler believed the students’ improved performance was the result

Suggested Citation:"2 Practical Examples of Health Professional Education Assessment." Institute of Medicine. 2014. Assessing Health Professional Education: Workshop Summary. Washington, DC: The National Academies Press. doi: 10.17226/18738.
×

of knowing what the instructors were measuring and therefore had greater knowledge about what aspects were important and should be focused on.

Not only did students learn from their preceptors and from each other, but faculty also heard from students regarding their level of coaching and learned whether they intervened too much or not enough. There were also surveys completed by faculty and students, followed by a structured debriefing. Although students were eager to talk about all that went wrong during the exercise, they were forced to follow a set format where students and faculty discussed what went well, what could have gone better, what is the one thing that they took away from the exercise, and what each person learned from the entire experience. One additional tool included in the assessment portfolio was a video recording of the case exercises. This was set up by a doctoral student doing her dissertation on the psychometrics of the simulated case tool to see whether it was possible to measure teamwork in individuals who are learning together for the first time.

Lessons Learned

Zierler closed her talk by describing the lessons she learned from their work on developing a patient-safety curriculum using simulated case studies. First, the context is vitally important. There is no one-tool-fits-all for IPE. The assessment instrument needs to be tailored based on the curriculum objectives, the goals, and the setting in which the interprofessional experience will take place. If it is a high-stakes environment that is uncertain and highly complex (like the one Zierler set up), it is going to have different requirements that will need to be adaptable because each experience will be different.

Another discovery was that assessors often want to measure all aspects of IPE, but focusing on what the exercise is set up to teach will better link the assessment to the goals of the educational activity. Also, everyone on the team needs to be clear about the purpose of the team’s work, which often required a discussion about language. Zierler found they needed to talk about communication barriers, such as profession-specific definitions and jargon, to be sure team members were speaking the same language.

Strategies to enhance learning were also important. Because human patient simulators would not always be readily available, Zierler’s group also made use of actors so students could be exposed to both teaching modalities. Regardless of the educational tool, it was the instructional strategies and the design of the unfolding case that were the critical components.

Zierler also talked about the dose and timing of interprofessional training. It is not currently known how much IPE students should receive. For example, is a single exposure to IPE adequate, or does IPE need to be repeated throughout the student’s education? It is similarly not known

Suggested Citation:"2 Practical Examples of Health Professional Education Assessment." Institute of Medicine. 2014. Assessing Health Professional Education: Workshop Summary. Washington, DC: The National Academies Press. doi: 10.17226/18738.
×

when students should be trained interprofessionally. Should the exposure take place early in students’ education, or all throughout their curriculum? From Zierler’s perspective, how much IPE a student requires in order for the student to demonstrate proficiency differs with each individual and is based on the individual’s personality; some students naturally collaborate well before even entering their health professional specialty.

For this exercise, it was important that each student came with the same knowledge base so the didactic session and online training about teamwork and communication was key to ensuring an equal understanding of the issues.

Finally, from doing the different types of assessment, Zierler learned that assessing teams and assessing communication are very difficult to do. The team might perform well, but there may have been one individual who did not communicate well, which complicates the assessment process. But, as Zierler pointed out, that is real life. She and her colleagues are providing a safe environment where students can experience such real-life situations so that when they are confronted with similar scenarios in practice, decisions can be made that decrease the likelihood of medical errors.

THE MESSINESS OF ASSESSING TEAMS

Forum Co-Chair Jordan Cohen from George Washington University began the question-and-answer session by asking about the unit of accountability; his understanding is that it would be the individual’s skills that are involved in communication and interprofessional teamwork. The assumption, he said, is that if those skills are learned and adequately assessed, the team will perform its appropriate functions when it comes together, and this would lead to the better outcomes—namely, better patient care. He then asked whether or not that assumption is validated; that is, are there ways to assess the team performance in terms of how the team actually produces the desired outcomes?

Baker responded that measuring team skills are clearly more complicated than measuring individual skills. For example, in assessing team leadership, there is an assumption that the physician is the leader, but when raters were trained using the T-TPOT (their assessment tool for their trauma study using simulation to measure patient outcomes), they found that leadership could be evidenced by any team member. For their study, they looked at the team’s plan of care. The plan may change and might even require continuous updating; Baker then asked, is this the responsibility of the team leader, or can any team member update the care plan? He added that in his work, they trained raters to focus on the behavior of the team and not the individual.

Suggested Citation:"2 Practical Examples of Health Professional Education Assessment." Institute of Medicine. 2014. Assessing Health Professional Education: Workshop Summary. Washington, DC: The National Academies Press. doi: 10.17226/18738.
×

Raters of Teamwork

Baker’s point raised the issue of how the rater is trained to interpret all these elements on the assessment scale. Interestingly, said Baker, in TeamSTEPPS, teamwork skills are taught to individuals because individuals are always changing in health care teams. New sets of skills are required for each team situation. A goal could be to have everyone trained with a common frame of reference so a common foundation frames subsequent alterations in the team responsibilities within different settings.

Raters of teamwork may find it difficult to aggregate a team score when the team contains, for example, one person who communicates well and two others who do not. This is one reason why the training of the raters is critical so they understand how to interpret certain observable behaviors. Baker admitted that assessing teams is difficult, particularly when they are assessed in actual care settings. In his opinion, there will need to be some level of acceptance of the “messiness;” these sorts of assessments will not meet equal standards that a written test can meet.

Metrics for Understanding Teams

Carol Aschenbrener picked up on a point raised by Baker that multiple observations of team members will provide more in-depth information about the team because some members may not communicate in one scenario but might be the lead communicator in another situation. This led to a question about the large number of metrics emerging from all the work being done in this area, and whether there might be a consensus emerging on what might be a common set of metrics that will have some comparability and transferability to different settings. Such an assessment would lead to a better understanding of teams across institutions and across health care systems. Baker was somewhat apprehensive about the development of one measurement tool for all situations. There is fairly good consensus about the core knowledge, skills, and attitudes that define team performance and the behaviors at a generic level that represent those constructs. However, what one team does in one domain is going to be somewhat different from how those generic behaviors are represented in those different situations. But, he speculated, one could create a mapping of teamwork in all the different settings and situations to show commonality but there would be challenges with slightly different interpretations of the team domain itself. So, he believes it is possible.

Jody Frost suggested that a team could do a 360-degree assessment. Referring specifically to the topic of her presentation, members could assess each other’s individual performance around the interprofessional professionalism behaviors to get a sense of how well are they doing as a team

Suggested Citation:"2 Practical Examples of Health Professional Education Assessment." Institute of Medicine. 2014. Assessing Health Professional Education: Workshop Summary. Washington, DC: The National Academies Press. doi: 10.17226/18738.
×

within certain key areas like communication. This would also provide insight into how well their design keeps patient needs at the center of their work. In this regard, Frost suggested that patients could perform the same assessment as the health professionals on the team, which could reveal interesting information as to how well patients believe team members are exhibiting certain desirable behaviors and whether the patients value the care they are receiving.

Uncovering Fundamental Teamwork Skills

Carol Aschenbrener gleaned from the presenters’ responses to the questions that it is one thing to assess a team that is reasonably stable, like an operating room team or a trauma team, but in reality, teams form, then dissolve, and then form again. She wondered whether there was some way to measure an individual’s ability to enter a new institution and join a team and then, 2 days later, join another team.

Forum member Mattie Schmitt from the American Academy of Nursing agreed that there are different kinds of teams. Some of the teams are relatively stable and work together over a period of time, such as palliative care teams that share a cohort of patients, while other teams come together then disband. This suggests that regardless of the team make-up and structure, there is a set of fundamental teamwork skills that are necessary for all teams to function effectively, and uncovering these skills would provide the basic elements for assessing members’ teamwork skills. Another important element for assessment of teams is identifying how high-functioning teams develop over time. Using the group development model and a measurement framework called SYMLOG (System for the Multiple Level Observation of Groups)4 in her research, Schmitt was able to look empirically at how people move physically over time and assess shared leadership.

Overcoming Power and Hierarchy

Drawing from the sociological literature, said Schmitt, there are some frameworks for understanding what it takes for groups of individuals to come together and work as a high-functioning team. Often, groups reach a high-functioning state when the issues of power and hierarchy are resolved. From her perspective, what is needed is a better understanding of how high-functioning teams have resolved the common obstacles within the context of their work.

Scott Reeves agreed with Schmitt in terms of needing to better assess power and hierarchy within teams, but he then questioned whether a group

___________________

4See http://www.symlog.com for more information (accessed April 28, 2014).

Suggested Citation:"2 Practical Examples of Health Professional Education Assessment." Institute of Medicine. 2014. Assessing Health Professional Education: Workshop Summary. Washington, DC: The National Academies Press. doi: 10.17226/18738.
×

of clinicians working in the same space can be called a team. In his experience, which spans years of assessments of all types of health care teams in three countries, there is a lot of rhetoric about teams but little evidence that teamwork is actually taking place. Despite calling themselves a team, Reeves is finding what he refers to as “parallel play.” In other words, individuals are coming together very briefly over an activity that ends, and then another one begins with new players. Although those involved believe they are engaging in excellent teamwork, it is actually more of a fragmented, transient interaction with different professions rather than true interprofessional teamwork.

The Department of Defense: Examples of Team Assessment

Forum and planning committee member Patricia Hinton Walker from the Uniformed Services University of Health Sciences commented that the U.S. Department of Defense (DoD) has been using TeamSTEPPS for quite some time. In her experience, it translates well in obstetrics, the operating room, and the emergency room, where there is more consistency in the members, the work, and the decision-making process. The new situations present challenges in performing high-level assessments; these situations include, for example, assessment across teams (a major area in patient safety) and measurement of diverse teams, like the DoD’s large medical-surgical units. Newer challenging areas for the DoD, she said, are how to assess teamwork in their patient-centered medical homes and in their virtual encounters, where teams may not be speaking face to face.

Walker then talked about two other initiatives that are beginning to be integrated with the DoD. The first involves emulating design principles of highly reliable organizations (HROs)5 to reinforce the roles of team members and work that draws on an established evidence base. The second is the Partnership for Patients. This initiative addresses the role of the patient, family member, or community on that team. Often these three initiatives—TeamSTEPPS, HROs, and Partnership for Patients—are seen as separate, but increasingly the DoD is trying to bring them together so the work of one can inform the other. Walker acknowledged that Schmitt’s point about power and hierarchy is indeed a challenge, which is compounded in the military due to a built-in hierarchical structure outside of health care.

___________________

5HROs are found within industries like airlines and nuclear power that rely heavily on specified design principles to avoid accidents and catastrophes that might be expected due to the complexity of the environment within which they function.

Suggested Citation:"2 Practical Examples of Health Professional Education Assessment." Institute of Medicine. 2014. Assessing Health Professional Education: Workshop Summary. Washington, DC: The National Academies Press. doi: 10.17226/18738.
×

Working Toward the Triple Aim

Walker’s comments were later followed by remarks from Forum member Malcolm Cox of the U.S. Department of Veterans Affairs (VA). He believed all initiatives and interventions should strive to progress toward the triple aim6 as the measurement goal. Though looking at the formation of teams or the effectiveness of teams is very important, it is not the primary goal, he said. Cox stated that goals are to improve the health of individuals and populations and to bend the cost curve so the savings can be reinvested productively in other enterprises such as education. Cox harkened back to Forum member George Thibault’s comments that education and practice should be thought of as one system so learning is assessed based on delivery system outcomes.

To illustrate his point, Cox described the transformation in primary care that has taken place over the past 3 years at the VA with the introduction of patient-centered medical homes. Roughly $800 million was initially invested. After 2 years, the VA has recouped about $600 million of the initial investment and is projected to start making a profit in another 1 to 2 years. Those profits could be used for investments in educating the next generation of health workers and health care providers. He feels strongly that, as educators, there is an urgency to figure out how education will be funded in the future. That funding, said Cox, is going to have to come from the health delivery systems because there is not going to be any new money for this initiative.

REFERENCES

AHRQ (Agency for Healthcare Research and Quality). 2008. TeamSTEPPS fundamentals course: Module 6. Communication: Instructor’s slides. http://www.ahrq.gov/professionals/education/curriculum-tools/teamstepps/instructor/fundamentals/module6/igcommunication.html (accessed January 6, 2014).

AHRQ. n.d. About TeamSTEPPS. http://teamstepps.ahrq.gov/about-2cl_3.htm (accessed January 6, 2014).

Baker, D. 2013. Practical guide for assessment: Team-based care and communication. Presented at the IOM workshop Assessing health professional education. Washington, DC, October 9.

Baker, D. P., K. J. Krokos, and A. M. Amodeo. 2008. TeamSTEPPS: Teamwork attitudes questionnaire manual. Washington, DC: American Institutes for Research.

Canadian Interprofessional Health Collaborative. 2010. A national interprofessional competency framework. Vancouver: Canadian Interprofessional Health Collaborative.

___________________

6 The Institute for Healthcare Improvement Triple Aim is a framework for health system performance involving (1) better patient care, (2) improved population health, and (3) reduced health care costs (IHI, 2014).

Suggested Citation:"2 Practical Examples of Health Professional Education Assessment." Institute of Medicine. 2014. Assessing Health Professional Education: Workshop Summary. Washington, DC: The National Academies Press. doi: 10.17226/18738.
×

Capella, J., S. Smith, A. Philp, T. Putnam, C. Gilbert, W. Fry, E. Harvey, A. Wright, K. Henderson, and D. Baker. 2010. Teamwork training improves the clinical care of trauma patients. Journal of Surgical Education 67(6):439-443.

IHI (Institute for Healthcare Improvement). 2014. The IHI triple aim. http://www.ihi.org/offerings/Initiatives/TripleAIM/Pages/default.aspx (accessed January 6, 2014).

IPC (Interprofessional Professionalism Collaborative). n.d.a. Examples of IPP behaviors. http://interprofessionalprofessionalism.weebly.com/behaviors.html (accessed January 6, 2014).

IPC. n.d.b. Interprofessional Professionalism Collaborative homepage. http://interprofessionalprofessionalism.weebly.com (accessed January 6, 2014).

Stern, D. T. 2006. Interprofessional Professionalism Collaborative: Measuring medical professionalism. Vol. 19. New York: Oxford University Press.

Zierler, B. 2013. Practical guide for assessment of IPE teaching/learning perspective. Presented at the IOM workshop:Assessing health professional education. Washington, DC, October 9.

Suggested Citation:"2 Practical Examples of Health Professional Education Assessment." Institute of Medicine. 2014. Assessing Health Professional Education: Workshop Summary. Washington, DC: The National Academies Press. doi: 10.17226/18738.
×

This page intentionally left blank.

Suggested Citation:"2 Practical Examples of Health Professional Education Assessment." Institute of Medicine. 2014. Assessing Health Professional Education: Workshop Summary. Washington, DC: The National Academies Press. doi: 10.17226/18738.
×
Page 21
Suggested Citation:"2 Practical Examples of Health Professional Education Assessment." Institute of Medicine. 2014. Assessing Health Professional Education: Workshop Summary. Washington, DC: The National Academies Press. doi: 10.17226/18738.
×
Page 22
Suggested Citation:"2 Practical Examples of Health Professional Education Assessment." Institute of Medicine. 2014. Assessing Health Professional Education: Workshop Summary. Washington, DC: The National Academies Press. doi: 10.17226/18738.
×
Page 23
Suggested Citation:"2 Practical Examples of Health Professional Education Assessment." Institute of Medicine. 2014. Assessing Health Professional Education: Workshop Summary. Washington, DC: The National Academies Press. doi: 10.17226/18738.
×
Page 24
Suggested Citation:"2 Practical Examples of Health Professional Education Assessment." Institute of Medicine. 2014. Assessing Health Professional Education: Workshop Summary. Washington, DC: The National Academies Press. doi: 10.17226/18738.
×
Page 25
Suggested Citation:"2 Practical Examples of Health Professional Education Assessment." Institute of Medicine. 2014. Assessing Health Professional Education: Workshop Summary. Washington, DC: The National Academies Press. doi: 10.17226/18738.
×
Page 26
Suggested Citation:"2 Practical Examples of Health Professional Education Assessment." Institute of Medicine. 2014. Assessing Health Professional Education: Workshop Summary. Washington, DC: The National Academies Press. doi: 10.17226/18738.
×
Page 27
Suggested Citation:"2 Practical Examples of Health Professional Education Assessment." Institute of Medicine. 2014. Assessing Health Professional Education: Workshop Summary. Washington, DC: The National Academies Press. doi: 10.17226/18738.
×
Page 28
Suggested Citation:"2 Practical Examples of Health Professional Education Assessment." Institute of Medicine. 2014. Assessing Health Professional Education: Workshop Summary. Washington, DC: The National Academies Press. doi: 10.17226/18738.
×
Page 29
Suggested Citation:"2 Practical Examples of Health Professional Education Assessment." Institute of Medicine. 2014. Assessing Health Professional Education: Workshop Summary. Washington, DC: The National Academies Press. doi: 10.17226/18738.
×
Page 30
Suggested Citation:"2 Practical Examples of Health Professional Education Assessment." Institute of Medicine. 2014. Assessing Health Professional Education: Workshop Summary. Washington, DC: The National Academies Press. doi: 10.17226/18738.
×
Page 31
Suggested Citation:"2 Practical Examples of Health Professional Education Assessment." Institute of Medicine. 2014. Assessing Health Professional Education: Workshop Summary. Washington, DC: The National Academies Press. doi: 10.17226/18738.
×
Page 32
Suggested Citation:"2 Practical Examples of Health Professional Education Assessment." Institute of Medicine. 2014. Assessing Health Professional Education: Workshop Summary. Washington, DC: The National Academies Press. doi: 10.17226/18738.
×
Page 33
Suggested Citation:"2 Practical Examples of Health Professional Education Assessment." Institute of Medicine. 2014. Assessing Health Professional Education: Workshop Summary. Washington, DC: The National Academies Press. doi: 10.17226/18738.
×
Page 34
Suggested Citation:"2 Practical Examples of Health Professional Education Assessment." Institute of Medicine. 2014. Assessing Health Professional Education: Workshop Summary. Washington, DC: The National Academies Press. doi: 10.17226/18738.
×
Page 35
Suggested Citation:"2 Practical Examples of Health Professional Education Assessment." Institute of Medicine. 2014. Assessing Health Professional Education: Workshop Summary. Washington, DC: The National Academies Press. doi: 10.17226/18738.
×
Page 36
Suggested Citation:"2 Practical Examples of Health Professional Education Assessment." Institute of Medicine. 2014. Assessing Health Professional Education: Workshop Summary. Washington, DC: The National Academies Press. doi: 10.17226/18738.
×
Page 37
Suggested Citation:"2 Practical Examples of Health Professional Education Assessment." Institute of Medicine. 2014. Assessing Health Professional Education: Workshop Summary. Washington, DC: The National Academies Press. doi: 10.17226/18738.
×
Page 38
Suggested Citation:"2 Practical Examples of Health Professional Education Assessment." Institute of Medicine. 2014. Assessing Health Professional Education: Workshop Summary. Washington, DC: The National Academies Press. doi: 10.17226/18738.
×
Page 39
Suggested Citation:"2 Practical Examples of Health Professional Education Assessment." Institute of Medicine. 2014. Assessing Health Professional Education: Workshop Summary. Washington, DC: The National Academies Press. doi: 10.17226/18738.
×
Page 40
Next: 3 Assessment as an Agent for Change »
Assessing Health Professional Education: Workshop Summary Get This Book
×
Buy Paperback | $45.00 Buy Ebook | $35.99
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF

Assessing Health Professional Education is the summary of a workshop hosted by the Institute of Medicine's Global Forum on Innovation in Health Professional Education to explore assessment of health professional education. At the event, Forum members shared personal experiences and learned from patients, students, educators, and practicing health care and prevention professionals about the role each could play in assessing the knowledge, skills, and attitudes of all learners and educators across the education to practice continuum. The workshop focused on assessing both individuals as well as team performance. This report discusses assessment challenges and opportunities for interprofessional education, team-based care, and other forms of health professional collaborations that emphasize the health and social needs of communities.

  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    Switch between the Original Pages, where you can read the report as it appeared in print, and Text Pages for the web version, where you can highlight and search the text.

    « Back Next »
  6. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  7. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  8. ×

    View our suggested citation for this chapter.

    « Back Next »
  9. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!