The workshop described in this summary report is an activity hosted by the Institute of Medicine’s (IOM’s) Global Forum on Innovation in Health Professional Education (IHPE), which is the largest Forum at the National Academies. With 61 members from 8 high-, middle-, and low-income countries, who represent multiple sectors drawn from 18 different health professions involved with education and practice, the Forum provides an excellent platform on which to incubate new ideas that might only be produced from such a diverse membership. For this workshop, subject matter experts presented to the Forum members their extensive research and experiences relating to assessment in the context of health professional education. These presentations added significantly to the richness of the discussions.
Like all forums and roundtables at the IOM, IHPE is not designed to provide consensus recommendations, so any advice that may be construed from this report is that of individuals whose views do not necessarily represent those of the IOM. It might also be noted that as a summary report, the information provided includes only what was discussed at the workshop and may not be representative of all views on assessment in health professional education; however, the report does provide some interesting examples and highlights some key principles that were expressed during the workshop.
1The planning committee’s role was limited to planning and convening the workshop. The views contained in the report are those of individual workshop participants and do not necessarily represent the views of all workshop participants, the planning committee, or the Institute of Medicine.
Each of the Global Forum workshops is webcast and open to the public. The purpose of these events is to build coalitions with global partners over how to improve health professional education by sharing experiences and ideas with other members. Through this open, online system of information sharing, different types of collaborations are formed that can positively affect local education by learning from global partners. The workshop and its subsequent summary report is one example of the types of activities undertaken by this Forum.
Topics selected for more in-depth exploration are chosen by the Forum members themselves after considerable consultation concerning needs and gaps within the area of health professional education. One identified area of concern is the lack of uniformity among educators and health professionals in the area of assessment. Without greater standardization of practices used to assess learners and educators, spreading best practices becomes a challenge. The same is true in practice environments where assessments are not commonplace, and those that do occur are typically ad hoc events. This issue was touched on at a previous Forum workshop on interprofessional education (IPE). At that workshop, Scott Reeves, the editor of the Journal of Interprofessional Care, emphasized the importance of measuring the impacts of IPE collectively, which would necessitate a common parlance so different IPE experiences could more easily be compared. He began with a distinction between assessment and evaluation, which is also relevant to this report:
Assessment is done to determine the level of understanding by a learner, while evaluation is a tool to determine how well a program or an educator teaching a course is conveying messages. For assessment, he says, there needs to be a meaningful analysis of how the individual learns, not just in the short term but in the long term as well. For evaluation, thoughtful consideration is needed to determine how well the program is conveying the desired messages and information. (IOM, 2013)
Also evident at that workshop was the members’ view that education and practice are a continuous learning cycle with the patient (or person) at the center of the learning process. This particular perspective was similarly expressed in this 2-day, Forum-sponsored workshop that explored assessment of health professional education. At the event, Forum members shared personal experiences and learned from patients, students, educators, and practicing health care and prevention professionals about the role each could play in assessing the knowledge, skills, and attitudes of all learners and educators across the education to practice continuum. This was looked at from the perspective of assessing individual as well as team performance and individuals’ work as team members. In this regard, particular atten-
tion was given to assessing IPE, team-based care, and other forms of health professional collaborations that emphasize the health and social needs of communities. These various viewpoints are reflected in the following workshop objectives that were used to design the agenda:
- To look at the current state of assessment competencies in three areas, including IPE; team-based care; and patient/person centeredness
- To discuss challenges and opportunities of assessment within these three areas
- To encourage new linkages among professions that lay the foundation for interprofessional interactions that better engage consumers, communities, and/or business leaders
These objectives were developed by a eight-person planning committee led by co-chairs Darla Coffey, Council on Social Work Education, and Eric Holmboe, American Board of Internal Medicine, who structured the workshop based on the Statement of Task shown in Box B-1. The agenda for this workshop that took place on October 9–10, 2013, in Washington, DC, is found in Appendix A.
KEY ASPECTS OF THE WORKSHOP
The content covered at the workshop and captured in this summary report involves assessing core competencies particularly within IPE and health professional collaborations that include patient-centered health care teams. For the purposes of this workshop it may be noted that competency is not the same as competence because according to Holmboe, the ultimate goal of a competency-based educational system is expertise, not competence (Talbot, 2004; Holmboe et al., 2010). And in this regard, assessment measures whether a learner can demonstrate competencies have been achieved and is therefore capable of practicing those competencies.
Discussions at the workshop helped describe these competencies and explored the challenges, opportunities, and innovations in assessment across the education-to-practice continuum. Through facilitated discussions and moderated panel presentations, Forum members explored the challenges to effectively assessing individuals and groups while also considering potential opportunities for improving assessments across the education-to-practice continuum. Such opportunities might directly involve patients and other users of the health care system in assessments of health systems and the continuing education of health professionals. It might also involve communities for assessing health professional students’ involvement in wellness activities that benefit the targeted community. Discussions within these content areas led to descriptions of the importance of institutional or organizational
Statement of Task
Better use of existing assessment methods and new innovative tools are needed to assess the kind of competencies health professional students will need to adapt to a “new professionalism” that is interprofessional and that focuses on health improvement and the triple aim of improved patient care and experience, improved population health, and reduced costs. In an era of evolving technology and changing health and health care environments, creative thinking is needed to consider assessment methods and tools that have a positive impact, are affordable, are easily integrated into education, and assess competencies at micro-, meso-, and macro-levels (individual, team, organization). The impact could be measured by (1) improvements in population health outcomes, (2) better patient care, (3) more interprofessional collaboration/understanding, and (4) maximum value of services at lower costs.
To address these issues, an ad hoc committee of the IOM will plan and conduct a 2-day public workshop titled “Assessing Health Professional Education.” The committee will develop a workshop agenda that will attempt to elucidate such challenging issues as noted below, select and invite speakers and discussants, and moderate the discussions:
- What is currently being assessed and how might the outcomes be used (i.e., enhanced patient-centeredness, greater social accountability, promotion by media, learner skills, faculty development)?
- How can different disciplines be assessed such that the data inform a “new professionalism”?
- Which kind of assessment will lead to a new professionalism?
- What is the role of peer assessment?
- What is the role of patients in assessment?
- What is the role of work-based assessments?
- How might learners and practitioners be prepared for a lifetime of assessment?
culture change in the form of faculty development, role modeling, and experiential learning opportunities for promoting new thinking and the development of new competencies. The idea that assessment could help to drive such culture change was key.
Many of these ideas are presented and described within the five chapters of this workshop summary report, but more specifically:
Chapter 1 highlights the goals of assessment that can be viewed somewhat as catalysts for learning. It also discusses criteria for a good assessment and delves more deeply into the value of formative and summative assessments by differentiating assessments of learning and assessments for
learning. The roles of peers, patients, and direct observation in assessment are also considered.
Chapter 2 focuses on the role of education in teamwork, describes methodologies to teach teamwork, and presents some of the approaches to and challenges for assessing teamwork. This chapter also describes a tool to assess professionalism and elements of the interprofessional environment. Finally this chapter describes education in teamwork using simulation. These three presentations highlighted the challenge or tension of evaluating teams versus the members of the teams, aggregating scores, and evaluating stable teams versus fluid teams.
Chapter 3 presents different challenges to assessing various aspects of IPE and interprofessional practice based on examples that were drawn from around the world. The examples addressed the following:
- How to assess collaborative and transformative leadership;
- Deficiencies in organizational cultures that limit a collaborative atmosphere;
- Strategies for assessment in low resource settings (i.e., 360-degree evaluations, use of clinical outcomes);
- How to better use faculty development for promoting interprofessional practice and education; and
- Strategies to motivate faculty to embrace interprofessional practice.
Chapter 4 describes three ways in which technology has been leveraged for health education of patients, nursing students, and the general public through the Leading Reach Patient Engagement Mobile Platform, the University of Illinois College of Nursing’s simulation activity, and the Khan Academy’s open platform for medical education, respectively. Emphasis was on how each technology might be used for assessing interprofessional teams, promoting IPE and learning, and engaging patients without worsening disparities among disadvantaged populations.
Chapter 5 focuses on expanding high-quality assessments with strategies focused on the policy (macrolevel), the institution (mesolevel), and the individual (microlevel). Assessments focused on the interprofessional learner, measuring the effectiveness of new technologies and methods for teaching IPE, opportunities for assessing teams and collaborations in and with the community, and strategies for expanding the role of the patient voice in assessment from education to practice.
Holmboe, E. S., J. Sherbino, D. M. Long, S. R. Swing, and J. R. Frank. 2010. The role of assessment in competency-based medical education. Medical Teacher 32(8):676-682.
IOM (Institute of Medicine). 2013. Interprofessional education for collaboration: Learning how to improve health from interprofessional models across the continuum of education to practice: Workshop summary. Washington, DC: The National Academies Press.
Talbot, M. 2004. Monkey see, monkey do: A critique of the competency model in graduate medical education. Medical Education 38(6):587-592.