National Academies Press: OpenBook

Assessing Health Professional Education: Workshop Summary (2014)

Chapter: 1 Setting the Stage

« Previous: Background
Suggested Citation:"1 Setting the Stage." Institute of Medicine. 2014. Assessing Health Professional Education: Workshop Summary. Washington, DC: The National Academies Press. doi: 10.17226/18738.
×

1

Setting the Stage

Key Messages

  • Both summative and formative assessments are critical components of a competency-based system. (Holmboe, Norcini)
  • Understanding why the assessment is being conducted and how the purpose aligns with the desired outcomes is key to undertaking an assessment. (Holmboe, Norcini)
  • By combining a demonstration of knowledge with acquisition of skills, and by testing for an ability to apply both knowledge and skills in new situations, a message is sent to learners that knowledge, skills, application, and ability are all important elements for their education. (Holmboe, Norcini)
  • Too little time is spent on formative assessment. (Holmboe, Norcini)
  • There is a need for greater faculty development in the area of assessment. (Aschenbrener, Bezuidenhout, Holmboe, Norcini, Sewankambo)
  • Although it is a useful tool, most individuals are not good at self-assessments. (Baker, Holmboe, Norcini, Reeves)
  • Regardless of how well learners are trained, dangerous situations leading to medical errors will persist if there is no support of the larger organizational structures emphasizing the need for a culture of safety. (Finnegan, Gaines, Malone, Palsdottir, Talbott)

Suggested Citation:"1 Setting the Stage." Institute of Medicine. 2014. Assessing Health Professional Education: Workshop Summary. Washington, DC: The National Academies Press. doi: 10.17226/18738.
×

In setting the stage for the workshop, John Norcini from the Foundation for Advancement of International Medical Education and Research (FAIMER) described assessment as a powerful tool for directing learning by signaling what is important for a learner to know and understand. In this way, he said, assessments can motivate learners to acquire greater knowledge and skills in order to demonstrate that learning has occurred. The summative assessment measures achievement, while formative assessments focus on the learning process and whether the activities the learners engaged in helped them to better understand and demonstrate competency. As such, both summative and formative assessments are critical components of a competency-based system. A competency-based model directs learning based on intended outcomes of a learner (Sullivan, 1995; Harris et al., 2010) in the particular context of where the training takes place. Although it is outcome oriented, competency-based education also relies on continuous and frequent assessments for obtaining specific competencies (Holmboe et al., 2010).

THE PURPOSE OF ASSESSMENT

According to Norcini, assessment involves testing, measuring, collecting and combining information, and providing feedback (Norcini et al., 2011). Understanding why the assessment is being conducted and how the purpose aligns with the desired outcomes is key to undertaking an assessment. Norcini posed a list of potential purposes of the assessment in health professional education, which might include some or all of the following:

  • Enhance learning by pointing out flaws in a skill or errors in knowledge.
  • Ensure safety by demonstrating that learning has occurred.
  • Guide learning in a particular direction outlined by the assessment questions or methods.
  • Motivate learners to seek greater knowledge in a particular area.
  • Provide feedback to the educator or trainer that benchmarks progress of the learner.

Highlighting the fourth bullet, Norcini emphasized that a purpose of assessment is to “create learning.” In order to learn, one needs to be able to retrieve and use the information taken in. To underscore this point, Norcini cited an example involving students who took a test three times and ultimately scored better on that test than students who read a relevant article three times (Roediger and Karpicke, 2006). This is known as the “testing effect” where it is believed that tests can actually enhance retention even when those tests are given without any feedback. Norcini described

Suggested Citation:"1 Setting the Stage." Institute of Medicine. 2014. Assessing Health Professional Education: Workshop Summary. Washington, DC: The National Academies Press. doi: 10.17226/18738.
×

the testing effect hypothesis that assessments create learning because it forces not only retrieval but also application of information and signals to students what is important and what should be emphasized in their studies and experiential learning.

Forum Co-Chair Afaf Meleis from the University of Pennsylvania School of Nursing questioned whether there is a danger in using assessments that direct studying toward the assessment tool rather than opening new ways of critical thinking. Norcini responded in the positive, saying that because the risk is always present, the assessment tool must be carefully selected. Historically, tests have been designed around fact memorization. Roughly 20 to 25 years ago, the standardized patient was introduced into assessments that moved beyond the simple memorization–regurgitation model. By combining a demonstration of knowledge with acquisition of skills, and by testing for an ability to apply both knowledge and skills in new situations, a message is sent to learners that knowledge, skills, application, and ability are all important elements for their education.

Assessment Outcomes and Criteria

As might be expected, said Norcini, the most important outcome of an assessment differs based on one’s perspective. Students are concerned about being able to demonstrate their competence, educators and educational institutions are interested in producing competent health professionals who are accountable, and regulatory bodies are mainly focused on accountability and maintenance of professional competence. Users of the health system are also concerned that health professionals are accountable and competent, but in addition, they want to know if providers are being efficient with their resources.

Desired outcomes of an assessment differ not only based on perspective as noted above, but also based on the context within which the assessment is being conducted. And although there are certain characteristics of a good assessment, Norcini emphasized that no single set of criteria applies equally to all assessment situations. Despite all of the diversity in reasons for conducting assessments and the settings within which the assessments are conducted, Norcini reported on how participants at the Ottawa Conference were able to come together to produce a unified set of seven criteria needed for a good assessment (Norcini et al., 2011). These conference participants also explored how these criteria might be modified based on the purpose of the assessment and the stakeholder(s) using it. The criteria were presented to the Forum members for discussion at the workshop and can be found in Table 1-1.

In considering the criteria outlined by Norcini, Forum Co-Chair Jordan Cohen from George Washington University asked if it is possible to use

Suggested Citation:"1 Setting the Stage." Institute of Medicine. 2014. Assessing Health Professional Education: Workshop Summary. Washington, DC: The National Academies Press. doi: 10.17226/18738.
×

TABLE 1-1 Criteria Needed for a Good Assessment, Produced at the Ottawa Conference


Elements of a Good Assessment Describing the Assessment Element Further Information

Validity or coherence Is there a body of evidence that “hangs together” and supports the use of a test for a particular purpose Is a property of the inferences drawn from a test, not the test itself;
Is a matter of degree; Requires the ongoing collection of data
Reliability or reproducibility Scores of examinees will be the same if retested Test–retest reliability; Alternate form reliability; Split-half reliability; Reliability index
Equivalence Different versions of an assessment yield equivalent scores or decisions A challenge for assessment in the workplace
Educational effect The test motivates those who take it to prepare in a fashion that has educational benefit How do students prepare for the test?
Catalytic effect The assessment provides results and feedback in a fashion that enhances learning A requirement for formative assessment
Feasibility The test is practical, realistic, and sensible, given the circumstances and context
Acceptability Stakeholders find the assessment process and results to be credible

SOURCE: Norcini et al., 2011.

these principles of assessment for assessing how well teams function and work interprofessionally. Norcini responded with a resounding affirmation that the principles apply regardless of the assessment situation, although the challenges increase dramatically. This is an area, he said, that is a growing area of research. For example, the 360-degree assessment is one way to measure teams, and there is considerable work under way in using simulation to assess health professional teams.

Suggested Citation:"1 Setting the Stage." Institute of Medicine. 2014. Assessing Health Professional Education: Workshop Summary. Washington, DC: The National Academies Press. doi: 10.17226/18738.
×

Assessment as a Catalyst for Learning

Warren Newton, representing the American Board of Family Medicine, asked about Norcini’s use of the term catalyzing learning. Norcini responded that it is one thing to tell a student what is important to learn and another thing to provide students with feedback based on the assessment that drives their learning. The latter is a much more specific way of signaling what is important, and it is used to create learning among students. Newton then asked about the activity costs of assessment versus other kinds of activities. He pointed out that many of the Forum members manage both faculties and clinical systems; this prompted the question, how much time should be spent in assessment as part of the overall teaching role? Norcini responded by looking at the types of assessments, saying that far too much time is often devoted to summative assessment and too little time is spent on formative assessment; he added that formative assessment is the piece that drives learning and the part that is integrated with learning. Furthermore, assessments can be done relatively efficiently, especially if the assessors collaborate with partners across the institution. Norcini believes there could be greater sharing of resources across institutions, which would lead to better and more efficient assessments. Another advantage is the cost savings that can be achieved by spreading the fixed costs across institutions; these costs typically represent the largest expenses associated with assessments.

Assessment’s Impact on Patients and Society

Forum member and workshop co-chair Eric Holmboe from the American Board of Internal Medicine (ABIM) moderated the question-and-answer session with John Norcini, and brought up assessment from a public perspective. He asked the audience what the return on investment would be if the assessment were not in place—if health professionals were licensed who are insufficiently prepared, and allowed to practice throughout a 30-year career? The cost to society would be much less if time was spent, particularly on the formative side, to make sure health professionals acquire the competence needed to be effective. Holmboe said that often assessors look at the short-term costs and the time costs without recognizing that not putting in sufficient effort comes at a heavy cost over time. And, there has not been a strong concerted effort to embed assessment into daily activities, like bedside rounds; this might be a form of observation and assessment that could be more effectively exploited. There are also a number of multisource tools that are relatively low tech and involve a series of observations; however, what is lacking in these tools is how to make them sufficiently reliable so appropriate judgments and inferences can be extracted.

Forum and workshop planning committee member Patricia Hinton

Suggested Citation:"1 Setting the Stage." Institute of Medicine. 2014. Assessing Health Professional Education: Workshop Summary. Washington, DC: The National Academies Press. doi: 10.17226/18738.
×

Walker from the Uniformed Services University of Health Sciences followed Holmboe’s lead and asked about including the public on the health team and how an assessment might be conducted that includes not just patients but students as well. Norcini responded again by emphasizing the value of multisource feedback for team assessments as well as other opportunities, such as ethics panels that can make use of the patient’s competence in a particular area. He went on to say that the assessment process would lack validity if patients were not involved in the assessment. But in follow-up, Walker commented that students are somewhat separated from patients and families. Norcini pointed out this is an area of keen interest with researchers in the United Kingdom who are incorporating patients into the education of all health care providers through family interviews. Holmboe also brought up the longitudinal integrated clerkships (LICs) where students are assigned a group of patients and a family to follow over all 4 years of their training. The families play a major role in the assessment and feedback process of the trainees, said Holmboe. Although it is a resource intensive model, there are data from Australia, Canada, South Africa, and the United States looking into using LICs as an organizing principle (Norris et al., 2009; Hirsh et al., 2012). The Commonwealth Medical School in Scranton has actually moved to an entirely LIC-based model so every student at Commonwealth will be in an LIC-type model for their entire medical education.

Walker also wanted to know Holmboe’s and Norcini’s views on “high-stakes assessments.” In Holmboe’s opinion, there needs to be some form of public accountability through a summative assessment (Norcini agreed). At the ABIM, Holmboe views the certification exam as part of their public accountability as well as an act of professionalism. But for him, the bigger issue is the inclusion of more formative assessments during training and education rather than relying so much on summative examinations. Norcini added that he sees formative assessment as a mechanism for addressing trainee errors at a much earlier stage than waiting until the end for the summative assessment.

Jacob Buck from the University of Maryland School of Social Work, who joined the workshop as a participant, asked what the target of the assessment should be—is it to have healthier individuals and populations, or is it to graduate smarter health providers? In response, Norcini took apart the goal of the assessment. If the goal is to take better care of patients, then the focus would be on the demonstration of the skills in a practice environment and likely not a multiple choice test. In his opinion, the triple aim of improving health and care at lower costs may be the desired outcome from education, so an assessment could be designed to achieve that goal. Forum member Pamela Jefferies from Johns Hopkins University did not disagree, but she asked how one might measure interprofessional education (IPE) in the practice environment while patients are involved. Holmboe responded

Suggested Citation:"1 Setting the Stage." Institute of Medicine. 2014. Assessing Health Professional Education: Workshop Summary. Washington, DC: The National Academies Press. doi: 10.17226/18738.
×

that this gets at some of the complexities of assessing experiential learning acquisition of a learner. Holmboe also raised the complexity of finding training sites where high-quality interprofessional care can be experienced so the learners can be assessed against a gold standard. It is not surprising that learners who do not experience high-quality, interprofessional care are not well prepared to work in these environments. Jeffries suggested that interprofessional clinical simulations could help bridge the gap for learners who are not trained through an embedded IPE clinical or related work experience.

STRUCTURE AND IMPLEMENTATION OF ASSESSMENT

Looking at the assessment from a different lens, Forum member Bjorg Palsdottir, who represents the Belgian organization Training for Health Equity Network (THEnet), wanted to know more about who is doing the assessing and how that person might prepare to undertake this role. Norcini acknowledged the need for greater faculty development in this area because health professionals are not trained in education or assessment. Forum member and workshop planning committee member Carol Aschenbrener from the Association of American Medical Colleges agreed, but also felt that the shortage of modern, clinical practice sites in which to embed the learner is another major impediment. In her opinion, it is the clinical sites that need greater scrutiny and that, if pushed toward modernization through assessment, could be the lever for greater, more relevant faculty development. According to Holmboe, measuring practice characteristics unfortunately remains difficult although the tools are improving, particularly with the introduction of the Patient-Centered Medical Home (PCMH). For example, the National Committee for Quality Assurance (NCQA) PCMH developed the NCQA 2011 Medical Home Assessment Tool that providers and staff can use to assess how their practice operates compared to PCMH 2011 standards (Ingram and Primary Care Development Corporation, 2011). This tool looks mostly at structure and process, said Holmboe, but researchers are beginning to embed outcomes into the assessment that might make it a good starting place for measuring practice characteristics that could be then be applied in education.

Another example Holmboe described is the Dartmouth Microsystem Improvement Curriculum (DMIC). This is a set of tools that incorporates success characteristics associated with high-functioning practices (The Dartmouth Institute, 2013). It uses action learning to instruct providers on how to assess and improve a clinical work environment in order to ultimately provide better patient care. The Idealized Design of Clinical Office Practices (IDCOP) from the Institute for Healthcare Improvement is yet another tool (IHI, 2014). It attempts to demonstrate that through

Suggested Citation:"1 Setting the Stage." Institute of Medicine. 2014. Assessing Health Professional Education: Workshop Summary. Washington, DC: The National Academies Press. doi: 10.17226/18738.
×

appropriate clinical office practice redesign, performance improvements can be achieved that respond to patients’ needs and desires. Goals of the IDCOP model are better clinical outcomes, lower costs, higher satisfaction, and improved efficiency (IHI, 2000). Holmboe acknowledged that these examples are clinically oriented, and he would be interested to learn about other models (although no other models were offered by the participants).

Assessing Cultural Competence

Afaf Meleis asked how one might assess the social mission of health professional learners and design a tool that assesses cultural competence. Neither Norcini nor Holmboe knew of any good models to assess either of these areas, but Holmboe repeated that work within social accountability and professionalism can only be assessed if learners actually experience a work environment that has role models in these areas—and it is the responsibility of the professionals to create these opportunities. Norcini agreed with Meleis, saying that cultural competence is a critical issue to assess. He added that it is absolutely essential that assessors scrutinize the methods used and the results obtained to ensure no one is disadvantaged for cultural reasons. Meleis encouraged Norcini to add multicultural perspective to his list of criteria needed for a good assessment.

Assessment by Peers

Forum member Beverly Malone from the National League for Nursing questioned the role of peer assessment in formative and summative assessments given the inherent challenges associated with this type of assessment. Norcini responded that peer assessments are underutilized particularly when it comes to the assessment of teachers, although a set of measures is being developed for assessing teachers that includes peer assessment. Norcini added that another way to assess teachers is to look at the outcomes of students. Holmboe pointed out that one of the risks to using student outcomes as assessment tools of educators is when the experiences are not well designed so interactions with peers, patients, or others are brief or casual. Attempting to assess learners’ knowledge, skills, or ability in these types of brief and casual encounters are simply not useful, said Holmboe.

Assessment by Patients

The next question changed the focus of the conversation from the learner to the patient: a patient encounter is a one-time event, so what methodologies are in place to ensure equivalence when incorporating the patient’s very particular set of experiences? Norcini admitted that there are

Suggested Citation:"1 Setting the Stage." Institute of Medicine. 2014. Assessing Health Professional Education: Workshop Summary. Washington, DC: The National Academies Press. doi: 10.17226/18738.
×

biases so, in order to counter those, he samples the patient population of a provider as broadly as possible to include different patients on different occasions. In his opinion, there are at least three reasons for including patients in the assessment of providers:

  1. Patients are reluctant to criticize their provider so when they do, the provider has a major issue that should be addressed.
  2. Patients can be used to compare providers with their colleagues.
  3. Patient feedback makes a major difference in provider performance.

Time-Efficient Assessments

Another comment made during this question-and-answer session was a personal example from Forum member Joanna Cain, representing the American Congress of Obstetricians and Gynecologists and the American Board of Obstetrics and Gynecology, who described how her colleagues in the operating room (OR) use a time-efficient model of formative assessment. In their model, every operation ends with a “60-second” gathering of the team to discuss what did and did not go well. Holmboe applauded their use of formative assessment, but he cautioned against using time limitations as an excuse for not engaging in a complete assessment process. In his view, assessment is a professional obligation that demonstrates the return on investment. With that caveat, Holmboe reported that multiple 2- to 3-minute shared observations can be a rich source of information, and more opportunities for such assessments would be useful. In fact, as the OR example showed, quick assessments are attractive to many health professionals who keep busy schedules. Quick assessments can drive culture as colleagues observe the value in this form of individual and peer assessment, information sharing, and team building.

Self-Assessment

In hearing the previous discussion, Jordan Cohen commented that self-reflection is a potentially important tool. Norcini partly agreed, because although it is a useful tool, most individuals are not good at self-assessments. Holmboe added to the response that self-directed assessment defined by Eva and Regehr (2011) as a global judgment of one’s ability in a particular domain is as Norcini described. The real value is found when self-assessors seek comments and feedback from others, especially those outside their own profession or discipline (Sargeant, 2008). But despite the valuable information this form of assessment can provide, it is not used as often as other forms of assessment.

Suggested Citation:"1 Setting the Stage." Institute of Medicine. 2014. Assessing Health Professional Education: Workshop Summary. Washington, DC: The National Academies Press. doi: 10.17226/18738.
×

MAKING ASSESSMENT MEANINGFUL

Following the orienting discussion, Forum members engaged in interprofessional table discussions to delve more deeply into the value of formative and summative assessments. Each table in the room included Forum members, a health professional student representative, and a user of the health care system. The purpose of engaging students and patient representatives was to enrich the discussions at each table by infusing different perspectives into the conversations. Students identified by members of the Forum were invited to attend the workshop and represented the fields of social work, public health, medicine, nursing, pharmacy, and speech, language, and hearing. Forum member and workshop co-chair Darla Coffey from the Council on Social Work Education led the session. Coffey suggested that communication might be a focus of the discussions about assessment. One person from each group was designated to present to the entire group the summary of the discussions that took place at his or her table. The results of these discussions can be found in Table 1-2 (value of summative assessments) and Table 1-3 (value of formative assessments). The responses were informed by group discussion and should not be construed as consensus.

The Challenge of Uneven Power Structures

In addition to the points listed in the Tables 1-2 and 1-3, Forum member Richard Talbott, representing the Association of Schools of the Allied Health Professions, brought up challenges associated with assessing supervisors or others who may be possess greater power than the assessor, due to fear of reprisal. He believes that the first goal within communication is to dismantle the power structure so anyone can feel comfortable in speaking up. In this type of setting, individuals may feel more comfortable giving honest assessments. This would include patients and caretakers, and it would create positive role models for learners to emulate. Bjorg Palsdottir then discussed the hidden curriculum and how negative role models have an ability to imprint negative experiences on learners regardless of the educational training received in the classroom.

This comment was underscored by yet another Forum member, who cited an example of an aggressive attending physician. Their program director confronted the physician about his aggression by emphasizing the risk to safety, saying, “If you are intimidating people, you are not a safe practitioner.” One needs to understand how to navigate potentially delicate situations created by uneven power structures when one is challenging the hierarchy, said the Forum member. It takes practice, but it can be done. Workshop planning committee member Meg Gaines from the University

Suggested Citation:"1 Setting the Stage." Institute of Medicine. 2014. Assessing Health Professional Education: Workshop Summary. Washington, DC: The National Academies Press. doi: 10.17226/18738.
×

TABLE 1-2 Summative Assessment Discussion Question: From the Perspective of Assessment of Learning, What Do You Think Makes a Good Assessment Tool/Measure?a


Underappreciated Elements of a Good Assessment Description of Element Workshop Participant

Knowing the context Who the communication is with; who it is between; and for what purpose Carol Aschenbrener
Standardized metrics Include assessment of mutual respect, empathy, compassion, and professionalism across the different professions Patricia Hinton Walker
Standardized tools Indirect observation assessments Nelson Sewankambo
Safety Use clinical simulation to assess safety but be cognizant of embedded biases Meg Gaines
Hawthorne effect with assessments in simulation People act differently knowing their performance is being watched Scott Reeves
Identify the educational goals Align assessments with current educational goals Carol Aschenbrener

aThis table presents opportunities discussed by one or more workshop participants. During the workshop, all participants engaged in active discussions about opportunities. In some cases, participants expressed differing opinions. Because this is a summary of workshop comments and not meant to provide consensus recommendations, the workshop rapporteur endeavored to include all opportunities discussed by workshop participants as presented by the group leaders who were informed by the group discussions. This table and its content should be attributed to the rapporteur of this summary as informed by the workshop.

of Wisconsin Law School took this point a step further, saying that it was an ethical imperative to speak up.

This topic resonated with the Forum’s public health representative John Finnegan from the Association of Schools and Programs of Public Health (ASPPH), who was reminded of the 2005 Joint Commission report that cited communication failures as the leading root cause for medical errors (Joint Commission Resources, Inc., 2005). This does not mean the wrong information was always transmitted; rather, oftentimes nothing was said due to a fear of retribution. Regardless of how well learners are trained, said Finnegan, dangerous situations leading to medical errors will persist if

Suggested Citation:"1 Setting the Stage." Institute of Medicine. 2014. Assessing Health Professional Education: Workshop Summary. Washington, DC: The National Academies Press. doi: 10.17226/18738.
×

TABLE 1-3 Formative Assessment Discussion Question: From the Perspective of Assessment for Learning, What Do You Think Makes a Good Assessment Tool/Measure?a


Underappreciated Elements of a Good Assessment Description of Element Workshop Participant

Role models in practice environment The hidden curriculum can undo all education Bjorg Palsdottir
Safety Assess communication for safety rather than personality Susan Skochelak
Informed self-reflection Seek feedback from peers to inform self-reflection Eric Holmboe
Feedback Needs to be clear, directive, and timely, and assesses team and individual contributions Cathi Grus
Nonverbal communication Assess beyond spoken communication Cathi Grus
Bedside manner Assess for empathy Connie Mercer

NOTE: Connie Mercer participated in a table discussion as a user of the health care system.

aThis table presents opportunities discussed by one or more workshop participants. During the workshop, all participants engaged in active discussions about opportunities. In some cases, participants expressed differing opinions. Because this is a summary of workshop comments and not meant to provide consensus recommendations, the workshop rapporteur endeavored to include all opportunities discussed by workshop participants as presented by the group leaders who were informed by the group discussions. This table and its content should be attributed to the rapporteur of this summary as informed by the workshop.

there is no support of the larger organizational structures emphasizing the need for a culture of safety.

Assessment as a Driver for Change

Darla Coffey then asked the members and the students and patient representatives to consider how assessments could be a catalyst for change in the educational and health care systems. Much of the discussion revolved around the idea of better integrating education and practice; Forum member George Thibault from the Josiah Macy Jr. Foundation was a vocal advocate for rethinking health professional education and practice as one system. Forum member Lucinda Maine, the representative from the American Association of Colleges of Pharmacy, thought this could possibly be accom-

Suggested Citation:"1 Setting the Stage." Institute of Medicine. 2014. Assessing Health Professional Education: Workshop Summary. Washington, DC: The National Academies Press. doi: 10.17226/18738.
×

plished within her field by improving the assessment skills of their volunteer instructors and preceptors. In her view, this would make it easier to suggest changes in practice environments that could strengthen relationships within the continuum of education to practice. But, said Aschenbrener, for there to be any benefits to health professional education, assessments need to be reviewed at least annually for their alignment with the predetermined educational goals and the set level of student achievement.

The representative from the Association of American Veterinary Medical Colleges, Chris Olsen, felt that for assessment to drive change, it would need to be part of the expectation. Too often, assessments are carried out without taking the critical last step of using the information to drive change. Individual participants at the workshop provided their thoughts on how assessments in the context of education could drive changes in the practice environment. For example, workshop planning committee member Lucy Mac Gabhann, a law student at the University of Maryland, suggested that in a community setting, student assessment might influence policy. And Forum member Jan De Maeseneer from Ghent University in Belgium thought that students exposed to resource-constrained neighborhoods would develop a sensitivity to the social inequalities in health. However, others expressed doubt that assessments could affect change when the organizational culture is based on hierarchy and imbalances in power structures that are perpetuated through the hidden curriculum and role modeling. Beverly Malone pointed out that such a culture puts patients at risk when open and honest communication is avoided due to a fear of reprisal. John Finnegan fervently agreed, saying that communication in an organizational setting is strongly influenced by that culture, and no matter how much one tries to educate around it, the larger organizational framework will prevail. That must change, he said; there has to be a safe culture where communication is not feared in order for assessment to drive change in education and practice.

Yet another view was expressed by George Thibault, who pushed for health professions education and health care delivery to be taken as one unit with one goal. In this way, the impact of assessments is considered on both education and practice simultaneously. The educational reforms are informed by the delivery changes, and the delivery changes are informed by the education changes. If education and practice continue to be dichotomized, he said, valuable learning opportunities across the continuum will be missed. Workshop planning committee member Cathi Grus from the American Psychological Association commented on the opportunity for learning from assessments that are bidirectional. To her, such learning meant engaging patients in the design of the feedback that would be provided to students, and as such could send a powerful message to the learner of what is important to the end user of the health system. What is

Suggested Citation:"1 Setting the Stage." Institute of Medicine. 2014. Assessing Health Professional Education: Workshop Summary. Washington, DC: The National Academies Press. doi: 10.17226/18738.
×

important, said Grus, is that all involved have an understanding of the goals of the assessment in order to maximize its impact.

REFERENCES

The Dartmouth Institute. 2013. Dartmouth microsystem improvement curriculum: Microsystem action learning series. http://clinicalmicrosystem.org/materials/curriculum (accessed January 6, 2014).

Eva, K. W., and G. Regehr. 2011. Exploring the divergence between self-assessment and self-monitoring. Advances in Health Sciences Education 16(3):311-329.

Harris, P., L. Snell, M. Talbot, and R. M. Harden. 2010. Competency-based medical education: Implications for undergraduate programs. Medical Teacher 32(8):646-650.

Hirsh, D., E. Gaufberg, B. Ogur, P. Cohen, E. Krupat, M. Cox, S. Pelletier, and D. Bor. 2012. Educational outcomes of the Harvard Medical School-Cambridge Integrated Clerkship: A way forward for medical education. Academic Medicine 87(5):643-650.

Holmboe, E. S., J. Sherbino, D. M. Long, S. R. Swing, and J. R. Frank. 2010. The role of assessment in competency-based medical education. Medical Teacher 32(8):676-682.

IHI (Institute for Healthcare Improvement). 2000. Idealized design of clinical office practices. Boston, MA.

IHI. 2014. Idealized design of the clinical office practice (IDCOP): Overview. http://www.ihi.org/offerings/Initiatives/PastStrategicInitiatives/IDCOP/Pages/default.aspx (accessed January 6, 2014).

Ingram, D. J., and Primary Care Development Corporation. 2011. NCQA 2011 Medical Home Assessment Tool. http://www.pcdc.org/resources/patient-centered-medical-home/pcdc-pcmh/pcdc-pcmh-resources/PCDC-PCMH/ncqa-2011-medical-home.html (accessed January 6, 2014).

Joint Commission Resources, Inc. 2005. The Joint Commission guide to improving staff communication. Oakbook Terrace, IL: Joint Commission Resources.

Norcini, J., B. B. Anderson, V. Burch, M. J. Costa, R. Duvivier, R. Galbraith, R. Hays, A. Kent, V. Perrott, and T. Roberts. 2011. Criteria for good assessment: Consensus statement and recommendations from the Ottawa 2010 conference. Medical Teacher 33(3):206-214.

Norris, T. E., D. C. Schaad, D. DeWitt, B. Ogur, and D. D. Hunt. 2009. Longitudinal integrated clerkships for medical students: An innovation adopted by medical schools in Australia, Canada, South Africa, and the United States. Academic Medicine 84(7):902-907.

Roediger, H. L., and J. D. Karpicke. 2006. The power of testing memory: Basic research and implications for educational practice. Perspectives on Psychological Science 1(3):181-210.

Sargeant, J. 2008. Toward a common understanding of self-assessment. Journal of Continuing Education in the Health Professions 28(1):1-4.

Sullivan, R. S. 1995. The competency-based approach to training. Washington, DC: U.S. Agency for International Development.

Suggested Citation:"1 Setting the Stage." Institute of Medicine. 2014. Assessing Health Professional Education: Workshop Summary. Washington, DC: The National Academies Press. doi: 10.17226/18738.
×
Page 7
Suggested Citation:"1 Setting the Stage." Institute of Medicine. 2014. Assessing Health Professional Education: Workshop Summary. Washington, DC: The National Academies Press. doi: 10.17226/18738.
×
Page 8
Suggested Citation:"1 Setting the Stage." Institute of Medicine. 2014. Assessing Health Professional Education: Workshop Summary. Washington, DC: The National Academies Press. doi: 10.17226/18738.
×
Page 9
Suggested Citation:"1 Setting the Stage." Institute of Medicine. 2014. Assessing Health Professional Education: Workshop Summary. Washington, DC: The National Academies Press. doi: 10.17226/18738.
×
Page 10
Suggested Citation:"1 Setting the Stage." Institute of Medicine. 2014. Assessing Health Professional Education: Workshop Summary. Washington, DC: The National Academies Press. doi: 10.17226/18738.
×
Page 11
Suggested Citation:"1 Setting the Stage." Institute of Medicine. 2014. Assessing Health Professional Education: Workshop Summary. Washington, DC: The National Academies Press. doi: 10.17226/18738.
×
Page 12
Suggested Citation:"1 Setting the Stage." Institute of Medicine. 2014. Assessing Health Professional Education: Workshop Summary. Washington, DC: The National Academies Press. doi: 10.17226/18738.
×
Page 13
Suggested Citation:"1 Setting the Stage." Institute of Medicine. 2014. Assessing Health Professional Education: Workshop Summary. Washington, DC: The National Academies Press. doi: 10.17226/18738.
×
Page 14
Suggested Citation:"1 Setting the Stage." Institute of Medicine. 2014. Assessing Health Professional Education: Workshop Summary. Washington, DC: The National Academies Press. doi: 10.17226/18738.
×
Page 15
Suggested Citation:"1 Setting the Stage." Institute of Medicine. 2014. Assessing Health Professional Education: Workshop Summary. Washington, DC: The National Academies Press. doi: 10.17226/18738.
×
Page 16
Suggested Citation:"1 Setting the Stage." Institute of Medicine. 2014. Assessing Health Professional Education: Workshop Summary. Washington, DC: The National Academies Press. doi: 10.17226/18738.
×
Page 17
Suggested Citation:"1 Setting the Stage." Institute of Medicine. 2014. Assessing Health Professional Education: Workshop Summary. Washington, DC: The National Academies Press. doi: 10.17226/18738.
×
Page 18
Suggested Citation:"1 Setting the Stage." Institute of Medicine. 2014. Assessing Health Professional Education: Workshop Summary. Washington, DC: The National Academies Press. doi: 10.17226/18738.
×
Page 19
Suggested Citation:"1 Setting the Stage." Institute of Medicine. 2014. Assessing Health Professional Education: Workshop Summary. Washington, DC: The National Academies Press. doi: 10.17226/18738.
×
Page 20
Next: 2 Practical Examples of Health Professional Education Assessment »
Assessing Health Professional Education: Workshop Summary Get This Book
×
Buy Paperback | $45.00 Buy Ebook | $35.99
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF

Assessing Health Professional Education is the summary of a workshop hosted by the Institute of Medicine's Global Forum on Innovation in Health Professional Education to explore assessment of health professional education. At the event, Forum members shared personal experiences and learned from patients, students, educators, and practicing health care and prevention professionals about the role each could play in assessing the knowledge, skills, and attitudes of all learners and educators across the education to practice continuum. The workshop focused on assessing both individuals as well as team performance. This report discusses assessment challenges and opportunities for interprofessional education, team-based care, and other forms of health professional collaborations that emphasize the health and social needs of communities.

  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    Switch between the Original Pages, where you can read the report as it appeared in print, and Text Pages for the web version, where you can highlight and search the text.

    « Back Next »
  6. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  7. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  8. ×

    View our suggested citation for this chapter.

    « Back Next »
  9. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!