Skip to main content

Currently Skimming:

2 Evaluating Outcomes Based on Thoughtful Program Designs (Step 5)
Pages 7-16

The Chapter Skim interface presents what we've algorithmically identified as the most significant single chunk of text within every page in the chapter.
Select key terms on the right to highlight them within pages of the chapter.


From page 7...
... (Artino) STARTING WITH THE END IN MIND: DESIGNING AND EVALUATING FACULTY DEVELOPMENT Anthony Artino, The George Washington University, and Aliki Thomas, McGill University Beginning with the end in mind, said Anthony Artino, tenured professor ­ at The George Washington University, the fifth step of designing and evaluating faculty development becomes the first step.
From page 8...
... activity • Using a framework to guide design • Evaluation and sustainability Faculty Development as a Knowledge Translation Activity As part of best practices in virtual education, Thomas began by asking the workshop participants to complete the poll, "Is faculty development a knowledge translation intervention? " While many responded "yes," quite a few respondents indicated they were not sure what KT was.
From page 9...
... In the chat box, Warren Newton, president and chief executive officer for the American Board of Family Medicine, worried about knowledge being presented as binary or gaps, as opposed to higher-order "wisdom." Getting facts seems a low bar when compared to demonstrating implementation and critical skills. Artino agreed saying, "Education is about preparing participants for yet-to-be experienced problems." Sherman expounded on the idea suggesting faculty development be viewed as continuing professional development where there is ongoing assessment of the needs of the faculty, both as a group and as individuals.
From page 10...
... There are multiple types of theories, models, and frameworks that can guide the design and evaluation of faculty development programs, including models that describe the process of translating research into practice, theories to explain what influences the outcomes, and frameworks to evaluate implementation. Thomas presented a conceptual framework called the knowledge-to-action framework, which is based on a review of more than 30 planned-action theories (Graham et al., 2006)
From page 11...
... . TABLE 2-1  Some Individual and Organizational Facilitators and Barriers to Change Individual Organizational Facilitators • Motivation to change • Protected time to read and discuss • Readiness to change evidence on feedback • Belief that feedback will result • Proximity to university in better learner outcomes • Strong residency training Barriers • Heavy patient caseloads or lack • Program with few resources to of time to read literature support uptake of new practices • Lack of knowledge on effective • No funds for faculty development feedback strategies • Lack of recognition of continuous • Lack of good role models professional development regarding educational issues SOURCE: Presented by Thomas, August 11, 2020.
From page 12...
... A participant shared her personal experience in the chat saying input from faculty is not widely sought for designing faculty development programs that truly address the needs of faculty. Bushardt responded, saying that this view "is not all that different from patients' involvement (or lack thereof)
From page 13...
... One strategy for sustainability is having a "champion" on site, who can hold regular meetings with the team, have regular check-ins with team members, and ensures that the program remains a priority. What and How to Evaluate Evaluation is often framed as a way to answer the question "Does the faculty development program work?
From page 14...
... . In addition, satisfaction surveys have been shown to be biased against a number of groups, including women, minorities, and short people, TABLE 2-2  Examples of Various Methods for Measuring Outcomes Construct Method Knowledge Survey, vignette, test Attitudes Survey, standardized questionnaire, interviews Self-reported practices Survey, standardized questionnaire, interviews Self-efficacy Survey, standardized questionnaire, interviews Actual practice Observation, chart audit, video recall Competence Simulation, vignettes, observation, videoconference SOURCE: Presented by Artino, August 11, 2020.
From page 15...
... However, he said, surveys can still provide valuable evaluation data if they are well designed, linked to theory or a framework, and focused on constructs beyond satisfaction. For example, surveys can be used to measure outcomes such as self-reported knowledge, attitudes, or practice patterns.


This material may be derived from roughly machine-read images, and so is provided only to facilitate research.
More information on Chapter Skim is available.