Poster Title: Implementation of a Direct Observation Program for Pediatric Trainees using a QR Code Linked Evaluation Tool
Student: Menaka Reddy, Class of 2024
Faculty Mentor and Department: Jeanna Auriemma, MD, Pediatrics
Funding Source: Department of Pediatrics, Wake Forest School of Medicine
ABSTRACT
Background: Direct observation (DO), defined as “the active process of watching learners perform in order to develop an understanding of how they apply their knowledge and skills to clinical practice” is an integral part of medical training, from its start during medical school through years of residency and fellowship. Despite the well-documented advantages of DO on both learners and instructors, many medical programs struggle to implement a systematic process for ensuring that faculty consistently observe trainees and provide associated feedback. This educational lapse is often due to the pressure that attending physicians and residents face to achieve high efficiency, smooth workflow while prioritizing patient care. As such, there is a clear need for further research and program development into creating a streamlined process for facilitating direct observation and feedback between instructors and resident physicians. The aim of this study was to evaluate a novel program for increasing direct observation of resident trainees on all rotations, executed within the Wake Forest Pediatric Residency Program during the 2020-2021 academic year. During the program, interns in the pediatrics department received a QR code to be placed on their ID badge that instructors could scan with their phones, linking them to a Redcap survey for feedback. Results of the study will help determine the impact of such a program on resident training as well as the feasibility of continuing and/or expanding the program.
Hypothesis: If QR codes are placed on intern ID badges, then there will be an increase in the quantity of direct observations residents receive as well as a change in the content of feedback provided following direct observations.
Methods: To evaluate the direct observation program, both qualitative and quantitative data analysis were used. For the qualitative portion, the research team created two semi-structured interview guides for interns and faculty. The guide included questions such as “How did the program influence residency training?” and “How did you feel about the program? What did you like or not like?” Interviews lasted approximately 20-25 minutes and were transcribed and later analyzed using a qualitative coding program. For the quantitative portion, results of the Redcap surveys associated with the QR codes were analyzed using descriptive statistics to report which skills were most frequently observed and in which department sections.
Results: At the conclusion of the study, fifteen faculty and interns participated in the semi-structured interviews. Key themes that arose during the interviews included commentary on (1) quality of feedback; (2) the role of mentorships and relationships in the program; (3) the ability to individualize or tailor education; (4) the integration of the program into clinical workflow; (5) the technology associated with QR codes; (6) how direct observation and associated feedback impacts learning environment; and (7) logistics and program promotion important to implementation. Additionally, participants also reflected on which clinical skills were most useful to have observed as well as whether they had a positive, negative, or neutral experience with the program.
Conclusions: Further analysis of these data will reveal more conclusive evidence about the impact of the implementation of DO in residency programs. Additionally, this will highlight feasibility of continuing and expanding the use of QR codes as a means of capturing direct observation feedback as well as associated benefits and drawbacks of its implementation that would contribute to future modifications to the program.
Powered by Acadiate
© 2011-2023, Acadiate Inc. or its affiliates · Privacy