The F. Marie Hall SimLife Center Sharon I. Decker, RN, PhD, ANEF, FAAN Professor, Director of Clinical Simulations Covenant Health System Endowed Chair in Simulation and Nursing Education • Identify the goals of debriefing. • Discuss the elements of debriefing that improve outcomes. • Identify various approaches to debriefing. • Discuss the process of debriefing. Our students “ It isn’t that they can’t see the solution. It is that they can’t see the problem.” G. K. Chesterton “They don’t know what they don’t know” ◦ “Better learning is associated with improved teaching techniques” [simulation] Dunn, 2004 ◦ Teaching techniques that are evidence based and applied appropriately facilitate successful learning (and patient outcomes). Experience alone does not guarantee learning nor clinical competence Reflection promotes the transfer of experience to learning and knowledge Therefore: Learning is dependent upon the integration of experience (simulation-based and patient centered) and reflection. ◦ Conscious consideration of the meaning and implication of an action ◦ Assimilation of: Knowledge (Concepts) Skills Attitudes (Values & beliefs) With pre-existing knowledge Reflexio (Latin) ◦ The act of bending back A “wave motion or energy” Reflectivity dependent on: Angles of incidents A “wave motion or energy” Reflectivity dependent on: Texture of the reflective surface A “wave motion or energy” Reflectivity dependent on: Wavelength Reflection - Initiated through Questioning …You lead me on by means of things I know, point to things that resemble them, and persuade me that I know things that I thought I had no knowledge of.” (Quotes in Xenophon’s “Economics”) Active, persistent, and careful consideration Learning is dependent upon integration of experience with reflection Reflection promotes understanding of relationships Reflection-on-action ◦ Reflecting after – thinking through Reflection-in-action ◦ Being aware ◦ Reflecting while doing “They ‘feel’ where the music is going and adjust their playing accordingly.” (pg. 30) Knowing-in-action (Expert) Professional knowledge Skills competence Applying theory while problem solving Responds or “makes new sense” of uncertain, unique situations “Think like a nurse” Reflecting (Reflective Observation) Experiencing (Concrete Experience) Learning Cycle Thinking Applying (Active Experimentation) (Abstract Conceptualization) Reflecting (Reflective Observation) Learning Cycle Barriers ◦ ◦ ◦ ◦ Previous learning Fixations Socialization & Organizational culture Outcomes ◦ Heightened selfconfidence ◦ Empathy ◦ Understanding (Knowledge) ◦ Improved critical thinking ◦ Better patient care Requires Active involvement Realist environment Authentic experience Assistance (guidance) Time to reflect Can be taught Learners expanded their repertoire of possible solutions Boyd & Fales, 1983 Henderson & Johnson, 2002 Learning from reflection is not automatic – It demands active involvement in clinical experience and guidance Facilitator ◦ Learners who make their own discoveries – even if disappointing are more likely to acknowledge and own these discoveries then if these insights are pointed out to them. Dewey, 1938 Difficulty developing reflection Learner may have a distorted “view” Could lead to repeating mistakes Learners may only view the negative Fixations Outcome influenced by facilitator’s skills Boud, Keough, & Walker, 1985; Boud, 2001; & Paget, 2001 If facilitators interject their feedback prematurely learners stop reflecting, lose confidence, and become dependent on faculty Westberg & Jason, 2001 Set expectations (outline the process) Facilitate according to level of engagement Include “quiet” learners Integrate instructional points Reinforce Identify deficiencies Correct errors Summarize & Review improvement strategies Confidential Review objectives and expectations Professional courtesy Supportive not judgmental Listen ◦ No interruptions ◦ Respect ◦ Don’t talk about anyone not present ◦ Positive before negative Depends on ◦ The objectives ◦ The learner ◦ The facilitator ◦ The experience ◦ Time allowed for the process ◦ Relationships between participants High – “debrief themselves” [Critical Reflectors] Intermediate – “assistance” needed to analyze the experience [Reflectors] Low – learners demonstrate little initiative [Non-Reflectors] Environment ◦ Safe – non-threatening, trustful ◦ Circle – or modified according to objectives ◦ Private ◦ Time Varies – equal to or longer then the scenario • • • Be proficient with the equipment Do not show a segment unless it is to be discussed Show only 3 to 4 critical segments “This segment occurred … discuss what you were thinking as you…” • Show the segment • Pause – all the learner to self-critique The The The The participants facilitator’s expertise experience impact The recollection The timing Encourage self/team analysis Identify different ways of handling event next time Correct errors Promote reflective thinking A process in which after an experience the learner is lead through a purposeful discussion related to the experience Lederman, 1992; Fanning & Gaba, 2007 During ◦ “In-Simulation” – “Simulation Suspended” Error in management or 30 seconds without action Failure to perform a critical action ◦ Frozen Van Heukelon, Begaz, & Treat, 2010 Emphasize teaching Defuse a deteriorating situation Limit embarrassment After Decker, Gore, & Feken, 2011 The process that allows practitioners to uncover and expose thoughts, feelings and behaviors An active process of self-monitoring initiated by a state of doubt or puzzlement occurring during or after an experience Immediately after the experience Can be integrated with debriefing Self- Reflection – post experience Beginning – Reactive phase ◦ Emotional reactions Middle - Analysis ◦ Analysis and critique ◦ Correct any errors not recognized Summary ◦ Summarize the simulation ◦ Translation to practice Observers – Peers ◦ ◦ ◦ ◦ ◦ Explicit instructions Set the ground rules Needs to be guided Need a tool while observing Do not participate in the reactive phase ◦ During analysis – could be the third person in circular questioning Socratic Questioning – Guided questions – Strategically integrated “what if’s”, students find this difficult ◦ ◦ ◦ ◦ Requires active learning Encourages logic – making connections Facilitates critical thinking Integrated and/or during debrief Lambright, 1995; Schoeman, 1997 Socratic Questioning ◦ What did you experience? ◦ Analyze how you performed overall. ◦ How would you change your performance? ◦ How can you apply knowledge and skills for this simulation to an actual patient care situation? Alpha Delta (Based on standards) Gamma Examples of good action What we would like to change How we would change What was the outcome What could have occurred when you initiated CUS? if CUE had not been initiated? Recognize performance gap Strategies to minimize performance gap Phase Gather 25% Analyze 50% Goal Actions Actively listen to participants understand their perspectives Request narrative - Clarify “How did it make you feel? Clarify - Facilitate reflection Review events - Analyze Report Observations Correct Ask probing questions “What were you thinking when…” Summarize 25% Facilitate identification Plan strategies Verify and Summarize “Describe two things you need to work on…” WISER http://www.wiser.pitt.edu/ Uses: ◦ reflection-in-action, reflection-on-action, and reflection-beyond-action ◦ Six phases – Engaging, Exploring, Explaining, Elaborating, Evaluating, and Extending ◦ Focuses on learning ◦ Uses concept mapping Ask how one participant thinks another participant felt in a situation ◦ Example: “Jackie, how do you think Joe felt when you didn’t listen to his suggestions during the simulation?” Or Ask a third person to discuss a behavior that occurred between other participants ◦ Example: “Jenny, what did you observe related to the interaction between Jackie and Joe?” “Debriefing with Good Judgment” – (Advocacy-Inquiry) Debriefing leads to new frames Frames Debriefing changes later actions Actions Results http://www.harvardmedsim.org Advocacy – Inquiry ◦ “I noticed ….” “you did not double-check the dose of the medication. Without the double –check” ◦ “I’m concerned…” “that the patient is at more risk of getting the wrong dose.” ◦ “I was wondering…” “what was on your mind at the time?” Depends on learners and objectives [Example] ◦ Review recorded performance (A-V) ◦ Peer Debrief ◦ Self-Debrief “self-assessment” Debrief checklist Written Journal Web-based Research needed ◦ ◦ ◦ ◦ ◦ “how to promote reflection” “when and how often” “what are the most effective approaches” “whom should be included in the process” “how to structure – what tools/techniques” ◦ Measurement tool Debriefing Assessment for Simulation in Healthcare (DASH) http://www.harvardmedsim.org.debriefing-asssessmentsimualtion-healthcare.php Evaluate educational effectiveness Assessment of debriefing - DASH Meeting of educational objectives Scenarios Videos of simulations and debriefing All simulated experiences should include a planned debriefing session aimed toward promoting reflective thinking Guidelines and toolbox being developed Websites Simulation Innovation Resource Center, National League for Nurses http://sirc.nln.org/ TeamSTEPPS available at http://teamstepps.ahrq.gov/ QSEN Teaching Strategies incorporating simulation available at http://www.qsen.org/view-strategies.php World Health Organization: Patient Safety Tool Kit available at http://www.who.int/patientsafety/education/en/ Websites Army’s After-Action Review – Summary: available at http://www.au.af.mil/au/awc/awcgate/army/tc_25-20/chap1.htm http://www.wpahs.org/education/star-center/course-catalog/star-courses/debriefing-tools http://collaborate.uw.edu/educators-toolkit/debriefing-tools.html Debriefing Guide for Facilitators furcs.flinders.edu.au/.../CHSA%20sim%20toolkit/... DASH http://www.wpahs.org/sites/default/files/file/D11DASH-handbook2010FinalRev2.pdf The Observational Structured Assessment of Debriefing Tool http://www1.imperial.ac.uk/resources/CFE7DECB-8FE7-437C-8DAA6AB6C5958D66/debriefingosadtool.pdf