Professional Development in Embedded Instruction Mary McLean, Ph.D. - University of Wisconsin-Milwaukee Patricia Snyder, Ph.D. - University of Florida Susan Sandall, Ph.D. - University of Washington Mary Louise Hemmeter, Ph.D. - Vanderbilt University A previous version of this presentation was delivered at the annual meeting of the American Educational Research Association April 2011 New Orleans, LA Funded by the Institute of Education Sciences R324A070008 Embedded Instruction Multi-component approach to provide intentional and systematic instruction on priority learning targets during typically occurring activities, routines, and transitions to support child engagement and learning Key Components EC PD “Need” Relevant for Present Study • Descriptive studies have shown many early childhood practitioners do not feel – Competent – Confident • To meet the needs of young children with disabilities in inclusive learning contexts – Access – Participation Theory of Change: Abbreviated Intervention PD Tool Kit (Multi‐media materials) Workshops (high‐quality/ interactive) Coaching Teachers’ Frequent and Accurate Use of Embedded‐ Instruction Practices Increased Child Learning Opportunities Child Engagement and Learning (on‐site coaching or self‐coaching) Contextual Variables Instructional “Quality” Instructional “Effectiveness” Potential Efficacy Study • Conducted in FL, WA, and WI • 36 preschool teachers – 3 sites – 11 to 13 teachers per site • 106 children across 3 sites – 2-3 “target” children with disabilities in each teacher’s classroom Design • Teachers were randomly assigned to one of three conditions at each site – Tools for Teachers workshops plus on-site coaching – Tools for Teachers workshops plus self-coaching – Wait-list comparison (control) • Proximal outcome measures: 5 occasions – Before and after workshops – 2nd month and 4th month of coaching – After intervention • Distal outcome measures: pre and post – Before workshops – After intervention Teacher Information On‐site Coaching (n = 12) Self‐Coaching (n = 12) Control (n = 12) 12 12 11 8 Race White/Non‐Hispanic 10 9 African American 1 0 2 Hispanic 1 1 0 Othera 2 1 1 Education Bachelor Master 6 9 8 6 3 4b ECSE Trainc Yes No 9 8 9 2 4 3 Yrs. Experience in EC M = 9.3 M = 6 M = 7.5 SD = 6.0 SD = 4.0 SD = 4.2 Female 8 Child Information All participating children were identified with disabilities that qualified them to receive education and related services under Section 619 of IDEA. All children enrolled in the study had IEP On‐site Coaching (n = 35) Self‐Coaching (n = 36) Control (n = 35) 30 males 5 females 25 males 11 females 27 males 8 females Mean Age in Mos. (SD) 48.6 (8.7) 46.8 (8.1) 52.7 (8.4) Mean ABILITIES Index score (SD) 1.8 (.5) 1.7 (.4) 1.7 (.6) Gender 9 Primary Research Questions • What is the relationship between exposure to PD intervention and teachers’ frequent and accurate use of embedded-instruction practices? – Developing quality learning targets (LTRS) – Implementing planned learning opportunities (EIOS) – Delivering complete learning trials (EIOS) • Do scores on standardized measures of key preschool indicators (pre-academic, literacy, language, and social-emotional behavior) differ among children whose teachers were involved in each of the three experimental PD conditions? • What are teachers’ perspectives about embedded instruction and the professional development they received? Experimental Intervention • Teachers in both PD experimental conditions received: – 16.5 hours of workshops – Implementation guides and materials – Digital video camera • On-site coaching – – – – Observation, debrief, and email feedback Mean # sessions = 16 Mean duration of observation = 73.9 min (SD = 19.5) Mean duration of debrief = 39.3 min (SD = 12.1) • Web-based coaching* • Wait-list control teachers received workshops, implementation guides, digital video camera and access to web site at end of study Procedural Fidelity: Workshops • Workshop Implementation Guides • Workshop Fidelity Checklist – 96.8% (range = 93.6% -99.4%) • Instructional Strategies Used by Trainer • Time Allocated versus Time Spent Procedural Fidelity: Coaching Orientation (n = 12) Early (n = 24) Latter (n = 65) Email (n = 76) Final (n = 12) All Sessions (n = 189) Coach report % coaching log indicators M (SD) 98.6 (2.1) 96.7 (3.7) 98.1 (2.7) 98.5 (3.7) 100.0 98.2 (3.2) No. of sessions with second observer 4 5 15 25 4 53 100 91.8 (9.2) 95.7 (3.4) 96.3 (4.9) 97.9 (4.2) 96.1 (5) Second observer % coaching log indicators M (SD) 13 Procedural Fidelity: Self-Coaching • Fidelity self-coaching orientation session – 97.2% (range 91.7%-100%) • Fidelity weekly e-mail reminder to teachers in the self-coaching condition – 100% Select Findings Coaching Strategies: Observation Coaching Strategies: Debrief 17 Self-Coaching and Website Use # of visits every 2 weeksa Average time on site per visitb (min) # of action plans submittedc # of forms uploaded to the site Selfcoaching video submitted Teacher A 1.6 36 3 9 Yes Teacher B Moderate Users Teacher C 1.6 19 4 16 Yes .6 54 0 4 No Teacher D 1.2 19 1 0 Yes Teacher E .6 42 1 0 Yes Teacher F .4 13 1 0 Yes Teacher G .4 34 2 1 Yes Teacher H 1.2 27 0 0 No Teacher I .2 42 0 6 No Teacher J 0 n/a 0 0 No Teacher K 0 n/a 0 0 No High Users Low Users 18 Teacher Implementation Data Note. LTRS Total Score represents percentage of quality indicators. EIOS scores measured as rate based on number of trials implemented for a child on one learning target every 15 min. On average, teachers implemented trials for 2-3 children with 2-3 learning targets for each child. * Refers to statistically significant main effect at p < .05 EIOS: Teacher Implementation “Embedded” Complete Learning Trials 20 Child Outcome Data Note. TERA-3 = Test of Early Reading Ability-Third Edition; PLS-4 = Preschool Language Scale Fourth Edition. * Refers to significant main effect at p < .05 Social Validity Data: PD Intervention Limitations and Implications • Limitations – A priori power analyses based on alpha .20 – Standardized and decontextualized child outcome measures – Metrics used to evaluate “dosage” of self-coaching • Implications – High-quality workshops sufficient for improving quality of learning targets – On-site coaching to improve frequency and accuracy of embedded instruction learning trials – Different implementation supports for different components of embedded instruction – Social validity data strong, particularly for workshops plus on-site coaching