Safe and Supportive Schools Commission Summer Meeting Retreat July 22, 2015 – Harvard Law Library, Cambridge Flip Chart Notes - Suggested Tool/Framework Improvements SMALL GROUP TOPICS Overarching Comments (p.1) Improved Content (p.2) Update Per “Recent / Current” Legislation/Requirements/Initiatives (p.3) Usability of Tool o Tool Mechanics: (User-friendly) (p.4) o More embedded/linked/companion Guides/Resources (p.5) Overarching Comments Big Room – (DOTS: 7 green, 1 blue, 6 yellow) -Rather than re-do the Framework now, should we use this year’s grantees to gather more feedback about how it could be improved? -Start with an outline of what will be included and also include an index that shows the topics/themes covered and where it’s addressed in the Framework. -Does the Framework help inform/prioritize what the crisis of the “urgencies” really are? -Accompanying survey(s) aligned to the 6 standards and/or guidance on how many / types of stakeholders should be filling this out to triangulate data/responses. Screening Instrument? Does it tie to more diagnostic instruments? Look for strategic integration with the work of other state agencies **Clarity of PURPOSE?*** Screening? Comprehensive Assessment? Provide schools with suggested next steps. Theory of change? -Positioning: is this currently used when there’s been a critical urgency? Do we want it to be used preemptively as well? -Consider some investment in “the pitch” for this effort-to make to districts/schools(which may need to vary in light of local differences) – and evaluate whether this initiative is delivering on this asserted value-add. Name of the Framework assessment. 1 Improved Content Room 1 – (DOTS: 4 green, 3 blue, 4 yellow, 1 yellow, 1 blue) Level of priority – “How important is this?” 10 question pre-screening? Action plan should follow self-report. Metrics then action. Postvention should be considered for inclusion. Remove Training Request from Assessment (separate Action plan). Remove Action Plan from Assessment Staffing for support positions (internal and external). Structures, logistics, etc. to guide schools, meetings, teams, etc. Evidence of . . . to indicate level of implementation. Why is it School and Behavioral Health Team? Who should be participating in this? Emergency Management Planning/Response Substance abuse (Recs from Governor’s Commission). Cultural Proficiency Cultural competency with parents – is it included everywhere it is needed? “Behavioral Health” – SSS? *Two things being measured in one question. Simplify language wording concepts. Indicators 01. Under Family Engagement the last statement should be a separate indicator. -Creating examples of what each rating would really look like (since people’s ideas of a “little” to “fully” implemented” really varies). 2 Update Per “Recent / Current” Legislation/Requirements/Initiatives Room 2 – (DOTS: 2 green, 5 blue, 1 yellow) -Cyber Health/Safety? -Align the family engagement section with the parent: Family Engagement Fundamentals. Family Resource Centers (FRC). CHINS →CRA (Children Requiring Assistance) Add updates based on recent legislative changes: LGBTQ Restraints Ch. 222 Bullying New information on discipline (Chap 222) Report: Make a recommendation to keep the tool (updated) with timely and relevant resources and tools. Think strategically about other DESE/State/local initiatives that are related and consider how to better coordinate to take advantage of synergies and avoid redundancy. Ex: Urban leaders Network, Family and Community Engagement Fundamentals, School Counselors, social/emotional health tool. Emergency Planning. Medical Emergency Response Planning Non-Academics: Infuse. MTSS, PBIS and tiered int. fidelity questions. Connect to other related initiatives PBIS / MTSS. How to easily help people understand these: SEL SSS BH PBIS MTSS RTI MH Truancy CRA (Children Requiring Assistance) Legislation Have a companion document noting related legislation, requirements, etc. 3 Tool Mechanics: (User-friendly) Room 3 – PART I – (DOTS: 2 green, 1 blue, 1 yellow, 1 green, 1 blue) Answer Options: No option for “not at all.” No option for “not in place” (Leadership-school). R = Recommendations. On action plan – “initiate” needs to be find option. Clarify – does “no answer” mean “don’t know” or “not at all” or something else? Likert Scale. Needs clarification (how is “some” different from “moderate”. Need to have option of “not at all” Likert Scale Some extent vs. moderate extent. What does “no answer” mean? (blank). Implementation needs “not in place” option (Rating Scale). On Action Plans, “initiate” needs to be a choice. Maybe collapse some response categories. Structure of Tool: Re-label the “action steps”. Simplify, break apart questions to one piece to measure, less words. Separate the assessment from action planning. Use self-assessment science to structure the tool . Ex: only take 30 seconds to answer each? Better design-aesthetics. Ease of navigation. Not intuitive Can an IT person review the tool to determine more sophisticated capacity? Can a field for notes be included in the tool? For element sections? Can the term safe and supportive learning environment be incorporated throughout the tool? Separate assessment from .action planning. Think through the individual vs. group fill-out option. School vs. district teams – different tools. User ability on Macs vs. PCs. Does it work on both? Tags to connected elements. Simplify: Visual Clutter -too much on each screen. Less items more focused item. Action Steps: Action Steps-wrong language in framework. 4 Action Steps mis-labeled. “Action Steps” not right “Best Practices”. Action Plan: Maybe change to how important this is to you? Improved Navigation: Have an expert in self-assessment science to help: o Design structure of tool o Improve internal navigation o Simplify/reduce clutter Labels for navigation Tool should make it clear that you can start in any section of Framework. Left side navigation. Color scheme to designate section (or header/footer). Top tool bar versus buttons (tabs? toolbar? menu?) Pop up window for framework reference while completing tool. More Embedded/Linked/Companion Guides/Resources Room 3 – PART II (DOTS: 1 green, 3 blue, 2 yellow, 2 yellow, 1 blue) Data: DASHBOARD to help understand data Consider aggregate data output (i.e. for review by ESE and Commission). Guidance on how to use data Visual for data to help identify urgencies Is there a way the tool can cross reference data in fields? Resources: PBIS Org RT1 CSEFL / Pyramid MTSS Tool box Section – links to resources CBMI Guide, TLPL website, etc. Consider paper resource(s) Complement to on-line self-assessment Link results with suggested resources Links to resources (CBHI, TLPL website) Resources populate during Action Planning stage. Should there be a Resource section? Hyperlinks? Tool and Resources 5 o Coordinating Function (Tour Guide) TA/Guidance: Help boxes L7 provide clear definition of terms (i.e. crisis plans, emergency plans). Some kind of Rubric to define response categories – critical features that can be generated. When you fill out the team member page give option of school or district assessment. Move guidance and training to Action Plan report. “Start Here” button with instructions and how to start. Be able to see overall strategies before you dive into tool. Examples of what it should look like. Pre-self-assessment guidance What need to bring to table (ex: data) before beginning. Who is involved? Need guide on how to use this. How do we determine what is underlying issue. Better framing. Where to start. Guide tells us what the purpose of the tool is. Guidance/suggestions for starting the tool (i.e. what kind of data to gather before-hand, what people should be included, etc.). Suggest who should participate in evaluation (broad scope/representation). Webinars, conferences, workshops. Facilitation guide for using the tool. 6