Moving From Data-Rich to Data-Informed Johnston County School’s Program Evaluation Department

advertisement
Moving From Data-Rich to
Data-Informed
Johnston County School’s Program Evaluation Department
March 23, 2016
Who Are We?
• Johnston County Schools – 44 Schools
(22 Elementary, 13 Middle, 11 High)
–
–
–
–
Testing (State and Local Benchmark)
Accountability
School Improvement
Program Evaluation
Today’s Objectives
Participants Will Be Able To…
• Understand the role of school-based data teams
• Understand the role of the district in supporting school-based data
teams in data-mining efforts, data-drive decision-making, and
ultimately increasing student achievement
• Navigate analysis tools and resources for data teams
What is a Data Team?
What words come to mind when you think of a data team?
How about a PLC? Then…
• Head to jcsdatacentral.blogspot.com
• Click on Conference tab.
• Use the appropriate buttons to submit your words. Please submit
as many as you can think of.
Rationale of Creating Data Teams
•
•
In your school
or district…
•
•
•
•
•
How do we determine areas of strength and
focus?
Once an area of focus is identified and a strategy
is put in place to address it, how do we know if
that strategy is working?
How do we determine what our students know and
don’t know?
How do we determine what our students are ready
to learn and are not ready to learn right now?
How do we determine who needs remediation or
acceleration?
How do we identify our academically at-risk
students?
How do we identify those “diamonds in the rough”
that just need a little polishing to begin to shine?
– “The Forgotten Middle”
I know, I know!
We should use the data!
• Assessment Data
– EOG/EOC Goal Summary
– NCFE
– SAT/ACT/AP Data
– Local Benchmark Data
(i-Ready)
– mClass
– SchoolNet
• Common Formative
Assessments
• Program Data
– Course Outlines, Unit
Plans, Pacing Guides
– PLC meeting minutes
– Classroom Walkthroughs
– PD Offerings
•
•
Perception Data
– AdvancEd Stakeholder Survey
– Teacher Working Conditions
Survey
Demographics Data
– Subgroups
– EC / AIG / MTSS Rosters
– Suspension data
– Attendance (Student and Staff)
– Dropout Rate
– Graduation Rate
– Staff Turnover
*Topics of Data Sessions
Think about your school or district…
• What data sources are available?
• How are you using the data that is available?
• How do you know that it is used?
•
•
•
•
•
•
•
•
•
There’s so much data…where do we
start?
What data is best to use?
What data is available?
How should we be using it?
How can it help students?
How can it help teachers?
How can it support professional
development and growth?
How can it be used to improve
instruction?
How can it be used to help
instructional leaders?
What good is all of this data?
NONE, if it is not used
OR
if it is used the wrong way.
How Are You Using Data?
OR
Rationale
• Provide structure and support that allows school stakeholders to
use data in meaningful ways to inform instructional improvements
for all students.
– Increase knowledge and use of different data sources
– Encourage collaboration and support
– Encourage greater transparency through sharing of data (feeder pattern analysis)
Implementation Steps
•
•
•
•
•
•
•
Introduce idea to principals (Summer 2013)
Principals select data team members (Summer 2013)
District Level Data Sessions
– Explain data points
– Model analysis
– Provide analysis templates, tools, resources
Facilitate school-level data trainings, as requested
Be a resource
Provide central resource center
Provide ongoing communication
Data Teams ≠ New Concept
• Fully functioning PLCs
– “We are a PLC, we do Data Teams”
• Grade/Content Alike – Horizontal
– 5th Grade Team, 8th grade Math Team, Biology Team, etc
• Subject Alike – Vertical
– Middle School Math, K-2 Reading, etc
– Taking PLCs to the Next Level
• School Improvement Team
Leaders Make it Happen – An Administrator’s Guide to Data Teams, by Brian McNulty and Laura Besser
“We are a PLC, we do Data Teams.”
“The PLC grounds teams in the process of collaboration and inquiry, and the Data Teams process
enhances PLCs by providing explicit, data-driven structure that leads to results.”
~Brian A. McNulty, Leaders Make It Happen
Data Teams + PLCs…
Recommended Characteristics of a Data
Team Member
•
Genuine interest in data/data literate
•
Effective problem solver and communicator
•
Comfortable using technology
•
Strong team player/Collaborative mindset
•
Ability to see the “Big Picture”
•
Effective Instructional Leader within school
•
Diverse, creative, innovative thinker
•
Able to instruct and mentor others in using data for instructional decision making
Implementation Steps
• Introduce idea to principals (Summer 2013)
• Principals select data team members (Summer 2013)
• District Level Data Sessions
– Explain data points
– Model analysis
– Provide analysis templates, tools, resources
• Facilitate school-level data trainings, as requested
• Be a resource
• Provide central resource center
• Provide ongoing communication
District Data Team Support
• District Data Sessions to analyze and
discuss “hot” topics throughout year.
- 2015-2016 Data Session Calendar
- 2014-2015 Data Session Calendar
• Goal: To train at least one person from each
school that will be able to help facilitate the
implementation and training of data teams at
the school.
• Provide each school with resident data
experts.
Data Session Process
Model
• Explain data points
• Model data analysis process for district-level data
Do
• Schools use templates provided to analyze school-level data, with
district level facilitation and support
Act
• Discuss best practices
• Create an action plan
Feedback
• Participants provide feedback on session (other resources and
support needed, etc.)
Data-Driven Decision Making: The Process
• Process is cyclical
• Starting point may vary
• No fixed end point
• Prerequisites and Supporting Conditions:
Tools for collecting, analyzing, generating, and storing data
Support and modeling by school and district leadership for continued
improvement and use of data
Collaborative environment and protected time for reflection on data
Professional development and technical support for data interpretation
Analysis Model
Here’s What! Column is filled with specific facts. Example: Hispanic students are consistently performing below the target – proficiency dropped
to 25.4% in 2014.
So What? Column is an interpretation of the data. Why is it important?
Now What? What strategies can be implemented? Who is responsible? How can we monitor progress? What resources are needed?
Additional questions that may need answering.
HERE’S WHAT!
SO WHAT?
NOW WHAT?
In general:
What is the data telling us? What are our next steps?
Taken from “Data Driven Dialogue: A Facilitator’s Guide to Collaborative Inquiry”, Bruce Wellman & Laura Lipton
jcsdatacentral.blogspot.com
Data Analysis Protocol
jcsdatacentral.blogspot.com
Activity 1:
EVAAS Value Added
&
Diagnostic Analysis
Achievement vs. Growth
Student Achievement: Where are we?
• Highly correlated with demographic factors
Student Growth: How far have we come?
• Highly dependent on what happens as a result of
schooling rather than on demographic factors – ADULT
IMPACT
How is Growth Different?
•
•
•
•
We are not trying to get our students to reach a score on a test.
Every student can grow even if they are not proficient.
If we concentrate on growth, proficiency will come.
No matter where a student comes into your class, they can still
grow.
• Every student matters.
• We have to meet students at their “level” and help them grow from
there.
Two Models
Gain Model
Predictive
Reading/Math EOG
EOCs, Science, CTE, NC
Final Exams
Uses NCEs – Normal Curve
Equivalents – similar to a
percentile
Predicted Score based on prior
test history compared to
students with similar testing
histories
Uses up to 5 years of data
across grades/subjects –
measures average
achievement level from one
year to the next.
Uses up to 5 years of data
across grades/subjects.
All students are included,
regardless of their testing
history, their number of prior
test scores, and which test
scores they have.
To be included in the model Minimum of 3 prior test scores
across grade/subject for each
student
School Value Added Report
A conservative estimate of a school’s or district’s effectiveness. What impact did adults
within the educational community have on the measured progress of students when
compared to school, district or state averages?
Value Added Report
Your Turn:
Guiding Questions
•
•
•
•
What accomplishments did you discover?
What areas for improvement did you discover?
What patterns, if any, did you discover?
Based on your understanding of your school and the data
you reviewed, what do you think is the root cause of each
area of concern?
• What are your next steps in terms of addressing your
findings?
Analysis Template
Diagnostic Reports
• Use to identify patterns or trends of progress among
students expected to score at different achievement levels
Diagnostic Reports:
The Whiskers
Diagnostic Report Summary
YOUR TURN:
Guiding Questions
• Which quintiles show the most growth?
• Which quintiles are in most need of improvement/losing
ground?
• Do you notice any patterns in the data?
• What are your next steps in terms of addressing your
findings?
Share Out
•
•
•
•
•
Who needs to be part of the discussion?
Who should lead?
What might the conversation look/sound like?
What additional questions should be asked?
What is the team’s next steps? How is instruction
impacted?
Activity 2:
Instructional Impact
Teacher Effectiveness
Goal Summary Data
Teacher Effectiveness
Guiding Questions for Instructional Leaders
and Teachers
• Did each group make enough growth to meet the growth standard?
(bar above line)
• Which level of students did you have the greatest positive impact on?
– Why do you think this occurred?
• Which level of students would you like to see making more progress?
– What factors might have contributed to this?
– What are some strategies/activities you can do/participate in to show more
gains with this level of students this year?
• How can this data inform instructional practices, strategies,
programs?
Goal Summary Reports
•
The goal summary report provides the school and teacher information about student
achievement on specific goals from the Common Core/Essential Standards for each EOG or
EOC.
•
The goal summary report provides the following data:
–
–
–
–
Mean Scale Score for a group of students
Number of students tested – Number of Valid Scores
Percentage of items for each goal
Weighted Percent Correct – shows the percentage of questions answered correctly by this group
of students. For example, the percent correct for the Geometry Strand is 46.2%. This means that
this group of students correctly answered 46.2% of the questions related to Geometry. If there is
no data for the goal or objective, either there were not enough students or all forms were not
represented.
– Difference from the State Mean Percent Correct
Things to Remember When Using Goal
Summaries
•
The test forms used year to year may be different. Tests are equivalent at the total score level, not at
the objective level. Thus, forms from year to year may have more or less difficult items on a particular
goal or objective.
•
The Goal Summary Report provides valid data about curriculum implementation only when all forms are
administered within the same classroom/school/LEA, there are at least five students per form, and
approximately equal numbers of students have taken each form.
•
The number of students in a classroom setting
•
The type of students in the class (EC vs. AIG)
•
How the Standard is weighted
Percent Correct
of That Goal
Area
Percent of Test
Difference from
State Results
●
●
●
How do my scores compare to the overall
school?
On what standards did my classes perform
higher than the school?
○ What instructional practices contributed to
this success?
What are my standards of focus?
○ What resources/strategies could be
implemented?
○ What resources/strategies can my PLC
members offer that have shown success?
Take Charge
of Your
PDP!
Your Turn
EVAAS Teacher Reflection
• Which level of students did you have the greatest positive impact on?
• Did each group make enough growth to meet the growth standard?
• Which level of students would you like to see making more progress?
• How can this data inform instructional practices, strategies, programs?
Goal Summary Teacher Reflection
• How do my scores compare to the overall school?
• On what standards did my classes perform higher than the school? What instructional
practices contributed to this?
• What are my standards of focus?
• What resources/strategies can my PLC members offer?
Share Out
•
•
•
•
•
Who needs to be part of the discussion?
Who should lead?
What might the conversation look/sound like?
What additional questions should be asked?
What else can be used to analyze curricular impacts in
your district/school?
Additional Tools
Activity 3:
Perception Data
Teacher Working Conditions
It’s How You See Things
Perception Data
• Gathered through questionnaires, interviews, and observations
• Helps educators understand what students, parents, teachers, and the community
think about the learning environment.
• People act according to what they believe, so if you want to change a group's
perceptions, you have to know what they believe.
• For example, student perceptions can tell educators what motivates students to learn
and staff perceptions can indicate what kind of change is possible and necessary
within the school.
Perception Data
•
Data from perception data sources should be used as a guide for school improvement
planning to help prioritize needs.
•
Analyzing and using this information to improve schools is critical and needs to be a part of
continuous improvement efforts at the school and district levels.
•
Other data should be used with perception data to triangulate findings and provide a better
understanding of stakeholder perceptions.
Teacher Working Conditions Constructs
•
•
•
•
•
•
•
•
Time:
Available time to plan, collaborate and provide instruction and barriers to maximizing time during the
school day.
Facilities and Resources:
Availability of instructional, technology, office, communication, and school resources to teachers.
Community Support and Involvement:
Community and parent/guardian communication and influence in the school.
Managing Student Conduct:
Policies and practices to address student conduct issues and ensure a safe school environment.
Teacher Leadership:
Teacher involvement in decisions that impact classroom and school practices.
School Leadership:
The ability of school leadership to create trusting, supportive environments and address teacher
concerns.
Professional Development:
Availability and quality of learning opportunities for educators to enhance their teaching.
Instructional Practices and Support:
Data and supports available to teachers to improve instruction and student learning.
Teacher Working Conditions Analysis
• All teacher working condition data is publically
available at:
http://www.ncteachingconditions.org/results
Teacher Working Conditions Analysis
• Analyze each TWC survey using the Construct Reflections
Analysis Sheet – identifying construct positives/challenges
• Narrow your focus even further by prioritizing your Constructs
– Your item of focus does NOT have to be your lowest item score.
– You and your faculty know the context of your school. Use that
knowledge to choose an item of greatest impact to explore further.
Next Steps -TWC
• Celebrate Successes! How do we continue?
• For Items of Focus – conversations with School
improvement team – is there alignment with current plan?
–
–
–
–
What is working?
What is not working?
What would be ideal?
What are challenges to achieving the ideal?
• Resources: What’s working/not working?
What’s Ideal?
What’s Working? What’s Not?
Ideal Example
Implementation Steps
• Introduce idea to principals (Summer 2013)
• Principals select data team members (Summer 2013)
• District Level Data Sessions
– Explain data points
– Model analysis
– Provide analysis templates, tools, resources
•
•
•
•
Facilitate school-level data trainings, as requested
Be a resource
Provide central resource center
Provide ongoing communication
District-Wide Best Practice Sharing
Examples
– Data Wall session
– Scheduling practices
– Effective Instructional Practices
• Classroom Example
• School-wide Example
– Use of Stakeholder Data
– Feeder Pattern Action Plans
– Local Benchmark Best Practices (i-Ready)
Central Resource Center & Ongoing
Communication
JCS Data Central blog
– Central location for training, resources, templates, and presentations
– Interactive
– Communicate weekly with subscribers
• Further spotlight best practices
• Relevant and timely guiding questions and topics
• Info-graphics to explain tricky data points
• Additional resources
Data Teams + PLCs…
Thanks to JCS Data Teams, Johnston County Schools have moved from
a data-rich district
to
a data-informed district.
Contact Us
Kristy Stephenson
Executive Director of Program Evaluation
kristystephenson@johnston.k12.nc.us
Jenna McKeel
Data Analysis Specialist
jennamckeel@johnston.k12.nc.us
Katie Wall
Data Analysis Specialist
katiewall@johnston.k12.nc.us
Don’t forget to give us feedback on today’s session.
Feedback form can be found at
jcsdatacentral.blogspot.com.
Download