Data Wise - Tulsa Public Schools

advertisement
Data Wise
Data-Driven Decision Making
Tulsa Public Schools
September 18, 2012
Objectives



2
Develop a common language for
and understanding of data analysis
Examine the 8-step Data Wise
Decision Making Process
Learn how to use multiple data
sources to build a data-driven culture
where evidence is used to make all
decisions and action is taken for
improvement.
Agenda
The Data Wise Decision-Making Cycle
Case Study – Mount Baker Middle School
Working with Data
Data Analysis
Break
Data Analysis Continued
Alignment to WISE Process
Next Steps
3
Implementation Audit
District Data and Recommendations
for Future Actions
- Fall 2011-Lead and Learn CenterDistrict Initiative
Professional Learning
Communities
(District Average)
Learning
Context
Instructional and
Assessment
Strategies/Practices
Professional
Development
Leadership
Practices
Average Total Score
1.4
1.1
0.8
1.6
1.2
Implementation Scale
Level of Implementation
From
To
Deep Implementation
3.5
4.0
Full Implementation
2.5
3.49
Partial Implementation
1.5
2.49
Emerging Implementation
.25
1.49
No Implementation
0
.24
Description/Recommendations:
• Use data through an inquiry process to
promote an action orientation and focus on
results.
• Staff members work collaboratively in
processes that foster continuous
improvement in all indicators of student
achievement.
• Leadership of school improvement
processes is widely dispersed and helps
sustain a culture of continuous
improvement.
Pair-Share
Introduce yourself to a person near you. Share one way
that you are currently using data at your school.
Then Discuss:
What conditions are most important for your staff to
successfully engage in data analysis?
6
Three Supports to Using Data
•
Three factors must be in place to use data to improve
instruction:
1.
Use of a well-defined school improvement process
(Armstrong & Anthes, 2001; Boudett, City, & Murnane, 2005; Holcomb, 2001; Love, Terc, &
Regional Alliance for Mathematics and Science Education Reform, 2002)
2.
Cultural shift to a learning organization, in which
instruction is continually examined and improved
(Boudett, 2007)
3.
Reform must take place close to the classroom
(Holcomb, 2001; Garvin, Edmondson, & Gino, March 2008).
7
THE DATA WISE
DECISION-MAKING CYCLE
Dr. Kathryn Boudett
Dr. Elizabeth City
Dr. Richard Murnane
Stoplight Protocol
3
Full Implementation
2
Partial Implementation
1
Little or No Implementation
(Really happening)
(Sort of happening)
(Not happening)
1. Organize for Collaborative Work

Adopt an improvement process (WISE plan)

Form leadership team

Make time for collaborative work

Create meeting structures

Set norms, acknowledge work styles

Review data and initiatives
Boudett, City, & Murnane (2005)
2. Build Assessment Literacy

Review skills tested

Study how results are reported

Learn principles of responsible data use
Boudett, City, & Murnane (2005)
3. Create a Data Overview

Choose a focus

Display the data

Allow teachers to make sense of the data

Analyze data, find the story, and paint a picture
Boudett, City, & Murnane (2005)
4. Dig into Data

Assess school indicators

Examine a wide range of student data

Use protocols to stick to the evidence

Identify a “Learner-Centered Problem”

Develop a shared understanding of the knowledge
and skills students need
Boudett, City, & Murnane (2005)
5. Examine Instruction

Examine a wide range of instructional data

Get clear about the purpose of observation

Use protocols to stick to evidence

Identify a “Problem of Practice”
Boudett, City, & Murnane (2005)
6. Develop an Action Plan

Brainstorm solutions to the learner-centered problem

Research and select instructional strategies

Develop a common vision for implementation

Agree on what the plan will look like

Write tasks for implementation
Boudett, City, & Murnane (2005)
7. Plan to Assess Progress

Set student learning goals

Identify implementation indicators

Identify student achievement data
Boudett, City, & Murnane (2005)
8. Act and Assess

Communicate the action plan clearly

Implement the action plan

Visit classrooms frequently

Assess student progress

Plan PD to meet ongoing needs that emerge

Promote consistency rather than conformity

Adjust the action plan

Celebrate success!
Boudett, City, & Murnane (2005)
DATA WISE IN ACTION
Case: Mount Baker Middle School
18
Step 1: Organize for Collaborative Work




Leadership team
formed
Data carousel
protocol
Data selected
Whole staff meeting
Step 2: Build Assessment Literacy




Data displayed on
charts
Staff writes
statements telling
what data says
Identify strengths
and concerns
Ask additional
questions about
data
Step 3: Create a Data Overview



Started with global
picture
Narrowed it down
Focused on flat
Math scores
Step 4: Dig into Student Data


Why are math
scores low?
Area of Statistics is
weak.
Step 5: Examine Instruction

Incorporating
statistics across the
curriculum would
support student
learning.
Step 6: Develop Action Plan




Math team
collaborated with
content teams
Math mentors
served as a
resource
Coaching
Science, reading,
and language arts
integration with
statistics
Step 7: Plan to Assess Progress


End-of-Year
Summative
Assessment
Anecdotal stories of
student learning
Step 8: Act and Assess


-10 to -5 first year
Let’s try it again!
WORKING WITH DATA
27
The Data Pyramid
28
Phases of Data Acceptance
When Adopting a Data-Driven Decision Making Model
Challenging the Test
“Question #3 is poorly worded.”
“Answer ‘b’ is a trick answer.”
“The students made silly mistakes.”
Feeling inadequate
or distrustful
“How can two questions establish
mastery?”
“We don’t teach it in this format.”
Confusion, overload
“This is too much!”
“How can I really use all of this?”
TIP:
Less is more. If teachers“Students
take one
action
onproblems,
one so
do poorly
on word
Analytical but surface
do more word problems.”
standard from the data – theywe’ll
begin
to see the impact it
“We need more reading.”
can make and become bought-in.
“The wrong answers show that students
Looking for causes
Changing teaching practice and
improving student learning
29
can’t tell the difference between a
summary and a theme.”
“I need to write lesson plans for reteaching that differentiate between the
different needs of my student groups.”
Source: “Using Test Score Data to Focus Instruction” by Susan Trimble, Anne Gay, and Jan Matthews in
Middle School Journal, March 2005
STEP 3- CREATE DATA OVERVIEW
30
Using Protocols with Data
• The Comfort Zone is a place where
we feel at ease, with no tension,
have a good grip on the topic,
and know how to navigate
occasional rough spots with ease.
• The Risk Zone is the most fertile
place for learning. It is where
people do not know everything, or
sometimes not know anything at
all, but are willing to take some
risks.
• Generally it is not a good idea to
work from either your own Danger
Zone or anyone else’s. That area is
so full of defenses, fears, red-lights,
desire for escape, etc., that it
requires too much energy and time
to accomplish anything.
National School Reform Faculty
The ORID Protocol
The O questions identify
objective facts. The key
question is: What do we
know about this?
What we want are
statements starting with
terms like ‘I see… ‘, ‘There is
evidence for… ‘,. These are
documented but not
analyzed.
The I questions have to do
with meaning. The key
question is: What does it
mean for me/you/the
organization, etc?
Interpretive questions might
include ‘What if…?’, ‘What
would it mean… ?’, ‘What
would that do… ?’ and so
on. This is the analytical
phase
32
The R questions are about
how people feel about the
topic. They are about
subjective perceptions. The
key question is: How do we
feel about this? The phase is
one of identifying feelings
and not of analyzing them.
This is the stage at which a
decision is produced. The
key question is: What are we
going to do? Focusing on
the future, questions might
include, ‘What would be the
best course of action? “,
‘What would be achievable,
positive outcomes? ‘
Whole Group Debrief
•What did you learn by answering these
questions collaboratively?
•Would you use these questions with a grade
level team or a school improvement team? Why
or why not?
•Based on your understanding of the data, what
would you identify as your focus area for
improvement?
BREAK TIME
STEP 4- DIG INTO STUDENT DATA
Writing the Learner-Centered Problem
Select a problem to focus on through which
we can make a meaningful difference in
student learning for many students.
Rebecca A. Thessin, Ed.D., The George Washington University
Sample Learner-Centered Problems
Elementary
•Students
have
difficulty drawing
text-based
inferences when
reading.
•Students are
unable to read
increasingly more
sophisticated
levels of text.
Middle
School
•Students struggle
when asked to
solve multi-step
math problems.
•Students have
difficulty isolating
key scientific
concepts when
reading non-fiction
text.
Rebecca A. Thessin, Ed.D., The George Washington University
37
High
School
•Students have
difficulty taking a
critical stance,
explaining the
reasoning behind their
stance, and supporting
their stance with
evidence.
•Students have trouble
applying algebraic
reasoning skills,
particularly on openended questions.
Writing a Learner-Centered Problem
On your template, please draft a
learner-centered problem for the
focus area you selected. Then be
ready to share what you wrote.
Asking Good Questions Helps to
Triangulate the Problem
Your goal is to “triangulate”
the problem to:
• Define the problem with
greater accuracy.
• Move beyond
generalizations.
• Avoid making inappropriate
inferences from only one
source of data.
Trent Kaufman, 2006.
What Else Do We Need To Know?
Once we have drafted a learner-centered
problem, we need to gather more
information to help us to write an
instructional goal.
With your table, record 5-8 more questions
that you have about students’
performance in this area.
Rebecca A. Thessin, Ed.D., The George Washington University
STEP 5- EXAMINE INSTRUCTION
41
Writing an Instructional Goal
Instructional goal = a change in teacher practice,
designed to improve student learning in a specific
way.
Or another way to look at it . . .
An instructional goal is like a “testable
hypothesis” –a change in teacher practice
that, if made, then it is likely to lead to
improved student outcomes.
Rebecca A. Thessin, Ed.D., The George Washington University
Defining the Problem and Goal
•
•
•
Learner-Centered Problem: With what are students
struggling? This statement might begin with “Students
are having difficulty . . .”
Issue of Teacher Practice: What do teachers want to
do better? This statement might begin with “Teachers
want to . . .”
Instructional Goal: What is the instructional practice
that teachers will implement to change their
instruction? This statement might begin with “Teachers
will . . .”
Rebecca A. Thessin, Ed.D., The George Washington University
43
Sample Problems and Goals
•
To write an instructional goal, let’s go back to start from
a learner-centered problem we can work with for
practice:
•
Level: Middle
•
Scope: Language Arts and Social Studies
•
•
Learner-Centered Problem:
Students are having difficulty providing evidence from
texts that they read to support a particular point of
view.
Rebecca A. Thessin, Ed.D., The George Washington University
Sample Problems and Goals cont.
•
•
•
•
Issue of Teacher Practice:
Teachers want to adopt common teaching strategies
and assessment tools to provide students with more
opportunities to learn how to pull evidence from texts
to support an argument in language arts and social
studies classes.
Instructional Goal:
Teachers will engage students in guided persuasive
writing activities/tasks that involve selecting evidence
and providing a platform for debate in order to
develop the habit of substantiating opinion with
evidence.
Rebecca A. Thessin, Ed.D., The George Washington University
45
Draft an Instructional Goal
On your template, please draft:
•
•
46
an Issue of Teacher Practice
an Instructional Goal for your
learner-centered problem.
Evaluate Your Instructional Goal
•
The goal your team and your school selects to focus on
should meet the following criteria:
-Is common to many students
-If solved, would meet your larger goals for students
-Is supported by multiple sources of data and/or
research-based practices
-Is specifically focused on knowledge, skills, and
learning dispositions that you want students to have
-You have an understanding of the reasons behind
students’ low performance in this particular area
•
Boudett, K., City, E., Murnane, R. (2006.
Write a SMART Goal
Specific
• Well defined
• Clear to anyone that has a basic knowledge of the project
Measurable
• Know if the goal is obtainable and how far away completion is
• Know when it has been achieved
Attainable
• Agreement with all the stakeholders what the goals should be
Results Oriented
• Within the availability of resources, knowledge and time
Time Bound
• Enough time to achieve the goal
• Not too much time, which can affect project performance
Examples of SMART Goals
Students will meet or exceed the district writing expectations as
measured by the six-traits writing sample scoring.
SMART GOAL: During the 2006-07 school year, the number of first
through fifth grade regular education students at Sample School
improving their writing skills in targeted traits will increase 5% at
each grade level (see chart below) as measured by the Six-Traits
scoring rubric monthly grade level assessments.
95% of all 11th graders receiving f/r lunch will score above the 40th
NPR on ITEDs. (increase from 80%)
During the 2006-07 school year, proficient 11th grade students in the
f/r subgroup group (as indicated by the ITED math computation) at
Sample School will increase by 15% as measured by the ITED math
computation subtest.
STEP 6- DEVELOP ACTION PLAN
How Do We Plan the Change?
Once you have examined the data and drafted an instructional goal, you
will select strategies to use in the classroom to help you to reach your goal
and determine the supports needed for full implementation.
Discuss:
How would you determine what new strategies to use to reach the
goal your school or team establishes?
What types of support might teachers need?
51
STEP 7- PLAN TO ASSESS PROGRESS
52
Plan to Assess Progress
We won’t know if this change will lead to
improved student outcomes until we actually
implement the change!
How will the school leader, coach, or teacher
leader know which improvement efforts are
yielding positive outcomes?
Gather More Data!
Cause and Effect Data
Cause Data: Information based on
actions of the adults in the system
Effect Data: Student achievement
results from various measurements
Reeves, D. Standards Alignment: Tools and Processes.
Leadership and Learning Matrix
STEP 8- ACT AND ASSESS
Act and Assess
Using Cause and Effect Data
Assessing Cause Data
 Walkthroughs
 Observations
 Focus Walks
 Instructional
Rounds
 Peer Observations
Assessing Effect Data
 Examining student
work
 Common formative
assessments
 Benchmark tests
 Universal screener
Return to the
Beginning of
the Process
ALIGNMENT TO WISE TOOL
59
Data Wise = The WISE Plan
2
4
2
4
1
5
5
7
8
5
6
3
NEXT STEPS
PLCs are formed to
All principals attend Data Wise
training and receive training
Staff conducts
support implementation
materials.
research to
efforts of identified
determine
exemplary practices.
exemplary practices
for identified
Principals select appropriate Data
deficiency.
Wise training materials to
familiarize staffs with 8-step
Each PLC completes the
model.
YES
PDSA PLC planning form
and begins
implementation efforts.
Staffs engage in analysis of data to
identify root cause(s) of
deficiencies.
Can root
cause be
addressed
by staff?
PLCs study
implementation results
and measure progress
NO
toward goals. Standardize
or refine practices.
Planning PLCs
63
EVALUATIONS
Download