Progress Monitoring Presentation - Colorado Department of Education

advertisement
Unified Improvement
Planning:
Implementation and
Progress Monitoring
Hosted by: Colorado Department of Education
Provided by : Center for Transforming Learning and
Teaching
Introductions
Center for Transforming
Learning and Teaching
Julie Oxenford-O’Brian
Colorado Department of
Education
Judy Huddleston
Christina Larson
Lisa Medler
Session Purpose
Ensure planning teams are
prepared to monitor the
progress of the implementation
of their unified improvement
plan.
Introductions
Share:
– Name, Job Title, School/District
– Your role in supporting unified improvement
planning within the district
– Your most burning question about monitoring
the progress of UIP implementation.
Write your question on a sticky note.
Materials
Capturing Notes Today
• Plan for completing Progress Monitoring using
the Planning Progress Monitoring note catcher
(Toolkit, p. 1).
• Capture notes for the UIP Target Setting Form
and UIP Action Planning Form
– Use your partially completed version
– Blank versions available in the Progress Monitoring
Toolkit (p. 5 and p. 9)
The materials used during this session
were developed in partnership with the
Center for Transforming Learning and
Teaching in the School of Education and
Human Development at the University
of Colorado Denver.
Norms
The standards of behavior by
which we agree to operate
while we are engaged in
learning together.
Session Outcomes
Engage in handson learning
activities and
dialogue with
colleagues.
Access additional
resources.
Complete followup activities.
•
Understand the statutory and regulatory
requirements for monitoring the progress of UIPs.
•
Identify and fully describe at least one interim
measures that will be used to interpret progress
made towards annual performance targets.
•
Develop implementation benchmarks that will be
used to determine the level of implementation of
each action step associated with at least one major
improvement strategy.
•
Plan for how remaining interim measures and
implementation benchmarks will be identified.
•
Develop a calendar for when progress monitoring will
occur, what data will be reviewed, and who will
participate.
•
Plan for engaging in collaborative data-driven inquiry
process as part of regular progress monitoring.
Agenda
The Role of
Progress
Monitoring
Data-Driven
Collaborative
Inquiry
Implementation
Benchmarks
Interim
Measures
Planning
Progress
Monitoring
Unified Improvement Planning Processes
Preparing
to Plan
Gather and
Organize
Data
Section III:
Data
Narrative
Review
Performance
Summary
Section IV:
Target Setting
Describe
Notable
Trends
Ongoing:
Progress
Monitoring
Section IV:
Action Planning
Prioritize
Performance
Challenges
Identify
Root
Causes
Set
Performance
Targets
Identify Major
Improvement
Strategies
Identify
Interim
Measures
Identify
Implementation
Benchmarks
Colorado Unified Planning Template
Major Sections:
I. Summary Information about the school or
District
II. Improvement Plan Information
III.Narrative on Data Analysis and Root
Cause Identification
IV.Action Plan(s)
Section I
Summary
Information
about the
School/District
Section II Section III
Additional
Information
about the
School/
District
Section IV
Progress Monitoring of
Prior Year’s Targets
School Target Setting Form
 Priority Performance
Challenges
Data Worksheet
 Annual Performance Targets
 Notable Trends
(2 years)
 Priority Performance  Interim Measures
Challenges
 Major Improvement
 Root Causes
Strategies
Improvement
Data Narrative
Plan
Information  Description of
School/District and
Process for Data
Contact
Analysis
Information
 Review Current
Performance
 Trend Analysis
 Priority Performance
Challenges
 Root Causes
Action Planning Form
 Major Improvement
Strategies
 Associated Root Causes
 Accountability Provision
 Action Steps
 Timeline
 Key People
 Resources
 Implementation Benchmarks
 Status of Action Steps
Progress Monitoring Terminology
• Consider the UIP Handbook Excerpt: Planning
Terminology (Toolkit, p.13-14).
• Mark each term using the following legend:
–?
I have questions about this term.
–√
I’ve got it.
–+
I could explain this term to someone
else.
• Answer these questions:
– What is the difference between an interim measure and an
implementation benchmark?
– What is the difference between a measure and a metric?
Where and how is Progress Monitoring
represented in the UIP?
• Consider: UIP Handbook Excerpts (Toolkit, p. 11) and
UIP Quality Criteria Excerpts (Toolkit, p. 15)
– Interim Measures
– Implementation Benchmarks
• Answer the following questions:
– Where in the UIP template will interim measures be captured? What
information should be provided? How many should be included?
– Where in the UIP template will implementation benchmarks be
captured? What information should be provided?
– How should progress monitoring be included in action steps?
Statutory and Regulatory Requirements
Consider the District Accountability Handbook
Excerpts (Toolkit p. 19) to answer the following
questions:
– What is the School Accountability Committee (SAC)
role in monitoring the progress of the school’s
implementation of their UIP?
– How frequently must the SAC be involved in progress
monitoring?
– What is the District Accountability Committee role in
monitoring the progress of the district’s
implementation of their UIP?
Data Analysis: UIP Development
vs. Progress Monitoring
• Turn to Annual UIP Development vs. Progress
Monitoring, (Toolkit, p. 53)
• Work with a partner. Use a flip chart and
markers.
• Develop a double-bubble map to describe how
Annual UIP Develop and Progress Monitoring
are the same and how they are different.
• What would you add/change in this table?
UIP Development vs. Progress
Monitoring
UIP
Development
UIP
Progress
Monitoring
Agenda
The Role of
Progress
Monitoring
Data-Driven
Collaborative
Inquiry
Implementation
Benchmarks
Interim
Measures
Planning
Progress
Monitoring
Agenda
The Role of
Progress
Monitoring
Data-Driven
Collaborative
Inquiry
Implementation
Benchmarks
Interim
Measures
Planning
Progress
Monitoring
Performance Targets and
Interim Measures
• What is the relationship between performance
targets and interim measures?
• With a partner, share a one-sentence
description of how they relate to one another.
• Interim measures should provide data several
times during the school year about the degree to
which progress is being made towards each
performance target.
Effective Feedback
Clear, descriptive, criterion-based,
and indicate:
√ how their response differed from that reflected
in the UIP quality criteria, and
√ how they can move forward (what they might
do next).
Provide Feedback about
Performance Targets
• Choose a partner group/table. Exchange Target Setting
Forms (with performance targets identified).
• As you review their targets, consider the following
questions:
– To which students do the targets pertain? On what content is the
target focused? For what metric is the target set?
– To what degree do the targets meet the relevant quality criteria?
– How could they improve their annual performance targets?
• Provide feedback to your partner group/table.
Responding to Feedback
• Consider the feedback you received
– How will you respond to the feedback you
received? What will you do next to
incorporate this feedback into your annual
performance targets?
– How did it go providing feedback about
another district’s plan? What did you learn?
• Large-group share out
Currently Available
Performance Data
• Take out your Inventory of Performance
Data Sources.
• Determine which of the identified
resources could be used as “interim
measures”
– Administered more than once a year.
– Provide data related to your performance
targets.
Describing Interim Measures
• Data Source (assessment name)
• Purpose
• Related Performance Target
• Content Focus
• Which Students
• When Available
• Metrics
• Comparison Point(s)
• Questions
Consider the
Describing
Interim
Measures:
Legend (Toolkit,
p. 29).
Ensure you can
explain what will
information
should be
captured in each
row of this table.
Purposes of Interim Assessment
• Consider excerpts from Benchmark Assessment
for Improved Learning (Toolkit, p. 23)
• Work with a partner to answer these questions:
– What are different purposes of benchmark (or interim)
assessments?
– What best describes our purpose(s) for the
benchmark (interim) assessments we currently
administer?
– What purpose(s) are we serving when we use interim
assessments as part of our school/district
improvement efforts?
How do Performance Targets
Shape Interim Measures
Interim measures should:
• Provide data several times during the school
year about the degree to which progress is
being made towards each performance target.
• Provide data about the same students as the
performance target
• Provide data about the same content focus.
Describing Interim Measures
For at least one performance target, use the Describing
Interim Measures worksheet (toolkit p. 29) and identify:
• The name of the interim measure.
• The purpose(s) of administering the interim measure.
• The performance target (including on which students and on what
content the target is focused) about which this interim measure will
provide data.
• The content focus (or foci) for interim measure analysis.
• Student group(s) results that will be the focus of the analysis.
• When data will be available (be specific).
Metrics (Levels)
• Two levels of metrics (interim assessment results):
– Individual
– Group
• Examples of groups:
–
–
–
–
All students in the school
All students in a grade level
Students in a disaggregated group
Students in a classroom
• Group metrics are aggregates of individual
metrics.
Levels and Performance Metrics
Levels
Performance Metric (examples)
Individual
• Classroom (formal)/Individual
Group
• individual performance rating,
student growth percentile, scale
score.
• Aggregate school/district over- • %/number scoring at each
all or grade-level
performance level, MGP, & AGP,
average scale score.
• Standard/strand (grade-level)
• %/number meeting scoring
proficient or better by standard
• Disaggregated group
• %/number (within group) scoring
at each performance level, MGP,
AGP (overall and by grade-level)
Individual Metrics/Scores
• Raw Scores
– Number correct
– Percent correct
– Number of points earned (may incorporate item difficulty)
• Standardized scores (T-Scores)
• Scaled Scores (often incorporates item difficulty IRT)
• Norm Referenced
– Percentile Ranks
– Grade Equivalent
• Criterion Referenced
– Performance Category Ratings
– Lexile Rating, Estimated Oral Reading Fluency, etc.
Comparison Points
• What is a comparison point? (Consider definition
in UIP Handbook Excerpt, Toolkit, p. 13)
• Comparison points can be norm or criterion
referenced.
• Norm referenced scores often embed the
comparison point in the score itself.
• Each interim assessment metric that is used to
evaluate progress towards performance targets,
needs an associated comparison point.
• What are examples of comparison points?
Metrics and Comparison Points for
Assessments most used by Colorado
Districts
• Consider Typical Interim Assessments: Metrics
and Comparison Points, (Green Sheets on your
table)
• Job aide for identifying metrics and comparison
points for interim and early literacy assessments
most frequently used by Colorado districts.
• Are any of these “typical” interim assessments
being used in your school/district?
Identifying Metrics and Comparison
Points for Interim Assessments
• Consider Report Examples
• Review several reports.
• Use Identifying Metrics and Comparison Points
(Toolkit, p. 31)
• For each report reviewed, identify:
– What metrics are available on the report?
– To what could performance on each metric be
compared (comparison points)? How good is good
enough?
Identify your metrics and
Comparison Points
• Use the Describing Interim Measures worksheet (Toolkit,
p. 29).
• For at least one interim measure, identify what you will
use to track and evaluate progress toward the
associated performance target :
– Metrics
– Comparison points
• If you know, identify what reports will be used to analyze
the identified metric(s) and comparison points for your
performance target group and content focus?
Questions
Different metrics make it possible to answer different
questions. What metrics would help answer the following
questions:
• How many students in the school are likely to score proficient or
better in mathematics within the next three years?
• Are most of our students meeting minimum expectations in reading?
• How does student achievement writing compare across the grades
in the school?
• Are there differences in the rates of growth in math between ELL
and non-ELL students in the school?
Questions to Guide Analysis
• Work with a partner to identify what
questions can be answered with the
metrics and comparison points you have
identified on the Describing Interim
Measures worksheet.
• Capture those questions in the final row of
that work sheet for at least one interim
measure.
Planning for Progress Monitoring
• How will remaining interim measures be
identified and fully described?
• Take out: Planning for Progress Monitoring
(Toolkit, p. 1)
• Make notes about:
– Current status
– How you will complete this task
– Who will complete it? When?
Agenda
The Role of
Progress
Monitoring
Data-Driven
Collaborative
Inquiry
Implementation
Benchmarks
Interim
Measures
Planning
Progress
Monitoring
Role of Implementation
Benchmarks
Consider UIP Handbook Excerpts (Toolkit, p. 11)
to answer the following questions:
– What are implementation benchmarks?
– What would not be an example of an
implementation benchmark?
– Where are implementation benchmarks
captured in the UIP Template?
Different Types of
Implementation Benchmarks
• Two basic “types”;
– Outputs (e.g. professional development sessions
held)
– Adult Outcomes (e.g. new instructional strategy
implemented)
• Can be different “types” of data too:
– Qualitative
– Quantitative
Logic Model
A Logic Model can be used to describe your theory of:
• How major improvement strategies will result in
improved student performance.
• The outputs and adult outcomes that will result from
each action step.
• The outputs and adult outcomes that will result in desired
changes in student outcomes (performance).
• About what implementation benchmarks should provide
information.
Example Logic Model
Consider the Example Logic Model (Toolkit, p. 33) Work
with a partner to answer the following questions:
• What outputs are expected for different action steps?
• What adult outcomes are expected?
• What student outcomes are expected?
• What evidence would you collect to determine if the expected
outputs and adult outcomes are being met (implementation
benchmarks)?
Be prepared to share some suggested implementation
benchmarks for this major improvement strategy.
Create a Logic Model
• Select at least one major improvement strategy
on which to focus.
• Use the Logic Model Template (Toolkit, p. 35) to
describe the logic model behind your Major
Improvement Strategy.
• Include:
– Action Steps
– Associated Outputs and Adult Outcomes
– If time, student outcomes
Identifying Implementation Benchmarks
• Consider your Logic Model and your Inventory of Data
Sources other than Performance Data.
• Use the Identifying Implementation Benchmarks
worksheet (Toolkit, p. 37).
• Capture:
– Action Steps
– Associated Outputs and Adult Outcomes
• Identify Implementation Benchmarks associated with
each Output and Adult Outcome. What will count as
evidence?
Planning for Progress Monitoring
• How will remaining implementation benchmarks
be identified and fully described?
• Take out: Planning for Progress Monitoring
(Toolkit, p. 1)
• Make notes about:
– Current Status
– How this will be completed
– Who will complete it and when
Agenda
The Role of
Progress
Monitoring
Data-Driven
Collaborative
Inquiry
Implementation
Benchmarks
Interim
Measures
Planning
Progress
Monitoring
Institutionalizing Progress Monitoring
• How can progress monitoring become key to
how we do business?
• Consider the following resources:
– Case study of West Denver Prep: Using Data to Drive
Instruction and Improve Achievement. (Toolkit, p. 39)
– Sample Planning Calendar for Developing and
Revising UIPs (Toolkit, p.47).
• Institutionalizing Progress monitoring includes
building it into the regular schedule of the
school/district.
Develop a Progress Monitoring
Calendar
• Take out the Progress Monitoring Calendar
template (Toolkit, p.49)
• With your team, identify progress monitoring
activity that will occur between December 2012
and February 2013.
• Consider:
– Could this calendar template meet our needs?
– What might we change about this basic template?
Preparing Data for Progress Monitoring
• How will your team track:
– the changes in student performance at different points
during the school year, and
– progress made towards implementing your action
steps?
• Consider Example Tracking Tools (packet)
– What components do you like in these tools?
– What components would you change about these
tools?
– Could you adapt one of these tools for your use?
Planning for Progress Monitoring
• Turn to Planning for Progress Monitoring
(Toolkit, p. 2)
• Make notes about how you will:
– Develop a calendar for progress monitoring
sessions.
– Determine how data will be tracked through
out the year.
– Build capacity to engage in data-driven
collaborative inquiry.
Preparing for Progress Monitoring
Sessions
• Checklist for Preparing for Progress Monitoring Sessions
(Toolkit, p. 3)
• Tasks include:
–
–
–
–
–
–
Determine what data will be reviewed:
Generate appropriate interim assessment reports.
Organize and prepare implementation benchmark data.
Identify guiding questions
Schedule time
Determine participants.
• How could you use this checklist?
Give us Feedback!!
• Written: Use sticky notes
– + the aspects of this session that you liked or worked
for you.
– The things you will change in your practice or that
you would change about this session.
– ? Question that you still have or things we didn’t get
to today
–
Ideas, ah-has, innovations
• Oral: Share out one ah ha!
Download