Data-Driven Instruction Comprehensive Leadership

advertisement
Data-Driven Instruction Comprehensive
Leadership Workshop
Paul Bambrick-Santoyo
P1
NY State Public School ELA 4th Performance vs. FreeReduced Rates
100%
90%
Pct. Proficient
80%
70%
60%
50%
40%
30%
20%
10%
10%
20%
30%
40%
50%
60%
70%
Pct. Free-Reduced Lunch
80%
90%
100%
P2
NY State Public School ELA 4th Performance vs. FreeReduced Rates
100%
90%
Pct. Proficient
80%
70%
60%
50%
40%
30%
20%
10%
10%
20%
30%
40%
50%
60%
70%
Pct. Free-Reduced Lunch
80%
90%
100%
P3
Case Study: Springsteen Charter School, Part 1
• What did Jones do well in his attempt to improve
mathematics achievement?
• What went wrong in his attempt to do data-driven decision
making?
• As the principal at Springsteen, what would be your FIRST
STEPS in the upcoming year to respond to this situation?
P4
Man on Fire:
• What were the key moments in Creasy’s attempt to help the
girl (Pita)?
• What made Creasy’s analysis effective?
P5
ASSESSMENT ANALYSIS I
PART 1—GLOBAL IMPRESSIONS:
Global conclusions you can draw from the data:
• How well did the class do as a whole?
• What are the strengths and weaknesses in the standards:
where do we need to work the most?
• How did the class do on old vs. new standards? Are they
forgetting or improving on old material?
• How were the results in the different question types
(multiple choice vs. open-ended, reading vs. writing)?
• Who are the strong/weak students?
P6
ASSESSMENT ANALYSIS II
PART 2—DIG IN:
• “Squint:” bombed questions—did students all choose same
wrong answer? Why or why not?
• Compare similar standards: Do results in one influence the
other?
• Break down each standard: Did they do similarly on every
question or were some questions harder? Why?
• Sort data by students’ scores: Are there questions that
separate proficient / non-proficient students?
• Look horizontally by student: Are there any anomalies
occurring with certain students?
P7
Teacher-Principal Role Play
ROLE-PLAY ANALYSIS:
• What did you learn about the teachers?
• How did the interim assessment and analysis template
change the dynamic of a normal teacher/principal
conversation?
• By using this particular assessment and analysis
template, what decisions did the principal make about
what was important for the student learning at his/her
school?
P8
Teacher-Principal Role Play
META-ANALYSIS:
• What are the strengths and limitations of this approach to
data-driven decision making?
• What structures are needed to allow such a process to
happen?
P9
Videos of Teacher-Principal Conference
Videotaped 2005-06
P10
Impact of Data-Driven Decision Making
North Star Academy
State Test & TerraNova Results 2003-2008
P11
Comparison of 02-03 to 03-04:
How one teacher improved
5th Grade 2002-2003 -- Percentage at or above national avg
TER R A NOV A
N=43 s tudents
2002
5th
R eadi ng
Grad e Pre- T est
36.6%
34.1%
Language
2003
5th grade
40.5%
40.5%
C HA NGE
+ 3.9
+ 6.3
5th Grade 2003-2004 -- Percentage at or above national avg
TER R A NOV A
N=42 s tudents
2003
5th
Gr ad e Pr e- T est
2004
5th grade
C HA NGE
R eadi ng
31.0%
52.4%
+ 21.4
Language
21.4%
47.6%
+ 26.2
P12
Comparison of 02-03 to 03-04:
How 2nd teacher improved
6th Grade 2002-2003 -- Percentage at or above grade level
TERRANOVA
2002
2003
6th Grade Pre-Test
6th grade
CHANGE
Reading
53.7%
29.3%
- 24.4
Language
51.2%
48.8%
- 2.4
N=43 students
6th Grade 2003-2004 -- Percentage at or above grade level
TERRANOVA
2003
2004
5th grade
6th grade
CHANGE
Reading
40.5%
44.2%
+ 3.7
Language
40.5%
79.1%
+ 38.6
N=42 students
P13
North Star Academy:
NJ State Test Results
2009
P14
NJASK 8—DOWNTOWN MS LITERACY
P15
NJASK 8—DOWNTOWN MS MATH
P16
North Star Middle Schools: Setting the Standard
P17
North Star Elementary: Exploding Expectations
2008-09 TerraNova Exam:
Kindergarten--Median National Percentile
Kindergarteners’ Median National Percentile
100.0%
95.3%
96.7%
97.4%
80.0%
60.0%
K Pre-test
42.6%
40.0%
Kindergarten
29.3%
27.5%
20.0%
0.0%
Reading
Language
Math
P18
HIGH SCHOOL HSPA—ENGLISH
Comparative Data from 2008 HSPA Exam
P19
HIGH SCHOOL HSPA—MATH
Comparative Data from 2008 HSPA Exam
P20
NEW JERSEY HSPA—ENGLISH PROFICIENCY
P21
NEW JERSEY HSPA—MATH PROFICIENCY
P22
Day 1 Conclusions
Data-Driven Instruction & Assessment
Paul Bambrick-Santoyo
P23
Day 2
Data-Driven Instruction & Assessment
Paul Bambrick-Santoyo
P24
Dodge Academy: Turnaround Through Transparency
P25
Ft. Worthington: Turnaround Through Transparency
P26
Monarch Academy: Vision and Practice
P27
P28
Quick-Write Reflection
• From what you know right now, what are the most
important things you would need to launch a data-driven
instructional model in your school?
P29
THE FOUR KEYS:
DATA-DRIVEN INSTRUCTION AT ITS ESSENCE:
ASSESSMENTS
ANALYSIS
ACTION
in a Data-driven CULTURE
P30
1.
50% of 20:
2.
67% of 81:
3.
Shawn got 7 correct answers out of 10 possible answers on his science test.
What percent of questions did he get correct?
4.
J.J. Redick was on pace to set an NCAA record in career free throw
percentage. Leading into the NCAA tournament in 2004, he made 97 of 104
free throw attempts. What percentage of free throws did he make?
5.
J.J. Redick was on pace to set an NCAA record in career free throw
percentage. Leading into the NCAA tournament in 2004, he made 97 of 104
free throw attempts. In the first tournament game, Redick missed his first
five free throws. How far did his percentage drop from before the
tournament game to right after missing those free throws?
6.
J.J. Redick and Chris Paul were competing for the best free-throw shooting
percentage. Redick made 94% of his first 103 shots, while Paul made 47 out
of 51 shots.
• Which one had a better shooting percentage?
• In the next game, Redick made only 2 of 10 shots while Paul made 7 of 10
shots. What are their new overall shooting percentages? Who is the better
shooter?
• Jason argued that if Paul and J.J. each made the next ten shots, their
shooting percentages would go up the same amount. Is this true? Why or
P31
why not?
ASSESSMENT BIG IDEAS:
Standards (and objectives) are meaningless until
you define how to assess them.
Because of this, assessments are the starting point
for instruction, not the end.
P32
ASSESSMENTS:
LITTLE RED RIDING HOOD:
1. What is the main idea?
2. This story is mostly about:
A. Two boys fighting
B. A girl playing in the woods
C. Little Red Riding Hood’s adventures with a wolf
D. A wolf in the forest
3. This story is mostly about:
A. Little Red Riding Hood’s journey through the woods
B. The pain of losing your grandmother
C. Everything is not always what it seems
D. Fear of wolves
P33
ASSESSMENTS:
Subject-Verb Agreement
•
He _____________ (run) to the store.
•
Michael _____________ (be) happy yesterday at the party.
•
Find the subject-verb agreement mistake in this sentence:
•
Find the grammar mistake in this sentence:
•
Find the six grammar and/or punctuation mistakes in this paragraph:
P34
ASSESSMENT BIG IDEAS:
In an open-ended question, the rubric
defines the rigor.
In a multiple choice question, the options
define the rigor.
P35
ASSESSMENTS:
1. Solve the following quadratic equation:
x2  x  6  0
2. Give the following rectangle with the lengths shown below,
find the value of x:
x
x 1
Area = 6
P36
ASSESSMENTS:
PRINCIPLES FOR EFFECTIVE ASSESSMENTS:
COMMON INTERIM:
• At least quarterly
• Common across all teachers of the same grade level
DEFINE THE STANDARDS—ALIGNED TO:
• To state test (format, content, & length)
• To instructional sequence (curriculum)
• To college-ready expectations
P37
ASSESSMENTS:
PRINICIPLES FOR EFFECTIVE ASSESSMENTS:
REASSESSES:
• Standards that appear on the first interim assessment
appear again on subsequent interim assessments
WRONG ANSWERS:
• Illuminate misunderstanding
TRANSPARENT:
• Teachers see the assessments in advance
P38
THE FOUR KEYS:
DATA-DRIVEN INSTRUCTION AT ITS ESSENCE:
ASSESSMENTS
(Interim, Aligned, Reassess, Transparent)
ANALYSIS
ACTION
in a Data-driven CULTURE
P39
THE FOUR KEYS:
DATA-DRIVEN INSTRUCTION AT ITS ESSENCE:
ASSESSMENTS
(Interim, Aligned, Reassess, Transparent)
ANALYSIS
ACTION
in a Data-driven CULTURE
P40
ASSESSMENTS: Reading Decisions
LEVELED VS. SKILLS:
• Will your interim assessment develop around reading levels or
reading skills?
P41
Leveled Assessment Debate
Grade-Level Assessments
PROS:
• Predict results on external
assessments
• Measure student achievement against
grade-level standard
• Ensure school maintains high
standards and expectation that all
students will reach grade level
Leveled Reading Assessments
PROS:
• Shows growth along the leveled-text
continuum—possible to see monthly
gains toward grade-level standard
• Because the text is at an accessible
level, gives data on individual reading
standards
• Motivates students and engenders
student ownership of learning process
• Confirms student reading levels for
teachers
• Assessment levels correspond to book
levels
CONS:
• If a student is significantly behind in
level, offers little information to
inform instruction
• Difficult to see incremental (monthly
or quarterly) reading gains
CONS:
• Because text is often inaccessible to • Does not predict results on external
students, little data can be gathered
assessments
on strengths and weaknesses by
• If not supplemented by grade-level
standard
assessments, could lower standards
• Demoralizing for students to
and expectations for the school
constantly fail
P42
ASSESSMENTS: Writing
• RUBRIC: Take a good one, tweak it, and stick with it
• ANCHOR PAPERS: Write/acquire model papers for Proficient
and Advanced Proficient that will be published throughout the
school & used by teachers
• GRADING CONSENSUS: Grade MANY student papers together
to build consensus around expectations with the rubric
• DRAFT WRITING VS. ONE-TIME DEAL: Have a balance
P43
ASSESSMENTS: High School
• HIGH SCHOOL PROFICIENCY VS. COLLEGE READINESS:
Preparing for HS state test and ACT/SAT/AP/college-level work
• SOLID SHORT PAPERS VS. RESEARCH PAPER
• MATH: Textbook vs. Application vs. Conceptual understanding
P44
ASESSMENT ANALYSIS: Exercise
• TASK: Compare State assessment with interim assessment
• USE ASSESSMENT ANALYSIS SHEET TO ANSWER:
• Are they aligned in CONTENT? What is the interim
assessment missing?
• Are they aligned in FORMAT/LENGTH?
• Are they COLLEGE-READY expectations?
P45
Case Study: Douglass Street School
1. Did Krista Brown meet the challenge of 15-point gains? What
percentage of teachers do you think made the gains? Which
teachers did not? Why?
2. Based on your answers, name the biggest stumbling blocks to
school’s success.
3. Based on your answers, name the most important drivers of
school improvement.
P46
TRADITIONAL SYSTEMS: Principal-centered
HOW TO EVALUATE TEACHER EFFECTIVENESS:
1. How dynamic the lesson appeared to be
2. How well you control the kids
3. How good the curriculum guide / scope & sequence are
(“well-intended fiction”—Jacobs)
4. “What is the real curriculum?” “The textbook.”
5. What the teacher teaches and how “good” their pedagogical
choice was
P47
DATA-DRIVEN CULTURE:
• VISION: Established by leaders and repeated relentless
• TRAINED LEADERSHIP TEAM: “real” leaders and formal
leaders involved in process
• CALENDAR: Calendar in advance with built-in time for
assessments, analysis & action
• PROFESSIONAL DEVELOPMENT: Aligned
P48
THE FOUR KEYS:
ASSESSMENTS
(Aligned, Interim, Reassess, Transparent)
ANALYSIS
ACTION
in a Data-driven CULTURE
(Vision, Leadership, Calendar, PD)
P49
Analysis, Revisited
Moving from the “What” to the “Why”
P50
Man on Fire:
• What made Creasy’s analysis effective?
• After a solid analysis, what made Creasy’s action plan
effective?
P51
ANALYSIS:
• IMMEDIATE: Ideal 48 hrs, max 1 wk turnaround
• BOTTOM LINE: Includes analysis at question level, standards
level and overall—how well did the students do as a whole
• TEST-IN-HAND analysis: Teacher & instructional leader
together
• TEACHER-OWNED analysis
• DEEP: Moves beyond “what” to “why”
P52
THE FOUR KEYS:
ASSESSMENTS
(Aligned, Interim, Reassess, Transparent)
ANALYSIS
(Quick, Bottom line, Teacher-owned, Test-in-hand, Deep)
ACTION
in a Data-driven CULTURE
(Vision, Leadership, Calendar, PD)
P53
Running Effective Analysis Meetings
P54
PRECURSORS TO EFFECTIVE ANALYSIS MTGS
• Did teachers see the assessment in advance?
(TRANSPARENCY)
• Did they mark it up: Confident, Not Sure, No Way?
(TEST-IN-HAND, TEACHER-OWNED)
• Did you train teachers in analysis strategies?
(PROF DEVT, DEEP)
• Did they fill out an analysis sheet? Did they answer the
fundamental question: WHY the students did not learn it?
(TEACHER-OWNED, DEEP)
• Did they have to fill out an action plan? Did you model
how to fill out an action plan using these analysis
questions?
(ACTION PLAN, ACCOUNTABILITY)
P55
PRE-CURSORS TO EFFECTIVE ANALYSIS CONT.
• Did you model a poor and a good conversation so they hear
your expectations?
(PROF DEVT, DEEP)
• Did you analyze their results (above and beyond them
analyzing their own) in preparation for the meeting?
(LEADERSHIP)
• Did you collect their analysis ahead of time and see if it
looked acceptable?
(LEADERSHIP, ACCOUNTABILITY)
• Did you have a plan ready to access content experts if the
problems were beyond your expertise?
(PROF DEVT)
P56
TIPS FOR EFFECTIVE ANALYSIS MEETINGS:
• Let the data do the talking
• Let the teacher do the talking (or get them to!)
• Always go back to the test and back to specific questions
• Don’t fight the battles on ideological lines (you’re going to lose)
• There’s a difference between the first assessment and the third
• You’ve got to know the data yourself to have an effective
meeting
• Make sure it’s connected to a concrete plan that you can verify
P57
ANALYSIS MEETING HELPFUL PHRASES:
HELPFUL STARTERS FOR ANALYSIS MEETINGS:
• “So…what’s the data telling you?”
• “Congratulations on the improvement from last time in x
area! You must be really proud of their growth here.”
• “So the _____ [paraphrase their frustration: the test was
hard, the students were difficult, etc.]? I’m sorry to hear
that. So where should we begin with our action plan
moving forward?”
P58
ANALYSIS MEETING HELPFUL PHRASES:
DATA-FOCUSING FOR ANALYSIS MEETINGS:
• “So let’s look at question 18…..Why do you think they got it
wrong?”
• “You know, I thought it might be a silly mistake, but what
surprised me is that they did really well on questions x & y.
Why do you think they did so well on these questions and
yet not on your original question?”
• “Let’s look at question 11. What did the students need to
be able to do to answer that question effectively? Is this
more than they are able to do with you in your class?”
• [When new ideas occur or deeper analysis is done at the
meeting than what teacher did previously] “So let’s re-visit
the action plan you created and see how we can incorporate
these additional ideas.”
P59
The Language of Leadership
“That’s nice, but tell me again:
what’s the point of all this?
P60
Day 1 Conclusions
Data-Driven Instruction & Assessment
Paul Bambrick-Santoyo
P61
DATA-DRIVEN RESULTS:
Greater Newark Academy Charter School
8th Grade GEPA Results
Year Tested
GNA 2004
Language Arts
Mathematics
% Proficient / Adv
Proficient
% Proficient / Adv
Proficient
46.3
7.3
P62
DATA-DRIVEN RESULTS:
Greater Newark Academy Charter School
8th Grade GEPA Results
Year Tested
GNA 2004
GNA 2005
Language Arts
Mathematics
% Proficient / Adv
Proficient
% Proficient / Adv
Proficient
46.3
63.2
7.3
26.3
P63
DATA-DRIVEN RESULTS:
Greater Newark Academy Charter School
8th Grade GEPA Results
Year Tested
GNA 2004
GNA 2005
GNA 2006
Language Arts
Mathematics
% Proficient / Adv
Proficient
% Proficient / Adv
Proficient
46.3
63.2
73.5
7.3
26.3
73.5
P64
DATA-DRIVEN RESULTS:
Greater Newark Academy Charter School
8th Grade GEPA Results
Language Arts
Mathematics
% Proficient / Adv
Proficient
% Proficient / Adv
Proficient
GNA 2006
46.3
63.2
73.5
7.3
26.3
73.5
GNA 2007
80.1
81.8
Year Tested
GNA 2004
GNA 2005
P65
DATA-DRIVEN RESULTS:
Greater Newark Academy Charter School
8th Grade GEPA Results
Language Arts
Mathematics
% Proficient / Adv
Proficient
% Proficient / Adv
Proficient
GNA 2006
46.3
63.2
73.5
7.3
26.3
73.5
GNA 2007
80.1
81.8
+ 33.8
+ 74.5
Newark Schools 2006
54.5
41.5
NJ Statewide 2006
82.5
71.3
Year Tested
GNA 2004
GNA 2005
Difference 2004-07
P66
Greater Newark Charter: Achievement by Alignment
P67
Morell Park Elementary School: Triumph in Planning
P68
Holabird Academy: Coaching to Achievement
P69
Chicago International Charter: Winning Converts
P70
Excellence Charter School—3rd Grade Math
Excellence Charter School
2007 New York State 3rd Grade Math Exam
Class 2016: Percentage Proficient and Advanced
100%
100%
90%
80%
81%
70%
75%
60%
50%
59%
40%
30%
20%
10%
0%
School District 16*
New York City
District*
New York
Statewide*
Excellence
*District, NYC, and state results are for 2006.
P71
E.L. Haynes Charter School: Scheduled to Succeed
P72
Capitol Heights Elementary: Data in the Blue Book
P73
P74
The Language of Leadership
“That’s nice, but tell me again:
what’s the point of all this?
P75
One-Minute Responses—Agenda:
Small Groups/Pairs—Deliver Speeches (10 min):
• Deliver responses to each other
• Give feedback: What message did you hear (verbally and
non-verbally)? What had the biggest impact? What would
you change/improve?
• Write down responses that work
P76
Mr. Holland’s Opus:
• What made the difference? How did Lou Russ finally learn to
play the drum?
• What changed Mr. Holland’s attitude and actions?
P77
ACTION:
• PLAN new lessons based on data analysis
• ACTION PLAN: Implement what you plan (dates, times,
standards & specific strategies)
• LESSON PLANS: Observe changes in lesson plans
• ACCOUNTABILITY: Observe changes classroom observations,
in-class assessments
• ENGAGED STUDENTS: Know end goal, how they did, and
what actions they’re taking to improve
P78
THE FOUR KEYS:
ASSESSMENTS
(Aligned, Interim, Reassess, Transparent)
ANALYSIS
(Quick, Bottom line, Teacher-owned, Test-in-hand, Deep)
ACTION
(Action Plan, Accountability, Engaged)
in a Data-driven CULTURE
(Vision, Leadership, Calendar, PD)
P79
Results Meeting Protocol
Effective Group Meeting Strategy
P80
ACTION: RESULTS MEETING
50 MIN TOTAL
• IDENTIFY ROLES: Timer, facilitator, recorder (2 min)
• IDENTIFY OBJECTIVE to focus on (2 min or given)
• WHAT WORKED SO FAR (5 min)
• [Or: What teaching strategies did you try so far]
• CHIEF CHALLENGES (5 min)
• BRAINSTORM proposed solutions (10 min)
• [See protocol on next page]
• REFLECTION: Feasibility of each idea (5 min)
• CONSENSUS around best actions (15 min)
• [See protocol on next page]
• PUT IN CALENDAR: When will the tasks happen? When will
the teaching happen? (10 min)
P81
RESULTS MEETING STRUCTURE:
PROTOCOLS FOR BRAINSTORMING/CONSENSUS
PROTOCOL FOR BRAINSTORMING:
• Go in order around the circle: each person has 30 seconds to
share a proposal.
• If you don’t have an idea, say “Pass.”
• No judgments should be made; if you like the idea, when it’s
your turn simply say, “I would like to add to that idea by…”
• Even if 4-5 people pass in a row, keep going for the full
brainstorming time.
PROTOCOL FOR REFLECTION:
• 1 minute—Silent personal/individual reflection on the list: what
is doable and what isn’t for each person.
• Go in order around the circle once: depending on size of group
each person has 30-60 seconds to share their reflections.
• If a person doesn’t have a thought to share, say “Pass” and
come back to that person later.
• No judgments should be made.
P82
RESULTS MEETING STRUCTURE:
PROTOCOLS FOR BRAINSTORMING/CONSENSUS
PROTOCOL FOR CONSENSUS/ACTION PLAN:
• ID key actions from brainstorming that everyone will agree to
implement
• Make actions as specific as possible within the limited time
• ID key student/teacher guides or tasks needed to be done to
be ready to teach—ID who will do each task
• Spend remaining time developing concrete elements of lesson
plan:
• Do Now’s
• Teacher guides (e.g., what questions to ask the students or
how to structure the activity)
• Student guides
• HW, etc.
NOTE: At least one person (if not two) should be recording
everything electronically to send to the whole group
P83
TOPIC CHOICES FOR RESULTS MEETING:
1.
FIRST PD SESSION WITH ENTIRE FACULTY: Design the agenda for the
whole-staff meeting introducing the data-driven instructional model you will
launch
•
Assume that the school has done very little in this area, and the teachers
associate “data-driven” instruction with state testing and test prep
2.
FIRST TEAM MEETING: Design the agenda for the first meeting with the
grade-level team that you will lead during your residency
•
Assume that the team has done very little in this area, and the teachers
associate “data-driven” instruction with state testing and test prep
3.
COLLEGE READINESS: For high school administrators, design the steps you
will take to adapt your city or state assessments to prepare students to
succeed at the college level
4.
ADAPT CITY EXAMS: Finally, if your city has mandatory exams, design the
steps you will take to bring these exams into alignment with your end goal
tests.
P84
ACTION: RESULTS MEETING
50 MIN TOTAL
• IDENTIFY ROLES: Timer, facilitator, recorder (2 min)
• IDENTIFY OBJECTIVE to focus on (2 min or given)
• WHAT WORKED SO FAR (5 min)
• [Or: What teaching strategies did you try so far]
• CHIEF CHALLENGES (5 min)
• BRAINSTORM proposed solutions (10 min)
• [See protocol on next page]
• REFLECTION: Feasibility of each idea (5 min)
• CONSENSUS around best actions (15 min)
• [See protocol on next page]
• PUT IN CALENDAR: When will the tasks happen? When will
the teaching happen? (10 min)
P85
Start Up Scenarios
Dealing with Challenging Situations
P86
ENTRY SCENARIOS:
DEALING WITH CHALLENGING SITUATIONS
• YOUR TEACHER TEAM DOES NOT HAVE ANY INTERIM
ASSESSMENTS: You are placed with a teacher team at a
grade level for which there are no citywide interim
assessments, and the school doesn’t have any either. What do
you do?
• YOUR DISTRICT HAS POOR MANDATED INTERIM
ASSESSMENTS: Your district has an interim assessment in
November and April, and your state test is in June. Not only
are the interim assessments too far apart, as you review them
you realize that they only cover about half of the standards
that will be on the state assessment, and they don’t include
any open-ended responses. What do you do?
P87
ENTRY SCENARIOS:
DEALING WITH CHALLENGING SITUATIONS
• JADED LEAD TEACHER: You are working with a team of
teachers and do your opening PD with them around datadriven instruction, and the younger teachers seem very
interested in working on this. But the oldest teacher on the
team (who has a very important influence on everyone else)
makes very dismissive comments about how this is a waste of
time. You give your one-minute response about the importance
of this work, but you can see that the newer teachers’
enthusiasm drops. What do you do?
P88
ENTRY SCENARIOS:
DEALING WITH CHALLENGING SITUATIONS
• ASSESSMENTS ARE UNALIGNED WITH INSTRUCTIONAL
SEQUENCE: You notice that the interim assessments in your
city are not aligned with the instructional sequence that the
teachers are mandated to follow. What do you do?
• NO GOOD ANALYSIS: Your district takes too long to produce
a data report, and you have no analysis templates to use.
What do you do?
P89
Burning Questions
Data-Driven Instruction & Assessment
Paul Bambrick-Santoyo
P90
Conclusions
Data-Driven Instruction & Assessment
Paul Bambrick-Santoyo
P91
Download