data - New Jersey School Boards Association

advertisement
The Board
&
Student Achievement
New Jersey School Boards Association
March 2, 2013
presented by
Dr. Tracey Severns
Introductions – Who am I?
Background Check
 Teacher
 Vice Principal & Principal
 Superintendent
 Researcher
 Presenter
 Student
 Chief Academic Officer for the NJDOE
Who are you?

What district do you represent?

Why are you here?
Defining Success
What is your definition of a
great school?
Make it short and measurable.
Milton Chen’s definition is…
The kids run in
faster than they
run out.
(and so do the faculty!)
Consider this…
I think that if we changed
_____________________,
our students’ scores would improve.
Without data,
it’s just an opinion.
Opinions may be your
most important data!
How are we doing compared to standard in Language Arts?
100
92
90
86
86
80
% Proficient
70
66.1
64.8
60
50
District
AYP
40
36.3
30
20
10
0
Gr 4
Gr 8
Grade Level
Gr 11
How are we doing compared to standard in Mathematics?
100
90
86
84
80
80
70
61.4
% Proficient
60
50
District
AYP
37.5
40
35.6
30
20
10
0
Gr 4
Gr 8
Grade Level
Gr 11
“I believe that our system of special education is effective.”
100
90
80
76.75
% Agree or Strongly Agree
70
65
60
55
54
MOMS
MOHS
50
40
30
20
10
0
Elem
District
“I believe if we changed the way we work with special ed. students,
they could achieve at higher levels.”
100
90
80
71
% Agree or Strongly Agree
70
60
59
58
MOHS
District
51.75
50
40
30
20
10
0
Elem
MOMS
“I believe the majority of special ed. students can achieve proficiency.”
100
90
80
% Agree or Strongly Agree
70
60
57
48
50
42
40
31.75
30
20
10
0
Elem
MOMS
MOHS
District
What opinions do you suffer?




Special ed kids are better served in
special ed classes.
Grouping students by ability improves
student achievement.
Having one teacher, all day, is the best
way to teach elementary school.
The students fail because they don’t care.
Leaders must use data to:






Evaluate progress and performance
Establish goals and mobilize efforts
Leverage resources
Inform practice
Guide decision-making
Measure, Monitor & Market results
Today, we’re going to


Examine the role of BOE members in
using data to improve student
achievement.
Learn to ask questions of the data.
Establish a baseline
With regard to student achievement:
 What data do you have?
 What data do you use?
 Who uses the data?
 For what purpose are the data used?
 What data do you need?
Identifying the Data Barriers


What gets in the way of using data in
schools and school districts?
What are the obstacles?
Data Sources and Key Results
Student Performance (classroom quizzes/tests,
lab reports, projects, pre/post tests, GPA,
performance assessments, standardized tests
(norm ref, criterion ref) PSAT, SAT, ACT, AP,
report card grades, portfolio pieces, writing
assessments, promotion/graduation rates,
discipline records, college acceptance, G&T,
BSI, honors classes, advanced courses,
honor/high honor roll, scholarships, awards,
record at competitions/championship)
Data Sources and Key Results



Demographic data (enrollment and
performance by race, gender, SES, ELL,
special education, migrant)
Climate (exit/entrance interviews, surveys,
attendance, extracurricular participation,
passage of referendums/school budgets)
Resources (personnel, computers,
connectivity, time, space, revenues,
expenditures)
When working with data,
use three reference points.



How are we doing compared to
standard? (Proficiency)
How are we doing compared to
ourselves? (Progress)
How are we doing compared to
others?
(Relative performance)
Adequate Yearly Progress
Language
Grade
Arts/Literacy 3/4/5
Grade
6/7/8
High
School
Math
Grade
3/4/5
Grade
6/7/8
High
School
Starting
Point 2003
2005
2008
2011
2014
68
75
59 (73)
86
100
58
66
72
86
100
73
79
85
92
100
53
62
66 (69)
84
100
39
49
61
80
100
55
64
74
86
100
Performance Targets
According to the ESEA Waiver:
Targets are set in annual equal
increments so that within six years the
percentage of non-proficient students in
the “all students” group and in each
subgroup is reduced by half.
Huh?
If 40% of “all students” are Proficient:
 100 – 40 = 60


60 / 2 = 30


100%P – current %P = gap
Gap divided by 2 = target % increase in 6 yrs
30 / 6 = 5

6 yr target divided by 6 = annual target %
increase
And so…

For this school, the expected
performance rates would be:






Yr
Yr
Yr
Yr
Yr
Yr
1
2
3
4
5
6
45%P
50%P
55%P
60%P
65%P
70%P
More on Performance Targets



Targets were based on 2010-2011 data.
This process was repeated for each
subgroup with an n > 30.
High perf grps can meet expectations
by achieving 90%P (95%P in 2015).
Question
Does this process effect every subgroup
equally?
How are we doing compared to
Standard in 5th grade math?
School and AYP Gr 5 Math
100
90
80
70
% Prof.
60
Gr 5 School 08
50
AYP Gr 5
40
30
20
10
0
Gr 5 School 08
AYP Gr 5
How are we doing compared to
Standard in 5th grade language arts?
School and AYP Grade 5 LA
100
90
80
70
% Prof.
60
Gr 5 School 08
50
AYP Gr 5
40
30
20
10
0
Gr 5 School 08
AYP Gr 5
How are we doing compared to
Ourselves in Language Arts?
NJASK Language Arts
Cohort Comparison
100
90
80
70
% Prof
60
Regular Ed
50
Special Ed
40
30
20
10
0
MOMS 06
MOMS 07
MOMS 08
How are we doing compared to
Ourselves in Math?
NJASK Math
Cohort Comparison
100
90
80
70
% Proficient
60
Regular Ed
50
Special Ed
40
30
20
10
0
MOMS 06
MOMS 07
MOMS 08
How are we doing compared to Standard
and Ourselves in Language Arts?
2008 NJASK Language Arts
Grades 6, 7, 8
100
90
80
70
% Proficient
60
Regular Ed
50
Special Ed
40
30
20
10
0
MOMS Gr 6
MOMS Gr 7
MOMS Gr 8
How are we doing compared to Standard
and Ourselves in Math?
2008 NJASK Math
Grades 6, 7, 8
100
90
80
70
% Proficient
60
Regular Ed
50
Special Ed
40
30
20
10
0
MOMS Gr 6
MOMS Gr 7
MOMS Gr 8
How are we doing compared to Others?
NJASK8 LAL
30.0
25.0
20.0
School Mean
15.0
DFG Mean
State Mean
10.0
5.0
0.0
Total Students
General
Education
Special
Education
LEP
Title 1
How would you define comparable?







DFG
% FARMS
% ELL
% ELL at home
% Special needs
Student mobility
Teacher mobility






Class size
Cost per pupil
Total enrollment
Instructional hours
Student/Faculty
ratio
Student/Admin ratio
Where do you stand?
School Digger


www.schooldigger.com – ranks all NJ
public elementary, middle and high
schools by adding each school’s
average ASK Math and LA scores.
Includes a 5 star system to designate
schools in the top 10% of the ranking
Coping with Education Statistics
“There are three kinds of lies: lies,
damned lies and statistics.”
“Sometimes we accept statistics because
we are not in a position to challenge
them. Other times we accept them
because we lack the time to ferret out
the truth.”
- Gerald Bracey
Simpson’s Paradox


Has nothing to do with Homer.
Beware of changes in groups over time
when the aggregate data show one
pattern and the disaggregated data
show the opposite.
Consider this…
SAT Scores 2005
Mean = 480
SAT Scores 2011
Mean = 478
At a BOE meeting, people demand to
know, “Why are SAT scores dropping?”
But are they?
Let’s examine the data
SAT Scores 2005
500
500
500
500
500
500
500
500
400
400
Mean = 480
SAT Scores 2011
510
510
510
510
510
510
430
430
430
430
Mean = 478
First, we need to understand that
In 2005, the 500s represent scores of
white students and 400s represent
scores of black students.
In 2011, the 510s represent scores of
white students and 430s represent
scores of black students.
What do you notice?
White students’ scores went up 10 points.
Black students’ scores went up 30 points.
but
In 2005, 80% were white, 20% were black.
In 2011, 60% were white, 40% were black.
And so…


Although the SAT scores for both
groups increased, the overall mean
decreased because there was a
higher percentage of minority
students taking the test.
Thus, beware of shifts in subgroup
proportion and performance over time.
Simpson’s Paradox at work…
Ethnic Group
White
Black
Asian
Mexican
Puerto Rican
Am Indian
All Students
1995
519
412
474
438
437
471
504
2005
529
433
511
453
460
489
508
Gain
+10
+21
+37
+15
+23
+18
+4
Why are our scores dropping?
Average SAT Scores
490
485
Scores
480
475
470
465
460
Year
They’re not. We’re doing better!
SAT Scores by Ethnicity
550
530
Scores
510
490
470
450
430
410
390
White
Black
Asian
Mexican
Ethnicity
Puerto
Rican
Am Indian
All
Students
Imagine this.


Your superintendent has just presented
these results.
Write down what you are thinking.
Root Cause Analysis
Why are we doing better?
To what do we attribute the results?
Revealing the Root Cause
Root cause analysis is the process
of identifying the underlying cause,
or causes, of positive or negative
outcomes within a system.
Paul Pruess
In other words…
Why did subgroups perform as they did?
Possibilities include:
 Organizational issues (time, availability of
programs, personnel or support services)
 Instructional/implementation issues
(curriculum, instruction, assessment)
 Environmental issues (external forces or
factors that may have influenced results)
Data Analysis




What trends do you find in the data?
To what would you attribute the
results?
What questions come to mind when you
review the data?
What recommendations would you
make to improve student performance?
A Picture is Worth a Thousand Words
and a Million Statistics
GEPA Comparison 2003-2005
100
90
80
70
60
50
40
30
Special Ed
Regular Ed
20
10
0
2003
2004
2005
Lang Arts
2003
2004
2005
Math
2003
2004
Science
2005
What questions need answers?





Do students gain or lose ground over
time? Does this vary by track?
What patterns exist among teachers?
What is the effect of levels in math?
What courses, interventions or
programs result in gains? For whom?
How does question type (multi choice v
open-ended) effect performance?
What else do you need to know?
Are students in BSI improving?
Are ELL students progressing?
Are G&T students maintaining high scores?
What are our Eco Disadvantaged students’
areas of weakness?
How does the performance of students with
special needs vary by program/placement?
How do you get “buy in?”
Provide people with the data and invite
them to be curious.
Understand that “buy in” will follow, not
precede, results!
Paul Bambrick-Santoyo
What is the most effective way
to motivate people?
a.
b.
c.
d.
e.
f.
Public recognition
Private recognition
Bonuses
Threats
Data on personal and team progress
Annual performance evaluation
As a leader, we must…
Develop the “culture and capacity” to use
data to improve student achievement.

How do you build culture?

How do you build capacity?
Determine readiness



Capacity - Degree of proficiency
Assessment literacy
Data knowledge and skills
Culture - Degree of commitment
Emotional climate
Group norms
Surface predictions and assumptions
Where are you?
Low Commitment
Low Capacity
Low Commitment
High Capacity
High Commitment
Low Capacity
High Commitment
High Capacity
Where do you want to be?
How do we get there?
“The real methodology for system change
begins and ends with
ongoing, authentic conversations
about important questions.”
Tony Wagner
Michael Fullan suggests…
If a system is striving for both high equity and
excellence, then policy and practice have to
focus on system improvement. This means
that a school head has to be as concerned
about the success of other schools as he/she
is about his/her own school. Sustained
improvement of schools is not possible unless
the whole system is moving forward.
Fullan: Whole System Reform





Relentless focus on leadership
Small number of ambitious goals
Core strategy of capacity building
Use of evidence/data
Create units of schools that learn from
each other
Find your “Leadership Focus”
(Reeves, 2011)
The Law of Initiative Fatigue
 The key to improving schools is having
no more than 6 priorities.

As the number of initiatives increases,
student achievement decreases
(law of diminishing returns).
To diagnose “Initiative Fatigue”




Divide a piece of paper into two columns.
In the left-hand column, list all new
initiatives your school or district has
begun in the past 24 months.
In the right-hand column, list the
programs that have been evaluated and
terminated.
Which column is longer?
Prioritize and Pick!
It is practices, not programs that change
schools. Focus on practices that have:
 Impact – the potential to exert a
significant effect
 Leverage – the potential to effect
multiple outcomes
Refer to Visible Learning (Hattie, 2009) for the effect
size of various factors that effect learning.
Establish SMART goals





Specific
Measurable
Attainable
Results-Oriented
Time-bound
Remember…
What gets measured
gets
managed.
Address the Key Characteristics
of 90/90/90 Schools
(Reeves)





A laser-like focus on student achievement
Clear curriculum choices – spend more
time on reading, writing and mathematics
Frequent assessment and multiple
opportunities for improvement
An emphasis on nonfiction writing
Collaborative scoring of student work
Role of District Leaders
“Simultaneous loose-tight leadership”
characterized by “defined autonomy”
 Articulate clear, nondiscretionary goals
 Provide strategies for achieving goals
 Identify the indicators that will be used
to monitor the goals
Marzano and Waters
What’s “tight” in your district?
Activity
List 3 things that people throughout the
district understand are “tight” nondiscretionary priorities that must be
observed in every school.
Now, compare your answers!
Monitoring the Implementation Process
Seek evidence regarding:
 How teachers are organized into teams
 How teachers are given time to collaborate
 How the work of teams is monitored
 How the results of common formative
assessments are being used by teams
 How schools are providing systematic
intervention and enrichment.
Effective Monitoring



(Reeves)
Frequent
Focused on actions of adults (not just
student test scores)
Constructive (Is it a witch hunt or a
treasure hunt?)
Communication is Key!
Leaders throughout the district speak with
one voice and listen with both ears.
Communication must be clear, constant,
consistent and congruous (between one’s
actions and professed priorities).
Conduct a “Communication Audit”







What systems ensure that priorities are
addressed in each school?
What do we monitor in our district?
What questions do we ask?
How do we allocate resources?
What do we celebrate?
What are we willing to confront?
What do we model?
What can you do?
Ask questions of the data and seek answers.
Do something with the data. Use it to
determine district priorities for professional
development and allocation of resources
(time, personnel, funding, space).
Respect the data. Be judicious in the way
data is handled and shared. Balance the need
to ensure confidentiality and transparency.
Does it really matter?
Marzano and Stronge’s research:
One year with an ineffective teacher
takes three consecutive years with
a highly effective teacher to catch up.
Wrap up and Reflections
What are my biggest “take aways?”
How will I use what I learned to enhance
my service to the district?
What are my next action steps?
The End
Dr. Tracey Severns
Chief Academic Officer
NJDOE
tracey.severns@doe.state.nj.us
Download