The national i3 evaluation of diplomas now

advertisement
THE NATIONAL i3
EVALUATION OF
DIPLOMAS NOW
DN Summer Institute – July 9, 2013
Topics for this session

Refresher on the evaluation
 Evaluation
team
 i3
context
 Study design
Overview of data collection
 Student and staff surveys 2013-14
 Fidelity of Implementation

 Wave
1 Schools – Spring 2012
Partner Organizations – DN Study

MDRC

40-yr old nonprofit, nonpartisan, education and social policy
research organization dedicated to learning what works to
improve programs and policies that affect low-income
individuals and communities

ICF International

40-yr old research and consulting firm that seeks to provide
solutions and services that address challenging policy issues
Goals of the i3 Validation Grant
Program





Identify some of the most promising school
improvement initiatives
Provide support for them to scale up nationally
Research their effectiveness using the most rigorous
methodologies available
Document lessons learned about implementation
during the scale-up process
Publicize study results to influence national and
state policy
Overview of Diplomas Now
Study


Overall goal: Validate effectiveness of the
Diplomas Now model
Research Questions:
1.
2.
What is the impact of Diplomas Now on students’
outcomes, particularly with regard to attendance, inschool behavior, and course performance?
What lessons can be learned about implementation of
the model during national expansion?
Study Design: Random Assignment

A random assignment design uses a lottery
to assign participating schools to one of two
groups
 DN
schools (implementing the DN program)
 Non-DN schools (implementing any other school
reform program)
National sample



Currently 62 secondary schools in 11 districts
across the country participating in the study
Study will compare student outcomes in the 32
middle and high schools that implement DN to
those in the 30 schools that do not
Study will document implementation in the 32 DN
schools, and also investigate how it compares to
any school improvement efforts in the 30 Non-DN
schools.
Data Collection
DN Schools
Student Records Data (will be
obtained from district)
Non-DN Schools*
Student Records Data (will be
obtained from district)
Student, Principal, and Teacher Student, Principal, and Teacher
Surveys
Surveys
Case Studies (Interviews, focus
groups, observations; only at 8
selected schools)
* For
their participation, Non-DN Schools will
receive $10,000 compensation each year
Surveying: Spring 2014
Wave and Activity
Wave 1 Schools
- Principals and teachers complete online survey
- 8th / 11th grade students complete survey
Wave 2 Schools
- Principals and teachers complete online survey
- 7th / 10th grade students complete survey
Questions and Answers
DIPLOMAS NOW
EVALUATION
Fidelity of Implementation: Spring 2012
DRAFT - NOT FOR DISTRIBUTION
11
Implementation Fidelity Data Sources
12

Fidelity of Implementation Data come from the
following sources:
 Diplomas
Now Implementation Support Team (DNIST)
Survey
 School Transformation Facilitator (STF) Survey
 Citi Year Program Manager (CYPM) Survey
 Communities In Schools (CIS) Site Coordinator (SC)
Survey
 Communities In Schools (CIS) Site Records
DRAFT - NOT FOR DISTRIBUTION
DN Fidelity of Implementation
13



Fidelity of implementation is based on the DN Logic
Model and measured using the Fidelity of
Implementation Matrix.
The matrix is built on 111 separate components, 62
of which were identified as critical to adequate
implementation.
These components sort into 9 inputs, 6 of which were
identified as critical to adequate implementation.
DRAFT - NOT FOR DISTRIBUTION
DN Fidelity of Implementation
14

That is…
Component
Y
Component
X
Component
Z
Input
1
DRAFT - NOT FOR DISTRIBUTION
DN Fidelity of Implementation
15

And…
Input
4
Input
5
Input
3
Input
7
Input
2
Input
1
Input
6
Input
8
Overall
Fidelity
DRAFT - NOT FOR DISTRIBUTION
Input
9
16
DRAFT - NOT FOR DISTRIBUTION
Fidelity Matrix: Inputs
17
 Program
Staff Training and Professional
Development
 18
individual components, 15 of which are critical
 Integrated
 11
individual components, 9 of which are critical
 Family
6
On-Site Support (Critical Input)
and Community Involvement
individual components, 1 of which is critical
DRAFT - NOT FOR DISTRIBUTION
Fidelity Matrix: Inputs (cont.)
18
 Tiered
3
individual components, 2 of which are critical
 Strong
6
Intervention Model (Critical Input)
Learning Environments (Critical Input)
individual components, 4 of which are critical
 Professional
Development and Peer Coaching
(Critical Input)
5
individual components, 2 of which are critical
DRAFT - NOT FOR DISTRIBUTION
Fidelity Matrix: Inputs (cont.)
19
 Curriculum
 24
individual components, 4 of which are critical
 Student
 24
Supports (Critical Input)
individual items, 19 of which are critical
 Student
 14
for College Readiness
Case Management (Critical Input)
individual items, 5 of which are critical
DRAFT - NOT FOR DISTRIBUTION
DN Fidelity of Implementation
20

Fidelity is measured in two ways, by a
categorical rating and a continuous score:
1.
2.
Implementation Rating (categorical measure):
focused on critical components
Implementation Score (continuous measure):
inclusive of all components
DRAFT - NOT FOR DISTRIBUTION
Implementation Rating
21


The rating focuses on “critical” components and
“critical” inputs. How well did a site implement aspects
of the model hypothesized to be most important to
improving student outcomes?
Each input (e.g., program staff professional
development) of the DN model was rated as either:
1.
“Successful” - met implementation thresholds for all
“critical” components
2.
“Developing” - did not meet threshold for one or more
critical components
DRAFT - NOT FOR DISTRIBUTION
22
Implementation Rating (cont.)

Individual input ratings served as the basis for the
site-level fidelity rating, which has been broken up
into four categories:
1.
Low: successful on less than 3 critical inputs
2.
Moderate: successful on at least 3 critical inputs
3.
Solid: successful on at least 5 critical inputs
4.
High: successful on 8 or more inputs including 5
critical inputs
DRAFT - NOT FOR DISTRIBUTION
Implementation Score
23



The score measures implementation of all aspects of
the DN model, going beyond just the “critical” aspects
of the model. How well did a site implement the model
overall?
Each input is scored based on how well every one of
its components was implemented.
The average of the 9 input scores provides the sitelevel implementation score (0-1 scale: the proportion
of the entire model implemented by a site)
DRAFT - NOT FOR DISTRIBUTION
Wave 1 Schools - Year 1 Preliminary Findings
28
Cohort 1: 12 DN Sites
 5 High Schools
 7 Middle Schools
Cross-Site Implementation Rating
 % of DN sites with Solid Implementation Rating: 0%
 % of DN sites with Moderate Implementation Rating: 42%
 % of DN sites with Low Implementation Rating: 58%
Cross-Site Implementation Score
 Overall Implementation Score: 0.59
DRAFT - NOT FOR DISTRIBUTION
Highs and Lows:
Critical Components by Input
NUMBER OF CRITICAL COMPONENTS
INPUT
DN Staff Trng./PD
Total
15
Met by >
Met by <
75% of sites 50% of sites
10
4
Integr. On-Site Supp.
9
6
0
Family & Cmty. Involv.
1
0
0
Tiered Intervention
2
2
0
Strong Learning Env.
5
1
0
Highs and Lows:
Critical Components by Input (cont.)
NUMBER OF CRITICAL COMPONENTS
Met by >
Met by <
75% of sites 50% of sites
0
0
INPUT
PD and Peer Coach.
Total
2
Curric. For Coll. Rdy.
4
1
2
Student Supports
19
11
1
Student Case Mngmt.
5
2
1
Critical Components Met by < 50% of Sites
31
Program Staff Training and Professional Development
COMPONENT
% of Sites
Meeting
CY CORPS MEMBER trained in the use of data to identify
interventions
42%
CY CORPS MEMBER received on-going support in the
use of data to identify intervention supports
33%
Did DN team and school admin and teachers meet prior
to the start of the school year for joint planning sessions
25%
11 module online course, approximately 1.5-2 hours per
module
25%
DRAFT - NOT FOR DISTRIBUTION
Critical Components Met by < 50% of Sites
32
Curriculum for College Readiness
COMPONENT
% of Sites
Meeting
Student Team Literature (MS only, n =7)
29%
Savvy Readers’ Lab (MS only, n =7)
29%
DRAFT - NOT FOR DISTRIBUTION
Critical Components Met by < 50% of Sites
33
Student Supports
COMPONENT
% of math classrooms with embedded CY CORPS
MEMBERS
% of Sites
Meeting
42%
Student Case Management
COMPONENT
All Case Managed students identified with academic
needs are provided with Academic Assistance
interventions
DRAFT - NOT FOR DISTRIBUTION
% of Sites
Meeting
25%
Cohort 1 - Year 1 Preliminary Findings by H.S.
34
Count of
Critical Count of All
Inputs Met Inputs Met
Rating
Score
Belaire HS
3
4
Moderate
.57
Booker T. Washington
HS
3
3
Moderate
.50
English HS
1
1
Low
.39
Newtown HS
1
2
Low
.54
Sheepshead Bay HS
1
3
Low
.51
DRAFT - NOT FOR DISTRIBUTION
Cohort 1 - Year 1 Preliminary Findings by M.S.
35
Count of
Critical Count of All
Inputs Met Inputs Met
Rating
Score
Broadmoor MS
3
4
Moderate
.73
Capitol MS
2
3
Low
.63
Clinton MS
2
3
Low
.53
Dever-McCormack MS
1
1
Low
.46
Drew MS
2
3
Low
.73
Miami Edison MS
3
5
Moderate
.74
Shaw MS
3
4
Moderate
.71
DRAFT - NOT FOR DISTRIBUTION
National Evaluation Contacts


MDRC Project Director
William Corrin
Deputy Director, K-12 Education
(212) 340-8840
william.corrin@mdrc.org
ICF Project Manager
Aracelis Gray
Senior Manager, Health, Education and Social Programs
(703) 225-2290
agray@icfi.com
Download