Camilla-Nevill-presentation

advertisement
Addressing educational disadvantage,
sharing evidence, finding out what works
Camilla Nevill
Evaluation Manager
Overview
1.
2.
3.
4.
5.
The aims of the EEF
Teaching and Learning Toolkit
Grant-making process
Approach to evaluation
Challenges and reflections
The aims of the EEF
• Building the evidence for what works in schools by
identifying and rigorously evaluating evidence-based
approaches to teaching and learning
• Sharing the evidence with schools by providing
independent and accessible information through the
Teaching and Learning Toolkit
• Promoting the use of evidence-based practice both
through our projects, events and resources such as the
upcoming DIY Evaluation Guide for schools
The EEF by numbers
30
1,800
topics in
the Toolkit
schools
participating
in projects
16
independent
evaluation
teams
£200
m
estimated spend
over lifetime of
the EEF
300,000
9
pupils involved
in EEF projects
members
of EEF
team
3,000
56
heads
presented to
since launch
projects
funded to
date
The EEF approach
• Evidencebased grant
making
Decide
grants
ST-EEF
Toolkit
Synthesise
evidence
• meta-analyses
• Case studies
• Independent
evaluations
• Robust
methodologies
Commission
evaluations
EEF
Evaluations
Report
results
• Common outcome
metrics
The Toolkit and grant-making
• Evidencebased grant
making
Decide
grants
ST-EEF
Toolkit
Synthesise
evidence
• meta-analyses
• Case studies
• Independent
evaluations
• Robust
methodologies
Commission
evaluations
EEF
Evaluations
Report
results
• Common outcome
metrics
The Sutton Trust-EEF Teaching and
Learning Toolkit
• Accessible, teacher-friendly summaries of educational
research
• Practice focused: giving
schools in the information
they need to make
informed decisions and
narrow the gap
• Based on meta-analyses
provided by Durham
University
Case study: the Pupil Premium
• Large secondary school in the North West receives
£580,000 from the Pupil Premium in 2012-13.
• How does the head teacher decide to use this money?
• Reducing class size, investing in professional
development, one to one tuition?
• The Toolkit doesn’t tell the head what to do, but we
hope that it will help her make a more informed
decision.
Grant-making
The EEF assesses proposed projects by assessing:
1. The extent to which there is existing evidence which
suggests that the approach will improve academic
attainment (taking the Toolkit as a starting point)
2. Whether the project has the potential to be scaled up
cost effectively if proven to be effective (within the
envelope of the Pupil Premium)
• We are looking for disciplined innovation:
innovation that builds on what we already know.
EEF portfolio so far
# schools by school type
# schools by project type
Whole-school
improvement
Secondary
Targeted support
Parents and
community
Out-of-hours learning
Primary
Improving classroom
teaching
0
250
Already recruited
500
750
0
500
In recruitment process
Note: Figures include only grants made or to be made from the main endowment (Literacy Catch Up projects excluded)
10
A meta-analytic approach to
evaluation
• Evidencebased grant
making
Decide
grants
ST-EEF
Toolkit
Synthesise
evidence
• meta-analyses
• Case studies
• Independent
evaluations
• Robust
methodologies
Commission
evaluations
EEF
Evaluations
Report
results
• Common outcome
metrics
Approach to evaluation:
What the EEF is doing differently
A new and robust approach to evaluation in education:
• Independent evaluation
All projects evaluated by a member of our panel of evaluation
experts
• Common outcome metrics
Effect on attainment and cost—so we can compare and contrast
between projects
• Focus on longitudinal impact
All pupils will be tracked using the National Pupil Database
Approach to evaluation:
Robust yet pragmatic evaluations
Types of evaluation:
Pilot
(development)
Efficacy
(validation)
Effectiveness
(scale-up)
Decision based on an assessment of:
• the existing evidence base (takes precedence)
– Relevance
– Strength
• expected effect size
• cost
• policy context
• practicalities of delivery
An example of an EEF evaluation
Catch Up Numeracy:
• One-to-one intervention with children in
Years 2 to 6 who are struggling with
numeracy
• Previous research showed an effect
size of 0.3
• Trial in 50 schools with 300 pupils and
100 TAs randomised
• Effect on attainment measured using
standardised maths tests
• Independent evaluation by NFER
• Observations and interviews to inform
scale up
The EEF approach
• Evidencebased grant
making
Decide
grants
ST-EEF
Toolkit
Synthesise
evidence
• meta-analyses
• Case studies
• Independent
evaluations
• Robust
methodologies
Commission
evaluations
EEF
Evaluations
Report
results
• Common outcome
metrics
What’s next?
•
•
•
•
•
•
What works centre (on going)
DIY Evaluation (on going)
Digital Technology Round (just closed)
Project results (Autumn 2013)
Next General Round (closing October)
Neuroscience Round (April 2014)
Challenges and reflections
• Resistance from the academic community
How do we strike a balance between bringing a new and robust
independent evaluation approach and protecting the rights of project
creators?
• Attrition and testing
How do we best incentivise schools to remain engaged and see
research through?
• Mobilising what works and taking it to scale
How do we put the evidence to work and ensure that teachers can
access and use research effectively?
For further information:
www.educationendowmentfoundation.org.uk
camilla.nevill@eefoundation.org.uk
Thank you!
Download