4. Evaluating STEM initiatives - identifying weaknesses in evaluation

advertisement
To prove, to improve or
to learn? Lessons for
evaluating technical,
practical and vocational
educational initiatives
from an analysis of
STEM evaluations
Edge Research
Conference 2012
Friday November 16th
Aims:
Drawing on an analysis of evaluations
in STEM education we will:
• identify some of the problems with
such evaluations;
• examine some potential solutions
for the future;
• and examine the applicability to
wider TPVL.
Introduction
CSE and CEIR
Science and Innovation Observatory
Priorities
•
Research, intelligence, evaluation, polemics
•
Informing and influencing
•
Independent body
Reports, think-pieces, associates
Background - what is STEM and
how does it relate to TVPL?
•
•
•
•
•
In UK - developed from SET for
Success (Roberts, 2002)
STEM Framework launched 2007-8
Development of a set of 11
'ActionProgrammes'
Each of these contains more
projects, many of which were
evaluated - these from the bulk of
our analysis
Technical vocational routes part of
some Action Programmes, but main
focus was academic
What do we know about STEM
evaluations?
• Analysis of 20 STEM evaluations:
– 13 Projects/activities or programmes
– 4 Event evaluations
– 2 Evaluations of organisations
– 1 CPD evaluation
Examined:
•
•
•
•
•
Aims
Timings
Methods
Evaluation models
Use of prior evidence
• Results and outcomes
• Impact on policy and
practice
• Limitations
• Contribution to knowledge
Key points from the review
• Evaluation aims were not always explicitly
stated.
• Timings do not always appear to match the
purposes of the initiative being evaluated.
• Robust counterfactuals were rarely used.
• Explicit evaluation models were used in only a
small number of cases.
• Reviews of literature, policy or similar initiatives
were not usually presented.
Key points from the review continued
• Negative results and were not usually presented
in the same depth as positive results.
• Few evaluations looked to make
recommendations beyond the project at hand.
• Evaluations tended not to make explicit their
limitations.
• Contributing to a developing STEM knowledge
base is very rare in the evaluations we looked at.
• Conclusion: The potential for learning from
these evaluations is severely limited.
Linked to key point: the purposes of
Evaluation
Controlling
To understand whether the project is going to
plan
Proving
To understand if the project is achieving what
was intended
Improving
To understand how to modify the initiative to
make it work better
Learning
To provide transferable insights to help build a
body of knowledge beyond the project at hand
Responses :1
A single evaluation framework?
• E.g. Stake’s (1996) Stufflebeam (2002):
Cronbach's (1982)
• Each of these organises the focuses of
evaluation into three broad areas:
• context [antecedent, context, unit of
focus/setting];
• process [transaction, input/process,
treatment]; and
• outcome [outcome, product,
observations/outcomes].
Responses :1 Guskey?
reactions
learning
organisational
support and
change
use of new
knowledge
and skills
student
outcomes
• No support for this idea:
• One approach could not be designed that
would be appropriate to the aims of every
STEM project or evaluation.
• A multiplicity of approaches allows greater
fit, flexibility and creativity: and hence is
more likely to lead to transferable learning.
Responses 2: Theory-based
approaches
There are a number of well established
'theory-based' approaches e.g. Realist
evaluation; Theory of Change.
These develop hypotheses about the
social world, and test them out using a
variety of means.
Close to the scientific method.
EXAMPLE - Interventions aimed at directly
improving students’ attitudes to STEM
subjects
• EXAMPLE THEORY - using interesting,
innovative opportunities to learn
improves attitudes to STEM hence
improved learning outcomes and interest
in STEM careers (e.g. After school Science
and Engineering Clubs; Engineering
Education Scheme)
Next steps for STEM:
1. Development of effective use of theory-based
approaches to evaluation.
2. Systematic mining of current evaluation and
research to develop a bedrock of evidence of
the theoretical bases for initiatives, and their
effectiveness in various contexts.
3. A commitment to using and building the
evidence base through evaluation and research.
Next steps for technical, practical
and vocational learning (TVPL)?
Questions:
• Is there evidence that there is a similar lack of
impact of evaluations in relation to TVPL?
• What analysis needs to be done to help answer
this question?
• What needs to be done in TVPL to improve
evaluation - and to what extent do the
prescriptions in this paper for STEM evaluation
apply to TVPL?
Want to get involved?
Contact us:
Mike Coldwell
m.r.coldwell@shu.ac.uk
0114 225 6054
Ken Mannion
k.mannion@shu.ac.uk
www.scienceobservatory.org.uk
Download