Final Report

advertisement
South West Spoke
Final Case Study Report Template
IMPORTANT: The SW final report guidance document must be consulted
before completing this report template. Please complete all sections.
Project Title: Demonstrating proven, effective outreach activity through
effective evaluation
Project Leader: Sophie Duncan
Department/School: N/A
Institution: National Co-ordinating Centre for Public Engagement
Other institutions/organisations involved in the project: Beacons for Public
Engagement (Manchester Beacon; CUE East; Beacon for Wales; Edinburgh
Beltane.)
Abstract:
This project created training resources to support HE STEM practitioners to develop their evaluation
work in particular, of STEM outreach activities. By developing their understanding and skills in
evaluation, practitioners would be able to improve the effectiveness of their projects, and demonstrate
this to others. This was prompted by a range of dravers: the access agreements universities provided
need to have evidence of effective outreach practice; a need to justify the funding spent on ourreach
in terms of effective outcomes from the work and a desire by many in the sector to evidence the
impact of their work.
Informed by a survey of current practice within the sector (with 161 respondents), three training
courses were developed. The Beginner’s Guide to Evaluation was aimed at those with little
experience of evaluation, and sought to build confidence and skills in developing an evaluation plan.
The Evaluation Masterclass was for those with more evaluation experience, who wanted to better
understand how to make an impact with their evaluation. The Training the Trainer course sought to
support trainers to run their own Beginner’s Guide to Evaluation course within their institution. Finally,
three one hour Plug and Play sessions were developed that could be integrated into conferences and
other meetings. These covered: In-house Evaluators; Use of Logic Models; and Measuring Quality.
The courses were supported through an online Ning site where participants could download all the
resources and ask questions of the course trainers. You can find out more at
www.publicengagement.ac.uk/evaluating-stem-outreach
List of Outputs:

Getting Started: A Beginner’s Guide to Evaluation (BG2E). A one day training course to
support participants to develop their evaluation work
(2 pilot short sessions with 30 participants; 3 one day sessions with 34 participants)
1
South West Spoke

Evaluation Masterclass: Making use of your evaluation: Informing practice and evidencing
impact (EM). A 3 hour masterclass to support participants to make use of their evaluation
(3 Masterclasses with 47 participants)

Training the trainer session (TtT): 3 hour sessions to support HE trainers to make use of the
resources to deliver the BG2E course
(3 TtT sessions with 19 participants; 1 training session still to run)

An online Ning site to support people who have participated in the courses and provide
resources to support their work (http://nccpetraining.ning.com/)

Three Plug and Play masterclasses (PP): One hour sessions to explore specific aspects of
evaluation (these will be run by 30 September 2012 – with around 30 participants in total
across the 3 sessions)

Resources to support each of the courses and course participants to either develop their
evaluation practice or to run the BG2E training in their own institution. Resources include:
o
o
o
o
o

Powerpoint of each of the three courses
Trainer handbook for BG2E
Participant handbook for BG2E
Activities for each course
Handouts and resource sheets
A report based on feedback from HE STEM practitioners on their current evaluative practice;
and the support they would like to have to improve their evaluation work
Project Highlights:
1. The Beginner’s Guide to Evaluation course aimed to build knowledge and confidence for HE STEM
practitioners to develop their evaluation work – the evaluation of the course clearly indicated that the
course delivered this, and much more. It was great to hear people reflecting on the increase in
confidence they now had to develop an evaluation plan for their work.
2. The Evaluation Master classes provided a great opportunity for people to discuss more detailed
evaluation practice. The evaluation indicated that people really valued the opportunity to come
together to discuss impact and reporting, with the interactive nature of the course was appreciated.
3. The impact agenda is encouraging more people to evidence the impact of their research – many of
the resources developed as part of this project are equally relevant to a wider audience of people
seeking to develop their evaluation practice. This is particularly true of the Evaluation Masterclass. In
addition we have had interest from HE staff in Europe who are also tackling these questions.
Background and Rationale:
Evaluation and impact are a key part of developing good practice. The project was developed to
support HE STEM practitioners to develop effective engagement with schools and colleges through
2
South West Spoke
science communication and outreach.
In developing effective and appropriate methods of capturing the benefits and impact of STEM
outreach, academics will be able to identify successful activities (and thus target expenditure more
effectively), and demonstrate to senior managers the value of the activities. They will also understand
the importance of reflecting on the purpose of their activities and on assessing how effectively the
activities deliver against that purpose, with a view to improving the activities that they undertake.
The project aimed to support the practical application of evaluation tools within STEM subjects across
a number of different HEIs and, through training and an online forum, to enable staff to develop and
utilise evaluation methodology to: improve their own practice; demonstrate impact; increase
awareness amongst senior managers and enhance recognition for outreach activities. The train the
trainer methodology proved an efficient way of building capacity and ensuring wide uptake of the
resources.
Implementation:
There were several strands to this work:
1. Survey to assess the need for evaluation support and training for HE STEM practitioners. This
survey was used to inform the development of the course materials.
2. BG2E course. This was piloted at two events – and then developed into a one day course. This
was run three times – with improvements being made each time, based on the feedback from the
participants. The most notable changes to the course included: providing all the handouts in a
participant workbook; providing larger evaluation plan templates for groups to fill in together; providing
opportunities to complete the draft evaluation plan before the end of the course; providing online and
telephone support for people wanting to develop their evaluation plan.
3. Evaluation Masterclass. A short version of the masterclass was piloted at an event in Newcastle.
The course was created with content that could be selected from dependant on the interests and
needs of participants. This was run three times – the second two were developed on the basis of
feedback from the first. The main feedback linked to delegates wanting a longer course, and to cover
more of the material – however they also recognized that they would not have prioritized coming to
the event if it was longer than 3 hours.
4. TtT. A three hour session was developed to introduce people to the BG2E materials and to walk
them through the course. Several people would have liked to have attended the BG2E course before
coming to the TtT, but very few had been able to afford the time to do so.
5. The PP modules were developed by the Beacons for Public Engagement, based on specific
expertise that they had. These are planned to be run before 30 September 2012. We are hosting a
further Training the Trainer Day on the 30th August and have emailed all the participants in the
Training the Trainer module to ask them to apply for support to run their first course.
6. The course resources including powerpoints; trainer notes; participant workbooks; activity sheets;
additional resources lists etc were developed and hosted on an online Ning site. This site provided an
opportunity for people to download the resources and to ask questions of the trainers. Very few of the
participants have chosen to use the online space to ask questions, although most of the participants
in the courses have joined the Ning to download materials. Several people have contacted us by
phone / email for specific support. This support has tended to be project specific, in terms of helping
to develop an evaluation plan relevant to their project. In addition people have requested information
about specific evaluation methodologies, including sampling and analyzing data,
7. BG2E courses run by others. The TtT course aimed to support trainers to run the course
themselves. Many of the participants are now planning to run the course, or adaptations of it, with
their own communities. We look forward to receiving feedback from trainers, to reflect on the
relevance of their course in their institutions. A couple of courses have already been run – and these
were well received, and met the learning outcomes.
3
South West Spoke
Evaluation:
The evaluation plan was quite simple, but reflected the methodology taught in the courses. Namely to
plan the evaluation from the very start; to only collect data that could usefully be used; to integrate the
evaluation into the actual project.
There were three elements to the overall evaluation of the project as a whole:
1. A survey was done before developing the content for the project to ascertain the nature of need
across the sector. This informed the development of the core courses.
2. Each of the three courses (BG2E; EMC and TtT) was evaluated using a mixed method approach
(baseline; questionnaire post event; online survey two weeks after the event; graffiti walls) to inform
the development of each of the courses and to ensure they were fit for purpose. This iterative
approach ensured that we met the key learning objectives of each course.
3. We also plan to evaluate the BG2E courses run by the trainers who attended the TtT course, using
the same mixed method approach.
Evaluation summary:
1. Survey
161 people responded to the survey with a good spread across subject disciplines and types and
locations of institution. In summary:
a) A wide range of outreach activities is currently provided, and the scope of information about
evaluation methodologies needs to accommodate this range
b) A significant proportion of practitioners already undertake evaluation
c) The two main purposes of the evaluation currently undertaken (for improvement and for
impact) are consistent with the planned twin foci of the training programme to be developed
d) Attitudes towards evaluation are generally very positive
e) There is significant interest in training in evaluation
f) On-line resources are the most popular way of accessing this support
g) Workshops are also a preferred method for two thirds of respondents
h) Current on-line resources for evaluation provided through the National HE STEM Programme
website appear to be under-used
i) There are no major differences in responses between the different subject disciplines
j) Those not interested in receiving further support show a less positive attitude towards
evaluation
2. Evaluation of courses
(a) BG2E course
Summary:
2 Short pilot sessions: 30 participants.
3 Courses: 34 participants
Evaluation headlines:

Delegates indicated increased confidence in evaluating their current activities having attended
the BG2E course, and that this confidence is maintained, even after the initial enthusiasm for
the course.

Only two participants indicated they were less confident to evaluate their activities following
the course, but they both indicated this was because they now had a deeper understanding of
4
South West Spoke
evaluation and that their confidence before the course was based on misunderstandings
about the nature of evaluation practice

Best things about the training course included the content (13 out of 33); discussion between
delegates (11 out of 33); and the interactive nature of the course (6 out of 33).

25 delegates (n=27 who completed online survey) were satisfied or very satisfied with the
format / design of the workshop; and 27 were satisfied or very satisfied with the content.
Related quotes:
What I learnt..
“A strong framework and resources for evaluation will really help me to feel I’m covering everything I
need to”
“Course made the logic feel very approachable”
The best thing...
”The opening sharing format – non threatening, interactive, lots of group exercises”
“Very friendly and accessible trainers and good resources”
“It was great that you took everything from bottom up and explained thoroughly. Really good training:
workshops, theoretical learning and learning from the group”
“Very warm and positive group facilitators (and funny too!)”
“I now know what the constituents of evaluation should be...”
“Requires much more thought than I had anticipated”
(b) Evaluation Masterclass
Summary:
1 pilot session: 15 participants
3 courses: 47 participants
Evaluation headlines:
Immediately after the course 42 respondents filled in feedback cards. Delegates were asked to write
three words to describe the course – and these have been translated into a wordle – where the size of
the word relates to the number of times it was quoted
5
South West Spoke




39 (of the sample of 42) agreed or strongly agreed that the content was relevant to them
40 people agreed or strongly agreed that they enjoyed the course.
34 people agreed or strongly agreed that the course was pitched at the right level for them (A
couple of respondents reflected that they would have either benefitted from attending the
BG2E course; others that they wanted to have more detailed content.)
38 people reported that they strongly disagreed or disagreed with the statement ‘I was
disappointed with the course’
The best thing about the course for the majority of respondents was sharing and listening to other
people’s experiences; the main area for improvement was the time allocated to the course – although
there was also recognition that a longer course might not have been prioritised by delegates.
The online survey over 2 weeks after the course had 18 respondents:




15 were satisfied or very satisfied with the format/ design
16 were satisfied or very satisfied with the content
12 said they were more confident to evaluate their activities; with 6 saying they were neither
more nor less confident. Of those who said they were more confident the majority referred to
new ideas; resources and perspectives as the reason they were more confident. Of those
who said they were neither more nor less confident 2 said they were already very confident;
and 2 referred to appreciating course content and discussions but that this did not affect their
confidence.
13 said that they had fresh ideas in how to evaluate their work; and a further 9 had new ideas
about how to share their evaluation with others.
Related quotes:
What did you learn?
“Useful thinking of the many / various aspects of evaluation. Using pre-evaluation to inform activity
6
South West Spoke
design”
“That you should use your evaluation data!”
“Evaluation from different points of view”
“Interesting 1st discussion in the small group about catering to audiences”
“Different learning styles (e.g. reflective, artistic) call for different evaluation techniques – one size
doesn’t fit all”
The best thing was...
“The course – very strong on clarity of purpose, with good real-life examples”
“Meeting others with similar challenges to face”
“Different exercises, networking and discussion”
“Various nuanced discussions about evaluation – approaches, challenges etc.”
“Highlights areas to think about more so cuts down time needed to gain a start”
“Opportunity to discuss practice with like minded people”
Any other comments...
“Many thanks. I really appreciated the effort that had been put into designing the format and materials
for the training and the way it was being piloted with us for possible improvement.”
(c) Training the trainer
Summary:
3 sessions: 19 participants
Evaluation headlines:
Immediately after the course 13 respondents filled in feedback cards. Respondents were asked for 3
words to describe the course – and these have been translated into a wordle.
7
South West Spoke




The 13 respondents to the questionnaire all agreed or strongly agreed that the content was
relevant for them
12 out of the 13 respondents agreed or strongly agreed that the content was pitched at the
right level for them, with 1 respondent strongly disagreeing
All 13 respondents stated they would make use of the Ning site
The majority of respondents (10) felt the three hour session was long enough but there were
2 people who neither agreed nor disagreed and 1 person who disagreed.
What one thing could be improved?
Time was the strongest theme in these answers; for example ‘It would need to be longer to more fully
introduce course material’, ‘A slightly longer time’.
Related quotes:
The best thing:
‘It’s a good structure which covers all the main chestnuts and helpfully emphasizes timing importance’
“Thank you very much for providing me the opportunity to reflect on and improve my practice in
evaluation and the wider public engagement movement. I am a great fan of the programme of
learning and will be recommending the NCCPE and using training materials on a regular basis both
within my employment role and role as a post-graduate student group member.”
The final evaluation is not complete as we are still getting feedback for the TtT course, and from
courses other trainers are running. The final evaluation report will be posted on the Ning site and the
SW Spoke regional website in September 2012.
Discussion, Learning and Impact:
The project has yet to realize its full impact due to the inevitable time delays between creating a great
course; training others to deliver it; and then the trainers setting up their own courses. Many of the
trainers we trained are planning events in the Autumn – so we have yet to determine how effective
these will be.
There was a strong demand for the actual courses – with several of them being over-subscribed.
There is a real appetite in the sector to develop effective evaluation work in order to demonstrate
impact and to improve practice.
The courses were appropriate to the needs of participants, and the resources proved useful in
developing more effective knowledge and understanding. The courses met the learning objectives,
and the iterative nature of the course development meant that we were able to constantly improve the
course over the training period, which was reflected in fewer suggestions for improvements as the
course developed.
A particular challenge has been reaching the right people to get involved in the training the trainer
sessions. A small number of people attended these sessions because they wanted to learn how to
evaluate rather than because they had an opportunity to train others.
A second challenge was the confidence of evaluative practice that the trainer had – with several
requesting partnerships with us to deliver their first training session, as they felt they needed more
support.
.
Finally in the original bid we made the assumption that offering small grants to institutions would help
8
South West Spoke
them to run a training session – it has become apparent that what is really needed is support from a
trainer, rather than money to help fund the event. Therefore we have offered this option to trainers
and are currently exploring whether this will enable them to get started. To this end we have arranged
another TtT day in August to support more institutions to get involved.
Further Development and Sustainability
Will the activity continue in the future?
(b) Yes (in a modified form)
The NCCPE plan to run additional training the trainer sessions – these will be offered to trainers for a
small charge to cover the costs of running the course. If there is demand, we also plan to run the
Evaluation Masterclass on a cost–recovery basis.
The plug and play modules will be run as and when required at conferences and events.
The resources will be available to people wanting to run courses for HE STEM practitioners on
evaluation and these will assessable via an online Ning site – and advertised on the NCCPE website:
www.publicengagement.ac.uk/evaluating-stem-outreach
In relation to the approaches to sustainability outlined below, we are very interested in
activities and commitments which have occurred within the timescale of the project.
However, we recognise that some approaches may still be in the development phase at the
official project end date and it would also be valuable to include these examples in the
template.
Approaches to
Sustainability
Continuance (finding
alternative sources of
funding)
Examples
In relation to your project

Commitment from institutions to
provide continuation funding

Network/ communities likely to
be sustained through inclusion in
future funding bids
There is interest in the training
courses, and the training the
trainer courses. We are keen
to build on this momentum to
ensure the resources are used
to support HE STEM
practitioners to develop their
evaluation work, to help
ensure that their activities are
developed in more effective
ways, and the impacts of the
activity are better understood.
The Ning site will continue to
be supported, as long as
9
South West Spoke
participants find it a useful
forum to develop their
evaluation practice

Embedding (within
institutional activity)
Mainstreaming
(changes in working
practices)
Legacy (passing on
important elements of
the project)
Identification of institutional
strategies that the project has
informed

Uptake which has taken place, or
is likely to take place, within
own/other HEIs

Influencing of organisations
external to HE Sector which has
occurred through partnership
working

Staff development which is
planned or has taken place as a
result of your project

Curriculum enhancement that
has occurred or is likely to take
place as a result of your project

Influence of senior managers
that has arisen as a result of the
project

Networks/communities likely to
be continued

Dissemination of project outputs

Evidence of impact of activities

Creative Learning Journey
material made available via the
SW Spoke Creative STEM
website where relevant.
Trainers trained through this
process can make use of the
resources in their own
institutions. There have
already been several courses
run by trainers who have
attended the TtT courses –
and these have been really
effective at delivering against
the learning objectives.
We are currently offering
trainers support to run their
first course, where necessary
The main development has
been of HE STEM practitioners
who have developed their
skills, knowledge and
confidence in evaluating their
activities – we hope to see the
impact of this through
engagement through the Ning
site – but it is early days to see
how effective the training is in
informing practice.
Ning site will continue – and
provides support for people
wanting to develop their
evaluation plans and/ or
deliver training courses. In
addition more resources can
be developed and added to the
site by trainers and
participants, as well as
NCCPE.
References:
All resources are on the Ning site, including lists of articles that are of relevance to HE STEM
practitioners developing their evaluation work.
10
South West Spoke
Quotes:
“I’ve learned lots of new techniques and being able to discuss their use in different situations
has made me quite look forward to evaluating activities, something I never thought I’d say”
Participant BG2E
“I’ve realised how much I already know, my knowledge has been refreshed and I’m also
excited about doing it instead of nervous”
Participant BG2E
“I feel with the frameworks/resources provided I can develop a really good evaluation plan”
Participant BG2E
“The best thing was the practical exercises, especially looking at the evaluation of real
projects”
Participant EM
Many thanks!
11
Download