Dr Laura Meagher, University of Edinburgh (PowerPoint

advertisement
Turning Practical Challenges of
Evaluation into Practical
Opportunities
ESRC/DCMS Public
Policy Seminar
‘How do we know
what works?’
8 October 2010
Laura R. Meagher, PhD
Technology Development Group
Outline











Introduction
Twofold Challenges
Premises & Approach
Flows & Processes
Categories of Impacts
Example: Distribution of Impact Categories
Example: Timing of Impacts
Capturing Tacit Knowledge about Processes
Some Key Factors
Example: Practical Use of Evaluation Findings
Conclusions
Laura Meagher
Technology Development Group
Honorary Fellow, ESRC Innogen Centre, University of Edinburgh

PhD natural sciences, 25 years’ experience in US and
UK science, innovation, economic development,
strategic change

Design, catalysis & leadership for novel initiatives and
strategic alliances across disciplines, institutions &
sectors, knowledge exchange

Evaluation
Introduction
Why evaluate?
-Accountability
-Learning
 Making evaluations “strategically useful”
 Analogies between evaluations of societal
impacts of research and evaluations of
impacts of policies

Twofold Challenges
Increasing demand for impacts as
(research) ‘Return On Investment’

Challenges of capturing benefits
 “impacts”
can be subtle & elusive, diffuse
 many impacts are likely to be long-term, taking place
in various forms at multiple levels, various times
 attribution of causality is a nightmare

Challenges of generating benefits
 actual
(KE) processes can be subtle & elusive
 improvement of (KE) processes requires attitudinal &
behavioural changes at multiple levels
Evaluation can meet demands for
accountability…. and more
Synthesis & Robust Triangulation
 Multiple perspectives
 Multiple methods (e.g. document analysis,
surveys, semi-structured interviews, case
studies, focus groups)

Premises


It may be possible & useful to tease out processbased near-term impacts – or indicators of
‘impacts-in-progress’ (proxy indicators)
If evaluation of impacts recognises importance
of processes (not just formal outputs), evaluation
will be more appropriate and processes can be
improved to enhance likelihood of future impacts
Approach




“Formative” evaluation in the broadest sense
Learning from multiple evaluations
Deep evaluation capturing insights & lessons
learned
Deconstructing processes & roles leading
toward impacts…. Learning for future
improvement
Conceptual Framework
Societal issues, external influences, and national & local research cultures
Wider
Publics
KNOWLEDGE
BENEFICIARIES
Policymakers
Practitioners
KNOWLEDGE
USERS
Conceptual framework to
elaborate flows of
knowledge and
interactions generating
impacts from research
Media
Funders
KNOWLEDGE
BROKERS &
INTERMEDIARIES
Prof Assoc
PRIMARY
KNOWLEDGE
PRODUCERS
Researchers
KEY
= Individuals (or subdisciplines) within wider organisations
Individual
knowledge
intermediaries
= Organisations and institutions
See: Meagher, L., Lyall, C., & Nutley,
S., (2008), “Flows of knowledge,
expertise and influence: a method for
assessing policy and practice impacts
from social science research”,
Research Evaluation 17(3): 163-173
Categories of Impacts
Instrumental
 Conceptual
 Capacity-building

Attitude/Culture Change
 “Enduring Connectivity”

Distribution of Categories of Impacts
Instrumental
Impacts
Environmental
improvement
(59%)
Conceptual
impacts
Increasing the
evidence base
(62%)
Knowledge
exchange/uptake/learning
(60%)
Capacity-building Attitude/culture
impacts
change
Enduring
Connectivity
Training/ capacitybuilding(28%)
Continuing
communication
between
researchers, SHs
(35%)
Follow-on
collaborations (26%)
Risk mitigation
(35%)
Societal benefits
(29%)
Service
improvement
(22%)
Improved
productivity(12%)
Cash-releasing
(3%)
Wide
dissemination
(22%)
Improved
reciprocal und &
willingness to work
together (22%)
Timing of Impacts
Impacts
Short-term
(0-2 years)
Mid-term
(2-5 years)
Long-term
(5+ years)
Cannot
estimate
CapacityInstrumental building
Attitude/
Conceptual culture
Enduring
connect
35
61
67
42
66
27
2
9
12
9
10
0
4
2
0
29
37
20
44
26
Capturing Tacit Knowledge about
Processes
Satisfaction, views on effectiveness
 Obstacles
 Positive factors
 Key roles
 Lessons learned, good practice, advice

Some key factors perceived to
enhance likelihood of impact





Value placed upon/incentives provided for
generation of impact
Facilitating role(s) of “knowledge intermediaries”
Two-way interactions between researchers and
users
Communication/increasing accessibility of
research
Injections of financial support, dedicated staff,
infrastructure
“Golden Rules for Research Impacts”
www.sniffer.org.uk
Conclusions: Practical Opportunities
 Confronting the accountability conundrum
rigorous, qualitative & quantitative evaluations
process-embodied impacts & interactions, short-term
‘impacts-in-progress’ & possible proxy indicators of
future impacts
 Multi-tasking during evaluations:
identifying impacts and impacts-in-progress
opportunities to help participants reflect upon & share
insights into key issues, processes, roles
Conclusions: Evaluators as
“Knowledge Intermediaries”
 Influencing the future:

encourage implementers to identify, share and use
evaluation learning
use emerging body of tacit knowledge to inform
organisational learning, thus enhancing likelihood of
future impacts
Evaluators can play a key brokerage role as “Knowledge
Intermediaries” between tacit knowledge and
organisational learning
Contact
Laura.Meagher@btinternet.com
Download