Document 13737516

advertisement
 JISC Business and Community Engagement Programme
Embedding Impact Analysis in Research Using BCE
Practitioners (Grant Calls 15/11 and 20/11)
PERO PROJECT CASE STUDY Report - November 2012
This PERO Project Case Study summarises one of a portfolio of nine Embedding Impact
Analysis in Research Using BCE Practitioners projects funded by the JISC BCE programme.
Funding objective: to spread BCE and research information management expertise more widely
across the sector in order to enhance the capabilities of research groups to identify, analyse and
articulate the benefits of their research to the economy and society.
All Case Studies will be analysed and synthesised by the Facilitation and Synthesis project, run
by the National Co-ordinating Centre for Public Engagement (NCCPE), who will produce a
synoptic and publicly-available Compendium of Good Practice in BCE- and technology-enabled
research impact analysis, for the benefit of the wider sector.
Section 1: Project Details
1.) Project Partnership and Contact Details
Project Title: __ Public Engagement with Research Online _______________________
Project Type A or B: __Type_A_____________________________________________
Lead Institution: University of Warwick
Partner institutions: Open University, University of Cambridge, Portland State University
Primary Contact: Dr Eric Jensen
Role: _Impact Project Manager and Principal Investigator
BCE Practitioner (s):_ Prof David Ritchie, Nicola Buckley
Information Management specialist: Dr Trevor Collins, Dr Eric Jensen, Neil Gatty
Section 2: Executive Summary
2.) Executive Summary
Background
Researchers are increasingly engaging with publics online. Yet effective approaches for
capturing and analysing impacts of public engagement through this medium are not fully
developed. This project developed an evaluation framework for analysing the reach and
significance of online public engagement with research. This framework, elaborated through a
case study, documents the spread and impact of research findings as they diffuse within the
online public sphere. In addition to this framework, the project assessed how public perspectives
on research findings could be used to fulfil upstream public engagement or public involvement in
research goals as part of long-term impact generation.
JISC BCE Embedding Impact Analysis in Research Using BCE Practitioners – Case Study Template Aug 12. MAX 10 PAGES
1
The project team developed an evaluation framework that integrated existing tools such as freely
available web analytics software (i.e. Google Analytics) to draw upon the strengths of this
technology as part of a more in-depth process of online impact analysis than has previously
been articulated.
To address this JISC funding call, a cross-disciplinary team of researchers and practitioners
collaborated to develop a framework of online impact analysis and related resources. The team
possesses highly variegated experience and expertise, including BCE practitioners, impact
analysts, academic researchers from the Warwick economics research group CAGE and
information management specialists. The lead impact analysts were Dr Jensen (Sociology,
University of Warwick), Dr Staniszewska (Royal College of Nursing Research Institute, University
of Warwick) and Dr Collins (Knowledge Media Institute, The Open University), supplemented by
Prof. Ritchie (Communication, Portland State University), and Monae Verbeke (Warwick). The
lead information management expertise came from Dr Collins above, as well as Dr Jensen and
Prof. Ritchie (who are each skilled in information management as well as impact analysis), plus
Mr Gatty (Economics, University of Warwick) who brings expertise about Warwick IT systems. Dr
Jensen and Prof Becker (Economics, University of Warwick) led project co-ordination from the
research group perspective, with a further researcher perspective from Prof. Oswald
(Economics, University of Warwick).
Practical Framework for Analysing Impacts of Online Engagement
In developing a framework for researchers to analyse their research, the Public Engagement
with Research Online (PERO) team sought out feedback and suggestions from both academic
researchers and BCE practitioners by holding workshops, primarily within the University of
Warwick. The case study (focusing on the CAGE research group) employed quantitative and
qualitative analysis using web-based public discussion responding to research, framed within a
theoretically and methodologically robust impact evaluation framework.
A few points were reiterated by many of those individuals that PERO worked with in the
development of this tool:
• The tool had to be simple to understand.
• The framework needed to be explicit in how it was to be used.
• If necessary, the framework needed to identify sources of training in particular methods.
Therefore, we have proposed a framework, which is straightforward to implement, yet capable of
producing robust and valid impact evaluation results. This framework illustrates the impacts of
seeking to engage publics with an original piece of research online.
The framework consists of four steps:
1. (If appropriate), generate and insert Google Analytics code on relevant (e.g. institutional
or personal) webpages communicating research ideas that are the subject of the impact
evaluation.
2. Gather and validate keywords from Google Analytics, the web, and/or in-person events.
3. Use the acquired, validated keywords to gather online public discussions, in public
spheres, that reference key themes and/or original research, quality checking the
resulting data.
4. Analyse randomly selected webpages / discussions qualitatively and/or through
quantitative content analysis.
JISC BCE Embedding Impact Analysis in Research Using BCE Practitioners – Case Study Template Aug 12. MAX 10 PAGES
2
Case Study
This framework of online public engagement impact evaluation has been applied to a specific
instance of online public engagement conducted by Professor Andrew Oswald. Oswald is an
applied economics and quantitative social scientist at the University of Warwick. His research
has focused on the economic and social determinants of happiness and well-being. In 2004,
Oswald released a piece of research "Money, Sex, and Happiness: An Empirical Study" with
David Blanchflower in the Journal of Economics, which was then followed up with a number of
offline and online public engagement events, including:
•
Organiser, International Conference on Happiness, Adaptation, and Prediction at Harvard
University, 2007
•
"Happiness PowerPoint Talk: Esmee Fairbairn Lecture, Lancaster", November 2006
•
"How did we get into the crisis, and how will human happiness be affected?"
TEDxWarwick, February 2009
•
'Happiness, Lottery Winners and Your Heart', University of Warwick, July 2009
•
‘Modern Society and the Economics of Happiness’ University of Warwick Podcast,
September 2011
Conclusions
Case Study. The PERO project’s proposed online public engagement impact evaluation
framework involved capturing, analysing and generating reports on the reach and significance of
Oswald’s online engagement using existing using established online analytical tools (Google
Analytics) and social research methods. The final component of the framework requires analysis
of randomly selected webpages through qualitative and/or quantitative content analysis.
Quantitative content analysis methods can be used to identify the frequency with which important
concepts are being articulated or used by online discussants, whilst qualitative analysis uncovers
patterns in the reception of research ideas within the online sphere.
Theory. The online impacts of public engagement with research are primarily visible through
informal discussions that occur on a range of web platforms. These discussions are important
due to the crucial role of informal conversation as a key medium through which public
engagement impacts develop, as well as a potential site for ‘pooling’ of public sentiment.
Relevant theoretical perspectives for articulating these impacts (and highlighting relevant impact
evaluation opportunities) include: the online public sphere, social change and diffusion of
innovation theories. Further theoretical and empirical work is needed to flesh out these initial
ideas and subject them to empirical investigation.
Knowledge Exchange. As a part of building relationship and contributing to knowledge
exchange across disciplines, four institutions, with a robust set of skills and expertise,
collaborated to develop this framework. As a result of the project, project partners were able to
connect with a wide-range of BCE practitioners and information management experts. This
broad engagement with impact analysis and information management specialists enabled CAGE
to develop an improved evaluation strategy. To enhance the broader impacts of this project and
build better links between researchers, BCE practitioners and information management experts,
we are committed to sharing our learning from this project as widely as possible. We have begun
this process through a joint dissemination event with the TDI project held at Warwick University
and will extend the sharing process through web-based dissemination of project working papers
and presentations at the NCCPE ‘Engage’ conference and other key venues.
JISC BCE Embedding Impact Analysis in Research Using BCE Practitioners – Case Study Template Aug 12. MAX 10 PAGES
3
Section 3: Issues, Changes and Impact
3.) The impact or benefits analysis problem the research group was trying to solve
This project aimed to enhance the institutional capabilities and processes required for identifying
and analysing public engagement impacts of researchers who are active online. Experts in the
theory and practice of public, business and community engagement, as well as information
management and impact analysis, have collaborated in the development of this framework.
As one of the methods of impact generation recognised by the REF, public engagement has
continued to gain dominance in UK higher education. Online media-based public engagement
has the potential to reach millions worldwide, in contrast to face-to-face public engagement,
which is limited by the researcher’s ability to physically reach the audience. With the expansion
of online public engagement (e.g., through Twitter and many other platforms), the scope for a
more dynamic two-way mode of public engagement has increasingly become a focus for higher
education institutions with expanding use of blog, online media and social networking activities
aimed at engaging publics with research. However, online public engagement has outpaced the
development of frameworks for capturing, analysing and accurately representing its impacts. In
particular, the parameters, best practice approaches, and identification of good information
management tools for measuring the reach and significance of online public engagement are yet
to be fully articulated for researchers interested in measuring online impacts. This problem (and
opportunity) was the most important reason for this project.
Our research project has revealed how existing impact analysis tools and insights can be
marshalled to benefit a large and growing number of researchers both in the UK and around the
world who engage publics with their research online. In the first instance, the tool identification
and analytic frameworks have been developed for the specific case of the CAGE research group
at the University of Warwick. However, the approach is applicable to a broader spectrum of
researchers external to Warwick. The case study focuses on how one case of online research
engagement at CAGE can be evidenced, plus describing tools we employed.
The research group that is the focus of this bid is CAGE, ESRC-funded Centre for Competitive
Advantage in the Global Economy at the University of Warwick. CAGE has internal Research
Associates drawn primarily from the Department of Economics and external Research
Associates who are renowned economists from university departments worldwide. CAGE fully
subscribes to the ESRC's impact agenda, and actively works to engage non-academics with its
research both online and off-line. Understanding the impact of research performed by CAGE
researchers is of paramount importance in evaluating the return on the ESRC's investment in
CAGE research. CAGE researcher and Warwick Economics Professor Andrew Oswald is widely
known for his research into happiness and will be used as a particular focal point for the case
study. Oswald is highly visible online and in the full array of news media where he has given
interviews to newspapers, magazines, radio and TV, and is actively engaging in public
discussions via op-eds, letters to the editor and public events. While indicators like number of
media mentions and broadcast appearances can be relatively easily measured, understanding
how the research is perceived, used and achieves impact in the public sphere is of particular
interest to us. Specifically, the PERO project’s aim was to understand, based on this case study
of Professor Oswald’s work, how research findings can spread online and with what kinds of
impact.
JISC BCE Embedding Impact Analysis in Research Using BCE Practitioners – Case Study Template Aug 12. MAX 10 PAGES
4
4.) The as-is impact analysis capability and process within the research group
Prior to the start of the project, the primary component of impact analysis for the CAGE research
group consisted of relatively unsystematically gathered quantitative indicators of reach such as
the number of media mentions and broadcast appearances, which can be relatively easily
measured. These indicators have allowed CAGE to measure their website’s performance and
gather rudimentary intelligence about the effectiveness of their online dissemination activities
within the bounds of the University of Warwick official webpages featuring Oswald’s work.
However, these indicators fall short in understanding how the research is perceived, used and
what (if any) impact is achieved in the broader online public sphere beyond the institutional
webpages and official media mentions.
Dr Sascha O. Becker is the Deputy Director of CAGE with particular responsibility for research
impact. He is involved in CAGE's public events series and social media activities. He brought a
wide-range of analytical skills and was, therefore, equipped to assist in developing a framework
for measuring impacts of research engagement online. Additionally, Dr Eric Jensen is an
Assistant Professor at the University of Warwick and the Impact Project Manager for the CAGE
research group. He is a leading expert in evaluation and media research, with numerous peerreviewed publications on public engagement published in top journals, e.g. Public Understanding
of Science and co-editor of the recently published book Culture & Social Change: Transforming
Society through the Power of Ideas. He has experience using web-based methods to enhance
impact and undertake impact analysis, especially through his prior role as research fellow on the
ISOTOPE (Informing Science Outreach and Public Engagement) action research project (funded
by NESTA – isotope.open.ac.uk).
5.) The expertise provided by the BCE impact analyst(s)
An element of this project called for the involvement of three BCE impact analysts. These BCE
impact analysts contributed a number of business and community engagement skills to the
combined project team competency and to the development of the PERO evaluation framework.
As Head of Public Engagement, Nicola Buckley manages the public engagement team at the
University of Cambridge and is a member of the Impact Working Group for REF at University of
Cambridge. As part of her role, Ms. Buckley conducts or commissions a number of evaluations
and web-based surveys each year of various public engagement activities. Ms. Buckley has
managed Cambridge’s Festivals and Outreach team since 2004; delivering and evaluating 7
annual Cambridge Science Festivals and 4 annual Cambridge Festivals of Ideas. Additionally,
Ms. Buckley researched and wrote toolkits and case studies for the National Co-ordinating
Centre for Public Engagement on ‘festival-based public engagement’, and organised a linked
national seminar, July 2011. As a member of the Impact and Quality Working Group of the
European Science Communication Events Association, she provided input into public
engagement aspects of REF pilot exercise. Although each of Ms. Buckley’s experiences have
been influential in the development of the framework, her skills in evaluating learning outcomes
among members of the public using frameworks, like Generic Learning Outcomes, or other
evaluation methodologies for assessing learning outcomes were particularly useful in developing
best practice in evaluating the engagement of research online.
Dr. Sophie Staniszewska is also participating as a BCE practitioner. The research team is
fortunate to have Dr. Staniszewska on the team, as she is a leading expert in incorporating
public views and voices in research to promote impact. These specialties translated well to the
major aspect of the case study by identifying how analysing online discussions could feed into
the research and impact process. Having lead the research programme at the NHS Centre for
JISC BCE Embedding Impact Analysis in Research Using BCE Practitioners – Case Study Template Aug 12. MAX 10 PAGES
5
Involvement on secondment from the RCN Research Institute, Warwick University, Dr.
Staniszewska has extensive experience and expertise in conducting research on patient
experiences, evaluations and promoting public involvement in research.
Also very skilled in information management, the next BCE impact analyst partner was Dr.
Trevor Collins, a Research Fellow in the Knowledge Media Institute, The Open University. Dr.
Collins is particularly concerned with the iterative, participatory approaches to design and
develop support for learning. As a Research Fellow for the last thirteen years, working on
technology-enhanced learning, semantic web and online community research projects, funded
by industry, the UK research councils and the European Commission (EC), he has developed a
number of digital learning tools. His work uses impact analysis to engage users and other
stakeholders in the co-design and development of technologies to support their practical tasks or
learning processes. Dr. Collins develops a series of prototypes as part of an iterative
participatory development process to facilitate and analyse direct user feedback and refinement
(referred to as agile methods). He has a broad range of experience in analysing people’s
understanding and use of technology to support learning and engagement. The business and
community digital engagement skillset Dr. Collins possesses contributed to the judicious use of
existing digital technologies in the PERO evaluation framework.
6.) The expertise provided, and lessons learned, by the information management
specialist
The research team consisted of three information management specialists: Dr David Ritchie, Dr
Eric Jensen and Neil Gatty. These experts contributed to the development of the evaluation
through their expertise in varying forms of digital and public communication management. Dr
David Ritchie is a Professor of Communication at Portland State University. His expertise in
language (including authoring numerous books on the topic) extends to prior research using
linguistic corpus analysis, emphasizing the existence discourse in urban communities and
cultural institutions. Dr. Ritchie specialty of research on the use of language in social interaction
was of particular value as the evaluation framework developed stemmed from the team’s
discussion of the significance of discussion in the digital public sphere in evaluating the impacts
of online research dissemination. Using his primary research focus of metaphor use, story telling,
and humor in naturally occurring discourse, the team was able to construct an ideal impact
evaluation scenario.
Whilst Dr. Eric Jensen expertise in information management for impact analysis is of particular
importance, as he has applied it to his work in evaluation and media research. Dr. Jensen is a
leader in vigorous research methods in impact evaluation. As part of his methodology, he has
employed a number of information management technologies in quantitative, qualitative and
mixed methods research methods. Dr. Jensen has generated high standards of ideas and
thinking in evaluating the dissemination of data and information. His expertise has been
advantageous in the development of the evaluation framework, as well theorizing methods of
articulating online impacts. Utilising Dr. Jensen’s wide network in evaluation methodology has
benefited the dissemination of this evaluation framework; as a number of research institutes
(viz., 31) have taken interest in the implementation of the evaluation framework within their
online engagement.
Whist, Neil Gatty, Computer Support Assistant, Economics Department, University of Warwick,
has significantly contributed his expertise to the project. Mr. Gatty ensured the necessary
information management software was incorporated within the existing relevant online sites.
JISC BCE Embedding Impact Analysis in Research Using BCE Practitioners – Case Study Template Aug 12. MAX 10 PAGES
6
Through Mr. Gatty’s participation the team has been able to access and use the appropriate
online data and information.
7.) The technologies and business intelligence practices and resources which were tested
and deployed, and their origin
The online technologies selected and deployed for this project were carefully selected and
researched by the project team. The research team recognised, through extensive discussion
with communication experts and academic researchers, the rational method for initiating the
framework for online impact evaluation should arise from web analytics. Web analytics refers to
the study of user data from websites. Web analytic services operate by providing a tracking code
that is added to each web page. Every time the page is viewed the web browser executes this
code, which sends the page access data to a server. A number of web analytic software
packages were analysed by the research team for effectiveness, reliability and ease-of-use.
Google Analytics was selected as the best tool for web analytics for a wide research audience.
Google Analytics uses tracking codes to send the web browser’s configuration settings, the
device’s Internet address and the requested page address to Google. If the user has agreed to
permit cookies to be accepted by their browser, then additional information stored in the browser
can also be sent to the server, such as whether the current page is being accessed for the first
time or not. The page access data held by the analytics service (i.e. Google) is used to generate
a set of interactive reports that can be accessed on the Google Analytics website. Currently, five
reporting areas are included, namely: audience, advertising, traffic source, content and
commerce. This project particularly uses the traffic source reports show the identified sources
from which visitors access the website, grouped by search engine, referral links and direct
addresses. These reports are the basis in summoning public conversations in online public
spheres. In adopting a web analytics service, it is critical to allocate the resources needed to
ensure all of the pages of interest are monitored (i.e. that they include the correct tracking code)
and to rigorously interrogate and validate the reported findings, so that impact of engagement
activities can be monitored and improved.
8.) The particular competencies developed and awareness raised
Through the development of this framework, the impact analysis capabilities and competences of
the research group have undergone significant change. One of the objectives of this study was
based on the premise that to develop the research groups success in developing impact analysis
skills it would be beneficial for the research group to interact with a number of BCE and
information management specialists. By collaborating with these experts, the research group
could maximize their ability to implement impact evaluation theories. The knowledge of forms of
engagement, methodologies in evaluation, reaching target audiences and reporting impacts
have formed the foundation of effectively developing a group-wide evaluation strategy. The
partnerships developed as a part of this project will continue to enhance the capabilities of the
research group through further collaborative project development, training for research group
members and BCE networking opportunities.
9.) How the research group has benefited from the project, including lessons learnt by the
researchers
The learning from this project has been instrumental to the development of quality online
engagement impact analysis. The research group has particularly benefited from this project
through the identification of evaluation procedures that were previously inadequate in producing
concrete impact statements. CAGE has benefited from investment of both time and expertise
JISC BCE Embedding Impact Analysis in Research Using BCE Practitioners – Case Study Template Aug 12. MAX 10 PAGES
7
into their evaluation infrastructure. The project is not an absolute solution for the issues
surrounding the group’s deficit in public engagement impact evaluation. The group realises how
necessary it is to increase measurement and demonstration of their impacts. Participation in this
project has presented an opportunity for CAGE to designate impact evaluation as a priority and
contemplate techniques to enhance engagement with the public.
CAGE is devoted to creating an infrastructure to improve their evaluation of the online impacts of
their activities directed towards online public engagement. Drawing on this insight, the CAGE
research group, using Dr. Oswald’s case study results, is developing an evaluation strategy
based on the recommendations of the BCE practitioners in the research team. Explicitly, CAGE
realises there are a number of factors that need to be addressed to conduct online impact
analysis, such as identifying the stakeholders who will/are engaging with their research online,
and employing evaluation tools prior to the end of individual research project or dissemination
event.
As a result of the project, CAGE was able to connect with a wide-range of BCE practitioners.
Participating with this project, primarily by participating with BCE practitioners, across the field of
public engagement, enabled CAGE to commit to an improved evaluation strategy. A result of the
research project is a potential evaluation study. The evaluation study is collaboration between
two of the BCE practitioners and CAGE in the production of an impact study focusing on CAGE
outreach component. CAGE is motivated to encourage collaborative work with BCE and
information management experts within the University of Warwick and promote the project
outputs to similar research teams at Warwick and beyond.
10.) Lessons learnt by the BCE practitioners
The BCE practitioners indicated a number of ‘lessons learned’ in developing the evaluation
framework, as well as in the best practices of online public engagement. The BCE practitioners
see the use of online public engagement as a potential tool for reaching and impacting a widerrange of publics, than what may be tangibly possible by researchers simply engaging in face-toface public engagement interactions. However, online methods may be less effective than
traditional engagement strategies, if the methods are not evaluated and monitored.
Nicola Buckley noted that the variety of online dissemination of research spans a spectrum from
types of knowledge content where more and less is at stake for public participants in online
dialogue. For particular individuals participating with online research features (such as news
articles, journals, and blogs), there tends to be a wide-degree of interest, but little to no affect
with persons emotionally or cognitively. Whilst, individuals participating in online forums or
discussion groups may lead people to reflect on aspects of their own lives, enter into debates
with other about the relative values people place on commodities or ideas. Yet, conceivably
there is an online public sphere where more is at stake for the people participating. This context
may most suitably lend itself to researchers to create greater opportunity for actively engage in
online dialogues, feasibly over periods of time. Stemming from this engagement, the researcher
then has the power to create greater, measurable impacts.
The project has made it apparent that guidance is needed train researchers in public
engagement to achieve the most from this opportunity for meeting the increasing demands of
demonstrating online public engagement of these individuals. Researchers need to be specific in
their public engagement objectives, which will assist in understanding their potential evaluation
strategies.
JISC BCE Embedding Impact Analysis in Research Using BCE Practitioners – Case Study Template Aug 12. MAX 10 PAGES
8
Dr. Sophie Staniszewska stated that she “…found the project fascinating as it has enabled [her]
to explore the potential of online forms of involvement and engagement, which is completely new
in the field of health services research. Through this work we hope to make a conceptual and
methodological contribution to thinking in this field. It has been great working with colleagues
within the University, at JISC and across other Universities and organisations and I look forward
to future collaborations.” Currently, Dr. Staniszewska is collaborating with project partners, from
both the research group and other BCE practitioners, in developing an outline for a paper
concerning online public engagement practices and impact evaluation.
11.) Process, technology approaches and other changes agreed as a result of the
experience, and future implementation plans
Using the project team’s evaluation framework, CAGE and the project team have evidenced
changes to approaches when using technology to evaluate impacts. At the project’s outset the
team envisioned working with the CAGE research group to change or adjust a number or factors.
A significant objective was to improve the in-house capabilities for impact analysis. Prior to the
beginning of the study, the CAGE research team simply possessed an initial draft impact case
study, developed for REF 2014, with no instruments or frameworks in place for measuring online
public engagement impacts. Following the publication of the PERO impact framework, CAGE
now possesses a variety of new capabilities for impact analysis. CAGE participated in several
workshops with the project team, developing and refining theory and skills in impact analysis.
Through this process, CAGE has developed a stronger link to BCE practitioners. This will
facilitate the development of future evaluation strategies.
An additional objective was for the CAGE team to increase their knowledge and confidence in
using impact analysis. Previously, CAGE used relatively crude forms of impact analysis.
Discussion, raised from the workshops and project meetings, demonstrates the research group’s
improved knowledge of impact evaluation strategies. The research group has become more
aware of these analytical tools and has agreed to participate in further evaluation training and
collaboration with BCE impact analysis experts.
12.) How learning has been shared from the project within and beyond the institution.
To enhance the project experience and to create a stronger research network amongst
researchers, BCE practitioners and information management experts, it is imperative to the
project team that our learning is shared beyond the primary research institution. The emphasis of
the team is to both communicate our findings and seek further input within our individual fields
and institutions, but also to engage researchers and practitioners across disciplines. Each of the
team members are participating in a number of events, aimed at reaching a wide and diverse
audience. These events include:
•
A collaborative dissemination workshop at the University of Warwick, conducted with the
Tracking Digital Impacts team. This workshop succeeded in attracting individuals from 31
different universities across Britain and included BCE practitioners, researchers,
information management specialists, and university administrators. This workshop
offered an introduction to the project and its outcomes, as well as an opportunity for
feedback and further discussion about the possible future utility for the tools and
frameworks we have employed/
JISC BCE Embedding Impact Analysis in Research Using BCE Practitioners – Case Study Template Aug 12. MAX 10 PAGES
9
•
A number of meetings with researchers and BCE practitioners at universities in the UK,
Europe and America, aimed at increasing the worldwide knowledge exchange relating to
the theory and practice of evaluating online public engagement impacts.
•
The Engage 2012 Conference held in Bristol and sponsored by the NCCPE.
•
The team will disseminate our work extensively online through our project website and
existing university, research and BCE networks.
•
We will also disseminate the work completed during this project through a number of
publications, of which two are currently in development.
In addition, each of the project team members has committed themselves to proactively
embedding the framework within their host institution and future research activities. For example,
both Nicola Buckley (Cambridge) and Dr Sophie Staniszewska (Warwick) have indicated
potential for this project to be carried forward into external impact evaluation projects.
Section 4: Additional Observations
13.) Any additional findings, reflections or implications relevant to a wider audience?
As a part of the final project report, the project team has developed a series of ‘working papers’.
Each of these papers discusses separate themes:
•
Theory of online public engagement with research impacts
•
A practical framework for evaluating online public engagement with research
•
Comparing traditional forms of patient and public involvement in health and social care
research with online forms of involvement and engagement
•
Web Analytics: Their role in demonstrating the impacts of online public engagement with
research
These papers, combined with videos and reports from dissemination events, will form the lasting
legacy of this research project. Currently, these resources can be found at Dr. Eric Jensen’s
University of Warwick homepage.
.
Section 5: Acknowledgements and references
14.) Acknowledgements and references
The project team would like to thank the CAGE researchers, Warwick communications, impact
and related staff, as well as the attendees at the NCCPE and PERO/TDI events, who generously
contributed their ideas to this project. We would also like to thanks the Tracking Digital Impact
(TDI team) and JISC representative Simon Whittemore for their contributions and collaboration
on the dissemination event held at the University of Warwick in November 2012.
JISC BCE Embedding Impact Analysis in Research Using BCE Practitioners – Case Study Template Aug 12. MAX 10 PAGES
10
Download