Research Information Literacy and Digital Scholarship (RILADS) Apr 2013

advertisement
Research Information Literacy and Digital Scholarship (RILADS) Apr 2013
Project manager: Stéphane Goldstein, Research Information Network
(stephane.goldstein@researchinfonet.org)
Author:
Location:
Date:
Revised:
Final:
Dr Charlie Inskip
http://rilads.wordpress.com/
Friday April 12, 2013
Friday April 19, 2013
Tuesday June 4 2013
Contents
1. Executive summary
2. Project Aims
3. Methodology
a. Sample
b. Analysis
4. Promotion of project
5. Findings and discussion
a. Who is the course or resource designed for, and why?
b. What knowledge, skills and competencies is the course or resource
intended to provide?
c. How is the course or resource delivered?
d. Criteria for assessing courses or resources
6. Revisions to criteria
7. Summary and conclusion
8. Next steps
9. Appendices
a. Final shortlist
b. Long-list
c. Returned forms
d. Press release
e. CILIP Update news story
f. Evaluation form
g. Skills list
h. Short listing process
1
Research Information Literacy and Digital Scholarship (RILADS) Apr 2013
1. EXECUTIVE SUMMARY
This document summarises the work to date (Oct 2012 – Apr 2013) on the RIN /
SCONUL information literacy and digital scholarship project known as RILADS
(http://rilads.wordpress.com/). The aim of the project is to deliver a small number
of key outputs contributing to a wider investigation into the support available to
students, staff and researchers to enhance digital literacy. This report focuses on
the first strand (RIN) looking at the identification and promotion of good practice
in information training in UK HE. The promotion strategy, using social networks,
print media and personal contact led to the gathering of a long list of 42 potential
examples. Questionnaires, informed by the RIDLS criteria for describing and
evaluating courses and resources, were sent to named people, predominantly
from the area of Academic Library services, involved in delivering and developing
these resources. 27 completed forms were returned.
The questions covered three main areas:
•
Who is the course or resource designed for, and why?
•
What knowledge, skills and competencies is the course or resource intended
to provide?
•
How is the course or resource delivered?
A brief overview initial analysis of these was initially used to identify key themes
and patterns in the data. The questionnaires were then analysed in more detail
and a number of resources shortlisted and contacted for information relating to
the evaluation of their resources.
It was confirmed from the results that the sample focused on post-graduate
delivery. Generally, resources had an introductory and flexible multi-session
multi-disciplinary focus, followed established pedagogic models, and
concentrated on the learners’ current academic practice. A range of internal and
external sources were used to assess learners’ demand for the resource,
including student feedback, attendance statistics and national debate. Internal
policy on researcher development is a strong driver. The current debate on OER
and sharable resources is widely acknowledged, although not always practical.
2
Research Information Literacy and Digital Scholarship (RILADS) Apr 2013
The knowledge, skills and competencies raised in the SCONUL 7 Pillars of
Information Literacy and Vitae’s Researcher Development Framework inform
much of this development. Referencing, source evaluation, plagiarism, searching
and dissemination are key areas, although much wider coverage is evident
across the sample.
The courses and resources can be categorised into two discrete types,
Classroom and Online, and these can take a blended learning approach. They
are primarily directed and delivered by Library Services staff, with varying levels
of input from other professional service departments (Graduate Schools, ISS,
Teaching and Learning Development) and faculty. It is notable that the (Library
Staff) respondents offered a wide range of additional skills they required
(teaching, research, technical) in order to successfully deliver these resources.
These skills were either gained through CPD or outsourced internally or
externally. It was widely agreed that time is required to develop and deliver
effective resources, although costs can also be an issue, reinforcing the culture of
sharing materials.
In terms of assessing the resources, statistical evaluations and qualitative
feedback are used to spot trends and iteratively develop resources to meet
changing participant needs. The lack of an assessment element in these types of
resources means it is difficult to determine changes in learners levels of skills /
knowledge / competences. Additionally, because many of the resources are
relatively new there is often insufficient data for detailed evaluation.
A number of self-selected information literacy resources have been evaluated
using the RIDLs criteria, leading to a shortlisting of a selection of 15 good
practice examples. This is not to say that every aspect of each of the shortlisted
examples is perfect – this project is not about finding ‘the best’ information
literacy resource - but the benefit of this selection is that those charged with
developing resources to serve a similar need may efficiently access some
examples – and ultimately, perhaps, that ‘good practice’ may become ‘common
practice’. Various recommendations are made within the report, which may be of
value to those planning to develop good practice resources. The value of the
RIDLS criteria in this research has been to provide an analytical framework for
3
Research Information Literacy and Digital Scholarship (RILADS) Apr 2013
such evaluations (for the researcher) and act as a reflective tool (for the
developers/deliverers). Hopefully some of the recommendations and comments
within the report, combined with a reflective look at the examples – and contact
with their helpful representatives – may assist those attempting to deliver good
practice information literacy in UK HE in 2013 and beyond.
4
Research Information Literacy and Digital Scholarship (RILADS) Apr 2013
2. PROJECT AIMS
The Research Information Literacy and Digital Scholarship (RILADS) project aims to
deliver a small number of key outputs contributing to a wider investigation into the
support available to students, staff and researchers to enhance digital literacy. There
are two strands to the project. One is co-ordinated by Research Information Network
(RIN) on behalf of Research Information and Digital Literacies Coalition (RIDLs), the
other by SCONUL under the JISC Developing Digital Literacies (DDL) programme.
The RIN strand focuses on the identification and promotion of good practice in
information handling and data management training and development across the HE
and research sectors. Its aim is to identify a representative sample of case studies to
illustrate information and data management training in Higher Education (including
those already documented in earlier research). The scope of these case studies, of
which 15 have been identified in the course of the project, will relate specifically to
HE researchers from postgraduate students to senior researchers (including
supervisors). It is essentially this first strand that is the subject of this report.
The SCONUL strand aims to identify, harvest, and use materials to progress the
development of digital professional expertise. To ensure that both strands retain
clear foci, while minimising duplication of effort, the emphasis for the RIDLs
programme will be on the identification and promotion of good practice in information
literacy in HE, and, for the SCONUL/JISC funded activity, on enhancing the digital
scholarship skills of information professionals, using the SCONUL baseline survey
definition: “Digital scholarship: the ability to participate in emerging academic,
professional and research practices that depend on digital systems. For example,
use of digital content (including digitised collections of primary and secondary
material as well as open content) in teaching, learning and research, use of virtual
learning and research environments, use of emergent technologies in research
contexts, open publication and the awareness of issues around content discovery,
authority, reliability, provenance, licence restrictions, adaption/repurposing and
assessment of sources." It is anticipated that the SCONUL strand will identify gaps in
provision and efforts will be made to make proposals on how these might best be
filled. These proposals will be targeted towards SCONUL members and other
5
Research Information Literacy and Digital Scholarship (RILADS) Apr 2013
information professional stakeholders in an effort to guide them in developing and
maintaining services and resources which enable digital scholarship.
This document reports on activities in the RIN strand between October 2012 and
April 2013.
6
Research Information Literacy and Digital Scholarship (RILADS) Apr 2013
3. METHODOLOGY
a. Sample
The plan was to collect a long list of examples from recommendations by the
community and then reduce this list to a manageable short list of 10-12, using the
RIDLs criteria (Appendix) as evaluators.
As a starting point, the RIN Information Handling Working Group (2010) list of 13
examples of good practice in information handling was used to identify resources
that may be appropriate for the planned long list. Emails inviting participation or
updated information were sent to contact names for each resource. Other resources
were identified by researching online for ‘information literacy’ materials, resources in
JORUM grouped under ‘information literacy’, VITAE’s Database of Practice, and
other resources identified or highlighted in social media (information literacy blogs
and #infolit Twitter). Various stakeholders were also identified from attendees,
presenters and award nominees at events such as LILAC and CILIP. Each relevant
stakeholder was sent a brief email introducing the project, including a hyperlink to
the blog, and inviting them to submit an example of information literacy good practice
for postgraduates and beyond in UK HE. The press release was also circulated to
professional private email lists by members of RIDLS and SCONUL. Members of the
JISC Developing Digital Literacies programme were approached by the researcher
at the programme meeting on 16th October 2012 to introduce the project and were
personally emailed with information about the project and requests for
recommendations for inclusion.
A long list of 42 resources was compiled out of this exhaustive approach. Although
every nominated resource was included, not every information literacy resource
identified by the researcher was added to the list: those that came up in search
results that were not functioning/running, or were minimal or outdated in their content
were discarded.
The long list was uploaded, with hyperlinks, to the project blog. As intended, this
milestone has already proven to be a valuable resource, with a small number of
participants later mentioning how the list usefully shows examples of good practice.
Although this was not stated, it is likely that the published list also acted as a
7
Research Information Literacy and Digital Scholarship (RILADS) Apr 2013
motivator for participation in the project in order to aid participants in the
dissemination of their work.
Contacts responsible for the development and delivery of each resource on the long
list were then individually emailed an ‘Information Literacy Evaluation Form’ (word
doc) for completion. This form was derived from the first section of the RIDLS
evaluation criteria (‘Criteria for describing and reviewing courses or resources’),
focusing on the description of the resources. Each question was numbered and laid
out in table format to make it plain where answers should be added. The form was
also uploaded to the project blog in downloadable format for those who wished to
submit a resource but had not been in direct contact with the researcher. A deadline
of 2 ½ weeks was given, which was subsequently extended to 3 weeks. A small
number of forms were returned after the extended deadline and are also included in
the analysis. In total 27 completed forms were returned from the institutions below,
some of which were additional to the previous long list entries; a more detailed list is
at appendix A. Each submitted form was acknowledged with a personal email
thanking the participant for their contribution and advising them they would be
contacted at a later date.
1. Bath
15. Loughborough_Elevenses
2. Birmingham
16. Loughborough_EMSRG
3. Cardiff
17. Loughborough_PGR
4. City_1
18. Loughborough_staff
5. City_2
19. LSE
6. City_3
20. Manchester
7. Cork
21. Nottingham
8. Cranfield
22. Open University
9. Durham
23. Oxford
10. EdgeHill
24. Portsmouth
11. Edinburgh
25. Salford
12. Glasgow_Pilot
26. UWE
13. Glasgow_Smile
27. Warwick
14. Imperial
8
Research Information Literacy and Digital Scholarship (RILADS) Apr 2013
b. Shortlisting
The shortlisting process followed the initial analysis. The sample was split into two
distinct groups: course/workshop-based and online. Resources which gave positive
responses, illustrating a considered approach to the RIDLs criteria were ranked for
each question according to the breadth and depth of their provision. In order to give
a broad view of the resources, the two groups were also evaluated in terms of the
type and style of the resources available. The draft shortlist were then sent additional
evaluation questionnaires, based on the RIDLs ‘criteria for assessing courses or
resources’, which sought to gather data on the assessment and evaluation process
followed by each of the 16 resources on this list. The draft list was subsequently
reduced to 15:
Institution
Resource name
Cardiff University
Embedded information literacy
Cranfield University
Online information literacy tutorial
Glasgow Caledonian
University
PG IL module (‘Pilot’)
Loughborough University
eMRSG: East Midlands Research Support Group
LSE
MY592
Open University
Ready to research
Oxford University
Research Skills Toolkit
University of Bath
Information Skills for Research Postgraduates
University of Birmingham
Raising your research profile
University of Durham
Training Resources 1213
University of Edinburgh
Research Data MANTRA course
University of Manchester
Media & Information resource
University of Nottingham
Effective Literature Searching
University of Salford
Salford Postgraduate Research Training (SPoRT)
University of Warwick
Digital Researcher
Table 1 Final shortlist of good practice resources (alphabeticised)
9
Research Information Literacy and Digital Scholarship (RILADS) Apr 2013
It is important that the limitations of this selection process are recognised. Although
the described approach includes some quantitative elements the final draft list result
should be considered in the light of various subjective factors. The original long list
was derived by a combination of subjective researcher selection from a range of
resources found by online search and recommendations from participants.
Additionally, the self-selected nature of the project means that the resources can
only be considered as a ‘snapshot’ of examples of current (Sep 2012 – Dec 2012)
practice. The rationale for the final shortlist is not to present a ranked list of ‘best
practice’. This is not a competition. The final shortlist is designed to offer a range of
examples of different types of resource which may be used to inform future practice.
The value of the RIDLs criteria, which were used to derive the data, performed the
essential function of giving a framework to the data collection and analysis, aiding in
the mitigation of the subjective nature of this type of research.
c. Analysis
The completed evaluation forms were imported into NVivo 8 software for coding. The
answer to each question was coded, allowing flexible interrogation of the data and
comparisons between answers.
The questions (see Appendix) covered three main areas:

Who is the course or resource designed for, and why?

What knowledge, skills and competencies is the course or resource intended
to provide?

How is the course or resource delivered?
These three areas are discussed below. This is an in depth analysis which develops
on themes arising from the previously reported preliminary review of the data
(Interim Report Oct – Dec 2012). Here, all quotations are from the 27 evaluation
forms, randomly anonymised numerically.
10
Research Information Literacy and Digital Scholarship (RILADS) Apr 2013
4. PROMOTION OF PROJECT
In order to give an accessible focus to the project a blog was set up at
http://rilads.wordpress.com/. The aim of the project blog is to house information and
findings related to the activities and outcomes of the project. It is not intended as a
researcher diary. Examples of contents include the initial press release, an initial list
of 13 good practice examples, a downloadable questionnaire, a long list (with links)
of good practice examples, a deadline extension message to participants, slides
from UKCGE conference. The home page also has a selection of related links (JISC,
RIN, SCONUL, RIDLS). It is possible to follow blog updates by means of automatic
email announcements. Twitter updates (@RILADS) are also listed.
An approved press release was sent to various key publications for information. A
news story ran in CILIP’s Update magazine (Dec 2012).
The press release (Appendix) was widely circulated to subscribed email lists:
LIS-INFOLITERACY, LIS-E-RESOURCES, LIS-LINK, PROFDOC,
POSTGRAD, EVALUATING-IMPACT, SUP-DEVELOPMENT.
This led to some early Twitter and blog followers.
The blog has been invaluable as a focus of the research and there is some evidence
of use of the links list. As dissemination continues through 2013 the blog will remain
the central point and will continue to be updated regularly and promoted via Twitter
and other outlets.
11
Research Information Literacy and Digital Scholarship (RILADS) Apr 2013
5. FINDINGS AND DISCUSSION
A.
Who is the course or resource designed for, and why?
1.
Who are the learners that the course or resource is designed for?
a.
By career stage (research students, research fellows, tenured
researchers…)
By far the majority of resources in this sample are aimed at post-graduate
researchers. This is unsurprising given the scope of the project (PGR and beyond).
The level under this number of responses identified All, Staff, Research Students
and PhD. Some included UG in this range, while those exclusively devoted to UG
were disregarded. Although Researchers and PostDocs were mentioned, they were
only specifically the focus of these resources in one case. This focus on PGR
indicates that the sample included appropriate resources relating to the scope of this
research project. Generally the responses indicated that the resources were
designed to cover an inclusive range of researcher types (PG, PhD, staff).
b.
By discipline
The focus of the respondents was generally on delivering a resource which was
appropriate for all disciplines, with some examples of specialising in broader areas
(Social Science in particular, but also Science and Humanities were mentioned)
relating to the specialities of the institution. Some one-off mentions were made of
more specific disciplines, which their resources were specifically designed to
support, such as Business, Psychology, Geoscience, Law, Engineering and English
Literature.
2.
What steps have you taken to assess learners’ need for the course or
resource?
The main approach to assessing learners need for the resource was through student
feedback, before designing and developing the resource as well as after students
had used the resource. This feedback was both anecdotal and derived from more
formal sources such as questionnaires, interviews and focus groups, taking “into
12
Research Information Literacy and Digital Scholarship (RILADS) Apr 2013
account learners’ feedback on their training needs, and wherever possible develop
new workshops where requested”. Discussion with internal practitioners and other
stakeholders, such as Faculty academics and Graduate School, and supervisors
also informs a number of resources. This assessment is also likely to be informed by
day-to-day experience of staff delivering information literacy as part of their duties,
awareness of national debate and reports from RIN and Vitae’s RDF initiative or
recommendations from “external researchers, who last year ran interviews and a
focus group in relation to this initiative noted that all staff interviewed regard
information literacy as very important, especially at the early stage where research
students are first starting their project.”. Data from development needs analysis
forms completed by students with their supervisors (or online) may also be used,
although often students are self-selecting and come from outside this process,
reflecting the ‘all-comers’ nature of many of the resources. One of the resources did
not refer to students and academics selected their resources that were incorporated
into the collection. A follow up question, (‘If such steps have not been taken, what is
the reason for this?’) was only answered by one respondent, indicating that almost
unanimously the respondents had performed what they believed to be adequate and
appropriate analyses of user needs.
It is recommended that when developing such resources learners’ needs are
assessed using a variety of channels:
•
Internal discussion – it is important not to rely on one perspective when
developing such resources;
•
National debate – extensive research is being done in this area and offers
valuable insights;
•
Development needs analysis performed at a one-to-one or self-assessed
level;
•
Existing frameworks such as NSS and other feedback gathering exercises;
•
Research Development office;
•
Student feedback;
13
Research Information Literacy and Digital Scholarship (RILADS) Apr 2013
•
Staff experiences in teaching and drop in sessions can provide valuable
insights;
•
Formal research into needs and demands for the resource within the
institution.
3.
Given that the course or resource relates to information literacy, how
does it fit the broader professional development needs of the learners?
The outcomes of such resources are generally focussed on existing needs for the
students to successfully partake in their studies. There is also acknowledgement that
these skills are likely to be useful in future careers, “thereby sitting within a broader –
yet connected - professional development context” although the transferable nature
of the skills developed are not always recognised, with the focus being
predominantly on the learner’s current research practice. Those participants referring
to the RDF discussed how the skills would inform the development of professional
researchers, “addressing employability and transferable skills, as well as the need
for high-quality information” and “identifying the target skill areas key to the
development of professional researchers”. Highlighted professional skills included
dissemination, data management, digital skills and teaching skills.
It is recommended that when developing such resources, current as well as future
transferable skills are considered, and that mapping resources to the RDF
information lens can frame skills to professional development.
4.
To what extent is the course or resource a response to demand from
learners, and if so, how have you identified this?
Participant feedback from previous iterations of similar modules / courses is primarily
used to assess demand for such resources in terms of spotting trends and filling
gaps in delivery. A ‘top down’ approach is also applied, where demand is set by
institutional stakeholders from formal student feedback to course committees.
However a more ad hoc approach is also apparent, with needs being anticipated.
The experience of the subject librarian, development needs analysis forms, RDF,
14
Research Information Literacy and Digital Scholarship (RILADS) Apr 2013
internal research, existing demand for popular courses, and requests from Grad
School and academic staff also support the decisions to develop resources.
It is recommended that a combination of some or all of these factors is used to
establish demand:
•
Participant feedback
•
Tutor feedback
•
Grad School feedback
•
Development needs analysis
•
Top down
•
External influence
•
Formal internal research
•
Staff request
•
Existing demand
5.
Is participation by learners in previous similar training activities a factor
in helping you to determine demand?
6.
Is such participation in previous activities analysed, in terms of range of
learners (for instance, by discipline or career stage)?
Many of the participants mentioned they assessed previous participation in similar
activities, ”the numbers attending training sessions are also used to determine future
demand – where waiting lists develop, additional offerings are scheduled and where
attendance is low, sessions are reviewed and modified or dropped from the
programme as learner needs change”, although this was not always the case
It was noticeable that most responses stated that they did not analyse participation in
previous activities by discipline or career stage.
It is recommended that, where the information is available, attendance statistics are
analysed when developing and launching new resources.
15
Research Information Literacy and Digital Scholarship (RILADS) Apr 2013
7.
How is the course or resource made appropriate to learners, for
instance with regards to their current level of skill, years of experience,
disciplinary areas?
Generally, courses are organised so students can self-select elements which they
feel will benefit them. Many sessions are introductory although some resources offer
a choice between ‘introduction’ and ‘intermediate/advanced’. The course rubric is
likely to clearly define the level of the content and, where appropriate, the disciplinary
content: classed may be “open to all research students but where they are targeted
towards a particular broad discipline group, this is indicated in the title or
description.”. Flexibility within a workshop session may allow specific information
needs to be met, using “workshop time to allow participants supported hands-on
experience, this gives them support at the right level.” It seems that across-the-board
resources are aimed at a wide range of disciplines, and act predominantly as
introductions to topics, while being sufficiently flexible to respond to learner demand
on-the-fly “at the beginning of each session”.
It may be appropriate for those developing online resources to incorporate this
flexibility in some way, for example via moderation and one-to-one follow ups. Some
resources may target a broad discipline group, focussing on specific databases or
issues such as impact (for Sciences) and e-resources (for Humanities).
8.
How accessible is the course or resource, particularly for learners with
diverse needs?
Accessibility was either interpreted as meaning ‘students can access the material
24/7’ (“The resource is universally accessible. Google Analytics data from the past 5
years of operation show that we have users around the world” or in terms of disability
(“Accessibility tools (e.g. adaptive peripherals and software) are made available as
required”. In future use of the criteria the meaning of ‘accessibility’ needs to be more
clearly stated.
By their nature, the online materials were deemed to be accessible widely and “open
to anyone who can use a computer”, while special tools such as hearing loops are
cited in accessible workshop sessions. There is recognition that making resources
16
Research Information Literacy and Digital Scholarship (RILADS) Apr 2013
accessible to students with particular needs due to disability or nature of study
(especially part time and distance learners) is important and support may be offered
where available, for example, in a situation where “at least two members of Library
staff run each session; this means there is more scope for individual assistance for
any participant who may need it”. This may also be planned for in advance from
communication with Graduate Centre administration. This is not always the case and
sophisticated tools do not appear to be always widely available.
It is recommended that the accessibility of both online and face-to-face resources is
considered carefully in their design in order to ensure their inclusivity.
9.
What do learners need to know already in order to benefit from the
course or resource?
Nearly all of the resources in this research required only basic knowledge for the
users. A baseline could be set when some introductory knowledge was required to
access more advanced resources, “students must have attended the introductory
workshop or be familiar with the functions described in that workshop’s description”.
Although technical and subject knowledge was not generally required, the context of
the research environment was mentioned, attendees being “expected to understand
the academic environment” and “they do need to be involved in some form of
research to get the benefit out of most of the workshops”. When prior knowledge is
required it is stated that this is made clear multiple times in the rubric and booking
process.
10.
On the basis of the assessment of need and demand, what have you
done to communicate clear learning objectives to those who attend the course
or use the resource?
The learning objectives are widely situated within the rubric of the resource, “each
workshop also has clear learning objectives that are regularly reviewed” and in
online resources they may be “detailed at the beginning of each unit and reiterated at
the end”. Participant feedback may be used to evaluate learning outcomes.
17
Research Information Literacy and Digital Scholarship (RILADS) Apr 2013
Outcomes are also re-iterated at the beginning of workshop / class sessions, where
“each presentation begins with learning objectives”.
It is recommended that this practice is followed, with Learning Outcomes being
clearly stated in the rubric, at the introduction of each session, and evaluated at the
end of each session.
11.
How does the course or resource fit with your institutional and/or
departmental policy and practice on researcher development?
University research development strategy informs good practice examples, through
formal structures “as part of the University’s broader skills programmes for
researchers”, aligning “with the university’s aims to support early career researcher
and PGRs, and to enhance research and transferable skills”. Library policy may also
inform development, if “it fits within the library’s strategic plan”. Vitae’s RDF is also
mentioned as an influence, when “the skills developed through the course fit with
Vitae’s Researcher Development Framework and supports the university’s ambition”.
However it is notable that not all resources sit within a policy framework, perhaps
because this is not explicit within an institution. One respondent noted that “the
closest we have to an institutional “policy” around researcher development would be
the University’s signing up to the Vitae concordat”, while the nature of the institution
may be that although “it is one of a number of courses offered to PhD students [and]
in some departments supervisors strongly encourage their PhD students to attend.
But [here] nothing is mandatory”.
It is recommended that wherever possible resources are clearly linked to institutional
and departmental policy on researcher development.
12.
Can the course or resource be transferred or adapted to suit needs or
contexts other than the one for which it is designed?
Following the current spirit of the sharable nature of such resources, many of these
examples are transferable or adaptable to outside users and “can be adapted to
ensure information literacy development is fully embedded into provision and
18
Research Information Literacy and Digital Scholarship (RILADS) Apr 2013
presented to researchers as an integrated whole”. They may also be available
internally and “adapted easily for use in other contexts or for other user group”. A
small number are downloadable or available via Jorum. However not all are available
to others, for reasons of specificity because “some of the content could be used a
foundation for some online courses, but would need a lot of reworking to make it an
effective learning tool in such a different environment”.
For many reasons it is becoming considered good practice to make such resources
transferable and adaptable and it is recommended this be considered when
developing.
19
Research Information Literacy and Digital Scholarship (RILADS) Apr 2013
B.
What knowledge, skills and competencies is the
course or resource intended to provide?
1.
What areas of information literacy does the course or resource cover?
What areas of information literacy does the course or resource cover?
Assessment
Information and analysis Citation and
searching
of
referencing Data
Plagiarism,
and
information (inc
management fraud,
discovery
sources
software)
and curation copyright etc
Birmingham
Y
Y
Y
Y
Y
City_2
Y
Y
Y
Y
Y
EdgeHill
Y
Y
Y
Y
Y
Cardiff
Y
Y
Y
N
Y
Cork
Y
Y
Y
N
Y
Cranfield
Y
Y
Y
N
Y
Glasgow_Pilot
Y
Y
Y
Y
LSE
Y
Y
Y
Y
Y
Oxford
Y
Y
Y
N
Y
Bath
Y
Y
Y
N
Y
Edinburgh
Y
Y
Y
Glasgow_Smile
Y
Y
Y
Y
Open University
Y
Y
Y
Portsmouth
Y
Y
Y
Y
Salford
Y
Y
Y
N
Y
UWE
Y
Y
Y
N
Y
City_1
Y
Y
N
Y
City_3
N
N
N
Y
Y
Imperial
Y
Y
Loughborough_PGR Y
Y
Y
N
Y
Manchester
Y
Y
N
Y
Warwick
Y
Y
Durham
Y
N
N
N
Loughborough_staff Y
N
Nottingham
Y
Y
N
N
Loughborough_Elevenses
Loughborough_EMSRG
Data
protection
&/or FOI
Y
Y
Y
Y
Y
Y
Y
N
Y
N
Y
Y
Y
Y
N
N
Y
Y
Y
N
N
N
N
Publishing
and
disseminatio
n including
OA
Y
Y
Y
Y
Y
Y
Y
Y
Y
Y
Y
N
Y
Other
Social media: generating interest and momentum via new channe
Y
Digital identity / Web 2.0
Managing your Information (using EndNote), Tracking down your
Current awareness
Use of social media
Measuring impact and bibliometrics, current awareness, IT Skills
Subject specific resources, searching for data and statistics
Independent learning, what is a student?, academic writing skills,
We want to tie it in with the IL lenses eventually
Y
Y
N
Y
Y
Y
N
Y
Y
Y
N
Y
Y
Choosing a research topic, defining your aims, research skills, tim
Y
Y
Collaboration using web 2.0 tools;
Media literacy, use (and abuse) of research in the media. Cognitio
The course is not all about information literacy but all about digita
The Elevenses programme changes each time it is run to meet the
Bibliometrics – Author & Journal.
Table 2 Resources ranked by range of IL coverage
20
Research Information Literacy and Digital Scholarship (RILADS) Apr 2013
Figure 1 Resources / coverage
The above lists (table 1, figure 1) are ranked in order of areas of information literacy
covered. Please note that some resources are highly specific and are not designed
to cover the range of topics so this ranking should not in any way be taken to imply
that the highest in the list is the best in the sample.
This data is summarised in Figure 2, which shows that there is an emphasis on
Citation and referencing, Publishing and dissemination, Plagiarism, fraud and
copyright, and assessment and analysis of information sources. This is followed by
Information searching and discovery and data protection and FOI, with Data
management and curation being least covered.
21
Research Information Literacy and Digital Scholarship (RILADS) Apr 2013
Figure 2 Coverage of IL provision over 27 resources
There are other areas of coverage identified, including notably: subject specific
resources, social media literacy, bibliometrics, evaluation of materials, general study
skills/research methods, IT skills. These additional categories indicate that the
criteria would benefit from being revisited to incorporate more possible areas of
coverage.
22
Research Information Literacy and Digital Scholarship (RILADS) Apr 2013
Figure 3 Use of RDF and 7 Pillars
2.
Is the course or resource informed by models or frameworks such as
the RDF and the Seven Pillars?
a.
If so, how?
The Seven Pillars and RDF lens are widely used. There appears to be a leaning
towards the use of RDF over the Pillars in more recently developed materials.
3.
Have you sought to make use of the information lens of the RDF?
Very few resources use the recent RDF information literacy lens, generally stating
that the resource was developed prior to the lens.
23
Research Information Literacy and Digital Scholarship (RILADS) Apr 2013
C.
How is the course or resource delivered?
1.
What form does the course or resource take?
a.
Classroom-based courses (lecture or workshop)
b.
Individual tuition
c.
Online courses
d.
Training material (printed or digital)
e.
Other
Freely
available
?
Birmingham
N
City_2
Y
EdgeHill
N
Cardiff
N
Cork
Y
Cranfield
Y
Glasgow_Pilot
Y
LSE
N
Oxford
N
Bath
N
Edinburgh
Y
Glasgow_Smile
Y
Open University
Y
Portsmouth
Y
Salford
N
UWE
Y
City_1
Y
City_3
Y
Imperial
Y
Loughborough_PGR N
Manchester
Y
Warwick
Y
Durham
N
Loughborough_staff N
Nottingham
N
Loughborough_Elevenses
N
Loughborough_EMSRG
Y
How is the course or resource delivered?
Classroombased courses
Y
Individual
tuition
Y
Y
Y
Y
Y
N
Y
Y
Y
N
Y
Y
Y
Y
Y
Online courses
N
Y
Y
Y
Y
Y
Y
Y
Y
Y
Y
Y
Y
Training material
(printed or
digital)
Other
N
Y
Y
Y
Y
Y
Y
N
Can be supplemented with lectures or face to face training by
Y
Y
An openly available online resource.
Y
Y
Y
Y
Y
N
Y
Y
Y
Y
Y
N
Y
Y
Y
Y
Y
Y
Podcasting
N
Y
Y
Y
Figure 4 Form of delivery
Figure 4 shows the spread of online / classroom-based resources. Although the term
‘blended learning’ is rarely used by the participants, this appears to be the most
widely used approach in the delivery of this type of information, as illustrated in the
pie chart below (Fig 5), where a combination of classes and VLE or freely accessible
online resources are employed.
24
Research Information Literacy and Digital Scholarship (RILADS) Apr 2013
Figure 5 Form of delivery (max 27)
2.
What would you describe as the main features of the course or
resource?
a.
b.
c.
d.
e.
Mode of instruction
Length of course
Use of assignments
Assessed/non-assessed
Other
The brief responses identified the key element of ‘mode of instruction’ which they felt
was the most important aspect, with no explanation. Key themes here include
workshop sessions, blended learning approaches, online content, modular. Mostly
the resources / courses are multi-session, requiring regular commitment.
Assignments are rarely used and only one course is assessed. Additional responses
included evaluations of the excellence of the resource within institutional and
professional frameworks, the benefit of cake in creating a relaxed atmosphere, the
value of delivering a wide variety of topics and the opportunity for the learners to
choose how they engage with materials.
3.
Who designs and delivers the course or resource?
a.
b.
c.
Library
Graduate school
IS department
25
Research Information Literacy and Digital Scholarship (RILADS) Apr 2013
d.
Other (who?)
All the resources / courses included in the survey are developed and delivered by
the Library, with some contributions from Graduate School in administration and
organisation and inputs from other service departments and faculty when available
and appropriate. Other contributors included Learning and Technology, Research
Office, Education Innovation. It should be noted that the gathering of this data was
heavily focused on library networks. Although supervisor and PG student networks
were also approached, all of the responses to the research call came from the library
sector.
4.
What are the different roles and responsibilities of these various players
with regards to the design and delivery of the course and resource?
It is likely that it is partly because of the sampling approach, which sourced
resources predominantly through library email networks, that the majority of
respondents stated that the library was the main player in developing and designing
information literacy resources. However this is mitigated by virtue of the fact that
generally “the content and delivery is the Library’s responsibility” . That there is
substantial evidence of strong liaison across departments is clearly indicative of
partnership-working within institutions: “in consultation with Academic Department
staff/students and the University’s Research Development Office staff” and effective
liaison “with the Graduate School, Planning Office and Research and Innovation
Services”, courses being “designed jointly”. Administration is an important example
of conjoint working where the “programme is organized by a small team of
professional and clerical staff in the UGC”. In terms of technical support this may be
“provided by the Graduate School’s learning technologist”. In a small number of
cases, academics design the programme and “an external consultant actually put the
web site together, and also contributed to the structure and general design of the
course”. A team may be assembled from a wide range of “instructional designers,
graphic designers, library staff for content, a project manager for the project phase
and product manager”. Learning Technology and IT departments are mentioned, but
rarely so, indicating a possible gap in effective use of available resources.
Academics are very rarely involved in delivery. In terms of institutional strategic
26
Research Information Literacy and Digital Scholarship (RILADS) Apr 2013
engagement, it appears likely that where inter-departmental networks are supported
by policy to work together to design and deliver effective resources, this is more
likely to take place effectively.
It is recommended that an appropriate range of services within the institution are
involved in the design and delivery of these resources wherever possible in order to
maximise the value that can be brought to these projects from staff with experience
outside of the library setting.
5.
What skills and know-how are required by those devising, running or
managing the courses and resources?
Teaching skills are most frequently highlighted, such as “good oral written and oral
communications skills, plus flexibility to adapt the differing needs of attendees –
range of experiences, disciplines etc”. In support of delivering a professional service
“many of the tutors have completed a PGCert in teaching in HE although it is not
required”. This approach, where “knowledge of Information Literacy Skills pedagogy,
teaching skills, current teaching practices and developments” is matched by a need
for an understanding of the research process. It is widely agreed that “it is obvious,
but essential, that there be an understanding of the research experience more
generally – not only to ensure that the offerings are appropriate to the stage of the
research but also to effectively communicate the benefits of participation to the
researchers”. IT skills are equally important, mainly in terms of developing online
materials, but also in terms of the tools being taught:. A good knowledge of digital
and information literacy was briefly mentioned, followed by numerous one off
mentions of specific skills (Appendix g).
The key skills noted here, teaching, research, technical seem to be paramount in
terms of their need in developing good practice resources. A combination of all of
those listed would benefit optimum resource development and delivery and should
be considered in planning to develop good practice.
a.
How do these skills and know-how relate to the different roles and
responsibilities?
27
Research Information Literacy and Digital Scholarship (RILADS) Apr 2013
Some respondents affirmed that “these skills and expert knowledge are core skills
for the Library staff running individual sessions and also necessary for those
planning and putting into place the combined programme”, while others had
developed special skills for this purpose, and recognised that “we will need to
update our skills on a more sustainable basis in future”.
b.
How were these skills and know-how acquired?
The skills required are developed through a combination of day-to-day experience as
a librarian, and CPD and encouraged by the institution, emphasising “the importance
of developing subject librarians’ teaching skills over recent years, through
workshops, conference attendance”. The importance of library staff taking PGCert is
notable, along with professional library qualifications. Knowledge sharing “through
sharing good practice and materials among Library staff and through shared
teaching of individual sessions”, peer-review, liaison with faculty and student
feedback all inform the development of these skills, and “deepens knowledge every
time”. This knowledge may be shared “through joint meetings with the teaching team
each term”.
A combination of experience, CPD and iterative evaluation is appropriate in
developing good practice resources.
6.
What support is required to run the course or resource (personnel,
facilities, financial)?
This work takes time. “Time for preparation/delivery. Time for advertising/marketing
– administration of courses”. Online courses may be more time-consuming than
face-to-face courses, because “each time the [online] course runs, it requires 40
hours of academic librarian time to be timetabled so that a tutor is constantly
available to respond to participant queries and to steer, as well as moderate, the
online discussions”. Time is also required for administration of the resource once it
has been developed. Budgeting time is very important, recognising the ebbs and
flows of the academic year. While “the Summer vacation allows for Library staff time
to be given to the course, … the Autumn term requires outside support to be bought
in”. Physical space for face-to-face programmes and some funding for external input
28
Research Information Literacy and Digital Scholarship (RILADS) Apr 2013
and other costs needs to be available as “there is a considerable administrative
overhead (advertising, course booking, room hire and set up, printing materials,
uploading materials to the web)”. Although there were numerous online services
surveyed, there was minimal mention of resources required for server space and
maintenance.
It is recommended that budgets are carefully drawn up when developing new
resources, and that time, the major resource required, is clearly allocated to those
responsible.
a.
If the courses and resources take the form of digital/online resources,
are they free for others to use or can they be readily purchased?
The vast majority of the respondents stated that their resource was freely available,
often with Creative Commons license and via Jorum. The culture of sharing being
highlighted – some may “borrow ideas as regularly as I create my own, so open
source and open access is important”. However those resources run within a VLE
are somewhat restricted in this regard because they require password or guest
access, therefore potential users from outside the institution are unable to access
them freely. This is a similar problem for face-to-face workshop courses, which
require co-operation with deliverers if resources are to be sharable.
29
Research Information Literacy and Digital Scholarship (RILADS) Apr 2013
D. Criteria for assessing courses or resources
This section discusses responses to the follow-up evaluation forms which were sent
to the draft shortlist. It uses the RIDLs criteria for assessing courses or resources to
gather data to determine the extent of evaluations performed on the cited resources.
It is therefore based on a smaller response rate (8) than the previous sections.
1. How many learners, by career stage and discipline have taken part in the
course or used the resource?
Numbers of learners accessing the courses are reported to be kept, split into career
stage more than by discipline. The information provided was not sufficiently detailed
to perform any statistical analysis. There is certainly an awareness amongst the
respondents that statistics are valuable in terms of evaluation. Numbers of attendees
at courses, and online viewing statistics were provided at varying levels of detail.
These varied from “We have at least 60 participants signed up for our next series of
sessions, some may have signed up to more than one session” to the detailed table
provided below:
Part-time /
Full-time
Type of PGR
PT
16 MRes
FT
96 Doctorate
Stage
9 Yr 1
103 Yr 2
Yr 3
Yr 4 +
Faculty
68 Arts
8
28 SocSci
23
12 MedHea
34
4 SciEng
47
2. If the course has been run previously, or if the resource has been
previously used, what is the trend in terms of number of learners?
The data and discussion thereof provided in response to this question indicates the
value of attendance / online viewing statistics in terms of provision and promotion.
Trends are recognized, analysed, and used to inform delivery and scheduling.
However new courses and limitations in software can cause difficulties: “the units
30
Research Information Literacy and Digital Scholarship (RILADS) Apr 2013
were launched in mid-2012 and too early to spot trends and unfortunately our project
website does not track usage”..
It is recommended that wherever possible, detailed statistics are gathered during
each iteration of such courses as their analysis can inform decisions on timings,
content, and gaps in uptake.
3. What have been the reactions and feedback from learners, notably on
whether learning objectives have been met, and on quality, originality and
attractiveness of the course or resource?
Users’ qualitative feedback comments are quoted extensively by the respondents.
These comments strongly support the resources. These may be in the words of the
learners, or in Likert scale-derived analyses. Such supportive comments not only
motivate those charged with developing and delivering the resources and inform
revision of future iterations, but may also be used to publicise the service to potential
learners. Notably, it appears from the comments supplied that there are two key
areas where learners feel they benefit – the regular and prescribed nature of the
courses helps to give the learners a focus for their studies, and specific content
within the courses is given relevant context to the learners, enabling them to
appreciate the value of various tools and approaches. Whether the learning takes
place in a physical classroom or online, the benefit of working with others seems to
be much appreciated: “it is the forum with students' participation and experience
sharing that gives me motivation to learn more and more”. Statistical analysis of
Likert scale comments can be used to create targets for future feedback evaluations:
“Our very ambitious target is for all workshops to achieve a mean of 4.0 or above on
all of these items”.
Pre- and post-course questionnaires can be used to identify progress achieved by
attending the course and can be combined with needs analysis to identify areas
which individual learners may benefit from covering.
31
Research Information Literacy and Digital Scholarship (RILADS) Apr 2013
4. What is shown by any evaluation and analysis of such feedback?
This feedback is taken very seriously by most of those who reported its use:
“feedback forms are compiled by the UGC admin support staff and forwarded to the
presenter within one week of the session. The compiled feedback is also reviewed
by a named officer in the University Graduate College, who notes any items for
action and follows up with the presenters” although the time required for its analysis
may cause difficulties, and new resources need to be established enough to garner
enough data to be of sufficient value.
Insights from participants may provide useful information which had not been picked
up otherwise allowing institutions to “review the content itself to keep it fresh, up to
date and relevant to its users” and courses may be directly influenced and lead to
“the development of new courses, adaptation of content in existing sessions and
changes in the length of a session”. Unfortunately feedback may be difficult to gather
in sufficient quantity as “the drawback to relying on questionnaires is that not
everyone will complete them”
5. What are the changes in learners’ knowledge, skills and competencies
resulting from the course or resource?
It is difficult to evaluate changes in knowledge, skills and competencies which are
directly attributable to taking a course or resource without pre- and post-course
assessment. Very little appears to have been done by the respondents in this regard,
“because there is such a broad range of courses and participants (and due to a lack
of staff resource), the UGC has not attempted to track any individual changes in
learners as a result of any of our workshops”. Achieving the course/resource
Learning Outcomes, which are frequently cited in rubric, could be one way of
informing the measurement of these changes, although this type of very detailed
analysis, possibly leading to some kind of assessment, requires large amounts of
staff time. As has been discussed earlier, very few of the resources include an
assessment element and are designed to support studies rather than lie alongside
them as assessed modules.
32
Research Information Literacy and Digital Scholarship (RILADS) Apr 2013
However feedback indicates that there is a change, and may “suggest their intention
to change the way they work as a result of their new skills and knowledge”.
It could be that a more rigorous evaluation of changes in learners’ knowledge, skills
and competencies could be more easily incorporated into the assessed element of
curriculum-embedded modules.
It is recommended that these difficulties are considered in terms of evaluation and
that steps are made to measure achievement of Learning Outcomes by summative
and formative assessment during and following courses. These assessments do not
have to contribute towards student degree marks and could readily be built in to VLE
resources as instant feedback quizzes, for example.
6. How has this been ascertained?
While feedback forms and anecdotal evidence can provide some useful information
regarding student achievement of learning outcomes, an exemplary approach lists
self-assessment, peer-review, and tutor feedback in the pursuit of these findings
where “the course uses self – assessment, peer learning during in class group
activities and feedback from teachers though observation and conversation on a one
to one basis”.
This approach is not widely taken and it is strongly recommended that in terms of
evaluation a rigorous process is used to determine whether or not the resource is of
any value to the participants.
7. What are the improvements in researcher attitude, confidence, behaviour,
performance and practice that might be attributable to the
activity/resource?
Attitude, confidence, behavior, performance and practice may appear in course aims
and objectives, and some respondents appear to quote from these in answering this
question, attributing improvements in these to the activity / resource is a tricky
process and often, “this hasn’t been collected in any systematic way, only
anecdotally”. Again, examples of anecdotal comments via feedback forms are given,
33
Research Information Literacy and Digital Scholarship (RILADS) Apr 2013
indicating substantial impact: “Previously I was in darkness. I see the dawn now. And
I never feel scared about doing research anymore”
8. How has this been ascertained?
These learning objectives are again collected anecdotally and interpreted from
comments feedback forms but it appears that they are not prioritised in the gathering
of data relating to the outcomes of the resources, most likely owing to their intangible
nature and the difficulty in assessing these higher level skills and attributing them
directly to participation in the course / resource.
9. What has been the broader impact of the activity/resource, i.e. the extent to
which recipients have become better researchers, and the way in which this
has benefitted the institution?
Although “it is currently difficult to draw direct correlations from the feedback we have
gained” because “we have been running the series of courses for less than a year –
so our next evaluation effort will take a longer range view of participants and ask
about their perceptions of improved performance” there has been some external
evaluation of these resources. Also “in focus groups run by external researchers in
2011, both research students and research supervisors reported they were greatly
impressed by the information literacy courses offered by the UGC programme”
showing there are efforts made to evaluate course outcomes. Notable comment is
made that there is much value to be derived from the process for the developers /
deliverers and other stakeholders in terms of CPD and other skills development. Also
as an example, when students who attended the courses started to teach, they
requested input from the (library) deliverers in research methods courses, indicating
there is additional value to the institution from participation and connecting academic
practice with library services.
34
Research Information Literacy and Digital Scholarship (RILADS) Apr 2013
10. What has been the feedback from the departments or other units in which
the learners work?
In addition to the expected recommendations to take part in courses / resources to
their students by other participatory units (faculty, graduate schools) there is some
positive evidence of positive recommendations and support above-and-beyond
professional expectations where “some departments go further and advise students
to take the course”, “staff in the Graduate School regularly recommend the course to
doctoral students as part of their researcher development programmes”, and a
“department incorporates an adapted version of the course in its own timetabled PhD
seminars taught by the academic liaison librarian”. Word-of-mouth within
departments and faculty also increases uptake and widens participation and it is
recognised that “word of mouth has increased attendance levels”.
11. What challenges/barriers have been encountered in implementing the
development intervention (including lack of resources), and how are these
managed and/or overcome?
Unsurprisingly, predominantly lack of time but also lack of resources (staffing,
financial, software and VLE restrictions, teaching space) are the key barriers to
“develop an online iteration of the course”. “Resourcing in people is limited due to
pressure on time from other responsibilities, and the appetite for generic skills
training from learners”. On occasion this is compounded by the fact that “the
designing and running of cross-university sessions is not specifically stated within
their job descriptions”.
These issues are partially dealt with by using quiet time over the summer to develop
courses, using OER and the cloud, and using statistics and feedback evaluations to
gain support from management to extend and expand services.
35
Research Information Literacy and Digital Scholarship (RILADS) Apr 2013
12. What steps were taken to improve the course or resource as a result of any
evaluation?
All of the respondents stated they respond positively to evaluations and use
feedback to continually develop the course/resource. This may be in terms of format
or content: increasing numbers of sessions, making them more interactive, changing
the delivery from classroom to online, developing relevant and up to date content,
rebranding module names amongst others. It is widely agreed that evaluation and
reflection should inform curriculum development and the many specific examples
offered indicate this is an important element of the process of continuous iterative
development.
36
Research Information Literacy and Digital Scholarship (RILADS) Apr 2013
6. REVISIONS TO CRITERIA
Clarification needs to be made in:
“What steps have you taken to assess learners’ need for the course or resource?”
The answers to this question varied according to whether the participant understood
it to mean the individual learner or learners in general. This should be clarified in
future revisions of the criteria.
“How accessible is the course or resource, particularly for learners with diverse
needs?” – not all respondents relate the term ‘accessibility’ in this question to
disability, rather focusing on the availability of their online resource. This should be
clarified.
“What areas of information literacy does the course or resource cover?” – the various
‘Other’ categories identified (subject specific resources, social media literacy,
bibliometrics, evaluation of materials, general study skills/research methods, IT
skills) indicate this criterion could be refined.
“What would you describe as the main features of the course or resource?” – it is not
immediately clear whether this question requires a tick box answer by category, or
elaboration within each category. However all of the participants chose the latter
interpretation, summarising their resource in a few words under each category. This
criterion may require some attention in terms of explanatory detail.
37
Research Information Literacy and Digital Scholarship (RILADS) Apr 2013
7. DISSEMINATION AND PROMOTION
The short list will be announced via the project blog and Twitter at the same time as
the final approved version of this report. Results of the analysis will also be
disseminated via social networks, relevant print publications (eg CILIP Update) and
conference presentations.
Charlie Inskip will be presenting at UKCGE International Conference on
Developments in Doctoral Education (Apr 11/12) and CILIP Umbrella (Jul 2/3). Other
relevant conferences will be targeted during the course of the year.
It is recommended that key findings of the report be identified and used to generate
targeted interest – for example the range of skills required by librarians to
successfully develop and deliver the resources would be an interesting angle for
Update, while THE are likely to find more value in the findings around interdepartmental collaboration or the importance of technology in delivering these
resources.
An accessible ‘how to build a good practice information literacy resource’ guide
could also usefully summarise the recommendations and may be more likely to
engage practitioners considering work in this area.
Input from the RIDLs steering group would be valuable here in terms of
dissemination and promotion possibilities.
38
Research Information Literacy and Digital Scholarship (RILADS) Apr 2013
8. CONCLUSION
A number of self-selected information literacy resources have been evaluated using
the RIDLs criteria, leading to a shortlisting of a selection of 15 good practice
examples. This is not to say that every aspect of each of the shortlisted examples is
perfect – this project is not about finding ‘the best’ information literacy resource - but
the benefit of this selection is that those charged with developing resources to serve
a similar need may efficiently access some examples – and ultimately, perhaps, that
‘good practice’ may become ‘common practice’. The value of the criteria in this
research has been to provide an analytical framework for such evaluations (for the
researcher) and act as a reflective tool (for the developers/deliverers). Hopefully
some of the recommendations and comments within the report, combined with a
reflective look at the examples – and contact with their helpful representatives – may
assist those attempting to deliver good practice information literacy in UK HE in 2013
and beyond.
39
Research Information Literacy and Digital Scholarship (RILADS) Apr 2013
9. APPENDICES
a. Final shortlist (15) (alphabetical order)
Institution
Resource name
Cardiff University
Embedded information literacy
Cranfield University
Online information literacy tutorial
Glasgow Caledonian
University
PG IL module (‘Pilot’)
Loughborough University
eMRSG: East Midlands Research
Support Group
LSE
MY592
Open University
Ready to research
Oxford University
Research Skills Toolkit
University of Bath
Information Skills for Research
Postgraduates
University of Birmingham
Raising your research profile
University of Durham
Training Resources 1213
University of Edinburgh
Research Data MANTRA course
University of Manchester
Media & Information resource
University of Nottingham
Effective Literature Searching
University of Salford
Salford Postgraduate Research Training
(SPoRT)
University of Warwick
Digital Researcher
Audience / Coverage
Postgraduate students
Integration of information and digital literacies
into the University Graduate College skills
development programme.
Undergraduate / postgraduate students
Highly interactive online tutorials on a wide
range of IL issues; attractively and imaginatively
packaged.
Postdoc researchers
Online tutorials on wide range of IL issues
(developed for postdocs, but seems suitable for
graduate students too).
Early career researchers
Online, interactive tutorials on disseminating
research outputs and reference management.
Resource developed jointly by four East
Midlands HEIs.
Postgraduate students
Structured 6-week course on many aspects of
IL.
Postgraduate students
A set of online tutorials, structured within a
broad range of IL topics.
Postgraduate students
A set of interactive online resources.
Postgraduate students
Extensive programme of courses throughout
the academic year, mostly on literature
searching, but also on copyright, plagiarism, use
of databases… The only programme on this list
which has some discipline-specific resources.
Workshops on publishing, bibliometrics and
social media.
Postgraduate students
Range of autumn term IL courses.
Postgraduate students
Online tutorials on all aspects of research data
management.
Postgraduate students, researchers
Podcast-based online resource covering wide
range of IL issues.
Postgraduate students (early stage)
5-day course on literature searching
Postgraduate researchers
Wide-ranging programme of workshops
reflecting the structure of the RDF; selected
sessions available on aspects of IL.
Early career researchers
Module-based, 18-week online learning
programme on social media in the research
lifecycle.
40
Research Information Literacy and Digital Scholarship (RILADS) Apr 2013
b. Long-List
Cardiff University
Embedded information literacy
Cranfield University
Online information literacy tutorial
Glasgow Caledonian University
PG IL module (‘Pilot’)
Glasgow Caledonian University
SMILE
Loughborough University
Academic and research staff worshops
Loughborough University
eMRSG: East Midlands Research Support Group
Loughborough University
PGR workshops
LSE
Copyright
LSE
MY592
LSE
Research support
Newcastle University
Information Literacy Toolkit.
Nottingham University
Effective Literature Searching
Open University
Digital scholarship
Open University
Ready to research
University College Cork
PG IL module (‘PG6009’)
University of Birmingham
4 Ways to raise your research profile
University of Birmingham
Disseminating your research
University of Birmingham
Raising your research profile
University of Cumbria
Skills@Cumbria
University of Dundee
Advance@Dundee
University of East London
Infoskills
University of Edinburgh
Research data management guidance
University of Edinburgh
Research Data MANTRA course
41
Research Information Literacy and Digital Scholarship (RILADS) Apr 2013
University of Exeter
Cascade
University of Exeter
OpenExeter
University of Leeds
Researcher@Library
University of Leicester
Library training
University of Manchester
Media & Information resource
University of Manchester
Researcher Development
University of Portsmouth
UPLIFT
University of Salford
Salford Postgraduate Research Training (SPoRT)
University of Sheffield
MA Information Literacy
University of Sheffield
Information Literacy
University of Surrey
For Postgraduate Researchers
University of Surrey
Learning Skills Portal.
University of Surrey
Taught Postgraduates
University of Sussex
Research Hive
University of Warwick
Research Exchange
University of Warwick
Digital Researcher
University of Warwick
PhD Information Literacy workshops
University of West of England
Research Observatory
Various
Skills Forge
42
Research Information Literacy and Digital Scholarship (RILADS) Apr 2013
c. Returned Forms
RETURNED FORMS
1.
Bath
2.
Birmingham
3.
Cardiff
4.
City (3 forms)
5.
City (3 forms)
6.
City (3 forms)
7.
Cork
8.
Cranfield
9.
Durham
10.
11.
12.
Edge Hill University
Edinburgh
Glasgow Caledonian
(2 forms)
Glasgow Caledonian
(2 forms)
Imperial
Loughborough (4
forms)
Loughborough (4
forms)
13.
14.
15.
16.
17.
18.
19.
Loughborough (4
forms)
Loughborough (4
forms)
LSE
Library PGSkills
Training
Programme
Raising your
research profile
Integration of
information and
digital literacies
into the University
Graduate College
skills development
programme
Upgrade
Hannah South
7/12/12
Judith Hegenbarth
3/12/12
Cathie Jackson
3/12/12
Rowena MacraeGibson
Rowena MacraeGibson
Rowena MacraeGibson
Margot Conrick
5/12/12
Mandy Smith
5/12/12
James Bissett
30/11/12
MANTRA
PILOT
Rachel Bury
Robin Rice
Marion Kelt
13/12/12
4/12/12
19/11/12
SMILE
Marion Kelt
19/11/12
Research@Imperial
Elevenses
programme
Dissemination of
your researcheMRSG project
Workshops for PGR
students
Academic and
Research Staff
workshops
MY592 Workshop
in Information
Literacy
Ruth Harrison
Helen Young
17/12/12
29/11/12
Helen Young
29/11/12
Helen Young
29/11/12
Helen Young
29/11/12
Maria Bell / Jane
Secker
29/11/12
Library Guide for
Researchers
Social Media guide
Graduate
Information
Literacy Skills
Information
Literacy Tutorial
Doctoral Training
Programme
5/12/12
5/12/12
6/12/12
43
Research Information Literacy and Digital Scholarship (RILADS) Apr 2013
20.
Manchester
21.
Nottingham
22.
Open University
23.
Oxford
24.
Portsmouth
25.
Salford
26.
UWE
27.
Warwick
Media &
Information
Literacy for
Postgraduates and
Researchers
Effective Literature
Searching
Ready to Research
/ Digital
Scholarship
Research Skills
Toolkit
UPLift: Information
Tips
Salford
Postgraduate
Research Training
(SPoRT)
programme
The Research
Observatory
Digital tools for
research
Drew Whitworth
3/12/12
Elizabeth Newall
9/12/12
Robin Goodfellow
30/11/12
Angela Carritt
29/11/12
Greta Friggens
4/12/12
Victoria Sheppard
22/11/12
Liz Falconer
2/12/12
Jenny Delasalle
6/12/12
44
Research Information Literacy and Digital Scholarship (RILADS) Apr 2013
d. Press release
RIN and SCONUL announce RILADS project into Information Literacy and Digital
Scholarship
Wednesday, 24 October 2012
SCONUL and Research Information Network (RIN) have announced they are cofunding a year-long research project into delivery of Information Literacy and Digital
Scholarship. The project aims to deliver a small number of key outputs contributing
to a wider investigation into the support available to students, staff and researchers
to enhance digital literacy and will actively seek input from the community in
nominating examples of good practice.
There are two strands to the project. One is co-ordinated by Research Information
Network (RIN) on behalf of Research Information and Digital Literacies Coalition
(RIDLs), the other by SCONUL under the JISC Developing Digital Literacies (DDL)
programme. The RIN strand focuses on the identification and promotion of good
practice in information handling and data management training and development
across the HE and research sectors. The scope will relate specifically to HE
researchers from postgraduate students to senior researchers (including
supervisors). The SCONUL strand aims to identify, harvest, and use materials to
progress the development of digital professional expertise.
It is anticipated that the SCONUL strand will identify gaps in provision and efforts will
be made to make proposals on how these might best be filled. These proposals will
be targeted towards SCONUL members and other information professional
stakeholders in an effort to guide them in developing and maintaining services and
resources which enable digital scholarship.
The project is led by Stephane Goldstein of RIN and Alison Mackenzie of SCONUL,
and project officer is consultant Charlie Inskip. Project updates and detail can be
found at Twitter (@RILADS) and http://rilads.wordpress.com. Charlie will be actively
45
Research Information Literacy and Digital Scholarship (RILADS) Apr 2013
gathering examples of good practice over the next few months with a view to
disseminating results in the New Year.
For more information and to give recommendations of good practice examples,
contact:
Dr Charlie Inskip
Gmail: inskiprilads@gmail.com
Twitter: @RILADS / Blog: http://rilads.wordpress.com
RIN: Research Information Network: http://www.researchinfonet.org/
RIDLs: Research Information and Digital Literacies Coalition
http://www.researchinfonet.org/infolit/ridls/
SCONUL: Society of College, National and University Libraries
http://www.sconul.ac.uk/
JISC Design Studio – Developing Digital Literacies:
http://jiscdesignstudio.pbworks.com/w/page/46421608/Developing%20digital%20liter
acies
46
Research Information Literacy and Digital Scholarship (RILADS) Apr 2013
e. CILIP Update news story
Figure 6 CILIP Update, Dec 2012
47
Research Information Literacy and Digital Scholarship (RILADS) Apr 2013
f. Evaluation form
This project aims to deliver a small number of key outputs contributing to a wider investigation into the support
available to students, staff and researchers to enhance digital literacy. There are two strands to the project. One
is co-ordinated by Research Information Network (RIN) on behalf of Research Information and Digital
Literacies Coalition (RIDLs), the other by SCONUL under the JISC Developing Digital Literacies (DDL)
programme.
The RIN strand focuses on the identification and promotion of good practice in information handling and data
management training and development across the HE and research sectors. Its aim is to identify a representative
sample of case studies to illustrate information and data management training in Higher Education (including
those already documented in earlier research). The scope of these case studies will relate specifically to HE
researchers from postgraduate students to senior researchers (including supervisors).
Your resource has been nominated as a good practice example of information literacy for our research project,
Research Information Literacy and Digital Scholarship (RILADS). The information you provide in this form
will be used to help us assess and evaluate your resource. Hopefully it will also help you. Please feel free to
comment on the questions we are asking, should you wish.
We would be grateful if you could please complete this form and return to inskiprilads@gmail.com by 3rd
December 2012. We will be in touch if we require more information. Many thanks for your assistance. For more
information on the project please email us or see the project blog: http://rilads.wordpress.com/
The information you provide in this form will be used in reporting to the wider community. It will be
anonymised but should not be regarded as being confidential. If you wish to take part in this research but do not
wish your comments to be circulated and made available to others please make it clear on this form.
A. Details about the resource
Name of resource
URL
Hosting organization
Contact name
Email address
Phone number
B. Who is the course or resource designed for, and why?
48
Research Information Literacy and Digital Scholarship (RILADS) Apr 2013
Individual learners
1. Who are the learners that the course or resource is designed for?
a. By career stage (research students, research fellows, tenured
researchers…)
b. By discipline
2. What steps have you taken to assess learners’ need for the course or
resource?
a. If such steps have not been taken, what is the reason for this?
3. Given that the course or resource relates to information literacy, how does
it fit the broader professional development needs of the learners?
4. To what extent is the course or resource a response to demand from
learners, and if so, how is have you identified this?
5. Is participation by learners in previous similar training activities a factor in
helping you to determine demand?
6. Is such participation in previous activities analysed, in terms of range of
learners (for instance, by discipline or career stage)?
7. How is the course or resource made appropriate to learners, for instance
49
Research Information Literacy and Digital Scholarship (RILADS) Apr 2013
with regards to their current level of skill, years of experience, disciplinary
areas?
8. How accessible is the course or resource, particularly for learners with
diverse needs?
9. What do learners need to know already in order to benefit from the course
or resource?
a. Have you set a baseline to reflect this?
10. On the basis of the assessment of need and demand, what have you done
to communicate clear learning objectives to those who attend the course or
use the resource?
The broader context
11. How does the course or resource fit with your institutional and/or
departmental policy and practice on researcher development?
12. Can the course or resource be transferred or adapted to suit needs or
contexts other than the one for which it is designed?
C. What knowledge, skills and competencies is the
course or resource intended to provide?
1. What areas of information literacy does the course or resource cover?
50
Research Information Literacy and Digital Scholarship (RILADS) Apr 2013
a. Information searching
and discovery
b. Assessment and
analysis of information
sources
c. Citation and referencing
d. Data management and
curation
e. Plagiarism, fraud,
copyright and other
relevant legal issues
f. Data protection and/or
freedom of information
g. Publishing and
dissemination of
research results
(including open access)
h. Other
2. Is the course or resource informed by models or frameworks such as
the RDF and the Seven Pillars?
a. If so, how?
3. Have you sought to make use of the information lens of the RDF?
D. How is the course or resource delivered?
1. What form does the course or resource take?
a. Classroom-based
51
Research Information Literacy and Digital Scholarship (RILADS) Apr 2013
courses (lecture or
workshop)
b. Individual tuition
c. Online courses
d. Training material (printed
or digital)
e. Other
2. What would you describe as the main features of the course or
resource?
a. Mode of instruction
b. Length of course
c. Use of assignments
d. Assessed/non-assessed
e. Other
3. Who designs and delivers the course or resource?
a. Library
b. Graduate school
c. IS department
d. Other (who?)
4. What are the different roles and responsibilities of these various
players with regards to the design and delivery of the course and
resource?
5. What skills and know-how are required by those devising, running or
managing the courses and resources?
52
Research Information Literacy and Digital Scholarship (RILADS) Apr 2013
a. How do these skills and know-how relate to the different roles
and responsibilities?
b. How were these skills and know-how acquired?
6. What support is required to run the course or resource (personnel,
facilities, financial)?
a. If the courses and resources take the form of digital/online resources,
are they free for others to use or can they be readily purchased?
E. Do you have any further comments or questions?
Please return this completed form to inskiprilads@gmail.com
Are you prepared for us to follow up your reply with additional questions? Y/N
May we take direct anonymised quotations from your form to use in reporting, journal
and other publications? Y/N
53
Research Information Literacy and Digital Scholarship (RILADS) Apr 2013
g. Follow-up assessment criteria
Criteria for assessing courses or resources
1. How many learners, by career stage and discipline have taken part in the
course or used the resource?
2. If the course has been run previously, or if the resource has been previously
used, what is the trend in terms of number of learners?
3. What have been the reactions and feedback from learners, notably on
whether learning objectives have been met, and on quality, originality and
attractiveness of the course or resource?
4. What is shown by any evaluation and analysis of such feedback?
5. What are the changes in learners’ knowledge, skills and competencies
resulting from the course or resource?
6. How has this been ascertained?
7. What are the improvements in researcher attitude, confidence, behaviour,
performance and practice that might be attributable to the activity/resource?
8. How has this been ascertained?
9. What has been the broader impact of the activity/resource, i.e. the extent to
which recipients have become better researchers, and the way in which this
has benefitted the institution?
10. What has been the feedback from the departments or other units in which the
learners work?
11. What challenges/barriers have been encountered in implementing the
development intervention (including lack of resources), and how are these
managed and/or overcome?
12. What steps were taken to improve the course or resource as a result of any
evaluation?
And, finally:
13. Do you have any further comments or questions?
14. Are you prepared for us to follow up your reply with additional questions?
15. May we take direct anonymised quotations from your form to use in reporting,
journal and other publications?
54
Research Information Literacy and Digital Scholarship (RILADS) Apr 2013
h. Skills list
Teaching skills:
“Knowledge of Information Literacy Skills pedagogy, teaching skills,
current teaching practices and developments also appropriate teaching skills. In
addition to knowledge of e-learning.”
“Teaching ability / Presentation Skills /???”
“Pro-active in supporting participants through their blogs, as they progress”
“Presenters need good oral written and oral communications skills, plus flexibility
to adapt the differing needs of attendees – range of experiences, disciplines etc.”
“Many of the tutors have completed a PGCert in teaching in HE although it is not
required.”
“Presenters need good oral written and oral communications skills, plus flexibility
to adapt the differing needs of attendees – range of experiences, disciplines etc.”
Librarian skills
“expertise in the practice of literatures searching and evaluation and expert
knowledge of subject resources and databases “
“Background knowledge,technical knowledge (bibliometrics etc).”
“good knowledge of information and digital literacy”
“Understanding of the width of the information landscape and the research life
cycle.”
“Skills in the tools and resources covered by the activities”
University skills
“contextual understanding of university and HE, “
55
Research Information Literacy and Digital Scholarship (RILADS) Apr 2013
“Understanding of the specific needs of academic and staff, compared with those
of students; e.g. time frame of research, specificity of subject areas, time
pressures; wide variety of experiences and depth of knowledge of topics and
information resources.”
“Understanding of the specific needs of PGR students, compared with those of
UG?PG (T) students”
“Ability to liaise effectively with faculty and Skills Officers to promote the
programme”
Management skills
“Project Management”
Marketing skills
“creative skills for marketing posters; “
Life / office skills
“CPD, “
“keeping up to date”
“excellent organisational skills,”
“respect for the others’ role and expertise “
“an understanding of the foundation frameworks. “
“Able to manage time & be flexible when supporting participants”
“Collaborative approach in designing/promoting the course “
“Reflective when re-designing different iterations of the course”
“Presenters need good oral written and oral communications skills, plus flexibility
to adapt the differing needs of attendees”
“clerical skills for analyzing feedback forms, timetabling etc.”
56
Research Information Literacy and Digital Scholarship (RILADS) Apr 2013
“Ability to produce clear instructional materials (in MS Word)”
Researcher skills
“Understanding of the research experience”
“A thorough knowledge of the principles of research data management;”
“Understanding of researchers’ needs & the research process”
“Understanding of the width of the information landscape and the research life
cycle.”
“Understanding of research and understanding of effective online resource
design.”
“Understanding of postgraduate research “
Technical skills
“Dreamweaver editing, “
“uploading files to Blackboard”
“Ability to use site content management system (menu-driven)”
“IT Skills – various / “
“Technical skills, about the tools being described and taught “
“Ability to write for the web”
“maintaining the database.”
“Web skills for uploading materials to the VLE etc “
“Powerpoint skills at present”
57
Research Information Literacy and Digital Scholarship (RILADS) Apr 2013
i.
Short listing process
CLASSROOM
Bath
Birmingham
Cardiff
Durham
Loughborough
LSE
Oxford
Salford
Warwick
ONLINE
Cranfield
Edinburgh
Glasgow Caledonian (Pilot)
Loughborough
Manchester
Nottingham
Open University
**Alphabetical order by institution
1. The original long list of 42 was reduced to 27 by circulating evaluation
questionnaires to named representatives of the original 42 resources. 27
questionnaires were returned by the deadline of Dec 7th 2013.
2. The 27 questionnaires were coded by question. The qualitative questions
coding gave rise to a range of themes for each question. The coding of the
quantitative questions gave some insight into the breadth and depth of
provision provided by each resource.
3. The sample was split into two groups, one offering course or workshop-based
approach (15), the other predominantly online (12).
4. The evaluation questionnaires provided by each member of these groups was
examined, focusing on the three main questions (Who, What, How). Coded
responses were considered in detail. Resources which gave positive
responses, illustrating a considered approach to the the RIDLs criteria were
ranked for each question according to the breadth and depth of their
provision. This ranking was performed for each of the three main question
groups.
5. The three ranked lists were then combined to determine which of the
Classroom-based and which of the Online-based resources consistently rose
to the top of the ranked list.
58
Research Information Literacy and Digital Scholarship (RILADS) Apr 2013
6. In order to give a broad view of the resources being considered, the two lists
(Classroom / Online) were also evaluated in terms of the type and style of the
resources available. This involved detailed analysis of each returned
questionnaire in association with the actual content and style of the resource
available to the researcher.
7. The resources primarily designed for UG were removed from the sample.
8. The draft shortlist (above), with the subsequent addition of Warwick, were
then sent additional evaluation questionnaires, which sought to gather data on
the assessment and evaluation process followed by each of the 16 resources
on this list.
9. Those who return their completed questionnaires (or provide similar feedback
in a different format, such as their own evaluation reports) were then
assessed according to their responses. This enabled the draft short list of 16
to be reduced to the original planned 12 resources, 6 of each Classroom and
Online.
Limitations:
1. It should be noted that although this described approach includes some
quantitative elements the final draft list result should be considered in the light
of various subjective factors.
2. The original long list was derived by a combination of subjective researcher
selection from a range of resources found by online search and
recommendations from participants.
3. The analysed list of 27 was derived exclusively from the returned
questionnaires. If a questionnaire was not returned, the resource was not
included in the analysis. This self-selected nature of the project means that
the resources can only be considered as a ‘snapshot’ of examples of current
(Sep 2012 – Dec 2012) practice.
4. Subjective researcher input took a substantial part in interpreting and coding
the questionnaire responses.
59
Research Information Literacy and Digital Scholarship (RILADS) Apr 2013
5. The draft shortlisting ranking process was informed by a combination of
qualitative data and subjective researcher input.
6. The rationale for the final shortlist is not to present a ranked list of ‘best
practice’. This is not a competition. The final shortlist is designed to offer a
range of examples of different types of resource which may be used to inform
future practice.
7. The value of the RIDLs criteria, which were used to derive the data, performed
the essential function of giving a framework to the data collection and
analysis, aiding in the mitigation of the subjective nature of this type of
research.
60
Download