OLA Services Assessment

advertisement
Library Services
Assessment
Isla Jordan, Carleton University
Julie McKenna, University of Regina
February 2, 2007
OLA Super Conference 2007: Session 1408
Outline
1.
2.
3.
4.
5.
6.
7.
8.
Definition and Purpose
Survey of Assessment Practices
Types of Assessment
Benchmarks, Standards and EBL
Drivers of Assessment
Tools and Techniques
Assessment strategy
Questions
Assessment
… a critical tool for understanding
library customers and offering
services, spaces, collections, and
tools that best meet their needs.
Without good assessment, libraries
could lose touch with users’ desires
and needs and even become
irrelevant.
Nardini (2001)
Assessment
…any activities that seek to measure the
library’s impact on teaching, learning and
research as well as initiatives that seek to
identify user needs or gauge user
satisfaction or perceptions with the
overall goal being the data-based and
user-centered continuous improvement
of our collections and services.
Pam Ryan,
libraryassessment.info
The purpose of assessment
in libraries
1.
2.
To understand user interaction with
library resources and services; and
To capture data that inform the
planning, management and
implementation of library resources
and services.
Bertot, 2004
Survey of Assessment Practices
in Canadian University Libraries
Winter 2007
Survey of Assessment
Practices - Purpose
1.
2.
Benchmark services
assessment practice
Capture some measures about
the culture of assessment
Survey Sections
 Demographic
Information
 Assessment Planning
 Involvement in Assessment in
Organization
 Collection and Use of Data to Inform
Decision-Making
 Final Comments
Survey Participants

Invitation to complete a web-based survey to all
University Librarians of:
 Council
of Prairie and Pacific University Libraries
(COPPUL)
 Ontario Council of University Libraries (OCUL)
 Council of Atlantic University Libraries (CAUL/AUBO)

Invitation (February 12, 2007) to complete a
French edition of the web-based survey:
 members
of Conférence des recteurs et des
principaux des universités du Québec (CREPUQ)
Survey of Assessment
Practices
 English
Survey
60
invitations; 39 respondents
65% response rate
 French
To
Survey
launch February 12, 2007
Thank you to …








University of Toronto
UWO
Queen’s University
McMaster
University of Windsor
York University
Guelph University
Nipissing University







University of Waterloo
Carleton University
Brock University
Memorial University
University of
Saskatchewan
UBC
University of Alberta
And many more….
Types of Assessment
1.
2.
3.
4.
Input & Output
Service Quality
Performance Measures
Outcomes or Impact
1. Input & Output

Input measures: expenditures & resources
 Funding
allocations, # of registered students, print
holdings, etc.

Output measures: activities & service traffic
 Reference
transactions, lending and borrowing
transactions, # of instruction sessions, program
attendance, etc.

Ratios
 Students/librarians,
print volume holdings/student,
reference transactions/student, etc.
Survey Results – how output
data is used
 Type
of data
Gate
count
Body counts
Reference
transactions
Circulation
statistics
 Decision-making
Hours
Staffing
&
scheduling
Service points
Collection
decisions
2. Service Quality
Services defined as all programs,
activities, facilities, events, …
 Measure capture results from interactions
with services
 Subjective evaluation of “customer
service”
 Measure of the affective relationship

“The only criteria that count in
evaluating service quality are
defined by customers.
Only customers judge quality;
all other judgments
are essentially irrelevant.”
(Zeithaml, Parasuraman and Berry 1990)
LibQUAL+

Association of Research Libraries
 Standard
for service quality assessment
(2003)

Total market survey
 Based

in Gap Analysis Theory
User perceptions and expectations of services
 Measures
outcomes and impacts
3. Performance Measures
Involves the use of efficiency and
effectiveness measures
 Availability
of resources
 Usability of programs, resources and services
 Web page analysis
 Content analysis
 Functionality analysis
 Cost analysis
4. Outcomes or Impacts
“the ways in which library users
are changed as a result of their
interaction with the Library's
resources and programs”
Association of College & Research Libraries Task Force on
Academic Library Outcomes Assessment Report, 1998
Examples


The electronic journals were used by 65
scholars in the successful pursuit of a total of
$1.7 million in research grants in 2004.
In a 2003 study, eighty-five percent of new
faculty reported that library collections were a
key factor in their recruitment.
LibQUAL+ Measures
Outcomes




The library helps me stay abreast of
developments in my field(s) of interest.
The library aids my advancement in my
academic discipline.
The library enables me to be more efficient in my
academic pursuits.
The library helps me distinguish between
trustworthy and untrustworthy information.
Benchmarks, standards and
EBL

Standards: “Measures that tie the value of
libraries more closely to the benefits they
create for their users”
NISO 2001 (National Information Standards
Organization)

Benchmarking: improving ourselves by
learning from others (UK Public Sector
Benchmarking Service)
Benchmarks, standards and
EBL

EBL (Evidence Based Librarianship): “attempts
to integrate user reported, practitioner-observed
and research-derived evidence as an explicit
basis for decision-making.”
(Booth, “Counting What Counts” 2006)
Example of a Standard
Example: Information Literacy Standards for Science and
Engineering Technology (ACRL 2006)
Standard #1: The information literate student determines the
nature and extent of the information needed.
Performance Indicator #3: The information literate student has a
working knowledge of the literature of the field and how it is
produced.
Outcome #a: ... student knows how scientific, technical, and
related information is formally and informally produced,
organized, and disseminated.
CACUL Standards Committee

Goals:

Add Canadian context to existing standards
in college and university libraries, e.g. ACRL
 prepare report for CACUL AGM at CLA 2007
 form new team in summer 2007
contact Jennifer Soutter jsoutter@uwindsor.ca
Survey Results: Drivers of
Assessment
University Library Administration
Need for evidence to inform planning
University Administration
CARL, ARL or regional lib. Consortium
92%
87%
62%
54%
Multiple Methods of
Listening to Customers

Transactional surveys

Mystery shopping

New, declining, and lost-customer surveys

Focus group interviews

Customer advisory panels

Service reviews

Customer complaint, comment, and inquiry capture

Total market surveys

Employee field reporting

Employee surveys

Service operating data capture
Note. A. Parasuraman. The SERVQUAL Model: Its Evolution And Current Status. (2000). Paper presented
at ARL Symposium on Measuring Service Quality, Washington, D.C.
Canadian Adoption of
LibQUAL+: Benefits



Quick, inexpensive
Standardized and tested instrument and practice
Data set of comparables for Canada



Insight into best practices at peer institutions
Build staff expertise and encourage evidence
based practice and practitioners
Opportunity to introduce Canadian changes to
instrument
User Surveys: LibSAT, LibPAS



continuous customer feedback
LibSAT measures satisfaction
LibPAS (beta) measures performance
http://www.countingopinions.com/
Usability testing


gives user perspective
often for website design:


e.g. “user driven web portal design” (U Toronto
2006)
also for physical space:

e.g. “wayfinding” in library:
http://www.arl.org/arldocs/stats/statsevents/laconf/2006/Kre
ss.ppt
Instruction Program Example -Assessment Methods

Learning outcomes




Program service measures (outputs)


# of instruction sessions offered, requests for course specific
support, # of session attendees, by discipline, by faculty
member, by course, logins to library-created online tutorials, # of
course pages created within university’s learning portal, etc.
Student course evaluations & peer evaluations


Student performance on examinations, assignments
Pre- and post-test results
Level of "information literacy"
Qualitative and quantitative
Service quality assessment

LibQUAL+ (gap between expectations and perceptions)
Examples

Use patterns
 laptop

loans, GIS over paper maps, eBooks…
Space usage studies
 e.g.
Learning Commons study (University of
Massachusetts Amherst)

Instruction and Information Literacy
 e.g.
use of online learning modules
Electronic resources
assessment
statistics not being systematically captured
for digital collections or services
 need for standard measures for use of
digital collections is increasingly important:

 to
justify huge expenses of electronic
collections
 decline in use of traditional services
(reference, ILL)
Electronic resources
assessment
COUNTER: Real-time acquisition of usage
statistics:
 imports
usage statistics from content vendors in a
uniform format (COUNTER - Counting Online Usage
of Networked Electronic Resources)
 reduces need to retrieve statistical data on a
resource-by-resource basis
 can compare usage statistics with cost information to
evaluate service benefits of e-resources
Electronic resources
assessment

Output statistics for ScholarsPortal
databases and e-journals, e.g.



the number of requests for articles
holdings of different aggregators, to see overlap
Web logs, to see patterns of use
Survey Results: Statistics electronic resources
25
20
Never
Sometimes
Frequently
Always
15
10
5
0
Gather
Analyze
Survey Results: Electronic
resources assessment
 "we
are gathering e-resources stats as part of
an overall journal review "
 “The
Library is currently reviewing Scholarly
Statistics, a product designed to gather and
present for analysis e-resource statistics. Also
under consideration is an ERM which, along
with its other capabilities, will provide statistic
analysis.”
Electronic resources
assessment
“I have been busy this week with the compilation of
electronic journal usage statistics for ARL. To complete
Section 15 (Number of successful full-text article
requests) in the Supplementary Statistics section, I am
limiting myself to Counter-compliant JR1 statistics
provided by the publisher. Still, I am encountering
unexpected complexities. .. The JR1 format is based on
an the calendar year, but the ARL statistics are reported
on the budget year. This means for every publisher I
have to compile two years worth of data and manipulate
it.” http://www.libraryassessment.info/
Surveys, Interviews, Focus Groups

Surveys
 quick to implement,
 identify issues, pick
 wording is critical
 test, test, test ….
 users

difficult to design
up anomalies
over-surveyed
Interviews and focus groups
 more scope for follow-up, explanation
 subjective, time-consuming
Survey Results: Top 5
planned assessment studies
1.
2.
3.
4.
5.
User satisfaction survey / LibQUAL
Gate traffic study
Electronic database use
Electronic journal use
Usability of the website
Survey Results:
Staff Abilities

Strengths
 Formal
presentations
 Formal reports
 Draw conclusions
 Make
recommendations
 Project management
 Facilitate focus
groups

Weaknesses
 Sampling
 Research
design
 Focus group research
 Survey design
 Qualitative analysis
Challenges of assessment





Gathering meaningful data
Acquiring methodological skills
Managing assessment data
Organizing assessment as a core activity
Interpreting data within the context of user
behaviours and constraints.
(Troll Covey, 2002)
Survey Results: Where is
assessment placed?






Assessment Librarian (2 institutions)
Assessment Coordinator
Libraries Assessment and Statistics Coordinator
Library Assessment and Information Technology
Projects Coordinator
Librarian, Evaluation & Analysis
Manager, Evaluation & Analysis
Survey Results: Who else is
assigned assessment
responsibility?
distributed to all unit heads or team
leaders (4)
 AULs have responsibility (6)
 UL or Director (3)
 administrative or executive officer (4)
 access services or circulation (3)
 other positions (12)

Survey Results: Committees











Assessment Committee
Priorities and Resources Committee
Statistics Committee
LibQual Committee
LibQUAL+ Working Group
Library Services Assessment Committee
Community Needs Assessment Committee
PR/Communications Committee
Accreditation Self-Study Steering Committee
Senior Management Group
Cooperative Planning Team
Q. 10 Does your library have an assessment
plan?
10%
Yes
33%
No
57%
Not yet but one is in
progress
Q. 11 At your university, do you feel that
there is greater emphasis on assessment than
in previous years?
5%
21%
Yes
No
Undecided
74%
Q. 12 Do you anticipate that the impetus for
assessment practice will be greater next year
8%
than this year?
10%
Yes
No
Undecided
82%
Q. 13 Some see evidence-based assessment practice
as a trend. Do you believe that the increasing
interest in this assessment practice will continue?
13%
3%
Yes
3%
No
Undecided
Don’t agree it is a
trend
81%
Services Assessment
Strategy
“The evaluation environment is increasingly
complex, and requires knowledge of
multiple evaluation frameworks,
methodologies, data analysis techniques,
and communication skills”
Note. J.T. Snead et al. Developing Best-Fit Evaluation Strategies. (2006). Paper presented at Library
Assessment Conference, Virginia.
Assessment –
Continuing Commitment
Research Question
Reporting
Methodology
Analysis
Services Assessment
Strategy

Decide what you need to know and why
 Assign
priorities
 Confirm timelines
Commit to and carry out methodologies for
discovery
 Analysis and reporting
 Continuous assessment and reporting
commitment

Culture of Assessment




is an organizational environment in which decisions
are based on facts, research and analysis
where services are planned and delivered in ways
that maximize positive outcomes and impacts for
customers and stakeholders
exists in organizations where staff care to know
what results they produce and how those results
relate to customers’ expectations
organizational mission, values, structures, and
systems support behavior that is performance and
learning focused.
(Lakos, Phipps and Wilson, 1998-2002)
Resources
ARL:

ARL New Measures website (background info)

Canadian LibQUAL consortium
 summer 2007 workshop
 Sam Kalb kalbs@post.queensu.ca

Service Quality Evaluation Academy (“boot
camp”)
Resources
ARL (cont’d):

ARL visit: “Making Library Assessment Work”
 11/2 day visit from Steve Hiller and Jim Self
 pre-visit survey, presentation to staff, interviews,
meetings, written report
 UWO participated - for more information, contact
Margaret Martin Gardiner mgardine@uwo.ca

2006 Library Assessment Conference
http://new.arl.org/stats/statsevents/laconf/index.shtml
Resources
Assessment blog:
libraryassessment.info
Journals, conferences:
 Performance Measurement and Metrics
 Evidence Based Library and Information
Practice
 Northumbria International Conference on
Performance Measures
Resources
Books & Papers:
 Blecic,
D.D., Fiscella, J.B. and Wiberley, S.E.Jr.
(2007) Measurement of Use of Electronic Resources:
Advances in Use Statistics and Innovations in
Resource Functionality, College & Research
Libraries, 68 (1), 26-44.
 Booth, A.
(2006) Counting what counts: performance
measurement and evidence-based practice.
Performance Measurement and Metrics, 7 (2), 63-74
 Brophy,
P. (2006) Measuring Library Performance:
principles and techniques, London, Facet Publishing.
Resources
Books & Papers:
 Bertot,
J.C. et al. (2004) Functionality, usability, and
accessibility: Iterative user-centered evaluation
strategies for digital libraries. Performance
Measurement and Metrics, 7 (1) 17-28.
 Brekke,
E. (1994) User surveys in ARL libraries.
SPEC Kit 205, Chicago, American Library Association
 Covey,
D.T. (2002) Academic library assessment:
new duties and dilemmas, New Library World, 103
(1175/1176), 156-164.
Resources
Books & Papers:

Lakos, A., Phipps, S. and Wilson, B. (1998-2000)
Defining a “Culture of Assessment”.
http://personal.anderson.ucla.edu/amos.lakos/Present
/North2001/Culture-of-Assessment-2001-4.pdf
 Nardini,
H.G. (2001) Building a Culture of
Assessment, ARL Bimonthly Report, 218 (Oct 2001).
http://www.arl.org/resources/pubs/br/index.shtml
Resources
Books & Papers:
 Snead,
J.T. et al. (2006) Developing Best-Fit
Evaluation Strategies. Library Assessment
Conference, Virginia.
<http://www.arl.org/stats/statsevents/laconf/06schedul
e.shtml>
 Zeithaml,
V.A., Parasuraman, A. and Berry, L.L.
(1990) Delivering Quality Service: balancing customer
perceptions and expectations, London, Collier
Macmillan.
Thank you!
Questions or comments
are welcome
Contact us:
Isla Jordan, Carleton University
Isla_Jordan@carleton.ca
Julie McKenna, University of Regina
Julie.McKenna@uregina.ca
Download