Powerpoint slides from Opening Session

advertisement
Effective, Sustainable and
Practical Library Assessment
Steve Hiller
Director, Assessment & Planning, University of Washington Libraries
ARL Visiting Program Officer
Jim Self
Director, Management Information Services, University of Virginia Library
ARL Visiting Program Officer
University of South Carolina University Libraries
December 3, 2007
Library Assessment
More than Numbers
Library assessment is a structured process:
• To learn about our communities
• To respond to the needs of our users
• To improve our programs and services
• To support the goals of the communities
Why Assess?
•
•
•
•
•
•
•
Accountability and justification
Improvement of services
Comparisons with others
Identification of changing patterns
Marketing and promotion
Opportunity to tell our own story
Using data, not assumptions, to make decisions
– Assumicide!
The Challenge for Libraries
• Traditional statistics are no longer sufficient
–
–
–
–
Emphasize inputs – how big and how many
Do not tell the library’s story
May not align with organizational goals and plans
Do not measure service quality
• Need measurements from the user’s perspective
• Need the organizational culture and the skills to
answer a basic question:
What difference do we make to our communities?
ARL Sponsored Assessment
• Tools
– ARL Statistics
– LibQUAL+®
– MINES for Libraries
• Building a Community of Practice
–
–
–
–
Library Assessment Conferences
Service Quality Evaluation Academy
Library Assessment blog
Workshops
Individual Library Consultation (Jim and Steve)
– Making Library Assessment Work (24 libraries in 2005-06)
– Effective, Sustainable, Practical Library Assessment (6 in 2007)
What We Found in Our Visits
• Strong interest in using assessment to improve customer
service and demonstrate value
• Uncertainty on how to establish and sustain assessment
• Lack of assessment knowledge among staff
• More data collection than data utilization
• Effectiveness not dependent on library size or budget
• Each library has a unique culture and mission
Effective Assessment
•
•
•
•
•
•
•
•
Focuses on the customer
Is aligned with library and university goals
Assesses what is important
Is outcomes oriented
Develops criteria for success
Uses appropriate and multiple assessment methods
Uses corroboration from other sources
Provides results that can be used
Sustainable Assessment Needs . .
•
•
•
•
•
•
•
Organizational leadership
Sufficient resources
Supportive organizational culture
Identifiable organizational responsibility
Connection to strategic planning and priorities
Iterative process of data collection, analysis, and use
Involvement of customers, staff and stakeholders
Practical Assessment
•
•
•
•
•
Keep it simple and focused – “less is more”
Know when enough is enough
Use assessment that adds value for customers
Present results that are understandable
Organize to act on results
Customer-Centered Library
and the Culture of Assessment
Customer-Centered Library
Culture of Assessment
• All services and activities are
viewed through the eyes of
the customers
• Customers determine quality
• Library services and resources
add value to the customer
• Organizational environment in
which decisions are based on
facts, research and analysis,
• Services are planned and
delivered to maximize positive
customer outcomes
It’s not about us! It’s about the customer.
Understanding our Customers: What We
Need to Know to Support Our Communities
•
•
•
•
•
•
What are their teaching, learning, and research interests?
How do they work? What’s important to them?
How do they find information needed for their work?
How do they currently use library/information services?
How would they prefer to do so?
How do they differ from each other in library use/needs?
How does the library add value to their work?
How does the library contribute to their success?
If It Was Only This Easy!
Good Assessment Starts Before You
Begin . . . Some Questions to Ask
• Define the question
– What do you need to know and why
• How will you use the information/results
• Where/how will you get the information
– Methods used
– Existing data
– New data (where or who will you get it from)
• How will you analyze the information
• Who will act upon the findings
University of Washington
(Site of the 2008 Library Assessment Conference!)
• Located in beautiful Seattle
metro population 3.2 million
• Comprehensive public research
university
– 27,000 undergraduate students
– 12,000 graduate and professional
students (80 doctoral programs)
– 4,000 research and teaching
faculty
• $800 million annually in federal
research funds (2nd in U.S.)
• Large research library system
– $40 million annual budget
– 150 librarians on 3 campuses
University of Washington Libraries
Assessment Methods Used
• Large scale user surveys every 3 years (“triennial
survey”): 1992, 1995, 1998, 2001, 2004, 2007
• In-library use surveys every 3 years beginning 1993
• Focus groups/Interviews (annually since 1998)
• Observation (guided and non-obtrusive)
• Usability
• Usage statistics/data mining
Information about assessment program available at:
http://www.lib.washington.edu/assessment/
Case Study: UW Libraries Review of
Support of Bioscience Programs
Reasons for review
• Better understand how bioscientists work
• Understand significance and value of bioscience and
research enterprise to University
• Gauge extent and impact of interdisciplinary research
• Understand implications of changes in library use patterns
• Review viability of Libraries organizational
structure/footprint
• Strengthen library connection in support of bioscience
programs and the research enterprise
The Importance of the Research Enterprise
University of Washington Operating Revenues
$2.4 Billion in 2005-06
Other
4%
Gifts
9%
Research Grants
43%
Investment
Income 13%
State
Appropriation
15%
Tuition
16%
Research Grants $1 Billion
Health and Human Services
$510 million
National Science Foundation
$95 million
Other federal agencies
$190 million
Industry/Foundations
$100 million
Other non-federal
$110 million
More Than Surveys and Statistics
The Qualitative Often Provides the Key
• Increased use and importance of such qualitative
methods as, comments, interviews, focus groups,
usability, observation
• Statistical data often can’t tell us
– Who, how, why
– Value, impact, outcomes
• Qualitative provides information directly from users
– Their language
– Their issues
– Their work
• Qualitative provides context and understanding
Biosciences Review Process (2006)
• Define scope (e.g. what is “bioscience”?)
• Identify and mine existing data sources
– Extensive library assessment data
• Including usage information
– Institutional and external data
• Including bibliometric information
• Acquire new information through a customercentered qualitative approach
–
–
–
–
Environmental scan
Interviews (12 faculty)
Focus groups (6 total – 3 faculty, 2 grad, 1 undergrad)
Peer library surveys
• NO NEW USER SURVEYS
Faculty Interview Key Findings
•
•
•
•
•
•
•
First stop: Google or Pub Med Central; also WOS
Those with grant support tend to buy books from Amazon
The transaction costs from discovery to delivery is too high
Need to integrate fragmented library systems and processes
Graduate students are self-sufficient in finding information
Faculty who teach undergraduates use libraries differently
Had difficulty coming up with “new services” unprompted
Focus Group Themes
• Content is primary link to the library
– Identify library with ejournals; want more titles & backfiles
• Print is dead, really dead
– If not online want it delivered online
• Provide library-related services and resources in
our space not yours
– Discovery begins outside of library space with Google and
Pub Med; lesser use of library bibliographic data bases
– Faculty/many grads go to physical library as last resort; too
many physical libraries
• Lack understanding of many library services and
resources
Biosciences Task Force Recommendations
• Integrate search/discovery tools into users workflow
• Expand/improve information/service delivery options
• Make physical libraries more inviting/easier to use
– Consolidate libraries, collections and service points
– Reduce print holdings; focus on services and work space
•
•
•
•
Use an integrated approach to collection allocations
Get librarians to work outside library space
Lead/partner in scholarly communications and E-science
Provide more targeted communication and marketing
In God We Trust:
All Others Must Bring Data
Did themes raised in the interviews/focus groups reflect
the bioscience population? The campus community?
The 2007 Triennial Survey as corroborating source
Related Questions
Mode of access (in-person, remote)
Resource type importance
Sources consulted for research
Primary reasons for using Libraries Web sites
Libraries contribution to work and academic success
Useful library services (new and/or expanded)
UW Triennial Library Survey
Number of Respondents and Response Rate 1992-2007
2007
2004
2001
1998
1995
1992
Faculty
1455
36%
1560
40%
1345
36%
1503
40%
1359
31%
1108
28%
Grad
Student
580
33%
627
40%
597
40%
457
46%
409
41%
560
56%
Undergrad 467
20%
502
25%
497
25%
787
39%
463
23%
407
41%
Mode of Library Use by Group 2007
(weekly or more often)
Undergrad
Remote &
Visit
39%
Visit Only 27%
Grad
Visit
Only 2%
Faculty
Visit
Only 1%
Remote &
Visit
19%
Remote
Only
14%
Remote &
Visit
45%
Remote
Only
47%
Remote
Only
72%
Non- Weekly
20%
Non- Weekly
6%
Non- Weekly
8%
Library as Place
Change In Frequency of In-Person Visits 1998-2007 (weekly+)
80%
Grad
70%
Undergrad
Undergrad
60%
50%
Grad
Faculty
40%
30%
Faculty
20%
1998
2001
2004
2007
Change in Off-Campus Remote Use 1998-2007
(Percentage using library services/collections at least 2x week)
70%
70%
Grad
60%
60%
50%
50%
Faculty
40%
40%
30%
30%
20%
Undergrad
20%
10%
10%
0%
0%
1998
2001
2004
2007
Graduate Student Mode of Access
by Academic Area(% Using at least 2x week)
80%
Visit In Person
Connect from Campus
Connect Off-Campus
70%
60%
50%
40%
30%
20%
10%
0%
Health Sciences
BioSciences
Phys Sci-Engineering
Hum-Soc Sciences
Where Do they Go? Sources Consulted for
Information on Research Topics (
Scale of 1 “Not at All” to 5 “Usually”)
Undergrad
Open Internet
Search
Grad
Faculty
Open Internet Ref
Source
Bibliographic
Databases
2.5
2.75
3
3.25
3.5
3.75
4
4.25
4.5
“If it’s not on the Internet, it doesn’t exist.” My students at all levels behave this way.
They also all rely on Wikipedia almost exclusively for basic information.
Associate Professor, English
Faculty: Resource Type Importance
by Academic Area
(Scale of 1 “not important” to 5 “very important”)
5
Journals>1985
4.5
4
3.5
3
2.5
Health Sciences
Books
BioSci
Journals<1985
Phy Sci-Engin
Bib Databases
Hum-Soc Science
Journals>1985
Reasons for Faculty Use of Libraries Web
Sites by Academic Area (Use at least 2x per week)
75%
65%
55%
45%
35%
25%
15%
Health Sci
BioSci
Library Catalog
Phy Sci-Engin
Bib Database
Hum-Soc Sci
Online journal articles
Usefulness of New/Expanded Services
for Faculty & Grads
Scan on Demand
Digitize specialized
collections
Office Delivery of
Books
Grad
Faculty
Integrate services into
campus Web sites
Manage your info and
data
0%
10%
20%
30%
40%
50%
60%
70%
80%
Libraries Contribution to: (
Scale of 1 “Minor” to 5 “Major”)
Being a more productive
researcher
Keeping current in your
field
Finding info in new or
related areas
Efficient Use of Time
Grad
Faculty
Academic Success
3
3.25
3.5
3.75
4
4.25
4.5
4.75
Survey Follow-Up Actions
• Probe deeper on specific library contributions to research
and student academic success using qualitative methods
– Interviews/focus groups beginning Winter 2008
– Review scope and effectiveness of information literacy programs
• Develop plan to deliver “print” content to faculty & grad
students in their format of choice and in their space
– Pilot test “scan on demand” begins January 2008
• Strengthen our subject librarian liaison efforts to better
understand and support research in their areas
– Develop standardized toolkit for assessing library connection to
research enterprise. Revisit scholarly communications policy
• Integrate library services & resources into user workflows
How UW Libraries Has Used Assessment
•
•
•
•
•
•
•
•
•
Extend hours in Undergraduate Library (24/5.5)
Create more diversified student learning spaces
Eliminate print copies of journals
Enhance usability of discovery tools and website
Provide standardized service training for all staff
Stop activities that do not add value
Change/reallocate budgets and staffing
Inform the strategic planning process
Support budget requests to University
How Are We Doing?
Overall Satisfaction by Group 1995-2007
4.6
4.6
Faculty 4.56
4.5
4.4
4.33
4.3
4.5
4.44
4.33
4.34
Faculty 4.25
4.26
4.32
4.4
Grad 4.36
UW Seattle UG
4.3
4.36
4.2
4.1
4.2
Grad 4.18
4.11
4.22
4.1
4
3.9
4
Undergrad 3.97
3.99
3.9
3.8
3.8
1995
1998
2001
2004
2007
You guys and gals rock!!!!!! We need to invest in our library system to keep it the best system in
America. The tops! My reputation is in large part due to you. Professor, Forest Resources
UVA MIS Consult
Collecting the Data at U.Va.
•
•
•
•
•
Customer Surveys
Staff Surveys
Mining Existing Records
Comparisons with peers
Qualitative techniques
Corroboration
• Data are more credible if they are supported by
other information
• John Le Carre’s two proofs
UVa Customer Surveys
• Faculty
– 1993, 1996, 2000, 2004
– Separate analysis for each academic unit
– Response rates 59% to 70%
• Students
–
–
–
–
1994, 1998, 2001, 2005
Separate analysis for grads and undergrads
Undergrad response rates 43% to 50%
Grad response rates 54% to 63%
LibQUAL+™ in 2006
Analyzing U.Va. Survey Results
• Two Scores for Resources, Services, Facilities
– Satisfaction = Mean Rating (1 to 5)
– Visibility = Percentage Answering the Question
• Permits comparison over time and among groups
• Identifies areas that need more attention
Reference Activity and Visibility in
Student Surveys
7,000
6,008
Reference Questions
Recorded per Week
in Annual Sample
64%
Visibility
39%
Visibility
34%
Visibility
Reference Visibility
among Undergraduate
75%
Visibililty
1,756
10%
1,000
1993
1994
1995
1996
1997
1998
1999
2000
2001
2002
2003
2004
Making the most of LibQUAL
•
•
•
•
•
•
•
Scan the results by user category
Use thermometer charts
Identify high and low desire areas
Identify the ‘red zones’
Examine the comments
Compare satisfaction scores with peers
Follow up on problem areas
LibQUAL+ 2006
University of Virginia Faculty
9.00
8.00
7.00
6.00
Top of Blue Bar = Desired Level of Service
Bottom of Blue Bar = Minimum Level of Service
Red Square = Perceived Service Performance
5.00
AS-1 AS-2 AS-3 AS-4 AS-5 AS-6 AS-7 AS-8 AS-9
IC-1
IC-2
IC-3
IC-4
IC-5
IC-6
IC-7
IC-8
LP-1
LP-2
LP-3
LP-4
LP-5
LibQUAL+ 2006
University of Virginia Library
Areas Needing Attention
9
Grads
Remote Access
Faculty
Library Website
Faculty
Journals
Grads
Journals
8
7
Top of Blue Bar = Desired Level of Service
Bottom of Blue Bar = Minimum Level of Service
Red Square = Perceived Service Performance
6
IC-1 Remote Access Grads
IC-2 Website Faculty
IC-8 Journals Faculty
IC-8 Journals Grads
LibQUAL+ 2006
University of Virginia
Faculty Ratings of Journal Collections
9.00
8.00
7.00
Top of Blue Bar = Desired Level of Service
Bottom of Blue Bar = Minimum Level of Service
6.00
Red Square = Perceived Service Performance
5.00
UVA Faculty Overall
Architecture Faculty
Education Faculty
Engineering Faculty
Humanities Faculty
Science/Math Faculty
Social Science Faculty
LibQUAL Follow Up on Journals
•
•
•
•
•
Examining the comments
Drilling into data
Talking to faculty and grad students
Corroborating with other data
Comparing with other libraries
2006 LibQUAL+™ Results
UVa and ARL Overall Satisfaction
Undergrad
Overall
Grad Overall
Faculty
Overall
UVa
7.52
7.48
7.87
ARL
Range
6.61 to
7.63
6.51 to
7.63
5.87 to
7.87
ARL
Mean
7.18
7.16
7.24
The Balanced Scorecard
Managing and Assessing Data
• The Balanced Scorecard is a layered and
categorized instrument that
– Identifies the important statistics
– Ensures a proper balance
– Organizes multiple statistics into an intelligible
framework
Metrics
• Specific targets indicating full success, partial
success, and failure
• At the end of the year we know if we have met
our target for each metric
• The metric may be a complex measure
encompassing several elements
What Do We Measure?
•
•
•
•
•
•
Customer survey ratings
Staff survey ratings
Timeliness and cost of service
Usability testing of web resources
Success in fundraising
Comparisons with peers
Metric U.1.A:
Overall Rating in Student and
Faculty Surveys.
• Target1: A score of at least 4.00 (out of 5.00)
from each of the major constituencies.
• Target2: A score of at least 3.90 from each of
the major constituencies.
• FY05 Result: Target1
– Undergraduates 4.08
– Graduate Students 4.13
Metric U.4.B: Turnaround
time for user requests
• Target1: 75% of user requests for new books
should be filled within 7 days.
• Target2: 50% of user requests for new books
should be filled within 7 days.
• Result FY06: Target1.
– 79% filled within 7 days.
Metric U.3.A:
Circulation of New Monographs
• Target1: 60% of newly cataloged monographs
should circulate within two years.
• Target2: 50% of new monographs should
circulate within two years.
• Result FY06: Target1.
– 61% circulated.
Using Data for Results at UVa
•
•
•
•
•
•
•
Additional resources for the science libraries (1994+)
Redefinition of collection development (1996)
Initiative to improve shelving (1999)
Undergraduate library open 24 hours (2000)
Additional resources for the Fine Arts Library (2000)
Support for transition from print to e-journals (2004)
New and Improved Study Space (2005-06)
in conclusion
Assessment is not…
•
•
•
•
Free and easy
A one-time effort
A complete diagnosis
A roadmap to the future
Assessment is…
•
•
•
•
A way to improve
An opportunity to know our customers
A chance to tell our own story
A positive experience
Moving Forward
•
•
•
•
•
•
Keep expectations reasonable and achievable
Strive for accuracy and honesty—not perfection
Assess what is important
Use the data to improve
Keep everyone involved and informed
Focus on the customer
For more information…
• Steve Hiller
hiller@u.washington.edu
www.lib.washington.edu/assessment/
• Jim Self
– self@virginia.edu
– www.lib.virginia.edu/mis
– www.lib.virginia.edu/bsc
• ARL Assessment Service
www.arl.org/stats/initiatives/esp/
Download