Proving the Value of Your Library

advertisement
November 11th-13th, 2014
Expo Centre , Sharjah, United Arab Emirates
Proving the Value of Your Library
When “Everything’s Available on
the Internet”
Lynn Silipigni Connaway, Ph.D.
Senior Research Scientist
OCLC Research
Chair, ACRL Value of Academic Libraries Committee
Sharjah International Book Fair
ALA Library Conference
The Road Travelled
2
Value of Academic Libraries Reports
Freely available http://acrl.org/value
3
Themes from Summits
Accountability
Unified approach
Student learning/success
Evidence-based
4
Value of Academic Libraries Initiative
Keep Up-to-Date
• Value of Academic
Libraries Blog
• Valueography
Outreach &
Collaboration
• Presentations (e.g.
CNI, LAC, &
Northumbria)
• ACRL Liaisons
Assembly
Assessment
Management Systems
 Under Discussion
• Librarian Competencies
• Research agenda
 In Process
 Library Poster
5
ACRL Plan for Excellence
Value of Academic Libraries
Goal: Academic libraries demonstrate alignment with and impact on
institutional outcomes.
Objectives:
•
•
•
•
Leverage existing research to articulate and promote the value of
academic and research libraries.
Undertake and support new research that builds on the research
agenda in The Value of Academic Libraries: A Comprehensive Review
and Report. And the summit white paper, Connect, Collaborate, and
Communicate: A Report from the Value of Academic Libraries
Influence national conversations and activities focused on the value of
higher education.
Develop and deliver responsive professional development programs
that build the skills and capacity for leadership and local datainformed and evidence-based advocacy.
6
Planning
Sharing
6.
Planning
Change
March-May
2014
5.
Analyzing
Evidence
Reflecting
January February 2014
1. Defining
Outcome(s)
June- July
2013
Cycle of Assessment
[focused on]
Library Value
2. Setting
Criteria
3. Performing
Action(s) &
4. Gathering
Evidence
Acting
AugustDecember 2013
7
Recommendations















Define outcomes
Create or adopt systems for assessment management
Determine what libraries enables students, faculty, student affairs professionals,
administrators and staff to do
Develop systems to collect data on individual library user behavior, while maintaining
privacy
Record and increase library impact on student enrollment
Link libraries to improved student retention and graduation rates
Review course content, readings, reserves, and assignments
Document and augment library advancement of student experiences, attitudes, and
perceptions of quality
Track and increase library contributions to faculty research productivity
Contribute to investigate library impact on faculty grant proposals and funding, a means of
generating institutional income
Demonstrate and improve library support of faculty teaching
Create library assessment plans
Promote and participate in professional development
Mobilize library administrators
Leverage library professional associations
8
Recommendations
1. Increase the profession’s understanding of library
value in relation to various dimensions of student
learning and success
2. Articulate and promote the importance of
assessment competencies necessary for
documenting and communicating library impact on
student learning and success.
3. Create professional development opportunities for
librarians to learn how to initiate and design
assessment that demonstrates the library’s
contributions to advancing institutional mission and
strategic goals.
9
Recommendations cont.
4. Expand partnerships for assessment
activities with higher education constituent
groups and related stakeholders.
5. Integrate the use of existing ACRL
resources with library value initiatives.
10
Assessment in Action Goals
Professional Competencies
Collaborative Relationships
Approaches, Strategies, Practices
11
Team Approach
Faculty
Member
Institutional
Researcher/
Assessment
Officer
Librarian
Leader
12
AiA 2014 Institutional Teams
15
Library Factors Examined
• instruction: games, single/multiple session,
course embedded, tutorials
• reference
• physical space
• discovery: institutional web, resource guides
• collections
• personnel
18
Variety of Tools/Methods
•
•
•
•
•
•
•
•
survey
interviews
focus group(s)
observation
pre/post test
rubric
student portfolio
research
paper/project
•
•
•
•
•
other class assignments
test scores
GPA
degree completion rate
retention rate
19
Some Initial Questions
• What is your definition of assessment?
• What comes to mind when you hear the term
“assessment”?
• What benefits do you see for assessment?
• What are your concerns?
20
Assessment Defined
Process of…
–
–
–
–
–
–
–
Defining
Selecting
Designing
Collecting
Analyzing
Interpreting
Using
Collecting
Analyzing
Interpreting
information to increase service/program
effectiveness
21
Why Assessment?
• Answers questions:
– What do users/stakeholders
want & need?
– How can services/programs
better meet needs?
– Is what we do working?
– Could we do better?
– What are problem areas?
• Traditional statistics don’t tell
whole story
22
Importance of Assessment
“Librarians are increasingly called
upon to document and articulate the
value of academic and research
libraries and their contribution to
institutional mission and goals.”
(ACRL Value of Academic Libraries, 2010, p. 6)
23
Formal vs.
Informal Assessment
• Formal Assessment
–
–
–
–
Data driven
Evidence-based
Accepted methods
Recognized as rigorous
• Informal Assessment
– Anecdotes & casual observation
– Used to be norm
– No longer acceptable
24
What We Know About Assessment
• Ongoing process to understand & improve
service
• Librarians are busy with day-to-day work &
assessment can become another burden
• Can build on what has already been
done or is known
25
Outcomes Assessment Basics
• Outcomes: “The ways in which library users are
changed as a result of their contact with the library’s
resources and programs” (ALA, 1998).
• “Libraries cannot demonstrate institutional value to
maximum effect until they define outcomes of
institutional relevance and then measure the degree to
which they attain them” (Kaufman & Watstein, 2008, p. 227).
26
Outputs & Inputs
• Outputs
– Quantify the work
done
– Don’t relate factors to
overall effectiveness
• Inputs
– Raw materials
– Measured against
standards
– Insufficient for
overall assessment
27
Principles for Applying Outcomes
Assessment
• Center on users
– Assess changes in
service/resource use
• Relate to inputs - identify
• “Best practices”
• Use variety of methods to
corroborate conclusions
– Choose small number of
outcomes
– Need not address every aspect
of service
• Adopt continuous process
28
Examples of Outcomes
• User matches information need to
information resources
• User can organize an effective search
strategy
• User effectively searches online catalog &
retrieves relevant resources
• User can find appropriate resources
29
“One size fits none!”
(My Mom)
30
Steps in Assessment Process
•
•
•
•
•
Why? Identify purpose
Who? Identify team
How? Choose model/approach/method
Commit
Training/planning
31
Survey Research
“…to look at or to see over or
beyond…allows one to generalize from a
smaller group to a larger group”
(Connaway & Powell, 2010, p. 107)
32
Survey Research: Advantages
• Explores many aspects of service
• Demographic information
• Online surveys (e.g., Survey Monkey) provide statistical
analysis
• Controlled sampling
• High response rates possible
• Data reflect characteristics & opinions of respondents
• Cost effective
• Can be self-administered
• Survey large numbers
(Hernon & Altman, 1998)
33
Survey Research: Disadvantages
• Produces a snapshot of situation
• May be time consuming to analyze &
interpret results
• Produces self-reported data
• Data lack depth of interviewing
• High return rate can be difficult
(Hernon & Altman, 1998)
34
Design Issues
•
•
•
•
Paper or Online (e.g., Survey Monkey)
Consider order of questions
Demographic questions first
Instructions
– Be specific
– Introduce sections
• Keep it simple
• Pre-test!
35
Survey Research
Interpreting Results
• Objectively analyze all data
• Interpret results with appropriate level of
precision
• Express proper degree of caution about
conclusions
• Use data as input in outcome measures
• Consider longitudinal study, compare results
over time
• Qualitative data requires special attention
36
Example: Seeking Synchronicity CIT:
VRS Potential User Online Survey Questions
Think about one experience in which you felt you
achieved (or did not achieve) a positive result after
seeking library reference services in any format.
a. Think about one experience in
which you felt you did (or did
not) achieve a positive result
after seeking library reference
services in any format.
b. Describe each interaction.
c. Identify the factors that made
these interactions
positive or negative.
(Connaway & Radford, 2011)
37
Interviews
Conversation involving two or more people
guided by a predetermined purpose
(Lederman, 1996)
38
Types of Interviews
• Structured
• Semi-structured
• Formats:
– Individual
• Face-to-face
• Telephone
• Skype
– Focus Group Interviews
39
Types of Questions
• OPEN
• “What is it like when you visit the library?”
• DIRECTIVE
• “What happened when you asked for help at the
reference desk?”
• REFLECTIVE
• “It sounds like you had trouble with the mobile app?”
• CLOSED
• “Have I covered everything you wanted to say?”
40
Interviews: Advantages
• Face-2-face interaction
• In-depth information
• Understand experiences &
meanings
• Highlight individual’s voice
• Preliminary information to
“triangulate”
• Control sampling
– Include underrepresented
groups
• Greater range of topics
41
Interviews: Disadvantages
• Time Factors
– Varies by # & depth
– Staff intensive
• Cost Factors
– Higher the #, higher the cost
• Additional Factors
– Self-reported data
– Errors in note taking possible
42
Example: Digital Visitors & Residents
Participant Questions
1. Describe the things you enjoy
doing with technology and the
web each week.
2. Think of the ways you have
used technology and the web
for your studies. Describe a
typical week.
3. Think about the next stage of
your education. Tell me what
you think this will be like.
(White & Connaway, 2011-2012)
43
Focus Group Interviews
“…interview of a group of 8 to 12 people
representing some target group and
centered on a single topic.”
(Zweizig, Johnson, Robbins, & Besant, 1996)
44
Conducting Focus Group Interviews
• Obtain permission to use information & if
taping
– Report and/or publication
• Enlist note-taker or, if recording, check
equipment, bring back-up
• Begin by creating safe climate
45
WorldCat.org Study Recruitment
• Difficult
– Little data of user-base
– Participants across 3
continents
– Hard-to-reach populations
• Historians
• Antiquarian booksellers
• Non-probabilistic
methods
– Convenience sampling
– Snowball sampling
(Connaway & Wakeling, 2012)
46
Example: WorldCat.org
Focus Group Interview Questions
Tell us about your
experiences with
WorldCat.org.
Broad introductory question to
reveal the extent to which users
have engaged with WorldCat.org,
and the information-seeking
contexts within which they use
the system.
(Connaway & Wakeling, 2012, p. 7)
47
Structured Observations
Systematic description focusing on
designated aspects of behavior to test
causal hypotheses
(Connaway & Powell, 2010)
48
Structured Observations: A Guide
• Develop observational
categories
– Define appropriate,
measurable acts
– Establish time length of
observation
– Anticipate patterns of
phenomena
– Decide on frame of
reference
(Connaway & Powell, 2010)
49
Ethnographic Research
Rich description
(Connaway & Powell, 2010)
50
Ethnographic research
• Incredibly detailed data
• Time consuming
– Establishing rapport
– Selecting research
participants
– Transcribing
observations &
conversations
– Keeping diaries
(Connaway & Powell, 2010, p.175)
(Khoo, Rozaklis, & Hall, 2012)
51
Analysis
“summary of observations or data in such a
manner that they provide answers to the
hypothesis or research questions”
(Connaway & Powell, 2010)
52
Analysis
• Collection of data
affects analysis of data
• Ongoing process
• Feeds back into
research design
• Theory, model, or
hypothesis must grow
from data analysis
53
Data Analysis:
Digital Visitors & Residents
I. Place
• Codebook
• Nvivo 10
A. Internet
1. Search engine
a. Google
b. Yahoo
2. Social Media
a. FaceBook
b. Twitter
c. You Tube
d. Flickr/image
sharing
e. Blogging
B. Library
1. Academic
2. Public
3. School (K-12)
C. Home
D. School, classroom, computer lab
E. Other
(White & Connaway, 2011-2012)
54
Local Considerations
– Available resources
• Budget
• Human resources
–In house?
–Should you hire a consultant?
–Faculty resources?
(university/college)
• Other institutional resources
– Parent institution needs/demands
55
What to Consider?
• What do you already
know?
(Previous assessments)
• What have others done?
(Literature review)
• What do you want to
know?
(Problems/questions)
56
Get the Right Fit!
• What do we know?
• Where do we go from
here?
Use tools & research
design to customize
project to fit your
assessment needs
57
infoKit
What is it?
• Contains advice on assessing digital/online
services within the broader context of traditional
services.
Why did we create it?
• To understand the contexts surrounding
individual engagement with digital resources,
spaces and tools.
Who will use it?
• Librarians and information technology staff
References
ALA/ACRL. 1998. Task force on academic library outcomes assessment report. Available:
http://www.ala.org/Content/NavigationMenu/ACRL/Publications/White_Papers_and_Reports/
Task_Force_on_Academic_Library_Outcomes_Assessment_Report.htm
Association of College and Research Libraries. 2012. Connect, Collaborate, and Communicate:
A Report from the Value of Academic Libraries Summits. Prepared by Karen Brown and Kara
J. Malenfant. Chicago, IL: Association of College and Research Libraries, 2012.
Brown, Karen, and Kara J. Malenfant. 2012. Connect, collaborate, and communicate: a report
from the Value of Academic Libraries Summits. [Chicago, Ill.]: Association of College &
Research Libraries. http://www.acrl.ala.org/value.
Connaway, Lynn S., Johnson, Debra W., & Searing, Susan. 1997. Online catalogs from the
users’ perspective: The use of focus group interviews. College and Research Libraries,
58(5), 403-420.
Connaway, Lynn S. & Radford, Marie L. 2011. Seeking Synchronicity: Revelations and
recommendations for virtual reference. Dublin, OH: OCLC Research. Retrieved from
http://www.oclc.org/reports/synchronicity/full.pdf
Connaway, Lynn S. & Powell, Ronald R. 2010. Basic research methods for librarians (5th ed.).
Westport, CN: Libraries Unlimited.
Connaway, Lynn S., & Wakeling, Simon. 2012. To use or not to use Worldcat.org: An international
perspective from different user groups. OCLC Internal Report.
Dervin, Brenda, Connaway, Lynn S., & Prabha, Chandra. 2003-2006 Sense-making the
information confluence: The whys and hows of college and university user satisficing of
information needs. Funded by the Institute of Museum and Library
Flanagan, John C. 1954. The critical incident technique. Washington: American Psychological
Association.
Geertz, Clifford. 1973. The interpretation of cultures: Selected essays. New York: Basic Books.
59
References
Hernon Peter & Altman, Ellen. 1998. Assessing Service Quality: Satisfying the Expectations of Library
Customers. Chicago, IL: American Library Association.
Kaufman, Paula, and Sarah Barbara Watstein. 2008. "Library value (return on investment, ROI) and
the challenge of placing a value on public services". Reference Services Review. 36 (3): 226-231.
Khoo, Michael, Rozaklis, Lily, & Hall, Catherine (2012). A survey of the use of ethnographic methods
in the study of libraries and library users. Library and Information Science Research, 34(2), 8291.
Lederman, Linda C. 1996. Asking questions and listening to answers: A guide to using individual,
focus group, and debriefing interviews. Dubuque, Iowa: Kendall/Hunt.
Oakleaf, Megan J. 2010. The value of academic libraries: a comprehensive research review and
report. Chicago, IL: Association of College and Research Libraries, American Library Association.
QSR International. 2011. NVivo 9: Getting started. Retrieved from
http://download.qsrinternational.com/Document/NVivo9/NVivo9-Getting-Started-Guide.pdf
Services (IMLS). http://www.oclc.org/research/activities/past/orprojects/imls/default.htm
White, David S., & Connaway, Lynn S. 2011-2012. Visitors and residents: What motivates
engagement with the digital information environment. Funded by JISC, OCLC, and Oxford
University. Retrieved from http://www.oclc.org/research/activities/vandr/
White, David, Connaway, Lynn S., Lanclos, Donna, Hood, Erin M. and Vass, Carrie. 2013. Evaluating
Digital Services: A Visitors and Residents Approach A JISC
infoKit. http://www.jiscinfonet.ac.uk/infokits/evaluating-services/.
Zweizig, Douglas, Johnson, Debra W., Robbins, Jane, & Besant, Michele. 1996. The tell it! Manual.
Chicago: ALA.
60
Thank You!
Lynn Silipigni Connaway, Ph.D.
Senior Research Scientist
OCLC Research
Vice-chair , ACRL Value of Academic Libraries Committee
@LynnConnaway
connawal@oclc.org
©2014 OCLC. This work is licensed under a Creative Commons Attribution 3.0 Unported License. Suggested attribution: “This
work uses content from [presentation title] © OCLC, used under a Creative Commons Attribution license:
http://creativecommons.org/licenses/by/3.0/”
61
Discussion and Questions
Lynn Silipigni Connaway, Ph.D.
connawal@oclc.org
62
Download