Effective, Sustainable & Practical Assessment Steve Hiller, Director, Assessment and Planning, UW Martha Kyrillidou, Statistics and Service Quality Programs, ARL Jim Self, Director, Management and Information Services, UVA Boston University September 08, 2008 Monday Morning Metrics With Steve Baseball Presentation #535 8 September 2008 Steve Hiller, UW Libraries Case Study: Worst Trade Ever Made by the Seattle Mariners • • • • • When Who Why was it the worst Evaluating performance Defining success and value – Statistics – “Intangibles” Library Assessment More than Numbers Library assessment is a structured process: • To learn about our communities • To respond to the needs of our users • To improve our programs and services • To support the goals of the communities Why Assess? • • • • • • Accountability and justification Improvement of services Comparisons with others Identification of changing patterns Marketing and promotion Opportunity to tell our own story • Using data, not assumptions, to make decisions – Assumicide! The Challenge for Libraries • Traditional statistics are no longer sufficient – – – – Emphasize inputs – how big and how many Do not tell the library’s story May not align with organizational goals and plans Do not measure service quality • Need measurements from the user’s perspective • Need the organizational culture and the skills to answer a basic question: What difference do we make to our communities? “If you think you're too small to have an impact, try going to bed with a mosquito.” »Anita Roddick Charting User Change User Behavior User Expectations Wide Array of User Studies Now Available Findings Students start with Google Format Agnostic Seek convenience Born Digital Customer Service (qualified & helpful staff) Library as a place, symbol, refuge Self-sufficiency & control of information seeking process Ready access to wide range of content (e.g. complete runs of journals) Re-conceptualizing Library Facilities Changing nature of library usage Re-configuring library facilities: Social and intellectual center Library as physical place, intellectual space, and community center ARL Sponsored Assessment • Tools - StatsQUAL – ARL Statistics™ - descriptive, longitudinal, comparative – LibQUAL+®, ClimateQUAL™, DigiQUAL™ – MINES for Libraries®, E-metrics …. Google Analytics • Building a Community of Practice – Library Assessment Conferences – Service Quality Evaluation Academy (training events) – Library Assessment blog • Individual Library Consultation (Jim and Steve) – Making Library Assessment Work (24 libraries in 2005-06) – Effective, Sustainable, Practical Library Assessment (14 in 200708) ARL Tools for Library Assessment As a result of the work of the New Measures and Assessment Initiative (1999)… ARL Statistics™ LibQUAL+® Since 1907-08 Since 2000 MINES for Libraries™ Since 2003 DigiQUAL® ClimateQUAL™ Since 2003 Since 2007 Survey Structure (Detail View) “22 Items and The Box….” Why the Box is so Important – About 40% of participants provide openended comments, and these are linked to demographics and quantitative data. – Users elaborate the details of their concerns. – Users feel the need to be constructive in their criticisms, and offer specific suggestions for action. MINES for Libraries TM • MINES is a transaction-based research methodology consisting of a web-based survey form and a random moments sampling plan • MINES typically measures who is using electronic resources, where users are located at the time of use, and their purpose of use in the least obtrusive way • MINES was adopted by the Association of Research Libraries (ARL) as part of the “New Measures” toolkit in May, 2003. • MINES is different from other electronic resource usage measures that quantify total usage (e.g., Project COUNTER, EMetrics) or measure how well a library makes electronic resources accessible (LibQUAL+™). ESP Insights • • • • • • Strong interest in using assessment to improve Uncertainty on how to establish and sustain assessment Lack of assessment knowledge among staff More data collection than data utilization Effectiveness not dependent on library size or budget Each library has a unique culture and mission Effective Assessment • • • • • • • • Focuses on the customer Is aligned with library and university goals Assesses what is important Is outcomes oriented Develops criteria for success Uses appropriate and multiple assessment methods Uses corroboration from other sources Provides results that can be used Sustainable Assessment Needs . . • • • • • • • Organizational leadership Sufficient resources Supportive organizational culture Identifiable organizational responsibility Connection to strategic planning and priorities Iterative process of data collection, analysis, and use Involvement of customers, staff and stakeholders Practical Assessment • • • • • Keep it simple and focused – “less is more” Know when enough is enough Use assessment that adds value for customers Present results that are understandable Organize to act on results The University of Virginia • 14,000 undergraduates – 66% in-state, 34% out – Most notable for liberal arts – Highly ranked by U.S. News • 6,000 graduate students – Prominent for humanities, law, business – Plans expansion in sciences • Located in Charlottesville – Metro population of 160,000 Collecting the Data at the U.Va. Library • • • • • • Customer Surveys Staff Surveys Mining Existing Records Comparisons with peers Qualitative techniques Long involvement with ARL statistics Management Information Services • • • • • • • MIS committee formed in 1992 Evolved into a department 1996-2000 Currently three staff Coordinates collection of statistics Publishes annual statistical report Coordinates assessment Resource for management and staff “…but to suppose that the facts, once established in all their fullness, will ‘speak for themselves’ is an illusion.” Carl Becker Annual Address of the President of the American Historical Association, 1931 www.arl.org UVa Customer Surveys • Faculty – 1993, 1996, 2000, 2004 – Response rates 59% to 70% • Students – 1994, 1998, 2001, 2005 – Separate analysis for grads and undergrads – Response rates 43% to 63% • LibQual+ 2006 – Response rates 14% to 24% • Annual Surveys 2008 – Student samples – One third of faculty – Response rates 29% to 47% Corroboration • Data are more credible if they are supported by other information • John Le Carre’s two proofs www.arl.org Analyzing U.Va. Survey Results • Two Scores for Resources, Services, Facilities – Satisfaction = Mean Rating (1 to 5) – Visibility = Percentage Answering the Question • Permits comparison over time and among groups • Identifies areas that need more attention www.arl.org U.Va. Reference Activity and Reference Visibility in Student Surveys 7,000 6,008 Reference Questions Recorded per Week in Annual Sample 64% Visibility 39% Visibility 34% Visibility Reference Visibility among Undergraduate 75% Visibililty 1,756 10% 1,000 1993 www.arl.org 1994 1995 1996 1997 1998 1999 2000 2001 2002 2003 2004 Data Mining • • • • Acquisitions Circulation Finance University Records www.arl.org Investment and Customer Activity University of Virginia Library 1993-2006 Acquisitions Expenditures by Format Customer Activities $3,000,000 2,400,000 1,800,000 E-Journals $2,000,000 Circulation Reference 1,200,000 $1,000,000 600,000 Electronic Resources Print Monographs Print Serials $0 FY93 FY94 FY95 FY96 FY97 FY98 FY99 FY00 FY01 FY02 FY03 FY04 FY05 FY06 0 FY93 FY94 FY95 FY96 FY97 FY98 FY99 FY00 FY01 FY02 FY03 FY04 FY05 FY06 The Balanced Scorecard at the U.Va. Library • • • • • • Implemented in 2001 Results tallied FY02 through FY07 Tallying results for FY08 Completing metrics for FY09 Builds upon a rich history of collecting data A work in progress The Balanced Scorecard Managing and Assessing Data • The Balanced Scorecard is a layered and categorized instrument that – Identifies the important statistics – Ensures a proper balance – Organizes multiple statistics into an intelligible framework Metrics • Specific targets indicating full success, partial success, and failure • At the end of the year we know if we have met our target for each metric • The metric may be a complex measure encompassing several elements What Do We Measure? • • • • • • Customer survey ratings Staff survey ratings Timeliness and cost of service Usability testing of web resources Success in fundraising Comparisons with peers Metric L.2.B: Retention Rate of Employees • Target1: Retain 95% of employees. • Target2: Retain 90% of employees. • Result FY08: Target1. – 95% of employees retained. Metric U.4.B: Turnaround time for user requests • Target1: 75% of user requests for new books should be filled within 7 days. • Target2: 50% of user requests for new books should be filled within 7 days. • Result FY07: Target1. – 77% filled within 7 days. Metric U.3.A: Circulation of New Monographs • Target1: 60% of newly cataloged monographs should circulate within two years. • Target2: 50% of new monographs should circulate within two years. • Result FY07: Target1. – 63% circulated. Balanced Scorecard U.Va. Library FY2007 Target1 Target2 Not Met Using Data for Results at UVa • • • • • • • • Additional resources for the science libraries (1994+) Redefinition of collection development (1996) Initiative to improve shelving (1999) Undergraduate library open 24 hours (2000) Additional resources for the Fine Arts Library (2000) Support for transition from print to e-journals (2004) New and Improved Study Space (2005-06) Increased appreciation of the role of journals (2007) University of Washington • Located in Seattle; metro population 3.2 million • Comprehensive public research university – 27,000 undergraduate students – 12,000 graduate and professional students (80 doctoral programs) – 4,000 research and teaching faculty • $800 million yearly in U.S. research funds (#2) • Large research library system – $40 million annual budget – 150 librarians on 3 campuses The Basic Question How Does the Library Contribute to the Success of our Researchers and Students? Our assessment priorities: • • • • • Information seeking behavior and use Patterns of library use User needs Library contribution to learning and research User satisfaction with services, collections, overall What have we learned (short version): • Faculty perceive success through collections • Grad students through timely access to resources and services • Undergrads through library as place for work and community University of Washington Libraries Assessment Methods Used • Large scale user surveys every 3 years (“triennial survey”): 1992, 1995, 1998, 2001, 2004, 2007 • In-library use surveys every 3 years beginning 1993 – 4000 surveys returned in 2008 • • • • Focus groups/Interviews (annually since 1998) Observation (guided and non-obtrusive) Usability Usage statistics/data mining Information about assessment program available at: http://www.lib.washington.edu/assessment/ UW Triennial Library Survey Number of Respondents and Response Rate 1992-2007 2007 2004 2001 1998 1995 1992 Faculty 1455 36% 1560 40% 1345 36% 1503 40% 1359 31% 1108 28% Grad Student 580 33% 627 40% 597 40% 457 46% 409 41% 560 56% Undergrad 467 20% 502 25% 497 25% 787 39% 463 23% 407 41% Library as Place Change In Frequency of In-Person Visits 1998-2007 (weekly+) 80% Grad 70% Undergrad Undergrad 60% 50% Grad Faculty 40% 30% Faculty 20% 1998 2001 2004 2007 What Did You Do in the Library Today? (In-Library Use Surveys 2008/2005) 70% 60% 2008 2005 50% 40% 30% 20% 10% 0% Ask for help Look for material Work alone Work in groups Use Lib Computer Use Own Computer Activity in the Library by Group 2008 Users: 73% UG, 22% Grad, 5% Faculty 70% Undergrad 2008 Grad 2008 Faculty/Staff 2008 60% 50% 40% 30% 20% 10% 0% Ask for help Look for material Work alone Work in groups Use Lib Computer Use Own Computer Usefulness of New and/or Expanded Services for Undergrads: Library as Place (2007 Triennial Survey) Quiet work/study areas Increase weekend hours More library computers Integrate services into campus Web sites Book Self-Check out Group/Presentation Spaces 0% 10% 20% 30% 40% 50% 60% Library As Place: Using the Results • Libraries are student places – – – – – – 350 computer lab installed in Undergrad Library Autumn 1998 Hours extended to 24/5.5 in Undergrad Library 2002 Collection footprint reduced Diversified user spaces provided (group, quiet, presentation) Student advisory committee provides ongoing feedback Add other collaborative student support services into library • Upgrade/renovate facilities to meet student needs – Furniture that encourages collaboration – More electrical outlets – Better lighting and noise control • Plan for major renovation of Undergraduate Library What Faculty/Grad Students Told Us Bioscience Interview/Focus Groups (2006) • Content is primary link to the library – Identify library with ejournals; want more titles & backfiles • Provide library-related services and resources in our space not yours – Discovery begins primarily outside of library space with Google and Pub Med; Web of Science also important – Library services/tools seen as overly complex and fragmented • Print is dead, really dead – If not online want digital delivery/too many libraries – Go to physical library only as last resort • Uneven awareness of library resources and services Sources Consulted for Information on Research Topics (Scale of 1 “Not at All” to 5 “Usually”) Undergrad Grad Faculty Open Internet Search Open Internet Ref Source Library Catalog Bibliographic Databases 2.5 2.75 3 3.25 3.5 3.75 4 4.25 4.5 Off-Campus Remote Use 1998-2007 (Percentage using library services/collections at least 2x week) 70% 60% 70% 76% of faculty and 72% of graduate students use library services online at least 2x week Grad 60% 50% 50% Faculty 40% 40% 30% 30% 20% Undergrad 20% 10% 10% 0% 0% 1998 2001 2004 2007 Primary Reasons for Faculty Use of Libraries Web Sites 2007 (at least 2x per week) 75% 65% 55% 45% 35% 25% 15% Health Sci Library Catalog Science-Engin Bib Database Hum-Soc Sci Online journal articles Theabiltyocsfu-xrpdeahtilonugbray subcriptonmyaefhlbrndisctomyeah. NeurobilgyGadStn E-Journal Usage at UW Scholarly Stats 2007: 5 million article requests from 19 vendors/platforms - 7 accounted for nearly 75% • • • • • • High Wire Press Science Direct JSTOR Nature Meta Press Blackwell Synergy • Ovid 1,200,000 975,000 750,000 300,000 250,000 225,000 225,000 Biomedical Science-Medical Multidisciplinary Science-Medical Science-Medical Science-Medical Biomedical Importance of Books, Journals, Databases Academic Area (2007, Faculty, Scale of 1 “not important” to 5 “very important) 4.8 Journals >1985 4.6 4.4 4.2 Books 4 Journals <1985 3.8 3.6 Bib Databases 3.4 3.2 Health Sciences Books Science-Engineering Journals<1985 Bib Databases Hum-Soc Science Journals>1985 Bibliographic Database Use: Login Sessions Database 2005 2006 2007 Change Art Abstracts 4120 2887 2330 -43% Geobase 4504 4435 3933 -13% MLA 23246 22189 20696 -11% ERIC 17063 16078 15263 -11% Expanded Academic 191095 176483 141887 -26% Web of Science 158451 162600 154546 -2% Librarian Liaison Satisfaction & Visibility By Selected School (2007 Triennial Survey; Satisfaction on 1 to 5 scale; visibility % who rated) 100% Fine Arts 90% 80% Visibility Humanities 70% Science 60% Engineering 50% Public Health 40% Medicine Business 30% 3.8 4 4.2 4.4 Satisfaction 4.6 4.8 Usefulness of New/Expanded Services for Faculty & Grads: Integrate into my space Scan on Demand Digitize specialized collections Office delivery of books Grad Faculty Integrate services into campus Web sites Manage your info & data 0% 10% 20% 30% 40% 50% 60% 70% 80% Integrate Library Services & Resources into User Workflows: Follow-Up Actions • • • • • • • WorldCat Local became primary catalog+ in 2007 “Scan on demand” pilot began 2008 Increase chat and remote services staffing Increase ILL staffing (due to WorldCat Local) Integrate course reserves & services into “My UW” portal Redesign UW Libraries Homepage Use qualitative methods to gain deeper understanding of user work and behavior • Strengthen librarian liaison efforts to academic programs How UW Libraries Has Used Assessment • • • • • • • • • Extend hours in Undergraduate Library (24/5.5) Create more diversified student learning spaces Enhance usability of discovery tools and website Provide standardized service training for all staff Review and restructure librarian liaison program Consolidate and merge branch libraries Change/reallocate collections budget Change/reallocate staffing Support budget requests to University Overall Satisfaction by Group 1995-2007 4.6 4.6 Faculty 4.56 4.5 4.4 4.33 4.3 4.5 4.44 4.33 4.34 Faculty 4.25 4.26 4.32 Grad 4.36 4.4 Undergrad 4.36 4.3 4.2 4.1 4.2 Grad 4.18 4.11 4.22 4.1 4 3.9 4 Undergrad 3.97 3.99 3.9 3.8 3.8 1995 1998 2001 2004 2007 Four Useful Assessment Assumptions • • • • Your problem/issue is not as unique as you think You have more data/information than you think You need less data/information than you think There are useful methods that are much simpler than you think Adapted from Douglas Hubbard, “How to Measure Anything” (2007) in conclusion Assessment is not… • • • • Free and easy A one-time effort A complete diagnosis A roadmap to the future Assessment is… • • • • A way to improve An opportunity to know our customers A chance to tell our own story A positive experience Moving Forward • • • • • • Keep expectations reasonable and achievable Strive for accuracy and honesty—not perfection Assess what is important Use the data to improve Keep everyone involved and informed Focus on the customer For more information… • Steve Hiller hiller@u.washington.edu www.lib.washington.edu/assessment/ • Jim Self – self@virginia.edu – www.lib.virginia.edu/mis – www.lib.virginia.edu/bsc • ARL Assessment Service www.arl.org/stats/initiatives/esp/