Assessing Library Contributions to University Outcomes 9th Northumbria International Conference University of York, England Joe Matthews August 2011 Indirect Measures National Survey of Student Engagement • Academic challenge • Opportunities for collaborative learning • Interactions with faculty • Enriching extra-curricular experiences • Supportive environment for learning NSSE & Libraries • Library use & educational purposeful activities are correlated at small liberal arts colleges • Larger universities – no correlation • Students who use the library more likely to work harder – meet faculty expectations Library Experiences • Do not lead to gains in information literacy • Do not lead to gains in student satisfaction • Do not lead to what students gain overall from college Book Use Goodall & Pattern (2011) eResources Library visits Direct Measures Student Learning The contribution of the university in assessing student learning is indirect, at best. Assess Learning • The Collegiate Learning Assessment (CLA) • The Collegiate Assessment of Academic Proficiency (CAAP) • The Measure of Academic Proficiency and Progress (MAPP) Collegiate Learning Assessment • Critical thinking • Judgment • Analytical reasoning • Problem solving • Writing skills Astin’s IEO Model Institutional Characteristics Entering Classes Programs Student Student Characteristics Graduating Characteristics Fellow Students Faculty Place of Residence Library Services Campus Environment Shavelson’s Student Learning Outcomes Model Total Collegiate Experience Time Spent Studying 40 35 30 25 20 15 10 5 0 1964 2004 Disengagement Compact Areas of Impact Student Faculty Enrollment Research productivity Retention & graduation Grants Success Teaching Achievement Learning Experiences, attitudes & perceptions of quality University Institutional reputation & prestige Limitations • Micro-level studies • Inward looking • Small samples sizes Need – Demonstrations of Value One Model • School libraries & standardized test scores • Controlled for school & community differences and found high correlations with use of library & test scores • 20 studies in different states Broad-based Data Analysis Library Data Farm Processes • • • • Load Clean Normalize Anonymize • Analysis • Export Assessment Management Systems Expand Data Sets • In addition to library data • Partner with the Office of Institutional Research – NCES – IPEDS – NSSE – CLA – Campus surveys – Student registrar data (enrollment, grades) Anonymity & privacy are not incompatible Library Needs to Support Assessment Collections & Services Space Virtual Space Community Space Collections & Services Space • • • • • • ILS data In-library use data ILL data Use of IT services Reference services Instructional services • Other Library Use & GPA Virtual Space Community Space Combine the Data David Shulenburger Library Assessment Conference Building Effective, Sustainable, Practical Assessment Baltimore, Maryland 2010 Partnering Privacy Institutional Review Board Broad-based Data Analysis Enables a library to prepare a credible analysis of the library’s impact in the lives of Students Faculty Researchers The Goal “until libraries know that that student #5 with major A has downloaded B number of articles from database C, checked out D number of books, participated in E workshops and online tutorials, and completed courses F, G, and H, libraries cannot correlate any of those student information behaviors with attainment of other outcomes. Until librarians do that, they will be blocked in many of their efforts to demonstrate value.” Megan Oakleaf Library Impact Model Books Use Print journals Special collections Intellectual development Assessment = Grade eJournals Use eBooks eResources Intangible Tangible Product Success The Goal Get a better handle on: • Who is using the library? • Why are they using the library? • What impact does library use have in their life? Questions? www.joematthews.org joe@joematthews.org