Annual Assessment Report to the College 2008-2009

advertisement
Annual Assessment Report to the College 2008-2009
College:
University Library
Department:
All
Program:
Note: Please submit report to your department chair or program coordinator and to the Associate Dean of your College by September 30,
2009. You may submit a separate report for each program which conducted assessment activities.
Liaison:
Katherine Dabbour
Overview of Annual Assessment Project(s)
1a. Assessment Process Overview: Provide a brief overview of the intended plan to assess the program this year. Is assessment under the
oversight of one person or a committee?
Our plan for 2008-09 was to continue our information competence (IC) SLOs assessment for University 100, and assess IC SLOs for COMS 151 students,
comparing student performance in traditional vs. hybrid instruction. Also, assess student and faculty perceptions of the quality of library services,
collections, and facilities. Kathy Dabbour is the Library Assessment Coordinator and chair of the Library Assessment Committee, composed of library
faculty and supervisory staff, which serves as an advisory group and assists with assessment projects.
1b. Implementation and Modifications: Did the actual assessment process deviate from what was intended? If so, please describe any
modification to your assessment process and why it occurred.
N/A
2. Student Learning Outcome Assessment Project: Answer questions according to the individual SLO assessed this year. If you assessed an
additional SLO, report in the next chart below.
2a. Which Student Learning Outcome was measured this year?
Information Competence SLOs:
 Determine the nature and extent of the information needed
 Access needed information effectively and efficiently
 Evaluate information and its sources critically
Prepared by Kathy Dabbour, 6/23/2016.
2a. Which Student Learning Outcome was measured this year? (Continued)
 Demonstrate understanding of the ethical use of information, i.e., plagiarism
2b. What assessment instrument(s) were used to measure this SLO?
Web-based pretest/posttest before and after library instruction for all U100 students and COMS 151 traditional and hybrid students.
2c. Describe the participants sampled to assess this SLO: discuss sample/participant and population size for this SLO. For example, what
type of students, which courses, how decisions were made to include certain participants.


100% sample of U100 students in fall 2008—part of an on-going project since the inception of U100; pretest provides a snapshot of freshmen IC
skills, and posttest provides assessment of library instruction methods.
100% sample of all hybrid sections of COMS 151 in fall 2008 (4 sections) and spring 2009 (6 sections) and the same number of randomly selected
sections of the traditional course in both semesters. Participants chosen for unique opportunity to study student IC skills having hybrid (self-guided
Web-based tutorials) vs. traditional (in-person) lecture method of library instruction from a pool of students taking the same course with a
standardized curriculum.
2d. Describe the assessment design methodology: For example, was this SLO assessed longitudinally (same students at different points) or
was a cross-sectional comparison used (comparing freshmen with seniors)? If so, describe the assessment points used.
Both projects use a Web-based pretest posttest design prior to and after receiving in-person library instruction for the U100 and COMS 151 traditional
students. The COMS 151 hybrid instruction consisted of self-guided Web pages and tutorials. There was no control group for any of the populations.
2e. Assessment Results & Analysis of this SLO: Provide a summary of how the data were analyzed and highlight important findings from the
data collected.


U100 pretest posttest results from 2003 to present are in the process of being analyzed, with comparisons between 2003 and 2008 data
COMS 151 traditional vs. hybrid pretest posttest results from fall 2008 and spring 2009 are in the process of being analyzed
2f. Use of Assessment Results of this SLO: Think about all the different ways the results were or will be used. For example, to recommend
changes to course content/topics covered, course sequence, addition/deletion of courses in program, student support services, revisions to
program SLO’s, assessment instruments, academic programmatic changes, assessment plan changes, etc. Please provide a clear and detailed
description of how the assessment results were or will be used.



Results will be used to possibly revise the current pretest posttest instrument being used to assess IC skills.
Comparison of 2003 and 2008 data for U100 will be used to see if any differences in incoming freshmen IC skills after 5 years.
Results of COMS 151 assessment will be used to determine if there are any differences in effectiveness between traditional and. online library
instruction, to provide information for implemented embedded library instruction into Moodle, for example.
Prepared by Kathy Dabbour, 6/23/2016.
Some programs assess multiple SLOs each year. If your program assessed an additional SLO, report the process for that individual SLO below.
If you need additional SLO charts, please cut & paste the empty chart as many times as needed. If you did NOT assess another SLO, skip this
section.
2a. Which Library Outcomes were measured this year?
Collections that provide access to a variety of resources to support the curricular and research information needs of students and faculty.
Services that support learning, teaching, and research, including:
 Courteous, capable, and responsive service
 A physical environment conducive to study and research.
 Open hours that takes into consideration both the schedules of our students and faculty and the realities of budgetary constraints.
 Skillful and engaging individualized point-of-use instruction, both in-person and virtual, that helps students develop their information competence
skills.
 Outreach to collaborate with faculty to develop assignments and instructional experiences that develop students' information competence skills.
 Skillful and engaging classroom instruction, which helps students develop their information competence skills.
 Maintain the efficient organization of and timely user access to the Library’s print, electronic, and media collections.
 Equipment (i.e., computers, printers, copiers, etc) and software to extract needed information from either online or print formats.

Timely access to resources of other libraries via inter-library loan and/or document delivery, as appropriate, to supplement Library collections.
2b. What assessment instrument(s) were used to measure this SLO?
LibQUAL+®, a Web-based survey offered by the Association of Research Libraries (ARL) that libraries use to solicit, track, understand, and act upon users'
opinions of service quality as well as provide normed data from related institutions to which a library can make comparisons.
2c. Describe the participants sampled to assess this SLO: discuss sample/participant and population size for this SLO. For example, what
type of students, which courses, how decisions were made to include certain participants.
A random sample of 3,000 undergraduates and 600 graduates (10% of fall 2008 headcount respectively), and 1,000 full- and part-time faculty (50% of fall
2008 headcount).
2d. Describe the assessment design methodology: Was this SLO assessed longitudinally (same students at different points) or was a crosssectional comparison used (comparing freshmen with seniors)? If so, describe the assessment points used.
Participants were emailed invitations to participate in the LibQUAL+® survey, which was available via the Web from mid-March to early April 2009.
Respondents answered 27 questions that asked them to rate the Library on three “dimensions”: customer service, access to needed resources, and user
friendliness of the library facility by rating their minimum, desired, and perceived service levels on a 9-point scale. Eight additional questions asked about
perceptions of overall services, including satisfaction with the library’s support of information competence skills. Respondents also provided demographic
data and frequency of library vs. Internet use.
2e. Assessment Results & Analysis of this SLO: Provide a summary of how the data were analyzed and highlight important findings from
the data collected.
Gap analysis or the mean differences between the minimum and perceived and desired and perceived scores are used to determine “service adequacy” or the
Prepared by Kathy Dabbour, 6/23/2016.
extent to which we are meeting the minimum expectations of our users, and “service superiority,” or the extent to which we are exceeding the desired
expectations of our users. In general, we fall between the two gaps, which is known as the “zone of tolerance,” which indicates that are meeting the
minimum expectations of our users but not exceeding their desires. We also performed a content analysis of the open-ended comments to look for common
themes.
2f. Use of Assessment Results of this SLO: Think about all the different ways the results were (or could be) used. For example, to
recommend changes to course content/topics covered, course sequence, addition/deletion of courses in program, student support
services, revisions to program SLO’s, assessment instruments, academic programmatic changes, assessment plan changes, etc. Please
provide a clear and detailed description of each.
Data from LibQUAL+® were or are being used as follows, with source of funding in parentheses:
 Re-indexed the periodicals titles list in the Library’s online catalog to make access to electronic versions of journals more transparent to users (no
additional costs except existing library personnel time).
 Re-modeled the area behind the reference services desk on the main floor to provide a study lounge area for students; currently planning to do the same
on the 4th floor East Wing (Student Quality Fee).
 Replaced most chairs to make the Library more comfortable for study (Student Quality Fee).
 Provided access to the library 24/7 during the week before finals in spring and fall 2009 (Student Quality Fee).
 Began a focus group study with library personnel to ascertain where library users ask for assistance and if they can self-direct to appropriate service
points (no additional costs except existing library personnel time).
 Partnered with several campus departments to purchase online access to Science and Nature, two heavily used and expensive journals.
 Changed the policy on group study rooms to facilitate their equitable use (no additional costs except existing library personnel time).
 Piloting a program to purchase and place course textbooks on reserve (Student Quality Fee).
 Replacing older videos with DVDs (Student Quality Fee).
How do your assessment activities connect with your program’s strategic plan?
The Library’s assessment plan http://library.csun.edu/kdabbour/assessment.html#plan aligns with its strategic plan
http://library.csun.edu/About_the_Library/stratplan.html .
Prepared by Kathy Dabbour, 6/23/2016.
Overall, if this year’s program assessment evidence indicates that new resources are needed in order to improve and support student
learning, please discuss here.
If it were not for the Student Quality Fee funding that we received, we would not have been able to implement many needed changes based on assessment
data gleaned from LibQUAL+®. Patron-driven changes under the current budget climate may not be possible without this funding, which is based on a
campus-wide competition. Furthermore, additions to our collections, particularly in e-resources will not be possible under our current base budget and next
year’s budget cuts, which was a major source of concern for students and faculty.
Other information, assessment or reflective activities not captured above.



Engaging in curriculum mapping to identify research and/or information competence outcomes for the CSUN courses for which we regularly
provide library instruction to establish baseline outcomes for G.E. courses at the 100- and 300-levels to identify gaps and overlaps in our instruction
program.
Performed usability and card sort studies on current Library web site to determine needed changes to revised site.
Library valuation study to put a dollar amount on its collections and services to demonstrate its value to the university from an economic point of
view.

Has someone in your program completed, submitted or published a manuscript which uses or describes assessment activities in your
program? Please provide citation or discuss.
No
Prepared by Kathy Dabbour, 6/23/2016.
Download