Annual Assessment Report to the College 2009-2010

advertisement

Annual Assessment Report to the College 2009-2010

College: University Library

Departments: All

Note: Please submit report to your department chair or program coordinator and to the Associate Dean of your College by September 30, 2010.

You may submit a separate report for each program that conducted assessment activities.

Liaison: Kathy Dabbour

1.

Overview of Annual Assessment Project(s)

1a. Assessment Process Overview: Provide a brief overview of the intended plan to assess the program this year. Is assessment under the oversight of one person or a committee?

Kathy Dabbour is the Library’s Assessment Coordinator. Due to furloughs and that Kathy Dabbour would also be serving as Acting

Chair of the Reference & Instructional Services Dept. from March 2010 through December 2010, we did not take on any new assessment activities during 2009/10. Therefore, the plan was to finish writing reports and analyses from the previous year’s assessment projects.

1b. Implementation and Modifications: Did the actual assessment process deviate from what was intended? If so, please describe any modification to your assessment process and why it occurred.

Library services assessment reports were shared with Library administration, faculty, and staff. Reports related to information competence assessment from the previous year are still in progress.

2. Student Learning Outcome Assessment Project: Answer questions according to the individual SLO assessed this year. If you assessed an additional SLO, report in the next chart below.

2a. Which Student Learning Outcome was measured this year?

Our SLOs are generally assessed as a group, i.e., Information Competence: http://library.csun.edu/kdabbour/assessment.html#infocomp

2b. What assessment instrument(s) were used to measure this SLO?

N/A

2c. Describe the participants sampled to assess this SLO: discuss sample/participant and population size for this SLO. For example, what type of students, which courses, how decisions were made to include certain participants.

N/A

2d. Describe the assessment design methodology: For example, was this SLO assessed longitudinally (same students at different points) or was a cross-sectional comparison used (comparing freshmen with seniors)? If so, describe the assessment points used.

N/A

2e. Assessment Results & Analysis of this SLO: Provide a summary of how the data were analyzed and highlight important findings from the data collected.

The analysis of information competency SLOs, which is in process, includes the following:

U100 pretest posttest results from 2003 and 2008 are being used to see if changes in pedagogy had an impact on IC skills of freshmen over five years.

COMS 151 traditional vs. hybrid pretest posttest results from fall 2008 and spring 2009 are in the process of being analyzed to determine if there are differences between traditional and hybrid forms of library instruction and student scores.

2f. Use of Assessment Results of this SLO: Think about all the different ways the results were or will be used. For example, to recommend changes to course content/topics covered, course sequence, addition/deletion of courses in program, student support services, revisions to program SLO’s, assessment instruments, academic programmatic changes, assessment plan changes, etc. Please provide a clear and detailed description of how the assessment results were or will be used.

Results from the pretest posttest for the U100 courses have shown us that a more authentic, direct form of assessment needs to be developed. Therefore, as part of her Simplifying Assessment sub-committee work, Kathy Dabbour is beginning to test a rubric to look for evidence of information competence skills in an annotated bibliography assignment in two sections of JOUR 372, which is upper division GE/IC.

Results from the pretest posttest study of COMS 151 traditional vs. hybrid library instruction will be used to determine if we can rely on more online, self-paced library instruction.

2a. Which Library services/collections outcomes were measured this year?

Our outcomes for library services and/or collections are generally assessed as a group, see: http://library.csun.edu/kdabbour/assessment.html#services http://library.csun.edu/kdabbour/assessment.html#collections

2b. What assessment instrument(s) were used to measure this SLO?

N/A

2c. Describe the participants sampled to assess this SLO: discuss sample/participant and population size for this SLO. For example, what type of students, which courses, how decisions were made to include certain participants.

N/A

2d. Describe the assessment design methodology: For example, was this SLO assessed longitudinally (same students at different points) or was a cross-sectional comparison used (comparing freshmen with seniors)? If so, describe the assessment points used.

N/A

2e. Assessment Results & Analysis of this SLO: Provide a summary of how the data were analyzed and highlight important findings from the data collected.

Random sample LibQUAL survey used to determine the extent to which we are meeting the minimum expectations of our users versus what they perceive to be our current quality. Out of a possible high score of 9, the mean score for perceived quality of service was 7.23; 7.30 for usefulness of the collections; and 7.06 for the “library as place.” Overall, respondents gave the Library a 7.45.

Content analysis on open-ended comments revealed that for undergraduates, the “library as place” was of most interest and mostly negative; for graduates, they were negative about the “library as place” and the collection; and for faculty, they were most positive

 about service, but mostly negative about collections.

“Questions around the Library” focus group interviews held with library employees at all levels to determine where our students are asking questions beyond the relevant service desks.

2f. Use of Assessment Results of this SLO: Think about all the different ways the results were or will be used. For example, to recommend changes to course content/topics covered, course sequence, addition/deletion of courses in program, student support services, revisions to program SLO’s, assessment instruments, academic programmatic changes, assessment plan changes, etc. Please provide a clear and detailed description of how the assessment results were or will be used.

LibQUAL results have been used to formulate Campus Quality Fee proposals, including the installation of student lounge areas and purchase of textbooks for some large, lower division GE courses. In addition, LibQUAL data and environmental scan lead to the revision of policies and procedures for group study rooms, the installation of a coffee cart, replacing aging videos with DVDs, purchase of Video Furnace, adding an automated “map it” feature to the library catalog to help users quickly locate books in the stacks, and planned customer service training.

Questions around the Library results have been shared with all library staff and faculty for consideration of how best to meet our users needs.

3.

How do your assessment activities connect with your program’s strategic plan?

Our Assessment plan is based on outcomes for information competence of students, service quality, and resources that support our users’ information needs. These outcomes, in turn, dovetail with the Library’s strategic plan, which states: “We are committed to meeting the information needs of our academic community, to providing effective, caring, and responsive service, to partnering with faculty in the education of our students, to developing the information competence skills of our students and to fostering a love of reading and learning.”

4.

Overall, if this year’s program assessment evidence indicates that new resources are needed in order to improve and support student learning, please discuss here.

With 14 positions eliminated or frozen, cuts to the collection which has resulted in reducing access to scholarly knowledge, reduction in service hours, and the reduction to the Library’s operational budget, which has a direct impact on students and faculty, there is substantial need for restoration through the general fund as the Library does not have as many alternatives as other areas on campus for revenue enhancement.

5.

Other information, assessment or reflective activities not captured above.

Reference and Instruction librarians are experimenting with traditional and online methods of classroom assessment in order to conduct a pre-instruction needs assessment, as well as a post-assessment of what students learned from their session.

In preparation for WASC visit, librarians are working on insuring that the Library’s educational mission, outcomes, and assessment projects are in sync with campus’ student learning outcomes by creating a Web site that summarizes and links these areas.

6.

Has someone in your program completed, submitted or published a manuscript which uses or describes assessment activities in your program? Please provide citation or discuss.

Dabbour, K.S. and James David Ballard, “Information Literacy and Latino College Students: A Cross-Cultural Analysis,” submitted for review to New Library World, a peer-reviewed journal, 10/26/2010.

Download