Student Services Department SSLO/AUO Assessment Analysis Form Use the form below to summarize the results of the department meeting in which you discussed the results of your SSLO/AUO assessment. Department Assessment Meeting Date Fall 2012 and Spring 2013 plus on-going discussions amongst assessment staff Number of Staff participating 4 total: Dean of Matriculation/CESS, Student Assessment Coordinator and two (2) Assessment Program Specialists % of department 100% SSLO measured Critically analyze and identify options that maximize accurate course placement based upon assessment score, prior knowledge, skill and experience. Assessment Tool (Briefly describe assessment tool) We created an online Google doc pre and post test quiz/survey taken prior to math, reading and English assessments. Students were asked to answer eight (8) questions about the assessment process. Students receive the information to answer the questions by reading Facts about the Assessment Test poster in the lobby and hallway while waiting in the line before assessment sessions; once seated in the lab, staff directs each student to read Student Assessment Directions sheet and read and sign the Assessment Policy Page. The pretest was taken after the students had read documentation about the assessment process. The post test was taken after the verbal instructions of the assessment process were given. Before the assessment begins; the staff reads aloud the assessment introduction and instructions script. This script along with the poster, directions sheet and policy page all contain the information needed to answer the questions correctly. The information is also on our website and included in the on-line orientation. We conducted our SSLO evaluation with the quiz tool in April-May 2013, with 205 students completing the pre and post survey/quiz. All of the questions had the option of “I have no idea” as a response because we didn’t want the students to guess – if they didn’t know, we wanted them to tell us that. This was very helpful in determining what they truly learnedwe saw significant decreases in this choice for the all post- test answers. The data shows that by and large the students learned about the assessment process and the impact that accurate course placement can have on their overall Cabrillo experience. We noted that questions for #6 & #7 about the challenge process had lower “correct answer” rates in both the pre and post test than the all the other questions. Our observation is that the students are not focused on this seemingly unrelated topic at the time of assessment – they “tune in” when the information directly relates to the assessment and “tune out” when the instructions shift away from the assessment. Questions #1 & #7 have “all of the above” as the correct answer with a 70+% response rate; when taken as a whole the data shows that over 92% of the students give a correct answer on both these questions. It appears that the students may be confused - we need to re-think these questions so that we can drill down to what we really want the students to learn. Also, while question #5 about the math retest policy did have a significant increase in the correct responses from the pre- to the post test it was still only at 88% on the post test- we’d like this to be above 90%. Assessment Analysis (Summarize the assessment results; discuss what student needs and issues were revealed) This SSLO evaluation period revealed significant changes from our first SSLO evaluation 20092012. There were three (3) new questions added to the quiz this year. The other five (5) questions remained conceptually the same but we revised the text to be more clear and direct. For those five (5) the percentages of correct answers increased by as much as 7% (the last SSLO evaluation was only a post-test, so we compared post-test to post-test). attribute this to improved documentation, clearer questions, the pre-test itself and more specific verbal instructions. Next Steps (How will you address the needs and issues revealed by the assessment?) See table below for data. We are generally pleased with the students’ correct answer response rates, especially on the post-test. We are also encouraged that the data reflects that the students come in with a significant amount of knowledge. Our main concern is that the responses to the math retest policy are not above 90% on the post test. Our overall conclusion is that by the time the student completes the assessment test and we explain the results, the student is clear about this. Anecdotally, in the past year we have seen fewer students returning to assessment requesting to re-assess in order to skip a class that they have received a “W” for or failed. Also, even though the percentage of correct answers is lower than we’d like, introducing the challenge process during the assessment is appropriate because it plants the seed that the challenge process exists; the student can seek out the information at the time they need it. For questions #1 & #7: consider revising the questions so that the students are not confused, possibly eliminating “all of the above” as the correct response. Timeline for Implementation (Make a timeline for how you will implement the next steps outlined above ) We do not have an SSLO in place for students taking the ESL assessment. While the SSLO itself could be the same, the questions we ask need to be different because the assessment and curriculum are different. On-going: we will continue to find ways to clearly communicate the math re-test policy because if students fail to understand this it will severely hinder their academic progress. We will also continue to be a resource to assist students through the challenge process (referring them to the challenge form online and to the CESS Division Office) In the next year we will meet and develop an SSLO for students taking the ESL assessment. We will be ready to launch the evaluation in 2014-15.