Assessing Transfer-Level English Strengthening Student Success Conference, October 3, 2007 Sandra Stefani Comerford, Professor, English Assessment Coordinator College of San Mateo comerford@smccd.edu Some Background . . . College of San Mateo: Part of a three-campus district Total student headcount 10,634 (Fall 2006) English at CSM 13 full-time instructors, teaching 34 classes (Fall 06) 22 part-time instructors, teaching 42 classes (Fall 06) 30 sections of English 100 (1A) (Fall 06) Two levels of pre-100 (1A) English 82% of students place into our developmental level Course-Based Department-wide English Assessment at CSM: Challenges… English departmental structure Lacking history of holistic scoring Norming sessions rarely held Number of part-time instructors . . . And Advantages English discipline culture (group commitment to high standards and consistency). Value meaningful assessment leading to positive change. CSM Assessment History Formal efforts in student services began in Fall 2003. Efforts began in Fall 2004 in instruction with the formation of the College Assessment Committee (CAC) which began to address the development of CSM’s assessment plan. CAC supports assessment work of disciplines in various ways, including a professional development grant, district-wide workshops, college-wide workshops, assessment updates, resource page on college’s assessment website. CSM Assessment History, Continued In Fall 2006, CSM Committee on Instruction began requiring that official course outlines contain SLOs. Also in Fall 2006, a report of SLO assessment became part of our annual Program Review. SLOs are now required on syllabi. The college’s assessment website gives information about CSM’s assessment processes: http://www.collegeofsanmateo.edu/assessment Overview of Outcomes Assessment in English at CSM SLOs for all English composition courses and many literature courses established between 2004 and 2007. First course-based department-wide assessment in composition = English 100 (English 1A). Outcomes Assessment in English at CSM, Continued Course-embedded summative assessment of student writing in English 100 composition course, not using common prompts. Representative samples of student writing read against an analytic rubric after a norming session. Consistent effort to use assessment results to improve teaching and learning. English 100 Assessment: Fall 2006 Distributed memo in September to all English 100 instructors, indicating submission of 5 randomly selected unmarked essays along with writing assignment at the end of semester. Distributed second memo in November to all English 100 instructors with detailed instructions. Reached agreement as a department on analytic rubric for scoring. English 100 Assessment, Continued: Spring 2007 Chose to assess five SLOs for English 100. Completed rubric with two categories and design for two readers to respond. Met in January 2007 to read and score randomly selected essays of the 140 sample essays submitted (about 4% of those actually written in all the 100 courses). 28 of the 30 sections submitted essays. Readers (N=12) (after a brief norming session) received an essay packet (essay assignment and 5 student essays). Readers were paired. Outcomes Assessed: SLO 1: Ability to analyze and critically respond to college-level texts (thesis) SLO 1: Development/Support SLO 2: Organization/Focus SLO 3: Purpose and Audience SLO 4: Sentence fluency and editing/proofreading SLO 5: Effective incorporation of textual material using standard MLA format English 100 Assessment Results N = 120 Adequate Needs Work Respond to college-level texts - Thesis 86 33 Development/Support 80 34 Organization/Focus 66 50 Purpose and Audience 97 20 Sentence Fluency & Editing/Proofreading 53 66 Integrating textual material - MLA Format 61 57 Assessment Results (Graph) 90 80 70 60 50 Adequte Needs work 40 30 20 10 0 Thesis Dev/Sup. Organ. Purp/Aud Flu./Edit MLA Assessment Results Percentage of sample essays demonstrating evidence of SLO achievement and number of discrepancies: Criteria Respond to text - thesis Development/Support Organization/Focus Purpose & Audience Fluency & Proofreading MLA Format Adequate (%) Needs work (%) Discrepancies (#) 72 70 57 83 45 52 28 30 43 17 55 48 6 14 6 4 16 11 Interpretation of Results Two subheadings under SLO 1 are two separate issues and difficult to evaluate as one SLO. Separated into two subheadings on rubric. Some essay assignments required summaries or a specific number of paragraphs per essay--both a problem at the end of English 100. Some assignments were not appropriate for the English 100 level and did not seem to elicit writing that could be judged with the rubric. It is impossible to say that papers “failed” to meet a requirement that was not specified on the prompt. Results SLO 1 Respond critically to college-level texts - Thesis (first subheading): A low discrepancy rate of 6. The 72% success rate was deemed acceptable at this time. Results • SLO 1 Respond critically to college-level texts - Development/Support (second subheading): A discrepancy rate of 14 caused concern (perhaps due to last minute change in rubric with the division of subheadings). The 70% success rate was deemed acceptable at this time. Results SLO 2 Organization/Focus: A low discrepancy rate of 6. The 57% success rate is disquieting. Discussion during and after the reading suggested that this area needs more attention. Results SLO 3 Purpose and Audience: A low discrepancy rate 4 Students demonstrate competency with this SLO with a 83% success rate. Discussion at the reading speculated awareness of academic audience was somewhat too difficult to evaluate when not familiar with the assignment. Perhaps these good results stemmed from inability to judge outcome. Results SLO 4 Sentence Fluency & Editing/Proofreading: A discrepancy rate of 16 caused concern. Fewer than half of the essays demonstrated competency in this area, with a success rate of 45%. With two subheadings rated together, the participants were concerned if they could evaluated these as one SLO. Results SLO 5 MLA Format: A discrepancy rate of 11 caused concern. Barely half of the essays demonstrated competency in this area, with a success rate of 52%. Students unable to demonstrate competency with this SLO had recurring problems with providing correct in-text citations as well as formatting Works Cited pages correctly. Discussion at the reading speculated that we aren’t spending enough time teaching MLA conventions and quotation methods--or holding students to sufficient standards in our grading practices. Changes Resulting from Assessment: Part 1 Revision of rubric: Division of subheadings in SLO 1 and SLO 4. Because SLO 4 had the most discrepancies, it needs to be more specific, i.e., for sentence fluency, are there specific signs? For editing/proofreading, is there an acceptable number/type of errors? Elimination of “academic audience” in SLO 3 with a focus on understanding the texts incorporated in the essay (thus with an emphasis on reading comprehension). Changes Resulting from Assessment: Part 2 Development of course handbook for English 100 consisting of the official course outline, guidelines, and sample essay assignments with corresponding student papers appropriate for the skill level needed by the end of English 100 and for the task of assessment, thereby making expectations clearer and providing pedagogical advice to all instructors. All day off-campus English Department retreat for all English faculty to discuss and review best teaching practices (including issues about grammar). Using the Results to Improve As a model for doing course-based department-wide assessment, this approach will been modified to assess learning in English 165 (1B) during Fall 2007. English 100 assessment results tabulated and distributed department-wide along with discussion notes from SLO essay reading were sent to all English instructors, underscoring evidence that we need to teach and assess based on agreedupon rubric standards. Discussion in discipline meetings on how to implement best teaching practices and on how to teach effectively to these SLOs.