Assessment Committee – Nov 19, 2014 Guidelines For Completing Course-Level Assessment Reports. List faculty who assessed or discussed assessments for sections of this course, as well as other individuals who may have been involved in dialogue. Faculty may want to consider sending both their results and ideas for improvement to colleagues in related disciplines and ask for feedback. These colleagues should then be listed. The artifacts are representational examples (not every assessment). They can be either paper or digital copies. Consult your Dean or Director for clarification if you intend to store these in a central divisional area. If paper copies are stored, indicate the specific office/storeroom location. For digital copies, indicate the resource location (preferably an online resource such as Canvas or myCR). Artifacts for performance-based assessments may be the assignment, rubric, and grading, but not an actual recording of a performance. List the course section (s) and the semester of the assessment. If multiple sections for a single course were assessed, the different sections should be indicated here. Courses with two sections should include combined results from the two sections assessed. Courses with three or more sections should include combined results from at least three of the course sections. Include the semester of the assessment. This is helpful because the reports are listed by academic year and not semester. There are a variety of assessment tools. These include portions of regular class assignments or independent assessment activities that are not part of the class grade. Authentic assessment is often described as students applying the course outcome to real-world situations. For example, can a student analyze a published study using the skills they learned in the course? Assessment Committee – Nov 19, 2014 List the criteria (rubric) that was used to rank the assessments into the categories listed below. The rubric is decided by the faculty experts and varies by the discipline. An example rubric: Exceeded Expectations: the student demonstrates the outcome without any errors or omissions in their assessment. Met Expectations: the student demonstrated achievement of the outcome, but their response included minor errors or omissions. Did not meet expectations: the student did not demonstrate achievement of the outcome. The student response included significant errors and/or omitted significant details. Some faculty use a “percent correct” method to establish these levels: >90% exceeds; 70-90% meets; <70% does not meet. The rubric is often specific to the discipline and the method of assessment. The numbers of this section should be the combined numbers for all students in the sections of the course(s) assessed. If only one course was assessed, then the numbers will reflect students in that single section. However, courses with two sections should include combined results from the two sections assessed. Courses with three or more sections taught in a single semester should include results from at least three of those sections. This section is the most important part of the report and it should provide evidence that assessment is being used to reflect on achievement and as the basis for course improvements. This section should provide evidence that faculty are analyzing the results and using that analysis to determine the strengths and weaknesses of learning within the course. Even satisfactory results should be analyzed, and ideas for further improvement should be considered and documented. If many students are exceeding expectations then consideration should be given to the level of achievement, and the suitability of the outcome. This is also an appropriate place to comment on the effectiveness of the outcome and the need to modify outcomes. This box should be checked only if there are actions/changes that are significant enough that they can be tracked and evaluated in a future assessment. This includes actions such as writing new outcomes, changing the format or content of the course, using substantially different teaching methods (e.g., a flipped course or changing the course to a more activity-based format). Do not check this box if the changes are minor and cannot be easily evaluated during the next regular assessment. Assessment Committee – Nov 19, 2014 Examples of thorough assessment reports Example reports -- combining multiple sections of a course English 1A (SLO#1): http://webapps.redwoods.edu/assessment/outcomesource/stoplightreport.aspx?ID=24 293 Math 15 (SLO#3) http://webapps.redwoods.edu/assessment/outcomesource/stoplightreport.aspx?ID=24 570 Example reports -- single section courses Geology 10 (SLO#3) http://webapps.redwoods.edu/assessment/outcomesource/stoplightreport.aspx?ID=24 866 Anthropology 1 (SLO#1) http://webapps.redwoods.edu/assessment/outcomesource/stoplightreport.aspx?ID=22 337 Biology 20 (SLO#4) http://webapps.redwoods.edu/assessment/outcomesource/stoplightreport.aspx?ID=24 347 Example reports -- professional programs courses Drafting Technology 23 (SLO#3) – single faculty report. http://webapps.redwoods.edu/assessment/outcomesource/stoplightreport.aspx?ID=24 411 Auto Tech 14 (SLO#3)– single faculty with a faculty from a related discipline. http://webapps.redwoods.edu/assessment/outcomesource/stoplightreport.aspx?ID=24 427 Health Occupations 170B (SLO#5) http://webapps.redwoods.edu/assessment/outcomesource/stoplightreport.aspx?ID=25 419 Example reports -- performance-based courses Speech 1 (SLO#1) http://webapps.redwoods.edu/assessment/outcomesource/stoplightreport.aspx?ID=24 487 Art 23 (SLO#3) http://webapps.redwoods.edu/assessment/outcomesource/stoplightreport.aspx?ID=24 738