Assessment Literacy Series -Module 4Scoring Keys and Rubrics 1 Objectives Participants will: 1. Develop scoring keys for all multiple choice items outlined with in the blueprint. 2. Develop scoring rubrics for constructed response items/tasks that reflect a performance continuum. 2 Helpful Tools Participants may wish to reference the following: Guides • Handout #6 – Scoring Key Example • Handout #7 – Rubric Examples Templates • Template #5-Scoring Key-Rubric Stuff Performance Task Framework 3 Outline of Module 4 Scoring Key for MC Module 4: Scoring Keys and Rubrics Process Steps Rubrics for Constructed Response Sample Answers Scoring Criteria 4 Scoring Keys 5 Scoring Keys Scoring Keys typically contain elements such as the following: • • • • • • Performance measure name or unique identifier Grade/Course Administration timeframe (e.g., fall, mid-term, final examination, spring, etc.) Item tag and type Maximum points possible Correct answer (MC) or rubric with sample answers or anchor papers (SCR & ECR) 6 Scoring Key Example [Handout #6] Assessment Name Grade/Course Administration Algebra I End-of-Course Algebra I Post-test (Spring) Item # 1 2 3 4 5 6 7 8 9 10 11 12 13 14 Item Tag 0001.MTH.ALGI.POST.MC-LV1-8EE2 0002.MTH.ALGI.POST.MC-LV1-ACED1 0003.MTH.ALGI.POST.MC-LV1-8EE1 0004.MTH.ALGI.POST.MC-LV1-ASSE1 0005.MTH.ALGI.POST.MC-LV2-ASSE1 0006.MTH.ALGI.POST.MC-LV2-7RP3 0007.MTH.ALGI.POST.MC-LV2-ACED1 0008.MTH.ALGI.POST.MC-LV2-ACED1 0009.MTH.ALGI.POST.MC-LV1-ACED1 0010.MTH.ALGI.POST.MC-LV1-AREI3 0011.MTH.ALGI.POST.MC-LV1-FIF1 0012.MTH.ALGI.POST.MC-LV2-ACED1 0013.MTH.ALGI.POST.ECR-LV2-NRN2 0014.MTH.ALGI.POST.ECR-LV2-FIF1 Item Type MC MC MC MC MC MC MC MC MC MC MC MC SCR ECR UAI (Unique Assessment Identifier) 01.003.1112 Point Value 1 1 1 1 1 1 1 1 1 1 1 1 2 4 Answer C B C D A C D D B A D A See Scoring Rubric See Scoring Rubric 7 Scoring Keys Scoring keys for MC items: • Answers within the key must represent a single, correct response. • Answers should be validated once the key is developed to avoid human error. • Validating answers should be done prior to form review. • Items changed during the review stage must be revalidated to ensure the scoring key is correct. 8 Process Steps [Template #4] 1. 2. 3. 4. 5. Enter the assessment information at the top of the Scoring Key. Record the single, correct answer during item development. For SCR and ECR items/tasks, the scoring rubrics should be referenced in the answer column and put in the correct rubric table on the Rubric Template. Record the item number, item tag, item type, and point value. Record the MC answers in the answer column. For each CR item, include the general scoring rubric and sample response for each point value. Repeat Steps 1-4 until all items/tasks on the blueprint are reflected within the Scoring Key. 9 QA Checklist All items/tasks articulated on the blueprint are represented within the Scoring Key. MC items have been validated to ensure only one correct answer among the possible options provided exists. MC answers do not create a discernible pattern. MC answers are “balanced” among the possible options. Scoring Key answers are revalidated after the final operational form reviews are complete. 10 Scoring Rubrics 11 Holistic vs. Analytic Rubric Scoring Holistic Scoring • Provides a single score based on an overall determination of the student’s performance • Assesses a student’s response as a whole for the overall quality • Most difficult to calibrate different raters Analytic Scoring • Identifies and assesses specific aspects of a response • Multiple dimension scores are assigned • Provides a logical combination of subscores to the overall assigned score 12 Rubric Scoring Considerations • Describe whether spelling and/or grammar will impact the final score. • Avoid using words like “many”, “some”, and “few” without adding numeric descriptors to quantify these terms. • Avoid using words that are subjective, such as “creativity” or “effort”. • Avoid subjective adjectives such as “excellent” or “inadequate”. 13 SCR Rubric Example [Handout #7] General Scoring Rubric The response gives evidence of a complete understanding of the problem. It is fully developed 2 points and clearly communicated. All parts of the problem are complete. There are no errors. The response gives evidence of a reasonable approach but also indicates gaps in conceptual 1 point understanding. Parts of the problem may be missing. The explanation may be incomplete. There is no response, or the work is completely 0 points incorrect or irrelevant. 14 SCR Rubric Example [Handout #7] Sample Response: “In two complete sentences, explain why people should help save the rainforests.” The student’s response is written in complete sentences and contains two valid reasons for saving the rainforest. 2 points “People must save the rainforest to save the animals’ homes. People need to save the rainforest because we get ingredients for many medicines from there.” The student’s response contains only one reason. 1 point “People should save the rainforest because it is important and because people and animals need it.” 15 Rubrics for ECR Tasks Create content-based descriptions of the expected answer for each level of performance on the rubric. Provide an example of a fully complete/correct response along with examples of partially correct responses. Reference the item expectations in the rubric. Make the rubric as clear and concise as possible so that other scorers would assign exact/adjacent scores to the performance/work under observation. 16 ECR Rubric Example [Handout #7] General Scoring Rubric 4 points 3 points 2 points 1 point 0 points The response provides all aspects of a complete interpretation and/or a correct solution. The response thoroughly addresses the points relevant to the concept or task. It provides strong evidence that information, reasoning, and conclusions have a definite logical relationship. It is clearly focused and organized, showing relevance to the concept, task, or solution process. The response provides the essential elements of an interpretation and/or a solution. It addresses the points relevant to the concept or task. It provides ample evidence that information, reasoning, and conclusions have a logical relationship. It is focused and organized, showing relevance to the concept, task, or solution process. The response provides a partial interpretation and/or solution. It somewhat addresses the points relevant to the concept or task. It provides some evidence that information, reasoning, and conclusions have a relationship. It is relevant to the concept and/or task, but there are gaps in focus and organization. The response provides an unclear, inaccurate interpretation and/or solution. It fails to address or omits significant aspects of the concept or task. It provides unrelated or unclear evidence that information, reasoning, and conclusions have a relationship. There is little evidence of focus or organization relevant to the concept, task, and/or solution process. The response does not meet the criteria required to earn one point. The student may have 17 written on a different topic or written "I don't know." ECR Rubric Example [Handout #7] Sample Response: “List the steps of the Scientific Method. Briefly explain each one.” 1. 4 points 3 points 2 points 1 point 0 points Ask a Question- Ask a question about something that you observe: How, What, When, Who, Which, Why, or Where? 2. Do Background Research- Use library and Internet research to help you find the best way to do things. 3. Construct a Hypothesis- Make an educated guess about how things work. 4. Test Your Hypothesis- Do an experiment. 5. Analyze Your Data and Draw a Conclusion- Collect your measurements and analyze them to see if your hypothesis is true or false. 6. Communicate Your Results- Publish a final report in a scientific journal or by presenting the results on a poster. 1. Ask a Question 2. Do Background Research-Use library and Internet research. 3. Construct a Hypothesis- An educated guess about how things work. 4. Test Your Hypothesis- Do an experiment. 5. Analyze Your Data and Draw a Conclusion 6. Communicate Your Results 1. Ask a Question 2. Do Background Research 3. Construct a Hypothesis 4. Test Your Hypothesis 5. Analyze Your Data and Draw a Conclusion 6. Communicate Your Results Ask a Question, Hypothesis, Do an Experiment, Analyze Your Data “I don’t know.” 18 Process Steps [Template #4] 1. 2. 3. 4. 5. Create the item/task description for the student. Using a “generic” rubric, begin by modifying the language using specific criteria expected in the response to award the maximum number of points. Next, determine how much the response can deviate from “fully correct” in order to earn the next (lower) point value. [Continue until the full range of possible scores is described] Using the “sample” rubric, create an example of a correct or possible answer for each level in the rubric. In review, ensure the item/task description for the student, the scoring rubric, and the sample rubric are aligned. 19 QA Checklist CR items/tasks have scoring rubrics that reflect a performance continuum. CR items/tasks include sample responses for each level of performance. CR scoring rubrics are clear and concise. CR scoring rubrics include all dimensions (aspects) of the tasks presented to the students. CR scoring rubrics avoid including non-cognitive (motivation, timeliness, etc.) or content irrelevant attributes. 20 Summary & Next Steps Summary Module 4: Scoring Keys & Rubrics • Developed a scoring key and rubrics for all items. Next Steps Module 5: Operational Forms & Administrative Guidelines • Given the items/tasks developed, create an operational form with applicable administrative guidelines. 21