Troy University eTROY Colloquium April 17-18, 2012 Assessment roles of faculty and adjuncts in eTROY Dr. Wendy Bailey – Sorrell College of Business Dr. Christina Martin – College of Health & Human Services Dr. Isabelle L. Warren – College of Education • • • Review the purpose and goals of assessment of student learning Clear up concerns about assessment Provide an introduction to the assessment cycle • • • University and Program, Linkages, Outcomes, Data Collection/Reporting, Action Plans Define the assessment role of faculty and adjunct professors Explain how Blackboard Outcomes can help and the timeline for implementation 3 Assessment is the ongoing process of: 1. Establishing clear, measurable expected outcomes of student learning tied to program and university missions 2. Ensuring that students have sufficient opportunities to achieve those outcomes in the curriculum 3. Systematically gathering, analyzing, and interpreting evidence to determine how well student learning matches our expectations 4. Using the resulting information to make changes that improve student learning 5. Continuing to measure, evaluate and make further changes as part of a process of continuous improvement - adapted from Linda Suskie, Assessing Student Learning, 2009 4 Programs designed curriculum they thought would give students the background they need in a particular field Instructors taught courses in the curriculum and gave grades for their course based on student performance overall If students did well in the course, we assumed they knew what we wanted them to • • • …. But did they? 5 Learning is a function of the curriculum, not the course. • Curriculums need to be more than a collection of courses. They need to ensure that every student has ample opportunity to achieve key institutional and program learning goals. • Institutional and program goals are broader than the course goals themselves, and should be reinforced throughout the curriculum. Students need to see the connections among their courses and other learning experiences as this makes learning deeper and longer lasting. 6 The goal of assessment is to improve our academic programs. Good assessment programs: 1. Help us to stay focused on and to continually 2. 3. 4. 5. evaluate our stated program goals Bring faculty and staff together to discuss important issues related to teaching and standards used Help faculty, staff, and students see how courses link together to achieve program goals Identify issues that may impede student learning Allow us to make better decisions that are based on data, rather than hunches, anecdotes, or intuition 7 When a professor assesses the work done in a course, assessment has a different purpose than program-level assessment. • In a class, every student is assessed, the professor sets their own criteria and standards (explicit or not), the professor evaluates the work and the student gets the result. • In program-level assessment, sampling is acceptable, the faculty (not one professor) sets explicit criteria and standards, the faculty or outsiders do the evaluation, and the feedback goes to the program faculty. 8 Program-level assessment checks if students are on track to achieve or have achieved important program goals. • Since these goals are reinforced throughout the curriculum, an assessment in one course doesn’t measure what that professor did or did not do, but what the program to that date has achieved. • Evaluation of individual faculty should never be the goal of program-level assessment and the results should never be used that way. • Assessment is a team sport! We celebrate our victories together, and we figure out how to change things together if results are disappointing. 9 “I have assessed students! I assigned them a course grade.” • • • Course grades are not sufficient for programlevel assessment. An overall course grade of a B doesn’t tell us which skills and concepts they have mastered. Sally may have gotten a C average on exams and an A in her project, while John may have gotten an A average on his exams and a C in his project. Both earned a B, but we won’t know looking at the grade alone what each has mastered and what skills need more work. 10 “Mandating program level assessment violates the principle of academic freedom.” • “Academic freedom does not absolve instructors of their responsibility to ensure that all students in their program… have sufficient opportunity to achieve those goals that the faculty collectively agree are essential… • “Academic freedom also does not relieve faculty of the obligation to assess student learning of their subject… -- Linda Suskie, Assessing Student Learning: A Common Sense Approach, 2009 11 “I do not have time to conduct assessments – as it is not within my job description.” • Assessment is everyone’s responsibility. • Accrediting agencies want to see that faculty know the program’s goals and that they are involved in assessment activities. • To save time, be smart about assessment. Courseembedded assessments can serve double-duty. Example: capstone research project Instructor evaluates for course grade (content, etc.) Samples of the same projects can be evaluated by faculty using two different rubrics for writing and critical thinking skills to evaluate the achievement of program-level goals. 12 Set goals, objectives, and outcomes Align curriculum with outcomes Use the data to make meaningful changes Mission Choose how outcomes will be assessed & set criteria Evaluate, report, & share the data Gather the data 13 Program goals are broad, conceptual statements that show the long-term aim or purpose of the entire course of study. Goals should be related to the college’s mission, as well as the institution’s. Program objectives are more specific than goals and are more short-term. They indicating the intended consequences of instruction within a timeframe. Program student learning outcomes (SLO’s) describe significant & essential learning that students should achieve or reliably demonstrate at the end of a program. SLO’s must be specific, observable, & measurable. 14 Next, examine your curriculum. • Where are concepts related to each outcome introduced, reinforced, or mastered? • Students should be given multiple opportunities to master important program goals. • Are there gaps in your curriculum? 15 Course Program SLO 1 Program SLO 2 ENG1101 I I ENG1102 ENG3341 R R ENG4430 ENG4442 Program SLO 3 Program SLO 4 I M I M R M M I = introduced, R = reinforced, M = mastered 16 Assessments measure whether students have achieved the program learning outcomes we’ve set. ◦ Direct assessment measures provide the strongest evidence of student learning because they are based on actual student performance on a task. ◦ Indirect assessment measures supplement direct measures. They provide information about student perceptions about learning experiences & attitudes towards the learning process, as well as program quality. No assessment of learning outcomes should be based on indirect measures of learning alone! 17 Criteria are standards of performance required to meet the objective/outcome, that is, the quality to be judged in the assessment task: • Quality words often used in criteria: clarity, accuracy, depth, legibility, impact, relevance, etc. Example: “Clarity of explanation” is a criterion for “Students will be able to explain how concepts in the subject interrelate.” • May be expressed as a percentage, a target number of accomplishment, a rate, an increase over a previous criterion, completion of a task or event, etc. Example: 85% of the students will be able to analyze … using the correct statistical procedures. 18 Assessment data needs to be collected in order to analyze it. Accrediting agencies often will allow sampling, rather than a census, provided the sample is representative and data is provided by program and location. This is one place where products such as Blackboard Outcomes can be very valuable. 19 Assessment data needs to be evaluated and the results communicated to others in order for it to inform decisions about programs. • Assessment doesn’t bring improvements in student learning; analysis and use of the results do. • If your assessment data is lying in a corner gathering dust, ask yourself whether the information gathered is useful. If not, figure out why. • Assessment results also need to be communicated to others (faculty, students, stakeholders) who can use them to make decisions. 20 Assessment is only valuable if the results of our analyses are used to make meaningful changes. • • “Closing the loop” simply means using the data to make changes. These changes need not be huge, but they should be meaningful. Examples of such changes could be new or modified courses, better coordination among courses or sections, modifications in concentrations, curriculum development grants, new course sequencing or prerequisites, opportunities for remedial work, new common assignments to address weaknesses, etc. 21 Be knowledgeable about your program’s student learning outcomes. 1. Know the courses that are selected for assessment activity 2. If you are the instructor on record for assessment courses, know what measures are being utilized for assessment activity….and USE them. 3. Ensure completion of the assessment activity This can be a graded or non-graded assignment 4. Ensure that the assessment activity was evaluated and documented! 5. Maintain this documentation; report findings to your assessment point person (program/department chairs). 22 In maintaining your assessment documents, specify the raw number of students who are “satisfactorily” and/or “unsatisfactorily” completing the identified assessments. • Example: 7/10 students scored 80% or higher (satisfactorily) on the diversity project. This information will help to make informed decisions and to enhance program quality. • Troy University HOMER Report Sample. 23 Curriculum maps Tagging of questions or assignments to particular learning objectives Gathering and sampling of assessment data Analysis of assessment results And more! • • • • • Timeline for implementation at Troy 24 Questions? Wendy Bailey Christina Martin Isabelle Warren Kang Bai Wendy Broyles wcbailey@troy.edu cllmartin@troy.edu iwarren@troy.edu bkang@troy.edu whuckabee@troy.edu