Danielle Duffourc, Ph.D. Director for Institutional Effectiveness and Assessment Xavier University Today’s workshop will focus on: ◦ Understanding the assessment results reported annually and how they can be used to improve your program/department. Who is going with us? What time are we leaving and how long will we be gone? How much will the trip cost? What kind of activities will we do on the trip? Is there any car maintenance to be done before leaving? Remember that measurement is not judgment. When the gas tank tells you it’s empty, it’s not judging you – just stating a fact. It is your job to make the best decisions possible with the information from that measure. You need gas to make the trip, but you would never have known that if you didn’t look at the meter. Find your measures and don’t look at them as failure indicators. Look at them as opportunities to improve the program. Know what you are looking for: ◦ Why are we assessing X? Are we aiming to improve our program or validate it? What decisions will this assessment help us and our audiences make? ◦ Do we have a clear strategy in place to ensure that we can achieve each of our major goals? ◦ Do the tools and techniques we’re using to measure progress clearly correspond to our goals? ◦ Have we set targets against which to compare our results? What about “Inadequate” findings? ◦ Are we certain that the results are inadequate? ◦ Does other evidence indicate a problem? (This is why we use two means of assessment per outcome. ) ◦ What is the scope of the problem? (If only a small percentage of total, may not be worth redesigning the entire process or curriculum). In the academic setting, disappointing results can usually be attributed to: ◦ Learning Goals ◦ Curriculum ◦ Teaching Methods ◦ Assessment Strategy and Tools In the administrative setting, disappointing results can usually be attributed to: Operational Goals Implementation Strategy Assessment Strategy Two ground rules: ◦ 1.) Don’t try to brush disappointing results under the carpet. The primary purpose of assessment is improvement. ◦ 2.) Don’t be punitive. Punishing faculty/staff responsible for area with disappointing results will generally impede the assessment process. Focus on a making a limited number of changes. ◦ Departments that try to engage in too many initiatives wind up accomplishing none of them. Focus on specific elements that relate to their chosen outcomes. Create sustained conversations about assessment data and engage in sense-making activities. This is akin to a campaign— not a series of reports posted on a website. ◦ List the individuals, constituencies, and governance structures that need to be engaged in discussions of assessment evidence and then develop plans to engage these constituencies in conversations for both making sense of and developing responses to the data. ◦ Even before these groups get data, it is important to consider engaging them in planning for what different findings might imply. Communicate assessment results. ◦ Can faculty, staff, and students readily identify the outcomes, measures, and recent findings of their department’s assessment program? ◦ Would all faculty members in a department be able to cite the same two or three things that their department is doing well and the same two or three areas for improvement along with evidence that supports their assertions? ◦ If the answer to these questions is “no,” then it is time for the department or program to revise how it communicates about assessment. Get evidence into the hands of people who are able and interested in using it to improve student learning and student experience and then support their efforts to understand and use the data. ◦ Hiding data because they are too controversial, sending out a report via email, or posting information on a website without creating opportunities for people to come together to reflect on and make sense of the findings will ensure that assessment evidence has little long-term impact. Sometimes assessment results suggest fairly simple, low-cost, quick fixes: ◦ Only focus on certain goals in certain courses/settings. ◦ Incorporate more exercises involving desired skills. ◦ Revise surveys and assessment tools for the next year. Sometimes assessment results point to a more significant problem: ◦ May need to find professional development and/or campus resources that will help students/faculty/staff learn needed skills outside the classroom (software, expertise, etc). This requires considerable planning and resource development. Campus leaders must be engaged. TracDat has built in modules for Actions Taken and Follow-up: It is also important to honor those results that are positive. ◦ Ongoing incentives and rewards programs can be developed. Appropriate rewards will depend on the culture of the department. Assessment is a perpetual work in progress. It is important to implement and refine as you go. There is no point in continuing assessment strategies that are time consuming and are not providing useful information. Seek to fine-tune rather than dramatically overhaul your assessment process. Blaich, C. and Wise, K. (2011). From Gathering to Using Assessment Results: Lessons from the Wabash National Study. Suskie, L. (2009). Assessing Student Learning: A common sense guide.