Winning Hearts and Minds: Successful Strategies for Service Area Assessment Mary McLean-Scanlon Director of Institutional Effectiveness Michael Dunn Coordinator of Data and Administrative Assessment Finger Lakes Community College A little about us… • Mary: • Director of IE at FLCC since August 2013. Held prior roles as Director of IE and Assistant Director of Research and Planning for two community colleges in Chicago • Doctoral candidate in executive leadership; Masters in political science • Instructor for the SUNY CPD Institutional Effectiveness Program • Adjunct political science instructor • Mike: • Coordinator since January 2015 • M.P.A. graduate • Data nerd IE is Cyclical and Interdependent • Cyclical, documented process of continuous improvement • Institutional Effectiveness cannot be achieved without connection, consistency, and commitment between all four areas Strategic Planning Master Planning Academic Assessment Institutional Effectiveness Service Area Assessment Institutional Research Accreditation Assessment • Assessment: “Assessment is any effort to gather, analyze, and interpret evidence which describes institutional, departmental, divisional or agency effectiveness” (Upcraft and Schuh, 1996) • Assessment is formative: ongoing; intent is to improve; process orientated • It is NOT summative • Assessment is NOT evaluation • Evaluation is summative: final, arrive at a judgement or overall score Central Themes • Collaboration • Leadership support • Assessment is NOT evaluative • Connection to budget • Data driven decisions Communicate the Value of Assessment • Gets staff across departments talking about their goals for their respective departments and students • Increases our confidence that we are putting our time and resources into activities that we value as an institution • Gather and use data that will enable us to make decisions that lead to improved processes, informed decision making, and efficient policies • Have ready access to data that will meet accrediting agency requirements • Gather and use data that will strengthen arguments for increased funding and/or resource allocations to areas that are meeting outcomes or need resources to meet outcomes Who is involved? • The entire college or university community • It is vital that every level of the organization participates in continuous improvement and that it is modeled from the top • Support from senior administrator is essential • Institutional Effectiveness is sometimes an office, or it can be several offices serving an overall purpose • If areas are spread out, you still need someone responsible to bring them together Best Practices • Assessment Plan • Program Review • External Review • Discovery Report • Annual momentum • Governance Assessment Plan • Assessment Plan VERSUS Operational Plan • Goals need to align to something stable, such a learning values • • Aligning with SP okay, but poses challenges Institutional Learning Outcomes • If technology is possible, buy it! • • Trying to manage these plans in word or excel is horrible Technology creates a collaborative environment • Start with assessment plans for all non-academic areas • May require areas to put processes into place to collect data • Rubric Assessment Plan • Components: • • • • • • • Mission Vision SMART Goals Implementation Strategy Measurement Tool Data Collection and Process Evidence • Data is used to improve processes and drive decisions • Determine what is working well and where can the unit improve Program Review • Every five years, a comprehensive review of the unit • Critical review of resources and processes • Goals: • Determine what is working well and provide evidence of this • Identify areas that could be improved and create a plan to address these • Service as evidence for funding, staffing, technological, and space needs • THIS IS WHERE YOU TIE TO THE BUDGET!! Program Review Report • Create a handbook • Chapters: • • • • • • One: History Two: Mission, Vision, and Goals Three: Department Resources – Personnel Four: Department Resources – Budget Five: Department Resources-Facilities, Technology and Equipment Six: Outcomes Administrative Support • Both the assessment plans and the program review reports need to be created collaboratively, with senior administrators approving the reports. • We use an escalation feature in our assessment platform to do this • Lesson learned: some cabinet members have been more supportive than others. In divisions without cabinet support, some staff have remained disengaged, or became disengaged, because they did not feel support from their cabinet member External Review • Meetings: PRM, Department, Cabinet member, and stakeholders • Process for choosing reviewers • Provide: • Manual • Honorarium • Template for report • One or two day visit • Take home report writing versus afternoon writing Discovery Report • AKA Closing the loop (hate that name!) • Addresses what was learned in the process • What areas are a strength? • What areas could be improved? • What resources are needed to improve? • Use external review report as EVIDENCE of needed resources Annual Summary Report • Annual report completed in bulleted format that follows the same format as the program review • Department fills in what went on that year in each area • This report will be used when the actual program review is written • Major pitfalls of program reviews: • they do not cover a full five years • someone leaves and takes institutional memory with them Annually… • Annual Summary Report • Update Assessment Plan • After program review, update on recommendations and changes in a special section of the assessment plan Response Letter and Presentations • Cabinet member writes letter to unit addressing the discovery report • Cabinet member meets with unit • Unit has the optional opportunity to present their findings to the cabinet and Board of Trustees • This is highly recommended, as it strengthens arguments for funding Governance • Both academic and non-academic assessment should have committees that are part of the governance process • The chair of each of these committees is either faculty (academic) or staff (non-academic) • Assessment plans and program reviews are endorsed/approved by either Senate or College Council Overall Challenges • • • • • • • Flying the plane while building it No assessment capacity on the non-academic assessment side of the house Not having institutional learning outcomes No technology to support process Not enough staff Brand new committee with no experience Length of process FLCC’s Story, Cliff Note Version Positives of this Experience • Committee was very involved in the creation of the process • Group One became a pilot group where we modeled “assessing the assessment process” • We didn’t fall into the planning rut that many college’s fall into • Purchase of assessment software and creation of new staff position • Very robust process has been created that many colleges (and middle states) are impress with! Questions?