MSCHE Standards: Institutional Effectiveness (7) and Student Learning (14) Dr. Jo Allen, Senior Vice President & Provost Widener University Overview of Presentation • • • • • Operational Terms Drivers of assessment Assessment of institutional effectiveness Assessment of student learning outcomes Questions and concerns Assessment is … the process of asking and answering questions that seek to align our stated intentions with documentable realities. As such, in higher education, it deals with courses, programs, policies, procedures, and operations. What or who is driving assessment? Accreditors… who determine the reputable from nonreputable institutions and programs who ensure that institutional practices support the viability and sustainability of the institution and its offerings who represent disciplinary and institutional interests Assessment drivers (cont’d.) • The public: “Ivory Tower,” liberal bias, ratings/rankings? • Legislators: responsive to citizens’ concerns about quality, costs, biases….or? • Prospective faculty: Quality and meaningful contributions to students’ lives? • Prospective parents: real learning and preparation for careers—worth the money? • Prospective students: How will I measure up? And what kind of job can I get when I graduate? • Funding agencies/foundations: evidence of an institution’s or faculty’s commitment to learning and knowledge and evidence of [prior] success? Make no mistake…. Assessment of Institutional Effectiveness vs. Student Learning • Institutional effectiveness = the results of operational processes, policies, duties and sites—and their success in working together—to support the management of the academy [Standard 7] • Student learning = the results of curricular and co-curricular experiences designed to provide students with knowledge and skills [Standard 14] Institutional Effectiveness What Accreditors Want to Know Institutional Effectiveness: What Accreditors Want to Know • Can you verify the effectiveness of operational contributors to a sustainable educational experience? • Do you use data and other findings to improve the quality of your educational and operational responsibilities? • Do you use those findings to align resources (financial, staff, curricular, co-curricular) to enhance desired outcomes? What sensibilities point to institutional effectiveness? What sensibilities point to institutional effectiveness? • A well-articulated set of processes for critical functions • A clear line of responsibility and accountability for critical functions • An alignment of the importance of the function and sufficient resources (staff, budget, training, etc.) to support the function • Evidence of institution-wide knowledge of those critical functions, processes, and lines of responsibility What kinds of evidence point to institutional effectiveness? What kinds of evidence point to institutional effectiveness? • Well-managed budgets • Accreditation and governmental compliance • Clearly defined and supported shared governance (board, president, administration, faculty, staff, and students) • Articulated communication pathways and strategies [transparency] • Consensus on mission, strategic plan, goals, priorities, etc. • Student (and other constituencies’) satisfaction Sites of Institutional Effectiveness Sites of Institutional Effectiveness • Processes [existence and transparency] (samples) – Enrollment: Admissions, financial aid, registration – Curricular: Advising, progress toward degree completion – Budgeting: operations/salaries; capital; bond ratings and ratios; endowment management; benefits; etc. – Planning: strategic planning, compact planning, curricular planning, etc. – Judicial: education/training, communication, sanctions, etc. – Residence Life: housing selection, training for RAs, conflict resolution/mediation – Advancement: fund-raising, alumni relations, public relations, government/corporate relations, community relations, etc. Sites of Institutional Effectiveness • Units/Offices of operations (samples) – – – – – – – – – – – Advancement Admissions Bursar Registrar Athletics Deans (school/college) Center for Advising, Academic Support, etc. Campus Safety Institutional Research IT Maintenance Measures of Institutional Effectiveness How do we measure institutional effectiveness? • Tangible data: Audited budget statements, handbooks, enrollment data, institutional data • Records/reports of activities and/or compliance • Self-studies pointing to documented evidence • Surveys of satisfaction, usage, attitudes, confidence, etc. • Disciplinary accreditation reports Student Learning Outcomes Assessment What Accreditors Want to Know Student Learning Outcomes: What accreditors want to know… • Have you articulated your institutional, general education, and disciplinary/course-based learning objectives? – Are the objectives documented? Where? – Are the objectives measurable? • Have you actually conducted the assessment to see if students have learned what you expect them to learn? • Did you use your results to maintain or improve your educational offerings? • Did changes make a difference? Learning Outcomes? • • • • • • • • • Civic engagement Diversity appreciation Communication skills Professional responsibility Ethics Critical thinking Collaborative learning Leadership Mathematical or Quantitative competence • Technological competence • Scientific competence • Research skills • Cultural competence • Interdisciplinary competence • Civic responsibility • Global competence • Economic/financial competence • Social justice Measurable Objectives/Outcomes? • Yes or No Evidence of… • The degree to which… • Alignment evidence… Sites of Evidence? • Essays/Theses • Portfolios (faculty or external readers evaluated) • Quizzes • Oral presentations • Homework assignments • Lab experiments • Tests • Journal entries • Projects • Demonstrations With Assessment of both Institutional Effectiveness and Student Learning Outcomes… Conducted the Assessment? Analyze, Interpret, Reflect? What does it all mean? Make Decisions Sample Decisions for Institutional Effectiveness • • • • Reallocate staff positions Re-engineer a process Cross-train employees Institute a new policy/practice Sample Decisions for Learning Outcomes • • • • • • Alter the curriculum content Alter the teaching methodology Alter the assignments Alter the schedule Alter the course rotation Alter the students Reassess: Did the alterations help? • • • • • • • • • • Better? Smarter? Clearer? Faster? Safer? More involvement? More effective? More efficient? More sustainable? More replicable? Middle States… • No prescription for your operational objectives or learning objectives • No prescription for how you measure • No prescription for what you do as a result Middle States • Evidence of operational objectives and learning outcomes • Evidence of measures • Evidence of analysis and action Assessment • Standard 7: How is the institution doing? • Standard 14: What and how much are the students learning? Assessment of Institutional Effectiveness & Student Learning Outcomes: What is similar? • A commitment to doing the very best job possible under whatever conditions exist • A commitment to recognizing ways that altering those conditions can affect the outcomes (e.g., labs, field placements, time of meeting, style of teaching) • A commitment to recognizing that altering the outcomes can affect the conditions (e.g., student success in particular studies attracts more students of certain kinds) Ultimately…. We hold ourselves and our colleagues accountable for articulating the intentions of our work and then measuring the realities, resulting in designing and implementing strategies for improvement over time. • How are we doing? • How can we do better? QUESTIONS? Comments?