From Compliance to Improvement: Accountability and Assessment for California’s Community Colleges Higher Education Evaluation and Research Group Spring, 2005 PLEASE COMPLETE THE PRE-SURVEY 2.15.05 Introduction Choosing Improvement over Compliance Compliance Improvement Compliance…Improvement Compliance Improvement Improvement…Compliance Saying YES to Assessment and Accountability • acknowledges community colleges’ appropriate roles in equity, upgrade training, lifelong learning, and other unconventional missions • gives faculty an appropriate voice in running their institutions • promotes a form of research in teaching and the creation of improvements in teaching • provides a foundation for widespread institutional improvement • become more effective learning environments Accountability is NOT new in CA California Community Colleges operate under at least four accountability systems: 1) PFE uses system level goals 2) State Report Card assesses performance of all publicly funded workforce preparation programs 3) Federal Vocational and Technical Education Act 4) Workforce Investment Act … and now WASC • WASC has been the last regional accreditation commission to require colleges to develop mechanisms of assessment and use of student learning outcomes •••WASC lets LOCALITIES choose which aspects of SLOs to measure and how to measure them Well developed system of internal accountability Ability to respond to external accountability requirements Module I Taking Stock CHOOSE IMPROVEMENT Take Stock Build Institutional Capacity Make Improvements and Evaluate Cycle of Assessment and Improvement Identify Improvement Strategies Norm Student Learning Outcomes Norm Assessments Choose Improvement TAKE STOCK Build Institutional Capacity Make Improvements and Evaluate Cycle of Assessment and Improvement Norm Student Learning Outcomes Norm Assessments Identify Improvement Strategies THE CRUCIBLE OF THE CLASSROOM Instructors Students Content/ Curriculum TAKING STOCK: STUDENTS Students ? • What do you know about your students’ attitudes and beliefs about learning? What We’ve Learned About Student Values and Attitudes • Students are credentialists, wanting credit/ credentials but not necessarily the learning the credential signifies (grades matter more than content) • Students are highly vocationalist, using college as a route to employment (relevancy matters more than intellectualism; students continuously make cost-benefit calculations) What We’ve Learned (cont’d) • Students constantly undermine learning outcomes. They are often fearful, afraid of being caught unprepared, isolated, intimidated by professors, and they manage their fear in unproductive ways (keeping quiet in class, avoiding hard classes, scaling down ambitions, failing to submit work even when it’s completed, dropping or stopping out) • Students define learning as the accumulation of facts Activity: Students — 1” • Into what key groups do you subcategorize your students? • How do you identify the changing needs of your student groups? • What are your students’ beliefs and values about learning? • How do you know? What is the forum in which you discuss categories, changing needs, and attitudes of students? TAKING STOCK: INSTRUCTORS Instructors ? • What do you know about instructors’ attitudes, beliefs and knowledge about teaching and learning? • Is teaching “community property”? ACTIVITY: INSTRUCTORS — 1” • What are faculty attitudes and knowledge about learning, teaching, assessment, “teaching as community property”, and continuous improvement? • Is there a forum for discussing examples and reasons for student success or lack of success, teaching ideas and methods? • How have faculty previously developed, shared and implemented student learning outcomes and assessments? TAKING STOCK: CURRICULUM Curriculum ? • What are external influences on curriculum? • What is the consistency of curriculum? • • The role of employer feedback ACTIVITY: CURRICULUM — 1” • Which of your critical curriculum is set by external agencies? • What is the consistency in expectations across sections of a course? • When/ where do instructors norm content and assessment? • Do instructors collaborate on and/or do peer review of learning outcomes for critical courses? TAKING STOCK: INSTITUTIONAL Institutional Support ? • What regulations impact student learning outcomes? • How do local practices and policies impact student learning outcomes and assessment? ACTIVITY: INSTITUTION — 1” • What federal, state, district, and/or professional-trade-industry regulations impact student learning outcomes, their assessment, and improvement on your campus? • Local practice and policies (faculty time; professional development policies; hiring, promotion, tenure policies; “teaching credential” TAKING STOCK: CONSISTENCY/ ALIGNMENT Alignment/ Consistency ? • How consistent are faculty’s expectations of student outcomes across sections of a course, courses, general education alternatives, certificates, and degrees? • What are forums for discussing expectations? ACTIVITY: CONSISTENCY — 1” • What is the consistency between the values of students and instructors? • What is consistency between instructor’s use of curriculum and her/his own values and beliefs about teaching and learning? • What is articulation among sections of a course and/or courses in a sequence? TAKING STOCK: EXISTING ASSESSMENTS Existing Assessments ? • What assessments are in place now (placement tests, capstone projects, portfolios, paper-pencil tests, etc.?) • How do those assessments contribute to improving student learning outcomes? ACTIVITY: ASSESSMENTS — 1” Name of Purpose of Effectiveness of assessment assessment assessment TAKING STOCK: GOVERNANCE of ASSESSMENT Governance ? • Who coordinates student learning outcomes, assessments, improvement strategies and continuous improvement? TAKING STOCK OF Assessment Governance • If assessment is to be continuous, on-going and stable, then it must be overseen by a group that takes responsibility for all aspects of assessments. In a self-reforming institution focused on instruction, the Assessment Committee would be the central committee in a college, so that concern over the nature and effectiveness of instruction drives all other aspects of a college. In this way, the Assessment Committee should have responsibility not only for creating a series of assessments but also for overseeing the subsequent stages in the assessment system. ACTIVITY: GOVERNANCE — 1” Student Assessment Leadership Team Membership Now Add? Rationale Reporting Out “When we take stock [student attitudes & beliefs; instructor attitudes & knowledge; consistency of pedagogy & curriculum; local practices & policies; existing assessments; local governance of SLOACs], we believe we have strength from _____ and we want to build __________.” A Primer Setting and Assessing Student Learning Outcomes Definition: SLO “Robust” student learning outcomes incorporate: • behavioral objective — what a student should know, value and be able to demonstrate/ perform • conditions under which performance will be assessed — simulation, lab, portfolio, writing task • Criteria/ performance standards/ primary traits for assessing student performance • Rubric for scoring student performance 1Adapted from Scroggins, B. (2003, 2004). Targeting Student Learning. Modesto Junior College. <http:cai.cc.ca.us/workshops/SLOFocusOn Results.doc> Confusion: Terminology?? Course Objectives versus Student Learning Outcomes: Generally, in California, course objective states what student will demonstrate, represent or produce at end of course. SLO also incorporates the conditions under which assessment will occur (test, portfolio, demonstration, etc) as well as evidence/criteria slo — SLOA — SLOAC • Among departmental/ institutional faculty, NORM - objectives for student performance (what should students know, be able to do, value) - conditions under which performance will be assessed (simulation, portfolio, lab experiment, writing assignment) - traits/ criteria for assessing student performance and rubrics for scoring • Implement assessment plan • Compile and analyze pattern of results from scoring student performance • COLLECTIVELY SET IMPROVEMENT PLAN • IMPLEMENT IMPROVEMENT PLAN • CONTINUE CYCLE Norming1 • “Nested” discussions, decisions and actions • Collaboratively authored and collectively accepted expectations for student learning and assessment • Norming does NOT mean identical learning activities, emphases, pedagogy — it means C&C 1Maki, P.L. (2004). Assessing for Learning. American Association for Higher Education, Sterling, VA: Stylus. Norming ~ Higher Ed Culture • SLOs, criteria/ primary traits, rubrics are set collaboratively with full and adjunct professors • Outcomes/ examples of student work are shared and peer-reviewed • Improvement alternatives are agreed upon • Autonomy/ Academic freedom/ Professional discretion and expertise Validity: Instrument/ procedure measures what it is intended to measure T B H A E S I C S Reliability: Inter-rater or test-retest • •••••• • •• • •• Types of Data • Quantitative • Qualitative SYSTEMATIC References: Norm & Criterion • Norm-referenced assessments measure individual outcomes relative to the sample of people taking the test- grading on curve • Criterion-referenced assessments measure individual outcomes compared to certain norms or criteria -mastery, licensure ** Criterion-referenced assessments are appropriate for measuring improvement in SLOs. Direct vs. Indirect Measures • Direct measures are reasonable replications of real world tasks; authentic assessment- DO IT • Indirect measures are proxies for demonstrated performance: grades, persistence, transfer (legislated measures are often proxies) External vs. Internal Accountability • External accountability is used to meet requirements of funding/ regulatory agencies • Internal accountability is used to improve student learning within courses, programs or degrees. Well developed cycle of internal accountability Ability to respond to external accountability requirements F O R M S O F A S S E S S M E N T Assessments • • • • Capstone projects Demonstration Simulations Portfolios } AUTHENTIC ASSESSMENTS •Criterion-referenced tests (licensure exams) •Norm-referenced tests (curve) Embedding Assessments • Assessment is woven into existing courses • Identify SLO demonstration points • Retain and analyze results • SLOs are aligned with certificate or degree goals EXAMPLE: Embedding Assessments in a Program Course Assessment Criteria Rubric Eng 101 Write for audience • Writes research paper appropriate to a specific audience A= Uses language & concepts appropriate for a professional, technical or literary audience Social Issues • Designs Valuing Diversity workshop for a specific setting A= Incorporates ethnic, life style and gender diversity; incorporates activities appropriate for that specific setting Identifies potential conflicts among diverse groups Levels at which Outcomes Can be Measured • Targeted population of students • Lesson/ unit of study • Program – Occupational certificate – Major – Department/ Division • Associate degree (A.A./A.S./A.A.S) • Institutional Targeted Populations • For VTEA and some special grants, a college may wish to focus on retention, persistence and/ or achievement for special populations of students – CalWorks – First generation – Limited English proficient Lesson/ Unit-Level Assessment Classroom assessment techniques (Cross & Angelo) – Systematic but informal, frequent gathering of information about content and pedagogy: • What was hard to understand today? • How did this teaching method work for you? Course Level SLOs • Most campuses are emphasizing course level assessments as part of program level SLOACs.1 Friedlander, J. & Serban, A. (2004). Meeting the Challenges of Assessing Student Learning Outcomes, in Friedlander, J. & Serban, A. [Ed.]Developing and Implementing Assessment of Student Learning Outcomes. New Directions for Community Colleges No. 126. San Francisco, Jossey Bass. 1 Program and Institutional Level SLOs • Courses are aligned to meet program goals and expectations. • Program/ major/ or general education goals are aligned to meet institutional goals and expectations. Norena Norton Badway, Ph.D. Principal Phone 209-951-7477 home office 209-946-2168 University office 209-601-7121 Email badway@aol.com nbadway@pacific.edu