Cabrillo College Participatory Governance Handbook Cabrillo College Participatory Governance Committee Bi-Annual Report Submitted to CPC Committee Name: ARC Date of Approved Report: 5/20/15 Please list the committee goals (milestones) for the semester and which are: 1. 2. 3. 4. Completed In Progress Continued to Next Semester Deleted (Please just put the number next to the goal.) If completed, please answer the following question and add to the goal summery. “If the goal/milestone is completed, please give three to five sentences on how the completion of the goal/milestone has helped to make Cabrillo College a more effective learning institution” If in progress please list the items completed and what needs to be completed next term (with a tentative timeline for completion if possible). If continued to next semester, please itemize the steps to be taken to complete the goal/milestone and give a tentative timeline for completion. If deleted, please give a one sentence response as to why the goal has been deleted. Thank you to the Committee from CPC for your diligence this semester. You can either send this summary to the committee as part of its final information review, or any member or all of your committee can attend the CPC meeting to make a formal presentation of the report. Please contact Cheryl Romer for deadline submittal dates and possible oral report timelines. I. Goal One: Fulfill ACCJC Recommendation #2 – 2 (in process) Action Items completed: 1. Create Plan with timeline to fulfill recommendation #2. 2. Take plan through shared governance process for approval by Faculty Senate and CPC. 3. Complete items on timeline for Spring 2015. 1 Cabrillo College Participatory Governance Handbook Action Items to be completed: 1. Complete items on timeline for Fall 2015 2. Complete items on timeline for Spring 2016. II. Goal Two: Prepare ARC 2014 Annual report - 2 (in process) Action Items Completed: 1. Analyze assessment data and processes in Instruction, Library, Student Services and Administration. 2. Write annual report detailing assessment trends and themes; make recommendations to improve student learning and campus assessment processes based on analysis. 3. Share report with Faculty Senate and CPC, Action Items to be completed: 1. Share report with Governing Board by July 2015. III. Goal Three: Prepare Best Assessment Practices Manuals - 2 (in process) Action Items Completed: 1. Select best practices to feature in manuals by end of Spring 2015. Action Items to be completed: 2. Complete manuals by December 2015. 3. Publicize manuals in Spring 2016. 4. Post manuals on SLO web site in Spring 2016. IV. Goal Four: Assess Implementation of CurricUNET SLO module - 2 (in process) Action Items Completed: 1. Assess departmental training and make any needed improvements. Action Items to be completed: 2 Cabrillo College Participatory Governance Handbook 1. Brainstorm solutions to any developing problems with SLO module itself and work with CurricUNET to implement them by June 2015. V. Goal Five: Analyze Emerging Issues/Recommendations in previous ARC annual reports – 1 (completed) ARC analyzed all its recommendations and the emerging student learning issues it noted in the past seven reports (2007-2013). The analysis documents how Cabrillo has become a better learning institution: student academic issues (especially lack of preparedness) have remained the same but the approach to solving these has changed, becoming more sophisticated, interdepartmental and college-wide. In addition, ARC found that of the 63 recommendations it has made to improve student learning and campus assessment processes, 44 have been completed, 13 are in process, 5 are on-going and only one was abandoned since it was found to be impossible. Finally, ARC found that the college has moved from the creation of assessment processes to quality assurance: how we talk about assessment has changed, shifting from a focus on the instructors to a focus on what students do. VI. Goal Six: Assess Institutional Effectiveness of ARC – 2 (in process) Action Items Completed: 1. Work with PRO Office to determine metrics for assessment. 2. Undertake committee survey. Action Items to be completed: 1. Analyze survey results in Fall 2016. 3 Outcomes Assessment Review Committee 2014 Annual Report Introduction At Cabrillo, we assess student learning outcomes and administrative unit outcomes in a regular, on-going rotation; their results are analyzed and included in departmental Annual Updates, fueling yearly planning. The six-year Program Planning process provides an opportunity for departments to analyze these results in depth, guiding their long-term plans and recommendations. Each year, the Outcomes Assessment Review Committee (ARC) reviews the Program Plans submitted to Instruction, Student Services and Administration, looking for common themes and broad trends in each department’s analysis of its assessment results. This report reflects on the insights discovered by the following programs: College Sector Program Plans Submitted Instruction: Transfer and Basic Skills Communication Geography and Meteorology Chemistry Geology/Oceanography/Environmental Sciences Medical Assisting Not scheduled this year Accessibility Support Center President’s Office Student Success and Support Programs Tutorials Warehouse Instruction: Career Technical Education Instruction: Library Student Services Administrative Units 2014 Program Plan Assessment Results: Emerging Needs & Issues Instruction Successes The faculty in Instruction learned that most students were successful at mastering the course and certificate SLOs as well as the skills required for each of the college core competencies. Some successes were directly linked to classroom activities and student support: • Peer tutoring and in class hands-on activities in several departments were shown to positively correlate with improved student learning. This observation has led some department to focus on creating more hands-on activities in big lecture classes. 1 • The length of time spent in science lab sessions positively correlated with higher student success. • Students who received individual attention from student assistants also performed better. Challenges When students struggled with SLOs, some common themes appeared. Departments noted the following areas of concern and proposed innovative and often inter-departmental solutions to address them: • Problems with writing and grammar, especially the writing of lab reports. Proposed solutions included giving more writing assignments in science courses, especially lab reports or research papers, requiring drafts of papers, creating a possible 1 unit writing course for the sciences, working with the English department to design science-oriented English courses, and more referrals to Writing Center. • Lack of basic math and other numerical skills in non-math classes. Proposed solutions included working more closely with the Math department. • Issues in critical thinking, especially applying concepts to problems and situations. Proposed solutions included creating class activities that require more hands-on learning, using clicker response systems to monitor student thinking and comprehension, and going out in the field where students would be required to apply what they’ve learned in the classroom. • A number of departments perceived potential student benefit from establishing dedicated, subject-specific study/learning spaces. The college will need to examine this idea as part of its facilities planning process. Student Services Successes Student Services professionals learned that their services made a great difference in student success. • SLO Assessment results showed that students greatly benefited from in-person information provided by the Accessibility Support Center. • A pre and post test revealed that students knew how to challenge a pre-requisite on their own after being assisted in completing the process the first time. . Challenges Some challenges emerged as a result of SLO assessment: 2 • Students revealed that faculty and staff were not uniformly aware of the steps involved in challenging a pre-requisite. Proposed solution: Post more materials on the web and distribute to Division offices to help faculty, staff and students be more aware of the steps required. • Students with disabilities need more support classes to help them succeed. Proposed solution: The Accessibility Support Center proposes to work with Counseling to create a section of Counseling and Guidance 51 for this population.. Administrative Units: Successes Administrative Units, new to AUO assessment, found some revealing successes from their assessment efforts: • An assessment of Assessment, Orientation and Counseling by the Counseling and Educational Support Services Division revealed great success from the new BYMA (Before Your Make An Appointment) efforts, facilitated by the CESS office. • The Warehouse discovered that those who used their services were very satisfied with them, but many campus faculty and staff were unaware of what the department had to offer. Challenges: • The CESS Dean identified a need for regular direct communication with all members of some small departments, to improve awareness of and involvement in new campus issues and changing requirements. • Students coming to Tutorials benefit less from their sessions if they come unprepared (without an assignment etc.). Proposed Solution: Tutorials will develop pre-semester communications and increase in-class visits to better inform students of their role in the tutoring process. Commendations ARC salutes the Geography Department for an excellent and sophisticated analysis of their assessment results. They detailed specific issues faced by students and the department’s creative solutions to meet them, which led to concrete goals and recommendations for the program plan overall. 3 ARC lauds the Chemistry Department for a superior analysis of assessment results and creative solutions to student learning issues. The work done for the Chemistry 12 class in assessment and analysis was particularly outstanding. ARC commends the Accessibility Support Center for implementing a new and highly successful process (a pre and post test) to assess their outcomes. ARC applauds the Counseling and Education Support Services Division for an excellent use of assessment tools to get actionable results. Assessment Facts and Figures The college tracks departmental participation in the assessment process, encouraging robust contribution from all members. The tables below show what was accomplished and who participated. Because assessment in Instruction takes place over a number of years, an average participation rate was calculated. Instruction: Basic Skills/Transfer Department Assessment Task Communication Geography and Meteorology Chemistry Geology/Oceanography/ Environmental Sciences Medical Assisting Core 4 100% 100% Course SLOs 50% 100% Assessment Participation Full Time Adjunct Faculty Faculty 100% 9% 100% 92% Discussion Participation Full Time Adjunct Faculty Faculty 31% 26% 100% 75% 100% 100% 100% 100% 100% 100% 60% 100% 100% 100% 40% 50% 100% 100% 100% 50% 100% 100% ARC is pleased to note that the assessment of every course SLO and each of the Core 4 was accomplished by all transfer departments; all full time faculty participated in the process in all but one. Several departments are still struggling to involve adjuncts in both the assessment process and the discussion of its results. This has been an on-going issue at the college from the onset of the SLO process, but the percentages of adjunct participation are improving. The one department which lagged in assessment activities and participation believed it was accomplishing its SLO tasks properly, but now understands the process better and has developed a plan to complete the necessary work and involve everyone in the department in SLO assessment. 4 Instruction: Career Technical Education Department Assessment Task Assessment Participation Certificate Course Full Time Adjunct SLOs SLOs Faculty Faculty Medical Assisting 100% 100% 100% 50% Discussion Participation Full Adjunct Time Faculty Faculty 100% 100% The Medical Assisting program mirrors the successful rate of completion in Transfer departments. It should be noted, however, that the MA program plan revealed that their first attempt at certificate and course SLO assessment resulted in all SLOs being rewritten, as they were deemed inappropriate or impossible to assess. The new SLOs are now in the process of being assessed, and have been found to be much more functional and illuminating about student needs and issues. Student Services Department Accessibility Support Center CESS Division Office (Matriculation) SLO # of SLOs % of SLOs assessed 1 1 100% 100% % of department personnel discussing results 100% 100% ARC is delighted that assessment in Student Services continues to generate robust departmental participation and accomplishment. Since the Student Success and Support Programs department has both an SLO and an AUO, it is listed under both charts. Administrative Units Administrative Unit/ Department President’s Office CESS Division Office (Matriculation) AUO Tutorials Warehouse # of AUOs % of AUOs assessed % of department personnel discussing results 3 1 100% 100% 100% 65% 2 1 100% 100% 100% unknown ARC is pleased that departments in Administration continue to assess AUOs for the first time and also involve the entire department in discussing those results. Analysis of Cabrillo’s Outcomes Assessment Process One of ARC’s charges is to analyze the campus’ assessment process and to make recommendations for how to improve it. This year the committee noted the following: 5 Instruction The committee observed three primary findings about the SLO process in Instruction: 1. Quality Assurance: All departments are robustly participating in assessment activities, but some are accomplishing it with greater sophistication and depth. Some departments showed an outstanding ability to analyze assessment results and propose solutions to student learning issues that were creative and student-centered. Other departments do not use their SLO assessment analyses as effectively, and are still primarily proposing teacher-centered interventions rather than ones that focus on students. 2. Departmental Leadership: Some departments are to be commended for meeting the challenges of past leadership gaps and past lack of participation in the SLO assessment process by taking charge and making the process meaningful for the entire department. Other departments still need to own the process. 3. Assessment Analysis: When their analysis determined that students were mastering their course SLOs, some departments chose to reexamine the efficacy of their assessment instruments, embracing the process to make it more useful. Other departments inferred that their students were completely successful, and thus no further analysis or effort was needed on their part. ARC recommends the following to build on the gains that have been made and to strengthen those departments that are less experienced with SLO analysis: A. Revise the program planning instructions so that the SLO section is student-focused rather than faculty-focused. Ask the Deans and program planning mentors on the Council of Instruction Planning to point out this change as they work with their department chairs to write program plans. B. Undertake an informational campaign for departments in need to help them better analyze their assessment results. This would include: • Using the SLO training for department chairs that will occur in spring 2015 as part of the Year of Instruction and the CurricUNET SLO module training that will be occurring over the next four semesters as venues for sharing more sophisticated methods of SLO analysis. • Revising the program chair materials on the SLO web site. • Sending a copy of ARC’s annual report directly to program chairs, to give them a better idea of what is viewed as exemplary SLO analysis. • Assembling a best practice guide for SLO assessment from some of the work done by exemplary departments in Instruction. Student Services and Administrative Units ARC found that Student Services and Administrative Units had two similar issues with their assessment processes. 1. Departments are still struggling to find the right assessment tools. They are working with the PRO office to create better metrics that will provide more useful information. 2. Departments could use some help with analyzing the data that they generate. ARC recommends the following to help strengthen the progress that is being made in both sectors: 6 A. ARC should provide a best practice guide for Student Services SLO assessment and Administrative AUO assessment that shows available tools, how to use the data, identify conclusions, generate goals and recommendations based on the data, implement them, and then look at the new data. Recommendations 2014 Recommendations for Teaching and Learning The recommendations below should be pursued by Faculty Senate, IAC, Student Services Council, or college shared governance committees, as appropriate. Recommendation Responsible Party Time Line A critical issue in all courses (transfer, CTE and basic skills) is a deficiency of basic reading, writing, and numerical skills. This on-­‐going, critical issue, which ARC has noted for the last seven years, negatively impacts student success across the campus. Efforts to address this issue (including the Equity Plan, the Student Success and Support Program and campus professional development) should be coordinated, with on-­‐going dialogue between all involved parties, including transfer and CTE faculty. ARC, Faculty Senate, CPC, Equity Committee, Student Success Committee, Basic Skills Committee, PETL Committee Spring 2015 2014 Recommendations for SLO Assessment Processes Recommendation Responsible Party Time Line In Instruction, revise the program planning instructions so that the SLO section focuses on the specific plans departments have developed to help students, rather than on professional development. Ask the Deans and CIP program planning mentors to reinforce this change as they work with their departments to write program plans. Undertake an informational campaign to help departments better analyze their assessment results. CIP and SLO Coordinator Discuss in spring 2015 SLO Coordinator spring and fall 2015 7 Informational Campaign: Use the SLO training for department chairs in spring 2015 and the CurricUNET SLO module training over the next four semesters as venues for teaching more sophisticated SLO analysis methods. Informational Campaign: Revise the program chair materials on the SLO web site to stress this approach to analysis. Informational Campaign: Send a copy of ARC’s annual report to program chairs, to give them a better idea of what is viewed as exemplary SLO analysis. Informational Campaign: Assemble a best practice guide for SLO assessment from some of the extraordinary work done by exemplary departments in Instruction Provide a best practice guide for Student Services SLO assessment and AUO assessment that shows available tools, how to use the data, identify conclusions, generate goals and recommendations based on the data, implement them, and then look at the new data. SLO Coordinator spring 2015 SLO Coordinator spring 2015 SLO Coordinator spring 2015 ARC fall 2015 ARC fall 2015 Progress on Last Year’s Recommendations ARC made several recommendations last year to improve the campus’ assessment processes. Six of the twelve recommendations were completed. The rest are in process. Recommendation Responsible Party Utilize projects from the Student Success Committee as opportunities for faculty to share best practices across departments In Instruction, create a formal program chair training that includes how to lead the department in SLO assessment. In Instruction, hold a FLEX workshop for program chairs to get help on specific issues from the SLO Coordinator In Instruction, revise instructions for the SLO section in program plans In Instruction, revise Core 4 Student Success committee; Staff Development committee Deans, VP of Instruction, Program Chairs In pro-­‐ cess Com-­‐ pleted Notes ü ü The Faculty Consultation Network continues to lead this work Part of the Year of Instruction training in spring 2015 SLO Coordinator & Staff Development Committee ü SLO Coordinator and CIP SLO Coordinator, ü ü 8 Was not well attended. Workshops may not be the best venue. Recommendation Responsible Party In pro-­‐ cess Com-­‐ pleted Notes Assessment Analysis forms to include separate discussions of student success in departmental degrees, the GE program and the college’s institutional outcomes. In Instruction, create a chart to better map what is assessed by transfer and CTE departments Create an informational campaign for how the college defines a program CIP, Faculty Senate Create an information campaign about the services available from the Planning and Research Office. In Student Services, fill out assessment analysis forms in addition to the annual updates to record SLO assessment results In Administration, after an AUO is assessed, continue to refine assessment instruments Analyze all ARC annual reports, noting trends that have emerged over time and the college’s progress in addressing them. Undertake a self-­‐evaluation of the committee PRO ü Student Services and the SLO Coordinator ü Administration and the SLO Coordinator ARC ü ARC ü ARC and the Faculty Senate ü See Response to ACCJC Recommendation #2 This is an on-­‐going effort ARC ü ü See Analysis of ARC’s Past Recommendations Analysis of ARC’s Past Recommendations One of ARC’s recommendations last year was to undertake an analysis of all its previous recommendations, assessing the college’s progress in addressing them. Sixty-three recommendations were scrutinized. The committee found that of those, 44 had been completed, 13 were in progress, 5 were on-going and one was abandoned since it was found to be impossible. ARC also noted the following trends: 1. Academic issues noted as a result of SLO assessment have remained constant. Students are underprepared academically, especially in reading and writing, and do not possess the habits of mind to enable them to be successful students. 2. Efforts to improve these academic issues have changed, moving from a focus purely on professional development to some solutions across disciplines. ARC noted that a collegewide approach to these issues is emerging through projects that will be developed through the Equity Plan and the focus on Habits of Mind by the Professional Engagement committee for 9 the next academic year. 3. Recommendations demonstrate a constant effort to build an infrastructure to sustain campuswide assessment efforts. That effort has been a success, but as noted earlier in this report, there is still an uneven quality in assessment across departments. In Instruction, department chair leadership is the key. In Student Services and Administrative Units, more experience with assessment will lead to improved metrics and measures. 4. The college has moved from the creation of the assessment process to quality assurance. How we talk about assessment efforts has changed, shifting from a focus on the instructors to a focus on what students do. This is a significant paradigm shift. Richer and more sophisticated analysis of assessment results is occurring, especially in some departments in Instruction. Interdisciplinary solutions are now more often sought to solve student academic issues. The detailed analysis and a list of all the recommendations appears in the appendix B of the report. Response to ACCJC Recommendation #2 After our accreditation site visit last fall, ARC was assigned to develop a plan to respond to recommendation #2: In order to improve effectiveness, the team recommends that the college clarify and document its definition of a program and include the evaluation and improvement of all degree offerings in the program review and planning process. ARC has developed an action plan that has been approved by the Faculty Senate and other campus governing bodies to fulfill all aspects of this recommendation. Many of last year’s ARC recommendations were part of that plan. The time line for implementation of the plan will be finalized in spring 2015, although many aspects of it are already in effect. See Appendix C for the full plan and timeline. Institutional Effectiveness ARC’s role in assessing the college’s Institutional Effectiveness ended at the close of spring 2014 with the creation of the Institutional Effectiveness Committee. ARC was one of the partners in helping to create that committee and define its charge. Its recommendation last year to assess its own effectiveness has now become one of the duties of the Institutional Effectiveness committee, although ARC has worked with the PRO office to define the most appropriate metrics to determine its efficacy. The assessment will occur in spring 2015. Respectfully Submitted by Marcy Alancraig, Dale Attias, Ivan Ayala (student representative), Justina Buller, Carter Frost, (student representative), Jean Gallagher-Heil, Paul Harvell, Denise Lim, Victoria Lewis, Isabel O’Connor, Margery Regalado, Georg Romero, Kathie Welch and Terrence Willett 10 Appendices: A. ARC’s History and Charge B. Analysis of ARC’s Past Recommendations C. Action Plan to Meet ACCJC Recommendation #2 11 Appendix A: ARC’s History and Charge In response to the change in accreditation standards in 2002, two shared governance committees, the Learner Outcomes and Accreditation Committee (2001-2003) and the Accreditation Planning Committee (2003-2005), along with the Cabrillo Faculty Senate, designed a comprehensive SLO assessment plan: assessment of student learning outcomes occurs in all sectors of the college as part of on-going Program Planning (departmental review) processes. The college was divided into five assessment sectors -- Transfer and Basic Skills Instruction, Career Technical Education, Library, Student Services, and Administration -- which were each to measure their contributions to students' mastery of the college’s core competencies. In 2012, after years of grappling with how to measure their contribution to student learning, administrative departments switched to writing and assessing administrative unit outcomes. Each sector of the college creates its own method to assess SLOs and/or AUOs. See the SLO website for a detailed description of the methods used in each area (https://sites.google.com/a/cabrillo.edu/student-learningoutcomes/home). Programs and services undergo Program Planning on a rotating basis; only a few departments complete the process each year. For example, in Instruction, approximately twelve of its fiftysix programs write a program plan in a given year. Because of the number of programs within its purview, the Instructional component began by phasing in SLO assessment, starting with the college core competencies. When this set-up phase was completed, Instruction moved to institutionalizing the process, asking that departments measure student mastery of every course, certificate and degree SLO within the six-year program planning cycle. This staggered schedule of assessment is called the Revolving Wheel of Assessment; every department is currently embarked on some stage of its repeating cycle (see the SLO website for a detailed description of the Wheel). Now Instruction has focused its efforts on quality assurance, creating processes and tools to ensure excellence and full compliance with its SLO procedures. Student Services also phased in SLO assessment, beginning with writing and then revising their departmental SLOs, and now by assessing them. A grant received by the college, the Bridging Research Information and Culture Technical Assistance Project, (sponsored by the Research and Planning Group and funded by the Hewlett Foundation) provided needed training in Student Services assessment methods during Spring 2011. By the next year, all Student Service departments had assessed each of their SLOs, leading to new insights and ways to improve services. Administration (composed of departments or administrative offices in the President’s component, Administrative Services, Student Services and Instruction) spent five years discussing and identifying how their departments contribute to student mastery of the college core competencies and how to measure it. Because they provide a wide range of services that enable teaching and learning to occur, but are not directly involved in the formal learning process, their role in assessing SLOs has been difficult to define. In Spring 2012, Administration switched to measuring Administrative Unit Outcomes. Cabrillo defines an Administrative Unit Outcome as a description of what all the users of an Administrative service can do, know, or understand after interacting with that office or department – it is user centered, a description of what the service provides for others. Unlike some schools across the state, a Cabrillo AUO is 12 not an administrative unit goal. Almost all administrative departments have written AUOs and are beginning to assess them. No matter the assessment sector, all college departments that write a Program Plan by June in a given year forward their assessment reports to the Outcomes Assessment Review Committee. This committee, a subcommittee of the College Planning Council, is chaired by the Student Learning Outcomes Assessment Coordinator and is designed to include representatives from the Student Senate, Faculty Senate, CCEU, CCFT, and a manager along with representatives from Administration, Student Services, Library, and Instruction (both Transfer & Basic Skills and CTE). The Campus Researcher and Accreditation Liaison Officer serve as ex officio members of the committee. The Outcomes Assessment Review Committee (ARC) oversees, analyzes and evaluates all campus SLO and AUO assessment activities. It reviews the yearly assessment results in Instruction, Student Services, the Library and Administration, looking for common themes and broad trends. In addition to analyzing the collective contents of the assessments submitted each year, ARC evaluates its own function and all assessment processes on campus. ARC writes a report about its analysis, submitting it to campus governing bodies authorized to act upon ARC’s recommendations, including the Governing Board, the College Planning Council, the Faculty and Student Senates and both unions, CCFT and CCEU. When needed, the committee is empowered to initiate a college-wide dialog process to analyze and solve broad issues about student learning that have been revealed by SLO assessment results across the campus. For more detailed information on ARC’s charge, membership and duties, please see the SLO website. 13 Appendix B: Analysis of ARC’s Past Recommendations (2007-2013) Trends 1. Academic issues noted as a result of SLO assessment have remained constant. They are: a. Students are underprepared academically, especially in reading and writing. Math is not mentioned as frequently, though affected departments have made note of student lack of ability in this area as well. b. Students do not know how to be students. 2. Efforts to improve the above academic issues have changed, moving from a focus purely on professional development within departments to some solutions across disciplines. For example: a. Flex workshops on plagiarism and the creation of campus honor code. b. Making use of projects from the Student Success Committee and the Faculty Consultation Network to share best practices and pedagogical approaches. 3. Recommendations demonstrate a constant effort to build an infrastructure to sustain campus-wide assessment efforts. a. Every year, several recommendations focused on improving the assessment process including: departmental, program chair and individual training in assessment methods, professional development on assessment, creation of the SLO web site and the workbooks it features, revision of reporting forms, and the quantitative data reporting pilot. b. Developing the CurricUNET SLO module. c. Instruction led this effort, with additional progress made first in Student Services and now in Administration. 4. The college has moved from the creation of the assessment process to quality assurance. a. How we talk about assessment efforts has changed, shifting from a focus on the instructors to a focus on what students do. This is a significant paradigm shift in campus culture. b. Richer and more sophisticated analysis of assessment results is occurring, especially in some departments in Instruction. c. Assessment data includes quantitative measures. d. Interdisciplinary solutions are now more often sought to solve student academic issues. e. SLO assessment is now part of the Faculty Handbook, along with new fulltime and adjunct faculty orientations. f. Materials and trainings have been geared toward Program Chairs, including a page on the SLO web site, several new tools, a workbook and flex sessions. g. Recommendations addressed issues noted during our Accreditation Site visit before those recommendations were even finalized by the ACCJC. 5. Almost all of ARC’s recommendations have been completed or are in process. Total Recommendations: 64. 14 Total Completed: 44 Total In-progress: 13 Total that are now on-going: 5 Total Abandoned: 1 (the task was deemed impossible) The Council of Instructional Planning, in particular, has followed almost all of ARC’s suggestions and worked with the SLO Coordinator to implement them. The other most frequent groups that have labored to make the recommendations come into being are ARC itself, PRO and the SLO Coordinator. Many of the assessment process recommendations also required and received the approval of the Faculty Senate. Issues that Disappeared and an Analysis of Why 1. The focus on increasing adjunct participation during the first three years morphed into program chair training when ARC realized that PC’s now do the bulk of the training in SLO assessment. 2. Recommendations to share Core 4 results across the campus stopped when the professional development workshops held during Flex week had poor attendance. Most Pressing Concerns 1. Academic Issues: Many students are still underprepared and not aware of what it is to be a student. A college-wide approach to these issues is emerging through projects that will be developed through the Equity Plan and the focus on Habits of Mind by the Professional Engagement committee for the next academic year. 2. Assessment Quality: There is still an uneven quality in assessment across departments. In Instruction, size and subject matter of the department were not found to be issues. PC leadership is the key. In Student Services and Administration, more experience with assessment will lead to improved metrics and measures. 3. Professional Development: Flex activities have not been well attended, particularly those designed for program chairs or for sharing Core 4 results across campus. A new format for dialogue and sharing needs to be found. Sustained professional engagement that goes beyond a one-time workshop may be a more effective way to accomplish assessment trainings and any departmental approaches to improve student learning, providing those involved with time to apply what has been presented. 4. Quality Assurance: There are only minor consequences if a department member or department as a whole does not assess outcomes in a quality way. Should the college consider creating more meaningful consequences or rewards? 5. Closing the Loop: The six-year assessment cycle takes so long that it is hard to see and measure if improvements in student performance have occurred from previous interventions. Should the college consider creating a faster time frame? A list of all recommendations is provided below: 15 ARC Recommendations Year Recommendations Responsible Party In process? Completed? 2007 1 2 The Administration and CCFT, along with program chairs and Deans, need to find ways to increase adjunct participation in SLO assessment in Instruction Revise the Instructional Assessment Analysis form to include number of faculty participating in assessment (full-time and adjunct), options for improvement and develop Scantron form for quantitative data. Administration, CCFT, PCs, Deans ü SLO Coordinator ü 3 Student Services and Administrative Services Departmental Reviews should include a section about departmental discussions of survey results. Student Services, Administrative Services ü 4 Continue to educate the Cabrillo community about the paradigm shift from evaluating individuals to evaluating departments. Use flex activities, and campus governing bodies such as CPC and Faculty Senate as vehicles for this education Make the SLO website, particularly the “best practices” examples posted there, clearly and easily accessible from the campus web page and on the college’s P-drive archives. President, CPC, Staff Development Cte, Faculty Senate ü SLO Coordinator and PRO ü Institute an SLO workshop for programs two years in advance of Instructional Planning and for non instructional programs; Provide similar workshops for noninstructional departments and components Develop system of succession and dissemination of expertise in SLO Coordinator and PRO ü SLO Coordinator ü Administration and Faculty Senate ü 5 2008 6 7 8 16 SLOAC across campus 2009 9 10 11 12 13 14 15 Provide sustained faculty development for addressing student learning needs in reading, research and documentation, and writing. Provide support for faculty as they confront challenges to academic ethics, such as plagiarism and other forms of cheating. Staff Development Cte On-going Faculty Senate, Dean of Student Services ü Support ongoing, sustained staff development in the assessment of student learning, including rubric development Share effective practices and strategies for modeling assignments. Encourage greater adjunct involvement. Assist the Administrative Services to develop a program review model that includes relevant student learning outcomes and that can be utilized consistently across noninstructional departments. SLO Coordinator ü Departments ü ü President, Cabinet, SLO Coordinator Communicate to the college at President large the importance of maintaining and documenting a college-wide planning process that systematically considers student learning, including noninstructional areas. ü ü 2010 16 Provide sustained faculty development for addressing student learning needs, particularly those of basic skills students, through new pedagogies, technology, the Faculty Inquiry Network and contextualized instruction Staff Development committee; Basic Skills Learning Community Advisory Council, and the Faculty Senate ü 18 Provide support for the teaching of college survival skills across the Staff Development committee and the Faculty ü 17 curriculum. Senate 19 Develop recommendations for making the SLO reporting process electronic ARC and Faculty Senate ü 20 Explore adding a quantitative component to SLO reports Survey adjunct faculty to assess their awareness of Cabrillo’s SLO process and barriers to their participation in it ARC and Faculty Senate ü ARC with the help of the PRO office ü 22 If necessary, create avenues for campus dialogue to discuss the survey results and brainstorm solutions to the issues it reveals. ARC ü 23 Inform potential hires of Cabrillo’s SLO process and participation expectations in new faculty and staff trainings, mentorships and in the Faculty Handbook Human Resources, Office of Instruction, Deans, Student Services, Administration and SLO Coordinator ü 24 Use the template created by the Instruction Office for program planning for all Administration departments Create a venue or reporting mechanism for Administration’s Program Plans Administration ü Administration ü Adopt the name “Program Planning” to describe the departmental review process for all components Cabinet and Administration Council ü Provide faculty development to improve pedagogy and sharing of best practices. Use flex hours, but not necessarily during flex week, to do this. Provide training about organizing and facilitating departmental SLO assessment to Program Chairs and Deans. Convene a meeting of Program Staff Development committee and Program Chairs 21 25 26 2011 27 28 29 On-going SLO Coordinator ü SLO Coordinator ü 18 30 31 32 33 34 35 36 37 Chairs of smaller departments to brainstorm organizational strategies for SLO assessment Create a web tool that lists the calendar for every Instructional department’s SLO assessment schedule. Post examples of a full assessment cycle on the SLO web site for transfer/ basic skills, CTE and Student Services Facilitate interdisciplinary discussions about student mastery of each of the four core competencies Write Administrative Unit Outcomes for each department in Administration Develop an Assessment Instrument and reporting format for Administration Program Planning that can be used by all the departments in this area Provide on-going training and workshops in SLO assessment to Student Services staff Serve as readers for the chapter on the college’s SLO efforts that will be included in the 2012 Accreditation Self-Evaluation. Revise and update SLO web site SLO Coordinator and PRO office SLO Coordinator and PRO office Not found to be possible ü ARC members and SLO Coordinator ü Administration ü Administration & PRO ü SLO Coordinator ü ARC ü SLO Coordinator with the help of the PRO office ü 2012 38 39 40 41 42 Provide faculty development to improve pedagogy and sharing of best practices. In Student Services, provide more than one-time trainings and/or follow up with students over time. Create a forum for Student Services and Instructional departments to share SLO assessment results that occurs outside of traditional flex week Find ways to showcase SLO and AUO assessment results across the campus Create an electronic means of Staff Development committee and Program Chairs Individual departments in Student Services On-going ü Staff Development Committee, the Faculty Senate, CCFT and Administration ü ARC ü ARC ad hoc committee 19 ü 43 44 45 46 47 48 49 50 51 2013 52 53 54 55 organizing, scheduling and reporting SLO assessment results Provide web based and flex training in how to assess course and certificate SLOs Departments struggling with SLO assessment should work individually with the SLO Coordinator to find solutions for any issues Facilitate a dialogue with Deans, the Council for Instructional Planning and the Faculty Senate to brainstorm other methods to ensure full compliance with college SLO standards. SLO Coordinator ü SLO Coordinator ü ARC, Deans, CIP. Faculty Senate On-going Create a special section of the SLO web site for Program Chairs Revise Assessment Analysis forms to gather better adjunct and curriculum data; add a line to report on previous interventions the last time an SLO was assessed and the progress made. Disseminate results of ARC survey; administer survey again in 2013 Gather feedback on pilot on quantitative reporting of SLO results Adopt the new template for nonInstructional program planning created by ARC Change ARC’s name SLO Coordinator ü SLO Coordinator ü SLO Coordinator ü SLO Coordinator ü Administration ü Faculty Senate ü Utilize projects from the Student Success Committee as opportunities for faculty to share best practices across departments In Instruction, create a formal program chair training that includes how to lead the department in SLO assessment. In Instruction, hold a flex workshop for Program chairs to get help on specific issues from the SLO Coordinator In Instruction, revise instructions for Student Success committee and Staff Development committee ü Deans, Vice President of Instruction, Program Chairs ü SLO Coordinator and Staff Development Committee ü SLO Coordinator and CIP ü 20 56 57 58 59 60 61 62 63. the SLO section in program plans In Instruction, revise Core 4 Assessment Analysis forms to include separate discussions of student success in departmental degrees, the GE program and the college’s institutional outcomes. In Instruction, create a chart to better map what is assessed by transfer and CTE departments Create an informational campaign for how the college defines a program Create an information campaign about the services available from the Planning and Research Office. In Student Services, fill out assessment analysis forms in addition to the annual updates to record SLO assessment results In Administration, after an AUO is assessed, continue to refine assessment instruments Analyze all ARC annual reports, noting trends that have emerged over time and the college’s progress in addressing them. Undertake a self-evaluation of the committee SLO Coordinator, CIP, Faculty Senate ü ARC ü ARC and the Faculty Senate ü PRO On-going Student Services and the SLO Coordinator ü Administration and the SLO Coordinator ü ü ARC ü ARC 21 Appendix C: Action Plan to Meet ACCJC Recommendation #2 ACCJC Recommendation #2: In order to improve effectiveness, the team recommends that the college clarify and document its definition of a program and include the evaluation and improvement of all degree offerings in the program review and planning process. ACTION PLAN I. Definition of Program ARC recommends that the college adopt the following definitions to avoid confusion and to be consistent across campus: a. Academic Program: A collection or series of courses that lead to a degree, certificate, or transfer to another institution of higher education (Title 5, ACCJC program definitions, Title 5 TOP code specifications1). For purposes of college organization, a program is composed of all the degrees and certificates offered by a specific academic department. CTE departments that offer separate programs accredited by different outside accrediting agencies are considered one department that offers multiple programs. For example, the ECE Program includes an AS and AS-T degree and all certificates offered by the ECE Department. The Spanish Program is the AA degree offered by the World Languages department. The Medical Assistant department offers two programs: Medical Assistance and Phlebotomy Technician because they are accredited by two separate outside accrediting agencies. Using this definition, there are a few programs that do not belong to a specific academic department (such as the General Science degree, Liberal Arts and Sciences degree or General Education). b. Academic Department: A group of faculty in a related field of study or a discipline that offers an academic program. Departments are people. Programs are courses of study, certificates and degrees. ARC also recommends the following name changes to be consistent with the definitions offered above and to avoid confusion to accreditors: 1 The Taxonomy of Program (TOP) is a system of numerical codes used at the state level to collect and report information on programs and courses, in different colleges throughout the state, that have similar outcomes. 22 • The name “Program Chair” should be changed to “Department Chair” or “Department Cluster Chair.” ARC recognizes that these definitions imply the following: • We need to inventory how many programs exist outside of an academic department, and we need to create a way to assess them. II. Documentation of the Definition of a Program ARC recommends that these definitions be documented in the following places: • College Catalog • Schedule of Classes • Program Planning Instructions • Appropriate BPs and ARs • Governance Handbook • Maps of campus processes on the SLO and PRO websites. • Appropriate documents that include programs (such as new PIE matrix, the New Faculty Handbook etc.) • Appropriate places on the college web site (including each academic department’s web site and listing under Majors/Programs) III. Publicity Campaign on the Definition of a Program ARC recommends that once the new definitions have been adopted and placed in all appropriate documents, websites and ARs and BPs, that the college conducts a publicity campaign to further alert members of the campus community. The campaign could include: • Presentations at Division Meetings and/or All College Day. • Faculty Senate meetings. • Included in an “Accreditation Info Sheet” before our next visit. • Other venues that arise from campus activities. IV. Demonstrating the Link between the Core 4 and Transfer Degrees ARC recognizes that one of the reasons we received this recommendation was that the visiting team did not understand that the assessment of the Core 4 is used to assess transfer degrees, the GE program and the college’s institutional outcomes. To clarify that, ARC recommends: • The creation of new maps of our SLO assessment process to be posted on the SLO website and College website in appropriate places. • Changes in the college Catalog that make this clearer, including listing the Core 4 as program outcomes for all transfer degrees. • The creation of maps that show what courses in a transfer degree teach the specific skills contained in the Core 4. ARC recommends that we start this work with all AAT and AS-T degrees and that we use a special flex SLO “Get It Done” day to do so 23 V. Evaluation of all Degree Offerings in the Program Review and Planning Process. ARC recommends that the college take the following steps to meet this part of the recommendation: • Change Core 4 Assessment Forms to include separate discussions of what the results indicate about student mastery of skills for a degree, the GE program and Core 4 in general. • Change the requirements for the SLO section of program plans to include separate discussions of what the Core 4 assessment results indicate about student mastery of skills for a degree, the GE program and Core 4 in general. Can also include averages from numerical results of Core 4 assessment (a data chart). • Work with CurrucNET to design reports that gather Core 4 assessment results for all courses that are included as part of a degree so that a separate report is generated and evaluated by the department. Reports can be archived in CurricUNET for accreditation. Timeline for Action Plan Action Item Maps of Assessment Process List Core 4 as transfer degree outcomes in College Catalog Post maps of assessment process on SLO website Revise Core 4 Assessment Forms Revise SLO Section of Program Plans CurricUNET report design Post this action plan in the Smart Sheet planning agenda Present Action Plan to Deans for feedback Present Action Plan to PCs for feedback Present Action Plan to Faculty Senate for feedback and then approval Present Plan to CPC/IE Inventory programs without departments and create a plan to assess them Document Program Definition in: • Catalog • Schedule of Classes • BP and ARs • Program Planning Instructions • Appropriate Documents Responsible Party PRO Curriculum Specialist Date to be Completed Completed Completed SLO Coordinator Completed SLO Coordinator SLO Coordinator Completed Completed SLO Coordinator SLO Coordinator and PRO Director SLO Coordinator Completed Completed SLO Coordinator Completed SLO Coordinator Fall 2014 SLO Coordinator ARC Fall 2014 In process SLO Coordinator with: Fall 2014-Spring 2015 Curriculum Specialist Curriculum Specialist Chair of IC Instruction Office 24 Completed • College Website Conduct Publicity Campaign on new definitions Mapping of AA-T and AS-T degrees to Core 4 Accreditation Info Sheet Instruction Office Spring Andrews & PCs for Departmental Websites ARC SLO Coordinator and Department Chairs IE Committee 25 Fall 2015 Spring 2016 Just before our next visit Summary of 2014 ARC Report Participation: College Sector Program Plans Submitted Instruction: Transfer and Basic Skills Communication Geography and Meteorology Chemistry Geology/Oceanography/Environmental Sciences Medical Assisting Not scheduled this year Accessibility Support Center President’s Office Student Success and Support Programs Tutorials Warehouse Instruction: Career Technical Education Instruction: Library Student Services Administrative Units Commendations: ARC salutes the Geography Department for an excellent and sophisticated analysis of their assessment results. They detailed specific issues faced by students and the department’s creative solutions to meet them, which led to concrete goals and recommendations for the program plan overall. ARC lauds the Chemistry Department for a superior analysis of assessment results and creative solutions to student learning issues. ARC commends the Accessibility Support Center for implementing a new and highly successful process (a pre and post test) to assess their outcomes. ARC applauds the Counseling and Education Support Services Division for an excellent use of assessment tools to get actionable results. Successes: Instruction: The faculty learned that most students were successful at mastering the course and certificate SLOs as well as each of the college core competencies. Some successes were directly linked to classroom activities and student support: Peer tutoring and hands-on activities were shown to positively correlate with improved student learning. The length of time spent in lab sessions positively correlated with higher student success. 1 Students who received individual attention from student assistants performed better. Student Services: Students greatly benefited from in-person information provided by the Accessibility Support Center. A pre and post test revealed that students knew how to challenge a pre-requisite on their own after being assisted in completing the process the first time. Administrative Units: An assessment of Assessment, Orientation and Counseling by the Counseling and Educational Support Services Division revealed great success from the new BYMA (Before Your Make An Appointment) efforts. The Warehouse discovered that users were very satisfied with their services, but many campus faculty and staff were unaware of what the department had to offer. Challenges/Solutions Instruction: When students struggled with SLOs, some common themes appeared. Problems with writing and grammar, especially the writing of lab reports. Proposed solutions included giving more writing assignments in science courses, requiring drafts of papers, creating a one-unit writing course for the sciences, working with the English department and more referrals to Writing Center. Lack of basic math and other numerical skills in non-math classes. Proposed solutions included working more closely with the Math department. Issues in critical thinking, especially applying concepts to problems and situations. Proposed solutions included creating activities that require more hands-on learning, using clicker response systems to monitor student comprehension, and more field work where students would be required to apply what they’ve learned in the classroom. Student Services: Students revealed that faculty and staff were not uniformly aware of the steps involved in challenging a pre-requisite. Proposed solution: Post more materials on the web and in Division offices to help faculty, staff and students be more aware of the steps required. Students with disabilities need more support classes to help them succeed. Proposed solution: The Accessibility Support Center proposes to work with Counseling to create a section of CG 51 for this population. Administration: The CESS Dean identified a need for regular direct communication with all members of some small departments, to improve awareness of and involvement in new campus issues and changing requirements. Students coming to Tutorials benefit less from their sessions if they come unprepared (without an assignment etc.). Proposed Solution: Tutorials will develop pre-semester 2 communications and increase in-class visits to better inform students of their role in the tutoring process. ARC’s Analysis of Campus Assessment Processes In Instruction, all departments are robustly participating in assessment activities, but some are analyzing its results with greater sophistication and depth. Some departments could use additional support in undertaking assessment activities. In Student Services and Administrative Units, departments are struggling to find the right assessment tools and could use some help with analyzing the data that they generate. ARC’s Recommendations For teaching and learning: A critical issue in all courses (transfer, CTE and basic skills) is a deficiency of basic reading, writing, and numerical skills. This on-going, critical issue, which ARC has noted for the last seven years, negatively impacts student success across the campus. Efforts to address this issue (including the Equity Plan, the Student Success and Support Program and campus professional development) should be coordinated, with on-going dialogue between all involved parties, including transfer and CTE faculty. For Improving Assessment Processes Create Best Practice guides for Instruction, Student Services and Administration. Undertake an informational campaign to help Instructional departments to better analyze assessment results. Analysis of Past ARC Recommendations One of ARC’s recommendations last year was to undertake an analysis of all its previous recommendations. The committee found that of 63 total recommendations over the last seven years, 44 had been completed, 13 were in progress, 5 were on-going and one was abandoned since it was found to be impossible. ARC also noted the following trends: Academic issues noted as a result of SLO assessment have remained constant. Students are underprepared academically, and do not possess the habits of mind to enable them to be successful students. Efforts to improve these academic issues have changed, moving from a focus purely on professional development to some solutions across disciplines. A college-wide approach is emerging through the Equity Plan and the focus on Habits of Mind by the Professional Engagement committee for the next academic year. Recommendations demonstrate a constant effort to build an infrastructure for campuswide assessment. That effort has been a success, but there is still an uneven quality in assessment across departments. How we talk about assessment efforts has changed, shifting from a focus on the instructors to a focus on what students do. This is a significant paradigm shift. 3