Service Areas Program Review Update 2012/13 

advertisement
Service Areas Program Review Update 2012/13 (fields will expand as you type) Section 1 ‐ Program Information 1.0 Name of Program: Date: 1.1 Program Review Authors: 1.2 Program Director Signature: Date: 1.3 Vice President Signature: Date: 1.4 Primary Function: 1.4.1 State briefly how the program functions support the college mission: 1.4.2 Program highlights/accomplishments: 1.4.3 Program Data: # of Full Time Employees 2010‐2011 # of Part Time Employees 2011‐12 2010‐2011 Personnel Budget 2011‐12 2010‐2011 Discretionary Budget 2011‐12 2010‐2011 2011‐12 Section 2 ‐ Data Analysis 2.0 List Service Area Metrics/Indicators and provide information on changes over time (Steady/Increasing/Decreasing, etc.) 2.1 Metrics/Indicators 1.AdminServiceAreasTemplate9 14 12.docx 2010/11 2011/12 10/8/2012 Observations (steady/increasing/decreasing) Page 1 2.2 Describe how these changes affect students and/or the program: 2.3 Provide any other relevant information, or recent changes, that affect the program: Section 3 –Critical Reflection of Assessment Activities (2011/2012) 3.0 Describe Service Area Outcomes Assessed or reviewed in the current cycle: 3.1 Summarize the conclusions drawn from the data and the experience of staff working to achieve the outcomes: 3.2 Summarize how assessments have led to improvement in Service Area Outcomes (top three.): 3.3 (Optional) Describe unusual assessment findings/observations that may require further research or institutional support: Section – 4 Evaluation of Previous Plans 4.1 Describe plans/actions identified in the last program review and their current status. What measurable outcomes were achieved due to actions completed. Actions Current Status Outcomes 4.2 (If applicable) Describe how funds provided in support of the plan(s) contributed to program improvement: 1.AdminServiceAreasTemplate9 14 12.docx 10/8/2012 Page 2 Section – 5 Planning 5.0 Program Plans (2012/2013) Based on data analysis, service area outcomes and indicators, assessment and review, and your critical reflections, describe the program’s Action Plan for the 2012/13 academic year. If more than one plan, add rows. Include necessary resources. (Only a list of resources is needed here. Provide detailed line item budgets, timelines, supporting data or other justifications in the Resource Request). 5.1 Program Plans Relationship to Institutional Plans Action to be taken: Relationship to Assessment Expected Impact on Service Area Outcomes Resources Needed 5.2 Provide any additional information, brief definitions, descriptions, comments, or explanations, if necessary. Section 6 ‐ Resource Requests 6.0 Planning Related, Operational, and Personnel Resource Requests. Requests must be submitted with rationale, plan linkage and estimated costs. Request Check One Amount Recurring Rationale $ Cost Y/N Linkage Planning Operational Personnel Section 7‐ PRC Response 7.0 The response will be forwarded to the author and the supervising Director and Vice President: S.1. Program Information: S.2. Data Analysis 1.AdminServiceAreasTemplate9 14 12.docx 10/8/2012 Page 3 S.3. Critical Reflection of Assessment Activities S.4. Evaluation of Previous Plans S.5. Planning S.6. Resource Requests 1.AdminServiceAreasTemplate9 14 12.docx 10/8/2012 Page 4 Instructional Program Review Update 2012/13 (fields will expand as you type) Section 1 ‐ Program Information 1.0 Name of Program: Date: 1.1 Program Review Authors: 1.2 Dean’s Signature: Date: 1.3 Individual Program Information # of Degrees # of Certificates # of Full Time Faculty 2010‐2011 # of Courses # of Part Time Faculty 2011‐12 2010‐2011 # of GE Courses # of Staff FTE 2011‐12 2010‐2011 2011‐12 Personnel Budget 2010‐2011 2011‐12 1.3.1 State briefly how the program functions support the college mission: 1.3.2 Program highlights/accomplishments: Section 2 ‐ Data Analysis 2.0 Provide information on changes over time (Steady/Increasing/Decreasing, etc.) 2.1 Indicators FTES FTEF Enrollment (Headcount) Sections Offered Fill Rate Retention Rate Persistence Rate 2.InstructionalTemplate r9 14 12.docx Value Change 10/8/2012 Discretionary Budget Comments Page 1 2010‐2011 2011‐12 Success Rate Completion Rate FTES/TLU Cost per FTES Overall, what has been the impact of the change in indicators on student achievement and learning: Comment on any adverse trend and whether it requires action: Provide any other relevant information, or recent changes, that affect the program: [CTE Programs must provide labor market analysis] Equity Measures Course completion Degree Completion Certificate Completion ESOL/Basic Skills completion Transfers Retention Labor Market Data (CTE ONLY) Change Value Comments Provide narrative on the factors that may have contributed to the improvement or decline in the identified population: Section 3 – Critical Reflection of Assessment Activities Curriculum & Assessment Data # of SLO Assessments Reported # of SLO’s Scheduled to be Assessed # of PLOs Assessed and Reported % of Course Outlines of Record updated Assessment Reporting completed? Y/N Program Advisory Committee Met? Y/N 2.InstructionalTemplate r9 14 12.docx 10/8/2012 Page 2 3.0 How has assessment of course level SLO’s led to improvement in student learning (top three): 3.1 How has assessment of program level outcomes led to degree/certificate improvement (top three): 3.2 (Optional) Describe unusual assessment findings/observations that may require further research or institutional support: Section – 4 Evaluation of Previous Plans 4.1 Describe plans/actions identified in the last program review and their current status. What measurable outcomes were achieved due to actions completed. Actions Current Status Outcomes 4.2 (If applicable) Describe how funds provided in support of the plan(s) contributed to program improvement: Section – 5 Planning 5.0 Program Plans (2012/2013) Based on data analysis, student learning outcomes and program indicators, assessment and review, and your critical reflections, describe the program’s Action Plan for the 2012/13 academic year. If more than one plan, add rows. Include necessary resources. (Only a list of resources is needed here. Provide detailed line item budgets, supporting data or other justifications in the Resource Request). 2.InstructionalTemplate r9 14 12.docx 10/8/2012 Page 3 5.1 Program Plans Relationship to Institutional Plans Action to be taken: Relationship to Assessment Expected Impact on Program/Student Learning Resources Needed 5.2 Provide any additional information, brief definitions, descriptions, comments, or explanations, if necessary. Section 6 ‐ Resource Requests 6.0 Planning Related, Operational, and Personnel Resource Requests. Requests must be submitted with rationale, plan linkage and estimated costs. Check One Amount Recurring Rationale Request $ Cost Y/N Linkage Planning Operational Personnel Section 7‐ PRC Response 7.0 The response will be forwarded to the author and the supervising Director and Vice President: S.1. Program Information: S.2. Data Analysis: S.3. Critical Reflection of Assessment Activities S.4. Evaluation of Previous Plans S.5. Planning S.6. Resource Requests 2.InstructionalTemplate r9 14 12.docx 10/8/2012 Page 4 Student Development Division Annual Program Review Update 2012/13 (fields will expand as you type) Section 1 ‐ Program Information 1.0 Name of Program: 1.1 Program Review Authors: Date: 1.2 Program Director Signature: Date: 1.3 Vice President Signature: Date: 1.4 Program mission: 1.4.1 State briefly how the program mission supports the college mission: 1.4.2 Program goals: 1.4.3 Describe how the program goals support institutional planning and goals (Strategic Plan, Education Master Plan, Enrollment Management Plan, etc.): Section 2 ‐ Data Analysis 2.0 Program Staffing/Budget Data and Indicators (Past years) Provide information to show changes over time (steady, increasing, decreasing, etc.). Insert additional rows as needed for your key performance indicators (KPI’s), such as program services, functions, student contacts, etc. 2.1 Staffing/Budget FTE Faculty and Staff FTE Additional Workforce Personnel (Dollars) Discretionary (Dollars) Other R9/14/12 2010/11 2011/12 10/8/2012 Observations (steady/increasing/decreasing) Page 1 2.2 Program Indicators TBD TBD TBD 2.3 Describe how these changes affect students and/or the program: 2.4 Provide any other relevant information, or recent changes, that affect the program: Section 3 –Critical Reflection of Assessment Activities 3.0 Student Learning Outcomes & Program Outcomes Assessed in the Current Cycle (2011/2012) 3.1 Summarize the conclusions drawn from the data and the experience of staff working to achieve the outcomes: 3.2 Summarize how assessments have led to improvement in Student Learning and Service Area Outcomes: 3.3 (Optional) Describe unusual assessment findings/observations that may require further research or institutional support: Section – 4 Evaluation of Previous Plans 4.1 Describe plans/actions identified in the last program review and their current status. What measurable outcomes were achieved due to actions completed. Actions Current Status Outcomes R9/14/12 10/8/2012 Page 2 4.2 (If applicable) Describe how funds provided in support of the plan(s) contributed to program improvement: Section – 5 Planning 5.0 Plan for Future Improvements (2012/2013) Based on institutional plans, data analysis, student learning outcomes, program outcomes, the assessment of those outcomes, and your critical reflections, describe the program’s Action Plan for the 2012/13 academic year. If more than one plan, add rows. Include necessary resources. (Only a list of resources is needed here. Provide detailed line item budgets, timelines, supporting data or other justifications in the Resource Request). (link to two year assessment plan) 5.1 Program Plans Action to be taken: Relationship to Institutional Plans Expected Impact on Student Learning or Service Area Outcomes Relationship to Assessment Resources Needed 5.2 Provide any additional information, brief definitions, descriptions, comments, or explanations, if necessary. Section 6 ‐ Resource Requests 6.0 Planning Related, Operational, and Personnel Resource Requests. Requests must be submitted with rationale, plan linkage and estimated costs. Request Check One Amount Recurring Rationale $ Cost Y/N Linkage Planning Operational Personnel R9/14/12 10/8/2012 Page 3 Section 7‐ PRC Response 7.0 The response will be forwarded to the author and the supervising Director and Vice President: S.1. Program Information: S.2. Data Analysis S.3. Critical Reflection of Assessment Activities S.4. Evaluation of Previous Plans S.5. Planning S.6. Resource Requests R9/14/12 10/8/2012 Page 4 Generalsuggestions:
Be brief. Most of your programs’ evidence or documentation of your internal department or program thinking or planning processes should be in your regular meeting minutes and other artifacts, such as spreadsheets of data collected, survey instruments used, details of assessments, etc. Your Comprehensive Program Review will pull together previous annual updates and provide more detail. This annual update should primarily serve as a snapshot of progress for the previous year, current plans and actions, and a preview of the next year. Include descriptions and explanations only as necessary for persons outside your program, for example, you should explain or spell out all acronyms at first use, such as SARS, or program‐specific words such as Koha. The form fields can be expanded to accommodate more text, however, when complete the form should be less than five pages. Plan and schedule activities. The annual program review is just one station of the journey through the academic year. Activities are ongoing, documentation is ongoing, surveys and assessments should be planned in advance for best results. Each program should keep a calendar of program review and assessment activities where all employees can see and access it. Your calendar can be customized to your program. Last year’s annual update should inform this year’s processes; the comprehensive will review all of the processes for the entire cycle. Accreditation team members do not want this to be a process of just filling out a form. The form is just a convenient way to provide evidence of a continuous and sustained process of quality improvement for everyone‐ accreditation team visitors, internal college constituent groups, employees in the program being reviewed, students, community members. Involve everyone. Accreditation expects that this process will involve all employees, and it is most effective when it does. Each person’s work contributes differently to our collective pursuit of student success. Branch campus or sites must be included, and any special circumstances or needs must be incorporated. Involving everyone ensures any goals, outcomes, targets, assessment methods, etc., are realistic and do‐able, authentic and meaningful. Suggestionsforlineitemsoftheform:
1.0 Program mission, goals, link to college mission. Although you may find it necessary to fine tune or revise your program’s mission statement every five years or so, the mission and goals normally will not change from year to year. Also, college‐wide strategic and other plans should not be changing frequently, so your linking elements will most likely be the same year after year. 2.0 Program Staffing/Budget Data & Indicators. Provide a data “snapshot” so reviewers and readers are able to quickly see potential trends based on changes that have been observed. Extensive narrative is not necessary. Section 2.1 is required data from all programs, and may be provided by the Business Office. Section 2.2 can be used for any relevant data, specific to the program, that illuminates the current picture, however, remember that 5 years’ continuous and consistent data will be required in the Comprehensive Review. Rows can be added to the table to accommodate additional data elements. Columns will be added as more annual data cumulates, up to the five year Comprehensive Review. R9/14/12 10/8/2012 Page 5 2.1 data is required of all departments, 2.2 is for additional KPI’s defined as specific to the department. 3.0 Critical Reflection of Assessment Activities. Consulting your assessment plan, each year you will assess three or four SLOs and/or POs, depending on the size of your program and the variety of services offered to students. Use Section 3.1 to summarize improvements based on assessment and 3.2 to report on planning and/or improvement that resulted from assessment 4.0 Evaluation of Previous Plans. Describe plans/actions identified in the last program review and their current status. What measurable outcomes were achieved due to actions completed? 5.0 Plan for Future Improvements. “Close the loop” by looking at your data, your outcomes, and determine if expectations or targets were met, or not, and what can be done about it, to either continue the improvement, or reduce the losses before the next assessment cycle for those outcomes. This is where we learn from experience, document what we learned, and how we plan to do better in future. For this section set out generally what you plan to work on in the next cycle for improvement. Based on student learning outcomes, program outcomes, the assessment of those outcomes, and your critical reflections, or an institutional plan initiative describe the program’s Action Plan for the 2012/13 academic year. If more than one plan, add rows. Include necessary resources. 6.0 Resource Requests. Only a list of resources is needed here. Provide detailed line item budgets, timelines, supporting data or other justifications in the Resource Request Terms and Definitions ACCJC Standards & guidelines for evaluating institutions. Sets the standards that all programs must meet so that collectively, the college as a whole maintains its accreditation. All student services programs and functional areas should be familiar with the sections of the standards that apply, and be prepared to build their cycles of continuous quality improvement on those standards. The ACCJC Guide to Evaluating Institutions (link below) is the manual for visiting teams that specifies exactly what the expectations are, and what kind of documentation or evidence to provide. College Mission, Vision, Values. Responds to local needs and interests of the community. What the college as a whole seeks to accomplish; future‐oriented; aspirational; based on broad ideals and fundamental principles. Answers the question, “Who are we and why are we here and what do we hope to be in future?” Strategic Plan goals, Educational and other Master Plan goals, and plan objectives. Targets that must be met in order for the college to fulfill its mission and respond to challenges and identify opportunities to maximize success. Answers the question, “What do we need to achieve in order to fulfill our mission, given the current conditions and potential future conditions?” These planning documents also include some objectives, the specific actions that will be taken in order to achieve the goals. Plan objectives answer the question, “What do we need to DO in order to achieve our goals?” Program mission. What the program seeks to accomplish; its definition, purpose, and function within the college: responds to local and internal needs and interests; future‐oriented; aspirational; based on R9/14/12 10/8/2012 Page 6 ideals and principles. Must support the college mission, goals, and objectives. Must be focused on student success. Answers the question, “How does the program mission specifically help the college fulfill its mission? How does this program help students achieve their goals?” Program goals. As with the broader college goals, program goals are what the program wants to achieve. Answers the question, “What do we need to achieve in order to provide for student success and to support the college mission and goals?” Program goals are closely related to, and are often derived directly from, the program mission statement. Program objectives. Breaks down the program mission and goals into specific tasks, steps, projects, initiatives, or action plans. Use the program goals to develop more specific and short term objectives to work on for the academic year that will support your long term goals (they are like mini goals for the year). Answers the questions, “What actions can we take that will achieve our stated goal or goals? What services are useful/helpful for students and how can we improve them? What do we do? How do we prove that the students are learning something worthwhile? What do we want students to learn? What are the most important things that they need to know? What are the things that we can impact/control/measure? What is the actual work that employees will need to do, who will do them, with what support, in order to achieve the goal or goals?” Mission and goals statements are revised on a longer cycle than program objectives, which could vary from year to year. Program objectives need to be responsive to more immediate challenges and opportunities. Objectives are employee‐centered; it’s what we do. Program & Student outcomes. Expected result of the actions taken, work, projects, or initiatives, that were started or completed within the time frame of the program review or update by the program. An outcome is the result of your program actions towards achieving your objectives. Answers the question, “What specifically do we expect this action plan to accomplish? What improvements do we hope to see?” Outcomes should be SMART: specific, measurable, attainable, relevant, and timely or time‐bound. Outcomes are student‐centered; it’s what they will learn or do. Program outcomes should demonstrate program improvement on key performance indicators. Answers the question, “If we achieve the program objective, what will be the result or outcome for the program?” Student outcomes should demonstrate improvement on student learning, behaviors, attitudes, activities; what students think, know, or do. “Student learning outcomes are generic abilities that can be developed, improved, and assessed.” (CR Assessment Handbook, p. B1, link to source provided below) Answers the question, “If we achieve the program objective, what will students learn or do?” Outcomes lead to the next phase, assessment & critical reflection, in that they lead to the question, “How do we prove we have achieved our objective? What can we measure, and how can we measure it?” Bloom’s Taxonomy can be helpful in articulating good student or program outcomes: Assessment & critical reflection. Every outcome should be measurable by some criteria for quality. Answers the questions, “How can we prove that students are now better able to accomplish, to learn, to R9/14/12 10/8/2012 Page 7 understand, to appreciate, to behave appropriately, etc.? How can we prove that the program’s services are efficient, helpful, useful, accurate, etc.? What can we measure that will accurately show improvement on these outcomes?” Assessment is the measurement part; critical reflection is when employees examine, discuss, analyze the measurements and try to determine if the objectives were carried out, if the outcomes were met, if the assessment itself was accurate or useful. Critical reflection answers the question, “So what now? Did it work or not? If not, what might work better? If it did work, should we continue or can we improve it even more?” This phase helps everyone recognize successes and learn from failures. Comprehensive program reviews. Provides the detailed, comprehensive, and long‐term overview of the program, on a three or five year cycle; an in‐depth recent history; can use extensive narratives but should also rely on accurate and consistent data; data should be cumulated from the annual updates. The comprehensive should reflect on all outcomes assessed over the time period, and can draw on data and narratives provided in the annuals. Annual program review updates. Provides a snapshot, brief update, of any recent changes or challenges; provides documentation of what outcomes and assessments were accomplished in that year; should only include brief narratives of things that others outside the program might not fully understand without an explanation. The annual should reflect on outcomes assessed for the previous year, and list any support required to make the improvements suggested by the outcomes. Should also provide short descriptions of outcomes to be assessed in the upcoming year, and what expectations or targets for how completion or success on those outcomes will be measured. Assessment plan. Incorporates the annual and comprehensive reviews, considers the academic year cycle, provides a reliable guide as to what will be assessed and when, depending on the workload and other needs of the program. Assessment of student and program outcomes can be ‘calendared’ for appropriate times in the academic year, and not all outcomes need to be assessed every year. A plan allows for spreading out the assessment of outcomes over a span of several semesters. Indicators, or Key Performance Indicators (KPI’s). Here’s the definition from the BRIC manual: “The actual KPIs adopted should be ones that help determine how important or how in demand, efficient, and effective the program, service, or operation is in meeting the mission of the college and promoting institutional effectiveness. … [S]tandard data definitions and sources of data should be agreed upon among program faculty and staff … [and] should include evidence of student learning, or … increased student success … . (p. 7, source link provided below) If the program does not have clearly defined and agreed‐on KPI’s, that can be a program goal. Quality Improvement Plan (aka Action Plan). Can be as simple as, “improve our rate of return of documents to department A” or, “increase the number of students attending orientations.” Set out your objectives for next cycle, what you want to improve or new work to accomplish. Some QIP’s may have multiple tasks, involve many staff, cross departments, etc., while others may be just one person working on one project. All QIP’s should relate to at least one student or program outcome. Your QIP’s can be drawn from your assessment plan and cycle if there is no data or recent change that requires revision. R9/14/12 10/8/2012 Page 8 Programreview,outcomes,assessments,&evaluationsuggestions
Top to bottom approach: Start by looking at the big picture and then work down, getting more specific to define the details on how to achieve the Mission/Goals. Move from broad and long term to more specific and immediate: from Mission to Goals to Objectives to Outcomes (SLOs/PLOs) to Action Plans to Assessments to Critical reflection and revision. Bottom to top approach: Start by examining what you already do and what you already measure, what data you already have, what you already are assessing. Define what those outcomes are, based on the data you collect. What question doe s the data answer? Move from that specific and short term to broader and more long term, deriving objectives and goals for what you already do and then linking it to your mission. If you can’t link it, maybe you need to reconsider whether you need to do it or not, or if your mission statement does not reflect what you do. Of course, you can do both top to bottom, and bottom to top, at the same time. The idea is to get your cycle established, determine what you will measure and when, and what you will do with your data. Then, you can run the program cycle year after year with minor revisions based on the feedback loop. Plan for the whole cycle: Each element does not stand alone; everything fits together in a cycle. Plan ahead based on the academic year or the comprehensive program review cycle. Plan to accommodate phases of the process, and the documentation, in your normal work cycle and program or department meetings. Every department has ups and downs; times when one type of work activity is prevalent and times when a different is prevalent. Customize your calendar cycle to your program, with input from all staff, and even external constituents. Program review, setting outcomes, evaluating and assessing outcomes, each of these is not a separate activity; they are all part of one cycle, with the same or similar activities recurring regularly. R9/14/12 10/8/2012 Page 9 College Mission & Goals; planning cmtes
Feedback, Closing the Loop
Program Mission & planning goals
Assessment & Critical reflection or evaluation
Objectives for the year or cycle, action plans
Program & Student Outcomes
Documentation and Evidence: Forms exist for comprehensive program reviews and annual updates, and assessment plans. Some of the feedback travels through the program review process via the parts of the forms that go to the planning committees: Furniture & Equipment, Enrollment Management, Facilities, & Technology. Maintain meeting notes, summary reports, presentations, workshop notes, or any other products for posting to the “Artifacts” page. Use the Forums to discuss. Forms and Forums may change; the key is to maintain continuity of the cycle and evidence of your program’s diligence and attention to it. Dialog: None of these activities should be done by one or two people in isolation of other staff. Accreditation guidelines are very clear that this process of continuous quality improvement must, at every phase, involve every employee at every level, with consideration taken for how each person’s work affects student success and furthers the college mission. The ACCJC Guide (link below) spends four pages on dialog, it’s importance, and how to engage in it. The BRIC guide says that “passionate, thoughtful inquiry” must be discovered or recaptured. So, this must be an authentic and collaborative process within each program and communicated appropriately to campus‐wide to all constituent groups. R9/14/12 10/8/2012 Page 10 Supportingdocuments
ACCJC Guide to Evaluating Institutions http://www.redwoods.edu/Accreditation/documents/GuidetoEvaluatingInstitutions.pdf This guide describes exactly what is expected of us, and of the evaluation team, and what kinds of evidence are acceptable. The introductory section on dialog and general themes of accreditation, is pages 6‐10. Student Support Services (IIB) section begins on page 33 and continues through page 36, where Library and Learning Support (IIC) section begins, which runs through to page 38. Each section quotes the standards, and then provides questions as prompts for the evaluation team. Each student support service program should consider how they might best answer these questions, and provide the necessary evidence, in their program review‐outcomes‐assessments‐
evaluation‐feedback cycle. Sources of evidence follow for Standard IIB & IIC on pages 42‐45. Accreditation Reference Handbook http://www.redwoods.edu/Accreditation/Accred_Ref_Handbook.pdf The official statement of the ACCJC standards, with other policy statements relevant to the work of the commission. Standards IIB and IIC are included on pages 21‐24. Pages 47‐53 list and explain the possible commission actions on institutions’ accreditation status. Also of interest to some departments is the policy on distance learning, pages 67‐68; and on diversity, page 69. Inquiry Guide: Maximizing the program review process, BRIC Technical Assistance Program. http://redwoods.edu/assessment/documents/INQUIRYGUIDE‐MaximizingtheProgramReviewProcess.pdf Everyone should read this entire document, it’s only 20 pages and provides solid explanations of the purpose and function of program review, assessments, and key performance indicators. Pages 18‐19 summarize the accreditation levels expected for awareness, development, proficiency, and sustainability in program review and related processes. College of the Redwoods Assessment Handbook. http://redwoods.edu/assessment/documents/Assessmentsv5c_000.pdf For a fuller understanding of assessments and how they fit into the program review cycle, read the entire document. For targeted sections of value to support services, read the General Philosophy, pages A1‐A2; the details on the proficiency levels on pages A5‐A7; Guidelines for Assessment Activities, pages A9‐A11; Tips for developing SLOs, pages B25‐B26; Student Support Services Assessment, pages D1‐D5; and D7‐D9. Although student support services have fewer faculty than instructional divisions and programs, the accreditation documents and other supporting documents and guidelines are very clear: program review and related processes are primarily faculty‐driven, and the primary focus is on student learning. So support services need to consider how to prioritize faculty and student interests. 2011 Assessment FOR student learning, handbook: it’s all about helping students learn. Columbus State Community College. http://redwoods.edu/assessment/documents/CSCC_AssessmentHandbook.pdf Program Review: Setting a Standard. Academic Senate for California Community Colleges. http://www.asccc.org/sites/default/files/Program‐review‐spring09.pdf These two documents primarily, almost solely, focus on instructional programs. However, because they both provide extensive treatments of outcomes and assessments, they may be helpful in providing deeper understanding for employees in student support services programs. R9/14/12 10/8/2012 Page 11 R9/14/12 10/8/2012 Page 12 
Download