Student Development Division Annual Program Review Update 2012/13

advertisement
Student Development Division Annual Program Review Update 2012/13
(fields will expand as you type)
Section 1 - Program Information
1.0 Name of Program:
1.1 Program Review Authors:
Date:
1.2 Program Director Signature:
Date:
1.3 Vice President Signature:
Date:
1.4 Program mission:
1.4.1 State briefly how the program mission supports the college mission:
1.4.2 Program goals:
1.4.3 Describe how the program goals support institutional planning and goals (Strategic Plan, Education Master Plan, Enrollment
Management Plan, etc.):
Section 2 - Data Analysis
2.0 Program Staffing/Budget Data and Indicators
(Past years)
Provide information to show changes over time (steady, increasing, decreasing, etc.). Insert additional rows as needed for your key performance
indicators (KPI’s), such as program services, functions, student contacts, etc.
2.1 Staffing/Budget
FTE Faculty and Staff
FTE Additional Workforce
Personnel (Dollars)
Discretionary (Dollars)
Other
2010/11
2011/12
10/3/2012
Observations (steady/increasing/decreasing)
Page 1
2.2 Program Indicators
TBD
TBD
TBD
2.3 Describe how these changes affect students and/or the program:
2.4 Provide any other relevant information, or recent changes, that affect the program:
Section 3 –Critical Reflection of Assessment Activities
3.0 Student Learning Outcomes & Program Outcomes Assessed in the Current Cycle
(2011/2012)
3.1 Summarize the conclusions drawn from the data and the experience of staff working to achieve the outcomes:
3.2 Summarize how assessments have led to improvement in Student Learning and Service Area Outcomes:
3.3 (Optional) Describe unusual assessment findings/observations that may require further research or institutional support:
Section – 4 Evaluation of Previous Plans
4.1 Describe plans/actions identified in past program reviews and their current status. What measurable outcomes were achieved due to
actions completed.
Actions
Current Status
10/3/2012
Outcomes
Page 2
4.2 (If applicable) Describe how funds provided in support of the plan(s) contributed to program improvement:
Section – 5 Planning
5.0 Plan for Future Improvements
(2012/2013)
Based on institutional plans, data analysis, student learning outcomes, program outcomes, the assessment of those outcomes, and your critical
reflections, describe the program’s Action Plan for the 2012/13 academic year. If more than one plan, add rows. Include necessary resources.
(Only a list of resources is needed here. Provide detailed line item budgets, timelines, supporting data or other justifications in the Resource
Request).
(link to two year assessment plan)
5.1 Program Plans
Action to be taken:
Relationship to
Institutional Plans
Relationship to
Assessment
Expected Impact on
Student Learning or
Service Area Outcomes
Resources Needed
5.2 Provide any additional information, brief definitions, descriptions, comments, or explanations, if necessary.
Section 6 - Resource Requests
6.0 Planning Related, Operational, and Personnel Resource Requests. Requests must be submitted with rationale, plan linkage and estimated
costs.
Request
Check One
Amount
Recurring
Rationale
$
Cost Y/N
Linkage
Planning Operational Personnel
10/3/2012
Page 3
Section 7- PRC Response
7.0 (The response will be forwarded to the author and the supervising Director and Vice President)
PRC:
General suggestions:
Be brief. Most of your programs’ evidence or documentation of your internal department or program thinking or planning processes should be
in your regular meeting minutes and other artifacts, such as spreadsheets of data collected, survey instruments used, details of assessments, etc.
Your Comprehensive Program Review will pull together previous annual updates and provide more detail. This annual update should primarily
serve as a snapshot of progress for the previous year, current plans and actions, and a preview of the next year. Include descriptions and
explanations only as necessary for persons outside your program, for example, you should explain or spell out all acronyms at first use, such as
SARS, or program-specific words such as Koha. The form fields can expand to accommodate more text, however, when finished it should be less
than five pages.
Plan and schedule activities. The annual program review is just one station of the journey through the academic year. Activities are ongoing,
documentation is ongoing, surveys and assessments should be planned in advance for best results. Each program should keep a calendar of
program review and assessment activities where all employees can see and access it. Your calendar can be customized to your program. Last
year’s annual update should inform this year’s processes; the comprehensive will review all of the processes for the entire cycle. Accreditation
10/3/2012
Page 4
team members do not want this to be a process of just filling out a form. The form is just a convenient way to provide evidence of a continuous
and sustained process of quality improvement for everyone- accreditation team visitors, internal college constituent groups, employees in the
program being reviewed, students, community members.
Involve everyone. Accreditation expects that this process will involve all employees, and it is most effective when it does. Each person’s work
contributes differently to our collective pursuit of student success. Branch campus or sites must be included, and any special circumstances or
needs must be incorporated. Involving everyone ensures any goals, outcomes, targets, assessment methods, etc., are realistic and do-able,
authentic and meaningful.
Suggestions for line items of the form:
1.0 Program Information: Mission, goals, link to college mission. Although you may find it necessary to fine tune or revise your program’s
mission statement every five years or so, the mission and goals normally will not change from year to year. Also, college-wide strategic and other
plans should not be changing too frequently, so the text you enter into this section will be very similar year after year.
2.0 Data Analysis: Program Staffing/Budget Data & Indicators. Provide a data “snapshot” so reviewers and readers are able to quickly see
potential trends based on changes that have been observed. Extensive narrative is not necessary.
Section 2.1 is for the program’s required Key Performance Indicators (KPI’s) and supporting data. The data may be provided by the Business
Office or collected by the program. Use the “other” line item for any data trends specific to your program, in the category of Staffing/Budget, for
example, you could itemize your student worker data here. Add more rows if necessary.
Section 2.2 can be used for any additional KPI’s or other data that are unique to the program, however, remember that 5 years’ continuous and
consistent data will be required in the Comprehensive Review. Rows can be added to the table to accommodate additional data elements.
Columns can be added as more annual data cumulates, up to the five year Comprehensive Review.
Section 2.3, describe any changes only if necessary to clarify specific circumstances for readers not familiar with your program.
Section 2.4 is provided for any other additional information that might help clarify the program’s issues to outside readers, or that is specific to
that year cycle: new challenges, new legislation, unplanned events.
3.0 Critical Reflection of Assessment Activities. Consulting your assessment plan, each year you will assess three or four SLOs and/or POs,
depending on the size of your program and the variety of services offered to students. You may collect the data in the previous year or semester,
10/3/2012
Page 5
or in the current year or semester, but here is where you document what was learned from the data. Use the prompts to answer questions such
as: What does your assessment data tell you? What did staff in the program learn from the process? Were targets met? What were your
successes or failures and why?
4.0 Evaluation of Previous Plans. Describe plans/actions identified in the last program review and their current status. What measurable
outcomes were achieved due to actions completed? Some departments may choose to draw forward plans from several years ago; it depends
on how you collect your data, the number and workload of available staff, size and scope of the program and number of students served, and
what the normal work cycle is like throughout the year. You are not required to fill in all the rows, but you can add rows as necessary.
Use Section 4.2 to link your assessments and improvement plans (program plans, work plans, action plans, Quality Improvement Plans or QIPs)
to funding and your budget. Were you able to find ways to improve efficiency? Did some plans fail or succeed due to funding? Explain why and
how.
5.0 Planning Plan for Future Improvements. “Close the loop” by looking at your outcomes, your data, and your assessments (the information
provided in Sections 1-4 ) to determine if expectations or targets were met, or not, and what can be done about it, to either continue the
improvement, increase the improvements, or reduce the losses before the next assessment cycle for those outcomes. This is where we learn
from experience, document what we learned, and how we plan to do better in future.
Section 5.1 Program Plans are where you can list what you plan to work on in the next cycle for improvement. Based on student learning
outcomes, program outcomes, the assessment of those outcomes, and your critical reflections, or an institutional plan initiative describe the
program’s Action Plan for the 2012/13 academic year. If more than one plan, add rows. Include necessary resources.
Section 5.2 is for clarification of program-specific needs, mandates, terminology, explanations, etc.
6.0 Resource Requests. Only a list of resources is needed here, but any resources requested should match to program plan(s) described in
Section 5. Requested amounts can be close estimates. What’s important is to show a good rationale and linkage to your outcomes, targets, and
goals.
7.0 PRC Response. The Program Review Committee will provide their response.
10/3/2012
Page 6
Terms and Definitions
ACCJC Standards & guidelines for evaluating institutions. Sets the standards that all programs must meet so that collectively, the college as a
whole maintains its accreditation. All student services programs and functional areas should be familiar with the sections of the standards that
apply, and be prepared to build their cycles of continuous quality improvement on those standards. The ACCJC Guide to Evaluating Institutions
(For link, see “Supporting Documents” section below) is the manual for visiting teams that specifies exactly what the expectations are, and what
kind of documentation or evidence to provide. This manual provides us with a practical guide to meet those expectations.
College Mission, Vision, Values. Responds to local needs and interests of the community. What the college as a whole seeks to accomplish;
future-oriented; aspirational; based on broad ideals and fundamental principles. Answers the question, “Who are we and why are we here and
what do we hope to be in future?” (See http://www.redwoods.edu/about/mission.asp)
Strategic Plan goals, Educational and other Master Plan goals, and plan objectives. Targets that must be met in order for the college to fulfill its
mission and respond to challenges and identify opportunities to maximize success. Answers the question, “What do we need to achieve in order
to fulfill our mission, given the current conditions and potential future conditions?” These planning documents also include some objectives, the
specific actions that will be taken in order to achieve the goals. Plan objectives answer the question, “What do we need to DO in order to
achieve our goals?” (See http://inside.redwoods.edu/IPM/plans.asp)
Program mission. What your program seeks to accomplish; its definition, purpose, and function within the college: responds to local and internal
needs and interests; future-oriented; aspirational; based on ideals and principles. Must support the college mission, goals, and objectives. Must
be focused on student success. Answers the question, “How does the program mission specifically help the college fulfill its mission? How does
this program help students achieve their goals?”
Program goals. As with the broader college goals, program goals are what the program wants to achieve. Answers the question, “What do we
need to achieve in order to provide for student success and to support the college mission and goals?” Program goals are closely related to, and
are often derived directly from, the program mission statement.
Program & Student outcomes. Expected result of the actions taken, work, projects, or initiatives, that were started or completed by the
program within the time frame of the program review or update. An outcome is the result of your program actions towards achieving your goals.
Answers the question, “What specifically do we expect this action plan to accomplish? What improvements do we hope to see?” Outcomes
should be SMART: specific, measurable, attainable, relevant, and timely or time-bound. Outcomes are student-centered; it’s what they will learn
or do.
10/3/2012
Page 7
Program outcomes should demonstrate program improvement on key performance indicators. Answers the question, “If we achieve the
program objective, what will be the result or outcome for the program?”
Student outcomes should demonstrate improvement on student learning, behaviors, attitudes, activities; what students think, know, or do.
“Student learning outcomes are generic abilities that can be developed, improved, and assessed.” (CR Assessment Handbook, p. B1, link to
source provided below) Answers the question, “If we achieve the program objective, what will students learn or do?” Outcomes lead to the next
phase, assessment & critical reflection, in that they lead to the question, “How do we prove we have achieved our objective? What can we
measure, and how can we measure it?”
Bloom’s Taxonomy can be helpful in articulating good student or program outcomes:
Assessment & critical reflection. Every outcome should be measurable by some criteria for quality. Answers the questions, “How can we prove
that students are now better able to accomplish, to learn, to understand, to appreciate, to behave appropriately, etc.? How can we prove that
the program’s services are efficient, helpful, useful, accurate, etc.? What can we measure that will accurately show improvement on these
outcomes?” Assessment is the measurement part; critical reflection is when employees examine, discuss, analyze the measurements and try to
determine if the objectives were carried out, if the outcomes were met, if the assessment itself was accurate or useful. Critical reflection
answers the question, “So what now? Did it work or not? If not, what might work better? If it did work, should we continue or can we improve it
even more?” This phase helps everyone recognize successes and learn from failures.
Comprehensive program reviews. Provides the detailed, comprehensive, and long-term overview of the program, on a three or five year cycle;
an in-depth recent history; can use extensive narratives but should also rely on accurate and consistent data; data should be cumulated from the
annual updates. The comprehensive should reflect on all outcomes assessed over the time period, and can draw on data and narratives provided
in the annuals.
Annual program review updates. Provides a snapshot, brief update, of any recent changes or challenges; provides documentation of what
outcomes and assessments were accomplished in that year; should only include brief narratives of things that others outside the program might
not fully understand without an explanation. The annual should reflect on outcomes assessed for the previous year, and list any support
required to make the improvements suggested by the outcomes. Should also provide short descriptions of outcomes to be assessed in the
upcoming year, and what expectations or targets for how completion or success on those outcomes will be measured.
Assessment plan. Incorporates the annual and comprehensive reviews, considers the academic year cycle, provides a reliable guide as to what
will be assessed and when, depending on the workload and other needs of the program. Assessment of student and program outcomes can be
10/3/2012
Page 8
‘calendared’ for appropriate times in the academic year, and not all outcomes need to be assessed every year. A plan allows for spreading out
the assessment of outcomes over a span of several semesters.
Indicators, or Key Performance Indicators (KPI’s). Here’s the definition from the BRIC manual: “The actual KPIs adopted should be ones that
help determine how important or how in demand, efficient, and effective the program, service, or operation is in meeting the mission of the
college and promoting institutional effectiveness. … [S]tandard data definitions and sources of data should be agreed upon among program
faculty and staff … [and] should include evidence of student learning, or … increased student success … . (p. 7, source link provided below) If the
program does not have clearly defined and agreed-on KPI’s, that can be a program goal.
Quality Improvement Plan (QPI) aka Action Plan, program plan, work plan. Can be as simple as, “improve our rate of return of documents to
department A” or, “increase the number of students attending orientations.” Involve all staff in identifying issues, problems, opportunities for
improvements, and in prioritizing them. Set out your objectives for next cycle, what you want to improve or new work to accomplish. Some QIP’s
may have multiple tasks, involve many staff, cross departments, etc., while others may be just one person working on one project. All QIP’s
should relate to at least one student or program outcome. Your QIP’s can be drawn from your assessment plan and cycle if there is no data or
recent change that requires revision.
Program review, outcomes, assessments, & evaluation suggestions
Top to bottom approach: Start by looking at the big picture and then work down, getting more specific to define the details on how to achieve
the Mission/Goals. Move from broad and long term to more specific and immediate: from Mission to Goals to Objectives to Outcomes
(SLOs/PLOs) to Action Plans to Assessments to Critical reflection and revision.
Bottom to top approach: Start by examining what you already do and what you already measure, what data you already have, what you already
are assessing. Define what those outcomes are, based on the data you collect. What question does the data answer? Move from that specific
and short term to broader and more long term, deriving objectives and goals for what you already do and then linking it to your mission. If you
can’t link it, maybe you need to reconsider whether you need to do it or not, or if your mission statement does not reflect what you do.
Of course, you can do both top to bottom, and bottom to top, at the same time. The idea is to get your cycle established, determine what you
will measure and when, and what you will do with your data. Then, you can run the program cycle year after year with minor revisions based on
the feedback loop.
10/3/2012
Page 9
Plan for the whole cycle: Each element does not stand alone; everything fits together in a cycle. Plan ahead based on the academic year or the
comprehensive program review cycle. Plan to accommodate phases of the process, and the documentation, in your normal work cycle and
program or department meetings. Every department has ups and downs; times when one type of work activity is prevalent and times when a
different is prevalent. Customize your calendar cycle to your program, with input from all staff, and even external constituents. Program review,
setting outcomes, evaluating and assessing outcomes, each of these is not a separate activity; they are all part of one cycle, with the same or
similar activities recurring regularly.
College Mission
& Goals;
planning cmtes
Feedback,
Closing the Loop
Program Mission
& planning goals
Assessment &
Critical reflection
or evaluation
Objectives for
the year or cycle,
action plans
Program &
Student
Outcomes
10/3/2012
Page 10
Documentation and Evidence: Forms exist for comprehensive program reviews and annual updates, and assessment plans. Some of the
feedback travels through the program review process via the parts of the forms that go to the planning committees: Furniture & Equipment,
Enrollment Management, Facilities, & Technology. Maintain meeting notes, summary reports, presentations, workshop notes, or any other
products for posting to the “Artifacts” page. Use the Forums to discuss. Forms and Forums may change; the key is to maintain continuity of the
cycle and evidence of your program’s diligence and attention to it.
Dialog: None of these activities should be done by one or two people in isolation of other staff. Accreditation guidelines are very clear that this
process of continuous quality improvement must, at every phase, involve every employee at every level, with consideration taken for how each
person’s work affects student success and furthers the college mission. The ACCJC Guide (link below) spends four pages on dialog, it’s
importance, and how to engage in it. The BRIC guide says that “passionate, thoughtful inquiry” must be discovered or recaptured. So, this must
be an authentic and collaborative process within each program and communicated appropriately to campus-wide to all constituent groups.
Additionally, some exciting benefits accrue from inter-departmental or cross-departmental or cross-disciplinary discussions. How does your
program affect other programs? How do other programs affect your program?
Supporting documents
ACCJC Guide to Evaluating Institutions
http://www.redwoods.edu/Accreditation/documents/GuidetoEvaluatingInstitutions.pdf
This guide describes exactly what is expected of us, and of the evaluation team, and what kinds of evidence are acceptable. The introductory section on dialog
and general themes of accreditation, is pages 6-10. Student Support Services (IIB) section begins on page 33 and continues through page 36, where Library and
Learning Support (IIC) section begins, which runs through to page 38. Each section quotes the standards, and then provides questions as prompts for the
evaluation team. Each student support service program should consider how they might best answer these questions, and provide the necessary evidence, in
their program review-outcomes-assessments-evaluation-feedback cycle. Sources of evidence follow for Standard IIB & IIC on pages 42-45.
Accreditation Reference Handbook
http://www.redwoods.edu/Accreditation/Accred_Ref_Handbook.pdf
The official statement of the ACCJC standards, with other policy statements relevant to the work of the commission. Standards IIB and IIC are included on
pages 21-24. Pages 47-53 list and explain the possible commission actions on institutions’ accreditation status. Also of interest to some departments is the
policy on distance learning, pages 67-68; and on diversity, page 69.
Inquiry Guide: Maximizing the program review process, BRIC Technical Assistance Program.
http://redwoods.edu/assessment/documents/INQUIRYGUIDE-MaximizingtheProgramReviewProcess.pdf
10/3/2012
Page 11
Everyone should read this entire document, it’s only 20 pages and provides solid explanations of the purpose and function of program review, assessments,
and key performance indicators. Pages 18-19 summarize the accreditation levels expected for awareness, development, proficiency, and sustainability in
program review and related processes.
College of the Redwoods Assessment Handbook.
http://redwoods.edu/assessment/documents/Assessmentsv5c_000.pdf
For a fuller understanding of assessments and how they fit into the program review cycle, read the entire document. For targeted sections of value to support
services, read the General Philosophy, pages A1-A2; the details on the proficiency levels on pages A5-A7; Guidelines for Assessment Activities, pages A9-A11;
Tips for developing SLOs, pages B25-B26; Student Support Services Assessment, pages D1-D5; and D7-D9. Although student support services have fewer faculty
than instructional divisions and programs, the accreditation documents and other supporting documents and guidelines are very clear: program review and
related processes are primarily faculty-driven, and the primary focus is on student learning. So support services need to consider how to prioritize faculty and
student interests.
2011 Assessment FOR student learning, handbook: it’s all about helping students learn. Columbus State Community College.
http://redwoods.edu/assessment/documents/CSCC_AssessmentHandbook.pdf
Program Review: Setting a Standard. Academic Senate for California Community Colleges. http://www.asccc.org/sites/default/files/Programreview-spring09.pdf
These two documents primarily, almost solely, focus on instructional programs. However, because they both provide extensive treatments of outcomes and
assessments, they may be helpful in providing deeper understanding for employees in student support services programs.
10/3/2012
Page 12
10/3/2012
Page 13
Download