Program Review Committee Friday, October 12, 2012, 9am – 11am Old Boardroom Present: Utpal Goswami, Keith Snow-Flamer, Tanya Smart, Brie Day, Barbara Jaffari, Angelina Hill, Jon Pedicino, Dana Mayer, Steve Stratton, Jeff Cummings, Rachel Anderson, Bob Cox, Hillary Reed, Crislyn Parker-notes 1. Review Instructional Template a. Data Included (not yet in template) IR Director, Angelina Hill, reviewed the revised data sets and prompts. Data sets are posted on the IR web site, under Program Review. Each data set includes a definition and prompts to trigger a potential response in the template’s corresponding area. Templates will be revised to include links and prompts. There was discussion at the September 28 meeting on the definition of transfer. The Chancellor’s Office has several definitions and IR is using the one that works best for CR reporting purposes. IR will asterisk these definitions in the prompts, to lead authors to the definition. Transfer data sets include transfers from all sites. Director Hill requested the committee members review the data sets and email her by Friday, October 19, with concerns or questions on the data sets. A prompt will be added to curriculum asking authors to note that if curriculum has not and will not be updated, why, and/or when will it be removed curriculum. FTES: Authors compare to district data and comment on faculty or class size if the ratio is very high or very low. If authors need assistance, they should talk to their Dean, IR or their VPs. Retention is determined at census. Fill rate is regardless of why a student is taking the class. Prompts will be included in the FTE and Curriculum sections. Discussion that language or terms should be defined in a way the college understands, not because it is the perfect definition-we need authors to hear/understand what we need to say. b. Discuss Prompts Included in the above discussion 2. Review “Common” PRC Response Prompts (all templates): Added clarification on Section 7: the program review committee provides comments on all sections of the program review, to be sent to authors and deans/directors. The format might change slightly once we know what the BPC requires. Send template changes to Student Services (Keith) as soon as possible. 3. Discussion on Areas which may Require Comprehensive Program Reviews The PRC agrees the comprehensive review process can be streamlined. A comprehensive program review should be triggered by annual reviews. The Committee will decide what triggers would require a comprehensive. Suggested to take from program revitalization concept factors - enrollment history, retention, etc. For example, one option is to choose 20% of programs for analysis, include 5 years of data, determine four or five data elements, and review. Another option is to note problem areas each year and track. However, if one goal of the PRC is to make the process less onerous, while maintaining quality, it is suggested to keep the cycle as is and in the year a comprehensive is due, add an extra section to the template. For example: add a Section 8 where authors will compare data trends for five indicators, and the program either meets the threshold or not. A comprehensive review is done to be sure a program is not a drag to the institution. It is the job of PRC to note both upward and downward trending. The committee agreed to inform authors and deans where a program is lacking, prior to requesting a comprehensive. If the problem is not corrected, it will lead to the comprehensive; and if a review is taken out of sequence, a trigger point must be clearly and specifically documented. Program development is under BP4020 and the CIO and senate will determine processes on how to bring to the college’s attention when program is rapidly expanding and needs additional resources. It is suggested the PRC include in the summary notations trigger points, such as is not enough faculty or facility issues, to reinforce resource requests and district dialog. Communication should come from the PRC when it sees a program in trouble, put the program on a watch list, and offer assistance. Success, Retention, and Enrollment are triggers. The PRC is not a “stick” for red-flagging programs, but is an evaluating and assisting body. Hybrid program trends may cause some problems. Discussion whether it is better to look at the validity of the certificate or degree, and not always the number of students. A comprehensive review looks at the total package to see if it is right. Another option is to analyze the program level outcomes as the comprehensive. Meaningful and easy. Authors would coordinate with assessment and reporting documents to consolidate into one comprehensive document to the PRC. Courses that aren’t core to a program won’t need a comprehensive Members of the PRC will work with areas on a comprehensive review template as follows: o Steve Stratton will work with Administrative Services o Keith Snow-Flamer will work with student services o Rachel Anderson, Utpal Goswami, Angelina Hill, and an assessment committee member (possibly Erik Kramer) for instruction 4. Community Ed Program Review The committee discussed the review and concluded it should include the number of courses offered and students served. Also, Community Ed needs to develop indicators that relate to their program function, as well as outcomes either for courses offered or for the overall program The committee would like to see numbers in the Quality SLO section. In the future, CED will use either the administrative or student services template, and determine what their indicators will be for the current year (2012-13). The committee has little to which it can respond. Converting to the new template will provide opportunity. The deadline for submitting metrics is coming up from service area heads (not instructional areas). It is noted, CED is under instruction through the Director of Human Resources. 3. Future meeting agenda item: Process Review, 10/26/12