Notes: Program Review Committee Friday, October 8, 2010 9:30 – 10:30am Lakeville Room Present: Paul Hidy, Hillary Reed, Cheryl Tucker, Vinnie Peloso, Mike Peterson, Cindy Hooper, Rachel Anderson, Melody Pope, Toby Green, Todd Olsen, Karen Nelson, Crislyn Parker, recording, Justine Shaw, guest Absent: Zach DeLoach, Keith Snow-Flamer, Utpal Goswami, Bill Honsal, Brady Reed 1. Where to archive the Program Level Outcomes assessment results/data and the Follow-up reports for Course-level Assessment Conclusions): Course level assessments are archived with area coordinators It was suggested that program level assessments should be archived with IR, Curriculum and/or Public Folders. (It was noted QIP’s and Assessment should be under program review in Public Folders, or may be create an separate assessment committee webpage or add a link to the Program Review internal site or both Action item: The deans should request all completed assessment forms be sent for archival in public folders and the website Action item: Have assessment forms sent to all deans and areas coordinators to facilitate consistent assessment activities Regarding discussion about customizing assessment forms (some divisions have specific assessment that must be done for outside accreditation and can that be incorporated), per Justine, Section 1 is ok to customize, but Course Level 1 needs to be consistent. She believes that as long as getting the information transferred to the standardized form for program review, customization should be ok Per Justine, the assessment documents (that are not part of the program review templates) are more for internal reporting and can be revised to work for program review documentation. Justine will talk to Zach and Karen to be sure all documents coordinate and information is consistent 2. Do the ("loop-closing") Follow-up Reports for Assessment Conclusions need a separate form, or is the current Program Review Template an adequate place to house this data? Based on the ACCJC workshop last year, accessibility to documents and assessment conclusions after closing the loop is important in the ability to compare success rates and determine that changes made based on assessment actually improved student success. Per Justine, the current template enables narrative about what changes are going to be made, but a simple form is needed to include narrative, based on what assessments were done, what changes were made, impact of these changes and subsequent student success data. As yet, CR hasn’t pushed faculty to close the loop. One suggestion is to add another tear out in the excel template: what was done last year and how did it work; which can be pulled out to put on the PRC site as Program Assessment Outcomes. For this year, create an addendum to cover this situation. It was questioned whether Zach can download that portion download into the Program Review templates each year, since the authors are accountable for what is included in program reviews and all decisions are supposed to be related to student success/data. 3. Definition of “program” Options discussed: Organize definitions around areas of influence (there was concern how that would work) Be organic, let area coordinators define programs. The outlier courses that have no oversight (such as Addiction Studies) can be identified and assigned by Deans to area coordinator or author Define programs based on how the program is conducted and by whom, in each area as needed, i.e. career and tech ed, student services, academic, etc. define what “groups” need programs reviewed Recommend deans and area experts identify programs and vet through the PRC, which would mostly be to identify the problem areas as the academic side is fairly well identified already Action item: The Committee made the recommendation to consult with deans and area/program coordinators to formulate programs in an organic manner, according to past practice, in order to make sure all courses and programs have area coordinator/program/author representation and oversight. Authors need to have input, be primarily responsible to identify program needs across the district. The Program Review Committee will oversee the viability of the author and review; authors are charged with the responsibility of contacting all individuals, constituents and advisory committees involved in that program throughout the district, including administrative office assistants. Authors will work with deans to clarify the definitions. (This will not be a generic definition, but documented by course and function.) The Deans and IR will be create a master spreadsheet that lists programs and present to PRC for approval. Action item: Need to determine where comprehensive programs fit into definition and the simple solution – change terminology to “annual discipline review” and “comprehensive program reviews” 4. BPC ranking strategy Moved to next meeting 5. Other: Karen will see if can Zach get these post the Comprehensive templates soon Concern was voiced that addendum information was not sent out on 9/27 as promised. Integrated planning functions committees need to be coordinated and emails sent out, according to the calendar Send out another addendum email with due dates and a reminder, to be sure to fill out top of page with division, author, and year. Email title: Program Review - Final call for resource requests