College of the Redwoods Program Review Committee

advertisement

College of the Redwoods

Program Review Committee

2013-14 Program Review Committee Executive Summary

Section IA: Program Review Committee Data

Number of Programs Reviewed (% of each area) Total: 54

Instruction

Student Services

Service Areas

Number % of Total

32 100%

10 100%

12 100%

I. Instructional Program Reviews: Number of Programs Reviews Deemed to have been completed according to the rubric as: (E) Exemplary, (A) Acceptable, or (D) Developing

The PRC developed a rubric by which to evaluate instructional and student services program reviews to determine if programs are evaluating and linking planning and resource requests to institutional goals and assessment. Reviews were determined to be exemplary, acceptable or developing; sometimes a combination in Section 2, Data; Section 3, Assessment; Section 4, Previous Plans; Section 5 and 6:

Planning and Resource Requests. There were one or two areas that were not acceptable.

(Note: not all programs had previous plans to evaluate; because they were new, or run every two years; not all programs requested resources, and a few programs had combined acceptable/developing and/or exemplary denotations.)

Section 2 Data

Section 3

Assessment

Section 4 Previous

Plans

Section 5, Planning

Section 6

Resource

Requests

E A D E A D E A D NA E A D NA E A D

19.3

5%

83.8

7%

0

%

22.3

3%

60

%

30

%

30

%

56.6

7%

13.3

3%

3.33

%

32.2

6%

45.1

6%

22.58

%

3.23

%

41.38

%

41.

38

17.

24

Forestry, math and physical science program reviews were tagged as model program reviews, meeting the exemplary criteria in all sections; analyzing their data, assessment, and linking planning and resource requests to institutional goals and assessments. Automotive Technology was also very well done.

The construction tech program was very minimal. This program includes all aspects of construction except for the historical preservation courses. Historically, the PRC has had difficulty determining which programs are reflected in the review. From an outside perspective, it is confusing to anyone.

Assessment is not tied to planning. The PRC strongly recommends: Change the title, provide more detailed assessment and planning, and link to institutional documents. Also, the Dean vetting this review should be more interactive.

II. Student Services Program Reviews: Number of Programs Reviews Deemed to have been completed according to the rubric as: (E) Exemplary, (A) Acceptable, or (D) Developing

Section 2 Data Section 3 Assessment

Section 4 Previous

Plans

Section 5, Planning

Section 6

Resource

Requests

E

30

%

A

60

%

D

10

%

E

22.22

%

A

55.56

%

D

22.22

%

E

20

%

A

80

%

D

0

%

N

A

0

%

E

30

%

A

70

%

D

0

%

N

A

0

%

E

10

%

A

80

%

D

10

%

3 6 1 2 5 2 2 8 0 0 3 7 0 0 1 8 1

Analysis of Rubric for Instructional and Student Services Program Reviews:

Acceptable reviews, based on the rubric, are in the majority, which shoes program review analysis is improving and authors are advancing in tying data and assessments to planning and resource requests.

College of the Redwoods

Program Review Committee

2013-14 Program Review Committee Executive Summary

Service Area Reviews:

Section II: Highlights of How the Use of Data Analysis Impacted Student Learning

Section III: Impact of Assessment (Instructional)

Programs on track regarding assessment*

Advising Committees Met During Assessment period

Number % of Total

Curriculum Updates Current/Completed**

* of programs had completed mapping and evaluated or were on track to evaluate SLOs, and PLOs. Many noted plans to complete this semester.

Program Reviews were due at the beginning of the semester.

**Most programs were on track to update curriculum; many were in the process of inactivating and updating, and numbers weren’t all reflected.

How Assessment Activities Have Led to Improved Student Learning

Summary Statement on Planning and Relationship to Institutional Documents

Section IV: Resource Requests Reviewed and Ranked: Number % of Total

Professional Development

Personnel:

Staffing

Faculty

Total Requests

Evaluation of Program Review Process

8

17

17

157

5%

11%

11%

100%

Statement on How Well the Process Worked

While the process went smoothly, the committee makes note of possible process improvements throughout the year. Efficiency within the committee and in the ease of use of the templates improves every year as well. The committee was able to streamline the work required to provide evaluation of all reviews.

College of the Redwoods

Program Review Committee

2013-14 Program Review Committee Executive Summary

Recommendations for Improvement

Some areas discussed for improvement include:

Fall and Spring Program Review Process and Timeline

Evaluation of the author feedback process

PRC members should check the accuracy of data evaluation, if no comment

Program review documents should be useful for the institution to determine the standing of a program, and to allow the program to make improvements internally-discussed maybe the process should be more like curriculum where reviews are posted online and comments can be fed back to authors to meet our internal standards.

Statement on the Instruments (Template)

Overall the templates worked well. Each year it is refined as needs and issues are identified.

An online template is in the process of being created, that will allow authors easier access.

Recommendations for Improvement

The committee discussed:

Instructional Template:

Revising and clarifying instructions in the advisory committee section. (CTE authors need to note the advisory committee meetings requirement was met or explain why not.)

Clarifying and revising instructions for completing the data section 2, and include check boxes whether the program is 3% above or below the district average; and if so, provide narrative.

Created a program review template for Distance Ed. IR will work with the Distance Ed committee to determine how the final document should look.

Include a prompt for achieved outcomes for multiple deliveries (DE vs. face to face).

Limit word characters in narratives. They should be succinct and to the point.

Prepopulate the template with the PRC comments from each previous year, so authors can improve their reviews based on feedback.

Service Area and Student Success Templates:

Service area and student services should use the same indicators over a period of at least three years to provide consistency.

Indicators need to be included and analyzed.

General Themes

Prevalent is the theme that program review is getting better each year!

Programs are improving in tying their assessments to planning and making changes based on assessments; as well as tying planning and resource requests to institutional goals.

Download