Program Prioritization Task Force Minutes November 19, 2012

advertisement
Program Prioritization Task Force Minutes
November 19, 2012
Present: Angi Brenton, Vicki Szabo, Brian Railsback, Tim Carstens, Dave Hudson, Mary
Jean Herzog, Laura Cruz, Chip Ferguson. Georgia Hambrecht, Jason Lavigne, John
Baley, Bruce Henderson, Joan Byrd, Dave Kinner
1. Minutes from 12 November meeting – on Blackboard for Information
2. Discuss feedback from comparative institution interviews ('Comparative Reports' - 16 pp.;
don't need to print this; can just read / review).
UNCG
Brian – concern was voiced regarding complexity with some of the categories.
Tim – feedback was that the process did not go all that well. Saw problems with same items
mentioned in the Dickenson book. May have followed criteria too closely in the book and
did not follow so much the rest of the process. Reiterated that the process needs to be
aligned with mission, need to have administration behind the process. One problem they
encountered was during unit analysis there was no point person. They had to back pedal
when they discovered criteria was not able to be obtained, too complex, etc and this did not
go over well since people had done the work.
Joan – big problem was the direction of institutional research was that data had to be dumped
quickly without explanation to faculty and department heads. In the end the chancellor
decided not to use all the data. It would have been helpful if the faculty could have
understood the data in time to utilize (efficiency data) - it would have been an essential part
of the review. The chancellor and provost kept changing their minds about things which did
not go over well. They also changed the message as to why they were completing program
prioritization. Originally the plan was to involve faculty and then decided not to include
faculty.
Mary Jean – they wish they had been more careful about what they published - it ended up in
the newspaper. Said administration that financial crisis was main motivator. Process was to
have been completed in a few months, and then time was extended. Quite a number of
programs were eliminated or put under scrutiny – many were inactive programs. Some weak
programs were eliminated. List is somewhere public. Did not think it saved any money in
the long run. Same data were not always used by different areas in university; data was not
reliable and up to date.
Laura – big split between faculty and administration, breach of trust. They have a small
number of faculty against the process.
1
ECU
Chip – the most significant comment was we must assure confidentiality and trust among the
task force or doomed to failure. Talked about committee make up – thought it was based on
finance. Asked each department head to write up SWOT analysis of programs and deans did
this as well. Faculty input went to department heads then to the dean. They used program
data template that was sent to deans. Important that everyone agree data is accurate and all
agree on it. Once the reports were turned in, they did a first cut. Colleges were allowed to
do two self studies, one after first cut. They ranked programs high, medium and low. They
did perform some statistical analysis that was used in cuts. This was good way to allow the
pressure cooker to blow off steam. The chair assigned three committee members per
program – they were ranked, then sent out to colleges for discussion and feedback, then made
a second evaluation of programs. This was deemed a favorable exercise. They referenced
UNCG debacle and intended not to go there. Had colleges rank their programs using same
criteria as task force. They would have had more meetings with colleges if they could. They
split BS and BA. Chancellor did follow through – using model and data collected to look at
future funding so when funds come in, they go right to the program prioritization lists. If a
position is freed up – they go back to this list to consider. They are now in phasee two,
moving around departments and colleges in their model.
Laura – went after low hanging fruit, but positive repercussions about prioritizing programs
which helped them to understand the need for good institutional data. They now have
dashboards that are utilized on a regular basis. Have an article in Change Magazine,
November issue.
Mary Jean – talked about consolidating colleges, in the end did not do that. They talked
about time frame, in depth involvement with different stakeholders. This was not a
department thing, but a university wide initiative, task force members were not representing
own constituencies. Transparency was very important. They talked about merging colleges
and are now in the second phase – in fine arts and communication colleges, they now have
rotating deans. They got rid of programs that have been inactive. They are now concerned
about faculty retention.
NC State
Angi – They had two processes, first one was effectiveness and efficiency, not focused on
quality just cost and enrollment data. That process has just been completed. The second
process is more about approval of new programs now that GA has limited each university to
3 new programs at a time – created a more organized process to address this. New programs
had to be approved by the task force and provost before approved to most forward. WCU
has more than three programs in the pipeline right now 1) Global Foreign Languages, 2) BFA
in Musical Theatre, 3) doctorate in Nurse Practitioner and under consideration are 2-3
(Masters in Criminal Justice, Ph.D in Psychology, and Masters in Forensic Science).
2
Sacramento
– deans were asked to use recommendations and then did nothing with it. Committee was
called the Four Thousand page committee. The committee had to stay a month into summer
to complete. They had two phases – one committee met for 1.5 years to decide on the
criteria. Another committee ranked them. They provided opportunity for rebuttal – one
committee member thought it was great, another said it was horrible to figure out the new
data. They had five people read each criterion – lots of qualitative data. Goals based on
baccalaureate learning goals and accreditation. They had four criteria that were highly
ranked. Thought it was good that the committee spoke with one voice. It took them a long
time. They did not do test cases and wish they had done so. They did talk to the data office,
department heads and program directors all the time. They helped figure out what criteria
mattered most.
Themes from these conversations - time line, criteria, transparency, assessment, rebuttal
issue, simplify, follow through
If you have not completed an interview and hear from them, please share that.
Angi spoke with Dianne Lynch about their process – their task force looking at
reorganization outside of academics. Their first focus was Advancement. Clifton is retiring
at the end of January; therefore the Chancellor wanted guidance on reorganization there. The
task force has submitted recommendations. They are now working with Administration
Finance. They still plan to meet the January timeline. They did not have criteria per se, but
did visit other campuses that included non academic units (NC State, ECU, and UNCGreensboro). They mainly have been proceeding with surveys and focus groups, met with
each of the deans, surveyed various campus units - they are less data driven. These groups
are so variable; there was no way to come up with a common set of metrics. WCU is way
ahead in instructional positions as compared to our peers and way below in non academic
units (this leaves many offices one person deep that has resulted in some of our processes
being fairly inefficient). Angi will share this report.
3. Ongoing discussion of criteria and indicators
 Proposal for two-tiered process? ('Possible core metrics,' attached)
Bruce kicked off this discussion last week in the data work shop abut indicators that
would be a good threshold without reviewing every program. Melissa thinks there are
such indicators and suggested six criteria that would give us our first cut. There will be
other programs that we clearly will not cut because they are growing so fast and are so
productive – why would we make them do this (like nursing). It is possible to do a two
step process, with first looking at a limited amount of data (number of majors, number of
3
graduates, generated FTE, program retention and graduation rates, and uniqueness - is
this a program that either the degree area or specific niche is not duplicated in other
programs). Melissa suggested we could set threshold numbers or we could look at a
percentage out of the indicators, we could see how many met 4 or 5 and decide what the
cut off would be. For any given program there might be one they don’t meet. This might
leave us to review half the programs we are looking at in the more intense phase. For
programs that made it through initial screening, they would still have the option of
submitting a case statement for increased funding. Those going through full program
review could also submit this request. Discussion ensued regarding pros and cons on this
possible process. The task force discussed a hybrid – quantitative data based on selected
indicators and a 500 word qualitative statement as to why they are a vibrant program.
The majority of the group wants a two stage process.
If we are going to do stage one screening, what are we going to look at?
Melissa can easily provide us with most of this data. Discussion ensued.



Working task force draft ('Working draft prioritization criteria indicators table')
Proposed 'case statement' from B. Railsback ('Alternate prioritization categories')
Finalize and disseminate prior to 27 Nov. forum
o COD / FS / SS / SGA / CLC.
o Others? Campus? Website?
It was suggested postponement of the forum from November 27. It was agreed we will
wait until January.
4. Program / unit analysis – ('Master list programs' – 9 pp.)
5. Preliminary preparation for Forum
 Feedback on criteria – open to change / present for questions only?
 Discussion of program report / assessment / other issues that may come up?
6. Moving forward
 Survey results
 Next meeting – Monday, 26 November, 12:30-2:00, UC Dogwood
o Format of program reports
o Program master list / complete narratives and program selections
For the next meeting, a broad item to be thinking about – assessing data. We have basically
three models to choose to assess with: 1) points on each criteria maybe weighted in some way
4
(advantage it is measureable and honoring criteria); 2) more holistic analysis – over whole
program profile score of out of 100; or 3) midrange - have a point value range for sections, with
the total at the end.
Please be thinking about this. This would be for the second stage.
5
Download