NCCBC Admin. Survey

advertisement
The National Benchmarking of an
Assessment Tool to Measure
Outcomes in Administrative Areas
National Community College
Benchmarking Conference
Terri M. Manning, EdD
Central Piedmont Community College
Charlotte, North Carolina
Some history….
CPCC is the largest community
college in both the Carolinas with
approximately 60,000 enrolled
students.
 We have over 100 instructional
program areas including for-credit,
continuing education and literacy
 We have over 30 administrative units

What’s Going on Nationally

The Spellings Commission Draft Report calls for
these things in higher education (across the
country).
• the complete overhaul of the entire student aid
system, together with substantial increases in needbased aid;
• creation of an overall measurement of an
institution's "bottom line," including measures of
institutional costs and performance that let parents
and policy makers view institutional results;
• a mandate that institutions measure student learning
outcomes, disseminate the results to students, and
report them publicly in the aggregate;
Spellings Commission Report
• development of a national student unit-record
database to follow the progress of each
student;
• establishing a national accreditation
framework that includes comparable
performance measures, and making the
findings of reviews easily accessible to the
public;
• removal of barriers to the transfer of student
credits; and
• creation of financial incentives for institutions
to lower costs through technological
advances.
http://www.ed.gov/about/bdscomm/list/hiedfuture/reports/0714-draft.pdf
What May Happen
More emphasis on outcomes for all
that must be transparent and available
to the public.
 Overhaul of financial systems and
funding formulas that is related to
student success and outcome based
criteria.
 Our spending justification under the
microscope (especially administrative
overhead which is huge for most).
 A complete overhaul of accreditation.

What We Have and Don’t Have

Have
• Good national tools to assess student
Satisfaction
 Engagement


Don’t have:
• Good tools to measure administrative
units or “all-college functions”.
• No one has really thought about it.
Institutional Effectiveness and the
Southern Association of College and
Schools




SACS is the oldest and strictest accrediting agency,
with North Central coming right along behind it ---- but
the others are not yet quite as strict.
At the heart of SACS’ philosophy is the concept of
institutional effectiveness (IE).
IE involves a process of planning, evaluation,
assessment and use of results.
Under the new core requirements and comprehensive
standards, there are two “mandates” that specifically
address program or unit reviews and the assessment of
outcomes.
Under the New Core Requirements and
Comprehensive Standards
Core Requirement 2.5 states “The institution
engages in ongoing, integrated, and institution-wide
research-based planning and evaluation processes
that incorporate a systematic review of programs and
services that (a) results in continuing improvement,
and (b) demonstrates that the institution is effectively
accomplishing its mission.”
Comprehensive Standard 3.3.1 States “The
institution identifies expected outcomes for its
educational programs and its administrative and
educational support services; assesses whether it
achieves these outcomes; and provides evidence of
improvement based on analysis of those results.”
What We Need In a Nut Shell

A planning and evaluation process for
instructional programs, student services
and administrative areas that is:
•
•
•
•
•
•
•
Ongoing
Integrated
Institution-wide
Systematic
Research-based
Shows continuous improvement
Proves effectiveness (meeting mission)
The Pieces we Need
Planning and Evaluation Process
 Program Review Process
 Define and Measurement Outcomes
 Use Results (with evidence) for
Improvement

What is Evaluation
Not research but research-based.
 A process of defining purpose,
identifying process, outcome or impact
objectives and measuring if those were
met.
 Using the results to:

• Improve what we are doing
• Make changes in program/purpose
Why this Process?



The evaluation of teaching and educational
effectiveness is a top priority for most
institutions
However, the evaluation of business and
administrative units is often undervalued
and overlooked
Evaluating the effectiveness of services
(not just customer satisfaction) makes
good business sense
Why This is Important

We want to answer a few questions through
our program evaluation and assessment
process.
• Is this unit effectively serving the college?
• What needs does the college have that this unit can
fulfill?
• Is this unit working efficiently?
• What are the administrative outcomes for this unit
and are they meeting them?
• How can we demonstrate continuous improvement
as a result of planning and evaluation?
IE at Central Piedmont
CPCC established an IE committee in
1998 and an IE website in 2000
(http://inside.cpcc.edu/IE) (this website
gets an average of 60 hits a day – most
from outside the college)
 Committee members were from various
instructional departments, student
services and administrative/business
offices
 The committee established an IE plan
for the college

Visual of the IE Plan
Four Major Overlapping Pieces of
the IE Plan

General
Education
Assessment
Program/
Unit Review


Annual Goal
Setting Cycle
College
Assessment
Process

Annual goal setting and
follow-up required by all
units (Strategic Plan)
Program/unit review on a
3-5 year cycle
College assessment plan
involving surveys,
assessment and data
analysis
The evaluation of general
education
Developing the Process
We wanted to create a meaningful
program/unit review process
 We wanted programs to complete the
process having learned something
valuable (not a document to set on a
shelf)
 We wanted the process to be
“outcome-based” to stand the test of
time

Developing the Process
The first review process was created for
instructional programs
 During the 1998-99 year, 18 programs
were reviewed (approximately 100
programs over five years)
 The perception at the beginning was that
“this is just another academic exercise”
 But the results were very different than
any review process previously completed

Identifying Outcomes

We focused on three types of outcome
measures:
• Program Outcomes
• Student Learning Outcomes
• Administrative or Managerial Outcomes

"Outcomes" are benefits for people:
changes in knowledge, values, position,
skills, behavior or status. More simply
stated, outcomes are typically what service
providers hope recipients achieve once they
complete a program or receive services.
This is not the “what” but the “why” of
education.
Outcomes




Student learning outcomes are outcomes
related to the learning that takes place in the
classroom. We measure improvements in
writing, speaking, understanding the scientific
method, etc.
Program outcomes are the benefits to a
student who receives an associate degree in
Nursing or completes a certificate in Network
Administration? Typical outcomes might be
licensure exam scores, job placement, etc.
Administrative outcomes are the benefits to
the college as a result of the direct services
provided or the function of a unit.
Outcome objectives are just objectives that
relate to the identified outcomes.
Program Outcome Model
INPUTS
ACTIVITIES
OUTPUTS
Resources
Services
Products or Results of
Activities
Staff
Buildings
Facilities
State funds
FTE
Education (classes)
Services
Counseling
Student activities
Numbers served
FTE (input next year)
# Classes taught
# Students recruited
# Who participated
Constraints
Laws
State regulations
Program Outcomes Model
INPUTS
> ACTIVITIES > OUTPUTS
> OUTCOMES
Benefits for People
*New knowledge
*Increased skills
*Changes in values
*Modified behavior
*Improved condition
*Altered status
*New opportunities
Organizational Stages of
Outcome Evaluation
Stage 4
Stage 5
Acceptance & adaptation
Challenge & competition
Catalyst - Proactive
Depression - exhaustion
Compliance - Passive reactive
Stage 3
Bargaining - no time/no money
Seek outside sources
Stage 2
Anger and antagonism
Resistant & Reactive
Stage 1
Disbelief & Denial
Paralysis - Passive resistance
Instructional Program Review

The review process contained five
sections:
• I.
• II.
• III.
• IV.
• V.
The Program Profile
Program Content
Student Learning Outcomes
Need for Change
Future Issues
Instructional Program Review

Tasks accomplished through review:
• Defined their program/unit responsibilities
• Detailed faculty and staff (credentials, prof. dev,
accomplishments)
• Unit mission or purpose
• Link unit mission to the college mission
• Defined the content of their function for the college
• Defined the population served
• Set outcome objectives
• Performed some means of assessment
• Analyzed results
• Determined strengths and weaknesses
• Created strategies for improvement
• Determined future needs (including curricular change,
needed resources, staff and space)
• Wrote a one-year follow-up to make sure they closed the
loop
An Example from Instruction

Workplace Basic Skills
• This program is a literacy initiative that
goes directly into the worksite and
teaches ESL classes, GED prep and GED
classes.
During their review, they surveyed
both employers and students.
 This was the first time they had even
done this.

What They Learned

Employers said:
• 43.8% of employers reported increases in
employee performance as a result of
participation in the program.
• 31.3% reported a reduction in
absenteeism by participants.
• 87.5% said classes improved the morale
of their employees
• 37.5% said participants received raises
• 50% said communication had improved.
What Students Said





70.2% reported being able to fill out job
forms better
35.5% said they could now help their
children with their homework
91.1% said they felt better about themselves
44.4% said they had received a raise,
promotion or opportunity as a result of the
courses
86.3% said their ability to communicate in
the workplace had improved
What Has Happened Since
Their assessment data has shown up
in their marketing brochures to
employers.
 Their enrollment has grown
dramatically.
 They have received funding and
marketing support from Charlotte
Reads (considered a model adult
literacy program).

What We Learned From the Entire
Process….





The first group was dragged kicking and
screaming through the process
Faculty had very little time for the details
Faculty had trouble identifying “student
learning outcomes”
The results produced great marketing
materials
Once they saw the results, faculty
embraced the process
Assistance from the Website
The IE website:
 http://inside.cpcc.edu/IE

• Explanation of IE process
• Templates and forms for review
• “Perfect” examples for clarification
• Schedule for review
What Happened
Deans used the results to make a
case for resources
 The administration became
interested in what was learned
through the review process
 The college began to fold other
activities into program review
(retention assessments, program
revitalization, etc.)
 Faculty knew their programs were
working

Unit Review – Lessons Learned

To do this well you need several critical
pieces:
1. Support from the top (President or Chancellor,
Vice Presidents or Vice Chancellors)
2. Buy in from the grassroots level (participation in
the development of the process)
3. Across-the-college participation (no one is
exempt)
4. Technology to make it easy (web page and review
templates)
5. Technical support from institutional research
6. Effective tools or measures (assessments)
IE Committee Decision





Administrative units helped created an
environment conducive to learning at the college
and supported the learning process
Administrative units should go through a similar
process
We created a committee to draft a similar
process for the administrative areas of the
college
Two representative from each VP area
participated in drafting the new review process
We spent approximately six months creating a
workable administrative review process
The Administrative Unit Review
Design
I. The Unit/Program Profile
A. The Mission/Purpose
1.
Role unit plays in the college mission
2.
Unit/program goals as they relate to the
college’s mission
B. The Staff
1.
Professional and administrative staff
(since the last review)
a.
Position description/duties
b.
Credentials (full and part-time, if any)
c.
Accomplishments (if applicable)
d.
Service to college, community and
nation
The Administrative Unit Review
(continued)
B. The Staff (continued)
e.
Professional development
activities
2.
Classified Staff
a.
List of names and positions
b.
List of required credentials (if
any)
C. The Customer/Client Served
1.
Breakdown of students/faculty or
staff by type or demographic
information (thorough explanation
of who is served)
The Administrative Unit Review
(continued)
II.
Definition of Services or Program
A.
Definition of day-to-day duties of the
unit
B.
Innovations, new projects, new
initiatives, local, state-wide or
national efforts
C.
Required functions of unit
(description and status of
compliance)
1.
2.
3.
4.
SACS "must" statements
State mandates
Federal mandates
Other
The Administrative Unit Review
(continued)
III.
Administrative Objectives and Student
Outcomes (where appropriate)
A.
Administrative Objectives (2-3
objectives)
B.
Outcomes (or status if incomplete) of
innovations, new projects, new
initiatives, local, state or national
efforts
C.
Assessment explanation (what was
assessed, who, when, how many)
D.
Results of Administrative Objectives
Based on Assessment
The Administrative Unit Review
(continued)
IV.
Need for Change
A.
Strengths identified by external
sources, faculty, staff and students
B.
Weaknesses identified by external sources,
faculty, staff and students
C.
Recommendations by faculty, staff,
external sources and students to improve
the unit's services and programs
D.
Strategies for change (based on input
from Sections A, B & C) - closing the
loop
E.
A one-year follow-up brief report to the
Unit VP reporting on the progress of
D
above (due at the end of the year
following review)
The Administrative Unit Review
(continued)
V.Future Issues - Resources needed for
future efforts
A.
Market trends within the broad service
unit or program area (based on bestpractices, the literature or training
received)
B.
Anticipated future changes and needs
(based on market trends)
C.
Resources, equipment, space, staffing
and work load changes needs for future
growth or continuation
D.
Future plans of unit
But How do You Assess the
Outcomes of Administrative Units?





We created an evolving and revolving
faculty/staff survey (three year cycle).
The survey is an online product and easy to
administer.
We spent time training units on “what
outcomes were” rather than just satisfaction
questions.
They could also use it for needs assessment
questions.
Satisfaction questions can answer
administrative outcome objectives though.
Assessment of Administrative
Units
During the year of program review,
the “Annual Faculty/Staff Survey”
contained questions from those
units being reviewed.
 Results were given to each unit and
broken out by campus and job type
http://www.cpcc.edu/planning


There’s a link to survey results.
Administrative Units at CPCC











Administrative Technology Services
Instructional Technology Services
Compliance & Audit
Inventory Control
Basic Skills Compliance & Reporting
Facilities Maintenance
Facilities Design and Construction
Institutional Advancement/Foundation
Planning and Research (IR/IE)
Resource Development
Security
Administrative Units at CPCC










Equal Opportunity Employment
Human Resources
Occupational Health and Safety
Technology Help-desk
Email Services
The Book Store
Vending/Food Services
Parking Services
Campus Printing
Distribution Services (mail/moving)
Administrative Units at CPCC










Payroll
Financial Reporting/Budgeting
Purchasing/Procurement
Accounts Payable
Cashiering
Community Relations
Marketing Services
Graphic Design
Professional Development
Faculty Development (CTL)
New Unit (entrepreneurial)

The Services Corporation:
• The
• The
• The
• The

Center for Applied Research
CPCC Press
Harris Conference Center
Performing Arts Center
Will have to add new questions
Where Do All These Units Fall
INPUTS
ACTIVITIES
OUTPUTS
Resources
Services
Products or Results of
Activities
Staff
Buildings
Facilities
State funds
FTE
Education (classes)
Services
Counseling
Student activities
Numbers served
FTE (input next year)
# Classes taught
# Students recruited
# Who participated
Constraints
Laws
State regulations
When You are an Input – What
are Your Outcomes
INPUTS
> ACTIVITIES > OUTPUTS
> OUTCOMES
Benefits for the College
*New Efficient Approaches
*Enhanced Services
*Motivating Environment
*Supportive Behavior
*Improved Conditions
*Altered Expectations
*New opportunities
Facilities Services

Outcomes
• Classrooms conducive to learning
• Students to be safe and secure
• Faculty to be able to work in assigned
space

Survey questions were developed
and asked around these themes
Facilities Services

Survey respondents gave high rankings (in
the 80-90% range) for:
• Newly constructed or newly renovated facilities
being of high quality and meeting needs
• Classrooms and common areas being generally
clean
• Lighting levels being adequate
• A safe and secure environment being provided
for all members of the College community
• Knowing procedures to follow if a medical
emergency on campus
Areas Needing Improvement

Only 73% of respondents were pleased with new
office spaces.
• To address this concern, the Facilities Standards Subcommittee
of Facilities Partners will review the adopted standards for
offices and furnishings and make recommendations for
revisions.

Only 50% of faculty and staff were pleased with the
temperatures in classrooms (affected teachinglearning process).
• The Central Energy Plant began operating in the summer of
2003; when connected to the main body of Central Campus,
the back-up capabilities of this facility should minimize the
disruption of cooling. The College is also developing a plan for
the future pairing of boilers to provide back-up heating for
Central Campus buildings. At the suburban campuses, central
plant approaches are being used to link buildings, thereby
providing back-up capability.
Areas Needing Improvement

Only 57% of the respondents expressed
satisfaction with the cleanliness of bathrooms.
• The College entered into a new housekeeping contract as of
July 1, 2003 which employs a new vendor and has enforceable
cleanliness standards that must be met. Additionally, Facilities
Management is addressing the appearance of some of the
older restroom facilities on Central Campus as part of a
cyclical rehabilitation program.
Planning and Research

Outcomes
• Be perceived as an effective and efficient unit
of the college
• Be perceived as contributing to the
effectiveness of the college
• Provide adequate assistance with survey
design
• Provide helpful guidance through the program
review process
• Have websites that are used and contain
appropriate information
Planning and Research
Survey respondents gave high rankings (in
the 80-90% range) for:










Responding quickly to their need for
data/information.
Producing enough reports annually to meet the
planning and information needs of faculty and
staff.
Making a significant contribution to the College.
Contributing to the effectiveness of CPCC.
Assisting in the area of survey
development/design.
Assisting with program/unit review.
Having a quality Institutional Effectiveness
Website.
Having a quality Institutional Research Website.
Having a quality Fact Book (available online).
Security

Outcomes
• For students and staff to be able to
work and learn. In order to do that
they must:
Feel safe on campus
 Handle emergencies correctly
 Gather safely in groups (students)
 Perceive safety in their jobsite (employees)

Questions Asked (level of
agreement)




If I see suspicious behavior, I always
contact security/campus police.
Insufficient Lighting is a significant safety
concern within my work environment.
Personal Security is a significant safety
concern within my work environment.
Safety in Gathering Places (crowd control)
is a significant safety concern within my
work environment.
Information Technology

Outcomes
• Faculty and staff to be able to use reliable
technology to get their jobs done more
effectively and efficiently.
• For the college not to be late on any state
or federal reporting generated out of IT or
assisted by IT.

Needs assessment
• How many are using remote email service
Information Technology
 Survey respondents gave high rankings
(in the 80-90% range) for:
• Courteous and helpful staff in answering
questions and resolving issues with
administrative systems access.
• Providing regular scheduled reports
accurately and on time.
• Reliability of systems and availability
especially during heavy registration.
• Not experiencing significant down time as a
result of office computers not working.
Purchasing Department

Outcomes
• Provide adequate training so
departments can purchase what they
need.
• The purchasing card program will save
time and money.
• Manuals and information will provide
adequate information in a readable
format.
Vending

Needs Assessment
• Do you like Coke or Pepsi products
• Do you like 12 oz. cans or 20 oz. bottles

Outcomes
• Faculty and staff can find healthy food
and drink choices in the vending
machines and through food services.
Book Store

Outcomes
• To enhance learning, students should
have books by the first day of class.
• The communication between the book
store and the faculty should be
adequate and timely.
• Faculty will understand the process of
requesting special products for classes.
Need for Senior Administration
Response
Units spend a lot of time working on
the program review.
 If we want them to take it seriously,
we have to take it seriously.
 Once reviews are reported at the end
of the year, their Dean or Vice
President/Chancellor need to:

• 1. Read them
• 2. Respond to them
From Personal Experience
Sit down with the unit (group meeting)
 Give them attention
 Discuss what was learned
 Discuss their major issues
 Discuss what they need to make
improvements
 Actually attempt to direct resources to
them to make those improvements
 Then they do not see it as an academic
exercise

Overall Benefits


#1 No recommendations in the area of
Institutional Effectiveness from SACS during
the October 2002 re-accreditation visit
The college became change-oriented
• Units had to define strategies for change
• Didn’t have to be perfect but rather making
continuous progress
• Strategies for change helped identify needed
resources for units
• Units had to “close the loop” with the one-year
follow-up (couldn’t promise and not deliver)
Overall Benefits

Units became empowered to perform
their functions in an optimal manner
and to ask for what they needed (no
one noticed them before, now they do)
• Created data to support needs
• Accomplishments were reported to major
administrative groups across the college
(President’s Cabinet, Planning Council,
etc.)
Overall Benefits
The college community understood
the purpose and function of every unit
 Senior administration realized the
benefits and became strong
supporters

• Data are reported annually from
program/unit review
• VPs became stronger advocates for their
units making changes to improve
services
“Best” Results of “Best”
Practice






Better use of data across the college
We have become more client/customer focused
Review gives direction for goals and needed
changes
Departments are empowered to do their jobs (can’t
slip through the cracks and be unnoticed)
Problems must be resolved (there is no hiding and
no excuses)
Surveys provide needs assessment data as well as
evaluation which gives departments direction
What We Want to Do
Work with 10-20 colleges this year.
 Collect outcomes from their
administrative units.
 Fine tune the assessment tool (needs
to match the outcomes).
 Distribute it to all the colleges in
pieces (probably 3).
 Nationally norm it.
 Potentially – use it for benchmarking.

What We Want to Do
Give the colleges their individual
reports.
 Give them comparison data for the
entire group.
 Please sign up if you are interested.

For a copy of this presentation
Contact Terri Manning
terri.manning@cpcc.edu
 Download or print presentation:
 http://www.cpcc.edu/planning

• Click on “studies and reports”
• NCCBC Admin Survey
Download