Review of Measures Used In U.S. News & World Report

advertisement
Assessment Toolbox: Multiple
Tools for Assessing Effectiveness
in Higher Education
Dr. Andrew L. Luna
Institutional Research, Planning, and Assessment
University of North Alabama
© 2012, Andrew L. Luna Consulting
Contact Information
Dr. Andrew L. Luna
Director
Institutional Research, Planning and Assessment
University of North Alabama
256-765-4221
alluna@una.edu
© 2012, Andrew L. Luna Consulting
Assessment?
x1  x2
t
S x1  x2
© 2012, Andrew L. Luna Consulting
Gosset, Beer, and Statistics…
William S. Gosset (1876-1937)
was a famous statistician who
worked for Guiness. He was a
friend and colleague of Karl
Pearson and the two wrote many
statistical papers together.
Statistics, during that time
involved very large samples, and
Gosset needed something to test
difference between smaller
samples.
© 2012, Andrew L. Luna Consulting
William Gosset
Sir Francis Bacon
Francis Bacon (1561 –1626) was an
English philosopher, statesman, lawyer,
jurist, and author. Although not primarily
known as a scientist, Bacon wrote a
philosophical work called Novum
Organum 1620. In it, he detailed a new
system of logic called the Baconian
method. This book was foundational to
empirical thinking and inductive
reasoning.
Note: A statistical test called the Chi Square was later used to
determine if Bacon wrote plays attributed to William Shakespeare
© 2012, Andrew L. Luna Consulting
Points of Discussion





The origins of outcomes measures and continuous
improvement
What is improvement?
The fact that surveys and nationally normed tests
are not the only methods of successful assessment
Examples of how outcomes measures are used in
the improvement process within higher education
Questions
© 2012, Andrew L. Luna Consulting
Connections to IE?
© 2012, Andrew L. Luna Consulting
Telephones and IE?
Walter Shewhart
Edwards Deming
Joseph Juran
Hawthorne Works, Bell Laboratories
© 2012, Andrew L. Luna Consulting
Shewhart Cycle
Plan
Act
Continuous
Improvement
Do
Check
FACT...
The Shewhart Cycle is the foundation for all quality and
Continuous improvement processes that we use today
© 2012, Andrew L. Luna Consulting
More on the Shewhart Cycle





Plan – Create a strategy as to what you want to do
and how you will measure success
Do – Follow the plan and do what you say you
will do
Check – Assess the effectiveness of the current
plan by looking at the success outcomes measures
Act – Make changes to the strategies to improve
the measured outcomes
Repeat the Cycle!
© 2012, Andrew L. Luna Consulting
Why is the Shewhart Cycle
Important?
If you can’t measure something, you can’t
understand it…
 If you can’t understand it, you can’t control
it…
 If you can’t control it, you can’t improve
it…
 If you can’t improve it…then why the heck
are you doing it?

© 2012, Andrew L. Luna Consulting
Shewhart Cycle and SACS
Core Requirement 2.5:
Planning
Check and
Act
The institution engages in ongoing, integrated, and
institution-wide research-based planning and evaluation
processes that (1) incorporate a systematic review of
institutional mission, goals, and outcomes; (2) result in
continuing improvement in institutional quality; (3)
demonstrate the institution is effectively accomplishing
its mission
Do
© 2012, Andrew L. Luna Consulting
Shewhart and SACS, Cont.
Plan
Comprehensive Standard 3.3.1:
The institution identifies expected outcomes for its
education programs … and its administrative and
educational support services; assesses whether it
achieves those outcomes; and provides evidence of
improvement based on analysis of those results
Check
Do and
Act
© 2012, Andrew L. Luna Consulting
Shewhart Modified…DMAIC
Control
Improve
Define
Measure
Analyze
DMAIC: Expands upon Shewhart and is the
method used to improve the current
capabilities of an existing process. This is by
far the most commonly used methodology of
continuous improvement teams.
© 2012, Andrew L. Luna Consulting
DMAIC Methodology





Define – Identify, prioritize, and select the right
project(s)
Measure – Identify key outcomes measures,
process parameters, and measure performance
Analyze - Identify the key (causative) process
determinants
Improve – Establish prediction model, DOE, or
other improvement tool and optimize performance
Control – Hold the gains
© 2012, Andrew L. Luna Consulting
Outcome Focus
Process
Optimization
Process
Characterization
Define
The right outcome(s), the right team(s)

Measure
Y
Analyze
Improve
Control



Process
Problems and Symptoms
Process outputs
Response variable, Y

X’s




QEP?
Independent variables, Xi
Process inputs
The Vital Few determinants
Causes
Mathematical relationship
© 2012, Andrew L. Luna Consulting
Goal: Y = f ( x )
Inputs, Processes, and Outcomes
Measurement
X’s
Materials
X’s
Methods
X’s
X’s
Input
Y’s
Outcomes
X’s
Environment
X’s
People
© 2012, Andrew L. Luna Consulting
X’s
Machines
Customers: A Traditional Model
High
Schools,
Business
Students
Inputs
Academic, Ed. Support, Admin.
Systems and Processes
Employers
Outputs Grad
Schools
Licensure
Gradates
Suppliers
Customers
Feedback
Knowledge
Bank
Feedback
In Higher Education, Students Are Pseudo-Customers
© 2012, Andrew L. Luna Consulting
Data and Assessment Examples

All examples were taken from assessment projects
within actual colleges and universities
 These examples (from simple to complex) show
only a fraction of the assessment tools available
 Most of these examples require no to little
statistical training
© 2012, Andrew L. Luna Consulting
Trend Analysis for English
Quality Point Average
English-100 & 200 Level Courses
3.5
3.4
3.25
3.3
3.2
3.24
3.2
3.07
3.1
3
100 level
2.91
2.9
2.8
200 level
2.94
2.94
2.88
2.84
2.92
Linear (200 level)
2.7
2.6
2.5
Fall 2003
Linear (100 level)
Fall 2004
Fall 2005
Fall 2006
Fall 2007
© 2012, Andrew L. Luna Consulting
Financial Aid Process Charting
Bottleneck of
applicationsMeasurement
to be
reviewed
Too many mistakes
by applicant
Materials
Methods
Pick up application in
box. Very little
assistance
Submitted
Application
Too few staff and too
many applications
Environment
People
© 2012, Andrew L. Luna Consulting
Machines
US
Government
Form: ‘nuff
said
Y = Time
Completed
Application
Computer
entry errors
In-House Satisfaction Survey:
Importance and Satisfaction

Availability of Major
 National Reputation
 State-Wide Reputation
 Quality of Undergrad Ed.
 Cost to Attend
 Scholarship Availability
 Warm/Friendly Campus
 Access to Faculty
 Small Student/Faculty
Ratio
 Safe Campus

Selective Admissions
Standards
 Athletic Programs
 Quality of Faculty
 Job Placement
 Social Events
 Leadership Development
 Honors Program
 Distance from Home
 Distance from Metro Area
 Quality of Facilities
© 2012, Andrew L. Luna Consulting
Satisfaction Survey Results: Gap
Analysis



Measures both the Importance and Satisfaction of a
particular variable or attribute
The Performance Gap is the difference between the
Importance Measure and the Satisfaction Measure
The higher the Performance Gap, the least institution
is meeting customer expectations
© 2012, Andrew L. Luna Consulting
Alumni Marketing Survey Results
Interpreting the Radar Chart
1
Meeting
Expectations
2
6
Larger Gap
Smaller
Gap
5
3
4
Importance
Satisfaction
© 2012, Andrew L. Luna Consulting
Exceeding
Expectations
/Possible
Overkill
Student Satisfaction and Radar
Charts
2007
2005
Availability of Major
Distance to Metro Area 7
National Reputation
Distance from Home
State-Wide Reputation
6
5
4
3
2
1
Honors Programs
Leadership Development
Social Events
Quality of Undergrad. Ed.
Cost to Attend
Scholarship Availability
Job Placement
Warm/Friendly Campus
Quality of Faculty
Quality of Facilities
Athletic Programs
Selective Adm. Standards
Access to Faculty
Small Faculty/Student Ratio
Availability of Major
Distance to Metro Area
National Reputation
7
Honors Programs
Leadership Development
Social Events
State-Wide Reputation
Quality of Undergrad. Ed.
Cost to Attend
Scholarship Availability
Job Placement
Warm/Friendly Campus
Quality of Faculty
Quality of Facilities
Athletic Programs
Access to Faculty
Selective Adm. Standards
Safe Campus
Importance
6
5
4
3
2
1
Distance from Home
Small Faculty/Student Ratio
Safe Campus
Importance
Satisfaction
© 2012, Andrew L. Luna Consulting
Satisfaction
ANOVA – Example

College of Education used an evaluation
questionnaire to determine the level of
competence a student teacher exhibited
toward required dispositions/characteristics
 The evaluation was completed by students,
faculty members, and teachers in the
classroom
 ANOVA was used to determine difference
between the three groups
© 2012, Andrew L. Luna Consulting
Measurement of Differences for
Teacher Education
Dispositions Comparisons - Spring 2007
3
2.75
2.5
2.25
2
1.75
1.5
1.25
1
Disp 1* Disp 2
Student
Faculty
Disp
3 Disp 4*
Disp 5*Teacher
Disp 6
Note: * indicates a significant difference at < .05
© 2012, Andrew L. Luna Consulting
Disp 7*
Pareto Chart in Residence Halls
Residential Life Incident Reports – 2 Years
250
100.00%
80%
80.00%
70.00%
60.00%
Count
150
50.00%
100
40.00%
30.00%
50
20.00%
10.00%
Th
Ha
ef
t
rra
ss
m
en
t
M
ed
ic
al
M
ec
ha
ni
ca
l
Sa
fe
ty
Al
co
ho
Va
l
nd
al
ism
n
re
Fi
sit
at
io
Vi
No
i
isc
M
se
0.00%
.
0
© 2012, Andrew L. Luna Consulting
Cumlative Percentage
200
90.00%
Pareto Chart with Trend Analysis
Trend Analysis
© 2012, Andrew L. Luna Consulting
Learning Outcomes

Outcome: Students completing the main
sequence general chemistry
CHEM1110/1120 will be able to
demonstrate a thorough knowledge of
general chemistry as evidenced by an
average score exceeding the national
average on the national ACS General
Chemistry Exam.
 Assessment: ACS General Chemistry Form
© 2012, Andrew L. Luna Consulting
Historical Performance of ACS
Score (Y)
Historical Comparison
44
42
40
38
36
34
32
1
2
3
Institution
4
Nat. Norm
© 2012, Andrew L. Luna Consulting
5
What X Variables Contribute to
ACS Exam Score?

SAT Math
 HSGPA
 Classroom Content
 Laboratory Work
 Homework
 Newly designed workshop
© 2012, Andrew L. Luna Consulting
Regression Analysis

Run two regression analyses where final
grade and the ACS scores were the Y’s
 Observe the relationships between the X
variables and the Y’s
 Develop a Path Analysis to better
understand both the direct and indirect
relationships
© 2012, Andrew L. Luna Consulting
Path Analysis for Chemistry
.1105
.3919
WS
HSGPA
GRADE
.2024
SATM
ACS
.7675
.6665
UG
.5430
© 2012, Andrew L. Luna Consulting
UG
Hot Water Problem in Residence
Halls


Survey found that most
residents in a female hall
were unhappy with the
bathrooms
Subsequent focus groups
found that residents were
upset over the quantity
and quality of hot water



Define – Hot water
variability in high-rise
residence hall
Measure – Record temp.
of hot water on high,
med., and low floors for
two weeks, three times a
day
Analyze – Plot hot water
on X-Bar/R Control Chart
© 2012, Andrew L. Luna Consulting
Control Chart for Hot Water
Hugging of the
Mean
Periodicity
Means
140
130
120
110
100
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
2
21
2
2
2
2
2
2
2
2
3
31
3
3
3
3
3
R
3
3
4
41
Exceeding
Control Limit
45
40
Trend
35
Ranges
3
30
Run
25
20
15
10
5
0
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
2
21
2
2
2
2
2
2
2
© 2012, Andrew L. Luna Consulting
2
3
31
3
3
3
3
3
3
3
3
4
41
Control Chart for Hot Water in
Residence Hall, Cont.



Improve – After understanding the process and the control
chart, the team offered suggestions to control variability
Control – A new control chart was run after changes to the
system and the process was found to be in control
Money – The changes decreased utility costs and increased
student retention in the hall
© 2012, Andrew L. Luna Consulting
Regression Models

A regression model was used to explain the
relationship between academic departmental
budget allocations and those variables that
could contribute to the variance in these
allocations.
Bud . Alloc. x1 , x2 , x3  xi 
© 2012, Andrew L. Luna Consulting
Simple Regression Model
y
Predicted Values
Residuals
r  Y  Yˆ
i
i i
Slope
Actual Values
x
© 2012, Andrew L. Luna Consulting
Multiple Regression Model
Y
X1
X2
© 2012, Andrew L. Luna Consulting
The First Model - Variables









Number of full-time professors (dependent)
Number of majors
Total degrees conferred
Total credit hours generated
Total credit hours generated by majors
Total lower-level credit hours generated
Total upper-level/grad. Hours generated
Delaware cost per credit hour
Market value of the discipline
© 2012, Andrew L. Luna Consulting
Results of Selected Model

F Statistic = 66.04 (.0001)
 R-Squared - .9497
 Standard Deviation of the unexplained
budget allocation (Root MSE) = $96,942
© 2012, Andrew L. Luna Consulting
Parameter Estimates of Selected
Model
Variable
Intercept
Parameter
Estimate
Pr > |t|
-544842
0.0027
PROF
43586
<.0001
CHRS
9.4536
0.1474
DEGREES
-1346.6
0.2206
MARKET
842966
<.0001
CHRS*DEGREES
0.20361
0.0104
-1.48669
0.002
CHRS*PROF
© 2012, Andrew L. Luna Consulting
So, What Happened?

The VPAA reviewed the model along with
the departmental budget requests
 He used the model to identify those
departments that were either significantly
under funded or over funded
 VPAA made adjustments to budget
allocations based in part upon the model
© 2012, Andrew L. Luna Consulting
Success with Assessment…
A spectrum of
continuous
improvement tools
will lead you down
the road to
assessment success
© 2012, Andrew L. Luna Consulting
Questions or Comments?
Dr. Andrew L. Luna
Director, Institutional Research Planning
and Assessment
University of North Alabama
alluna@una.edu
© 2012, Andrew L. Luna Consulting
Download