Cover Page - Student Voice

advertisement
Got Benchmarking Data?
Now what?
NIRSA Pre-conference
April 1, 2009
Overview
• Introductions: Take a note of colleges/universities
represented that might be most like your campus.
• General overview of benchmarking/intended outcomes
• Definition of and types of benchmarking
• Considerations and challenges of benchmarking
• NIRSA/StudentVoice Benchmarking Study
• Making sense of benchmarking data
• Assessment reporting
• Panel of campus recreation professionals
• Discussion/questions throughout!
Intended Outcomes
• Participants will leave with an increased understanding of
the value of benchmarking
• Participants will get hands-on experience viewing and
understanding benchmarking data
• Participants will learn how to utilize benchmarking data for
public relations, marketing, and importantly, they will
leave with ideas on how to report data in meaningful ways
to their various audiences
• Armed with new confidence around understanding and
interpreting benchmarking data, participants will leave
with an increased love of assessment and data!
Session Warning Labels
• Three hours is a long time… especially if you are talking assessment!
(We have one planned break around 9:45, but take breaks as needed.)
• Warning: High doses of assessment are likely.
• Some of the information is not going to make sense until your
college/university actually participates in benchmarking.
• Getting the most out of the group exercise (of interpreting
benchmarking data) is going to be based on what emotions ‘data’ and
assessment invoke in you!
• Set some of your own goals/outcomes for the session: Do you want to
show others on your campus how to use the data? Do you want to
learn best practices from other campuses? Do you want to be able to
go back to your campus and produce a report? What else?
• It’s ok if you don’t get ‘it’ all! StudentVoice benchmarking webinars
are available each May and December for free. Or I can do a webinar
for you and others on your campus specific to your data.
Benchmarking
• Comparing performance and other measures across
organizations/units.
• Benchmarking is done primarily to benefit from wisdom
and success of others (best practices) so that you don’t
have to “reinvent the wheel.”
• Allows institutions or departments to document what they
are doing well (in comparison to national or peer data), as
well as document opportunities for improvement.
Uses of Benchmarking
•
•
•
•
•
Improvement; learning of best practices
Short and long term planning
Forecasting and predicting trends
Generating new ideas
Provide data for analyzing and determining affordability
and cost effectiveness
• Exposes areas that are in need of follow-up/more in-depth
assessments
Types of Benchmarking
Types (typical in higher education)
1) Internal: departmental comparisons
2) External: comparisons to peer institutions
• Operational benchmarking (comparing staffing,
facilities, programs, budgets) – may occur both
internally and externally
• Student data benchmarking (comparing student
responses) – may occur both internally and externally
Benchmarking Considerations
•
•
•
•
•
Common questions asked across institutions
Timing of the data collection
Sampling methods
Selecting comparable institutions for comparisons
Many benchmarking opportunities have very strict
guidelines for participating campuses (rightly so – to
ensure that the data is valid and reliable)
History of the NIRSA/StudentVoice Benchmarking Project
• Inspired by assessment efforts of Juliette Moore, Director of
Recreational Sports at the University of Arizona and NIRSA Past
President, the Campus Recreation Student Outcomes Benchmarking
Project provides a benchmarking mechanism to capture ways in
which campus recreation enhances the lives of students.
• Online assessment to random sample of students (sometimes faculty,
staff, other stakeholders).
• Over 85 colleges and universities have participated in the last five
years.
• Data from over 70,000+ students, faculty, staff, and community
members.
Campus Use of the Data
• Data on the impact of campus recreation programs and
services on recruitment and retention
• Document student learning outcomes
• Justify new facilities, expansion, new programs
• Change and enhance existing programs and services
• Determine support for fee increases
• Gauge interest in new programs and services
• Implications for staff training and professional development
• Spotlight areas of achievement and areas for improvement
• Guide future/follow-up assessments
Benchmarking Topics
•
•
•
•
•
•
•
•
•
Importance of recreation on recruitment and retention
Usage of facilities, services, programs, and offerings
Barriers to participation
Satisfaction
Outcomes
Needs assessment
Marketing/Promotions
Feasibility questions (if appropriate)
Demographics
Are we really measuring outcomes?
• Program Outcomes examine what a program or process
is to do, achieve or accomplish for its own improvement;
Generally needs/satisfaction driven.
• Learning Outcomes examine cognitive skills that
students develop through department interactions;
Measurable, transferable skill development; what students
are expected to demonstrate in terms of knowledge, skills,
and attitudes upon completion of a program, course,
interaction, or activity.
Direct versus Indirect Assessment Methods
• Direct Methods - Any process employed to gather data
which requires subjects to display their knowledge,
behavior, or thought processes.
• Indirect Methods - Any process employed to gather data
which asks subjects to reflect upon their knowledge,
behaviors, or thought processes.
Process
• Sign-up for the NIRSA/StudentVoice benchmarking project.
• Customize instrument as appropriate – keeping in mind that any
changes to the instrument may impact the ability to benchmark.
• Obtain institutional approval/IRB – since your data will become
part of an aggregate dataset.
• Launch the project; All data is collected online preferably with a
random sample of students.
• Institution-specific data is available in real-time for immediate
use.
• Benchmarking data and reports are available once all campuses
are done collecting data.
• No college or university names are revealed alongside any
college or university data.
Viewing Benchmarking Data
Data Reporting Tools
Benchmarking Wizard
Making sense of multiple layers of data
• Institutional data including filtered views to understand
sub-populations or respondents with certain
characteristics
• Longitudinal data
• National results
• Peer comparison data (who do you want to be able to
compare with?)
• Filtered views by sub-populations across national and
peer comparison data
What’s in a StudentVoice benchmarking report?
•
•
Frequency tables
For scaled items:
• Means and standard deviations
• Whether the difference in means is statistically significant (green = ‘your’
mean is statistically higher at the p<.05; red = ‘your’ mean is statistically
lower). NOTE: Statistical significance occurs with large sample sizes and
the significance indicators need to be considered with this in mind.
Advanced statistical analysis should be conducted in a stats package
such as SAS or SPSS.
• Total N (number of responses), if this number is not ‘private’
• Top 2 and Bottom 2 percentages (Top 2 = Strongly agree plus somewhat
agree)
• Ranking of ‘your’ mean score
• No basis to judge / Not applicable are excluded from all mean
calculations. All benchmarking questions utilize Likert scales of 1-5, 1-4,
or 1-3.
Stats 101
What basic stats knowledge do you need (or at least need to be aware of) to
interpret a benchmarking report?
• Sample size and response rates
(http://www.touchpoll.com/calculator.htm)
• Generalizability
• Descriptive stats: Basic frequency tables
• Likert scales versus categorical data
• Means and standard deviations
• T-tests: Do two mean scores differ?
• Statistical significance
More advanced uses of assessment data:
• Factor analysis
• Correlation studies
• Regression
Small Group Work
• Find a work group with similar types of campuses or
similar types of recreation facilities and programs
• Maximum of four-five people per group will be best
• Knowing that each small group might approach the
exercise differently, if you are attending with another
colleague from the same institution, it might make sense
to split up and then share experiences afterwards.
Group Exercise Scenario
Let’s look at page 1 of the handout labeled
Benchmarking Data Questions 2 and 3 together in a
large group to make sure all is clear about how the
data is presented and what is in the report.
Let’s also look at page 1 of the handout labeled
Benchmarking Data Question 4 together.
Reporting out your findings
• Create manageable sections/chunks of data.
• Assign/delegate to staff or teams and come up with an
appropriate strategy to make the data analysis feasible
and manageable.
• Explore possibility of hiring a graduate assistant to assist
with data analysis or find other campus partners.
Assessment Reporting
1) Determine the ‘client,’ purpose, and audiences
for the report
2) Determine what data and information are
available or must be gathered
3) Select the type of report and report format
4) Decide how to depict data and information
5) Produce report
6) Dissemination
Findings vs. Results vs. Implications
• Findings: Data represented as value-free facts. What is
the data?
• Results: Interpretations of the findings. What do the
findings mean?
• Implications: Uses of the results. Who should care about
these results and what should they do with the results?
Knowing your Audience
•
•
•
•
What do my audiences need to know about this subject?
What do my audiences want to know about this subject?
What do I want them to know about this subject?
What decisions will or might my audience make based on
this report?
• What other audiences might my primary audiences send
this report to, even if I haven’t intended for them to
receive the material.
• What other audiences might be interested in the same
subject?
(Bers & Seybert, 1999)
Report Components
1)
2)
3)
4)
5)
6)
7)
Meaningful title
Executive summary
Table of contents
Introduction and purpose
Methodology
Findings
Summary, conclusion, implications, and
recommendations
8) References
9) Glossary
10) Appendices
Graphical Representations of Data
Graphs are depictions of numerical relationships to
demonstrate the following:
1) Size (e.g., largest to smallest)
2) How things change over time
3) What is typical or exceptional
4) How one thing predicts or is related to another
Avoid chart clutter/chart junk….
Huh?
Working - Studying & Grades
Transfers - Fall 2006
80%
3.40
3.20
60%
3.00
50%
2.80
40%
30%
2.60
20%
2.40
10%
2.20
0%
2.00
0
1-5
6-10
11-15 16-20 21-25 26-30 31-35 36-40 >40
Work Hour Category
1st Semester GPA
% Studying 10+ Hours per
Week
70%
Studying
More than
10 Hours
Per Week
Average
First
Semester
GPA
Presenting Clear Visual Representations of Data
Panelists
• Amy Davenport: University of Oklahoma
• Mary Chappell: University of Kansas
• Tina Clawson: Oregon State University
Opportunities to share best practices
after today:
https://www.studentvoice.com/app/views/community/
Online Resource Center
Additional Resources
Bender, B. E., & Schuh, J. H. (2002). Using
benchmarking to inform practice in higher education.
New Directions for Higher Education no. 118. San
Francisco: Jossey-Bass
Taylor, B. E., & Massey, W. F. (1996). Strategic
indicators for higher education. Princeton, NJ:
Peterson’s.
Upcraft & Schuh, (1996). Assessment in Student Affairs.
San Francisco: Jossey-Bass.
Contact Information
Kim VanDerLinden
www.studentvoice.com
Email: kvanderlinden@studentvoice.com
716.652.9400 and then press 1
Download