Off-the-Shelf or Homegrown? Selecting the Appropriate Type of Survey for Your Assessment Needs

advertisement
OFF-THE-SHELF OR
HOMEGROWN?
SELECTING THE APPROPRIATE
TYPE OF SURVEY FOR YOUR
ASSESSMENT NEEDS
Jennifer R. Keup, Director
National Resource Center for The First-Year
Experience and Students in Transition
keupj@mailbox.sc.edu
Institutional
data are
meaningless
without a
comparison
group.
My institution is
unique in its
programs and
goals.
The main
outcome of
interest on my
campus is
student
development.
GOALS FOR TODAY
 Introduce
& discuss Ory (1994) model for
comparing and contrasting local vs.
commercially-developed instruments

Identify elements of your institutional culture &
structure that would influence decision
 Discuss
myths with respect to survey
administration
 Share examples of:


…the most prominent national surveys for firstyear assessment
…software and services available to facilitate
institutional assessment
WHAT DO WE MEAN
“Off-the-Shelf”
“Homegrown”
Often commerciallydeveloped
 Scope to include
multiple institutions
 Primarily pre-set
content
 Examples:




CIRP
NSSE
EBI
Developed locally
 Focused on institution
 Content developed and
adapted by the campus/unit
 Examples:

Program review
 Utilization/satisfaction
surveys for specific programs

Continuum
QUESTIONS TO ASK
What is
my
budget?
Who needs
to see these
data?
What are my
analytical
capabilities?
Who needs to
make decisions
with these
data?
How will this fit
with my other
responsibilities?
What is my
timeline?
ORY (1994) MODEL
FOR COMPARING AND
CONTRASTING
LOCAL VS.
COMMERCIALLYDEVELOPED
INSTRUMENTS
SIX FACTOR COMPARISON
 Purpose
 Match
 Logistics*
 Institutional
Acceptance
 Quality
 Respondent
Motivation to Return
the Instrument
PURPOSE
Why are we doing this study, and
how will the results be used?
“Off-the-Shelf”
Allows for comparison
to national norm
group
 Examples:

Comparison to peer or
aspirant group
 Benchmarking
 Contextualize a broad
higher education issue
or mandate

“Homegrown”
Allows for a thorough
diagnostic coverage of
local goals and interests
 Examples:

Satisfaction with
campus program
 Achievement of
departmental goals
 Program review

MATCH
o What are the program/institutional goals,
outcomes, and areas of interest?
o Does an existing instrument meets my
needs?
o Does the survey address my purpose?
o Can the existing instrument be adapted to
meet my needs?
“Off-the-Shelf”

May provide incomplete coverage of
local goals & content
“Homegrown”

Tailored to local goals
and content
Local Questions!
INSTITUTIONAL ACCEPTANCE
o How will the results be received by the
IRB
intended audience?
o Who needs to make decisions with this
data?
Politics
o What is the assessment culture?
“Off-the-Shelf”
Professional quality &
national use may
enhance acceptance
 Failure to completely
cover local goals and
content may inhibit
acceptance

“Homegrown”
Local development can
encourage local ownership
and acceptance
 Concerns about quality
may interfere with
acceptance

QUALITY
o What is the track record of quality?
o What is the psychometric soundness of the
instrument?
“Off-the-Shelf”
Tend to have better
psychometrics
 Professional quality
may compensate for
incomplete coverage of
local goals and
objectives

“Homegrown”
Must fully test
psychometric properties
 Create a professional
appearance
 Lack of professional
quality may affect results
and institutional
acceptance

RESPONDENT MOTIVATION TO
RETURN THE INSTRUMENT
o What will yield the highest response rate?
“Off-the-Shelf”
Can create instant
credibility
 Sometimes provide
institutional or
individual incentives

Incentives
“Homegrown”
Local specificity may yield
greater respondent “buy
in”
 Local instruments may
not “impress” people
 Can create student
perception of immediate
impact

LOGISTICS (10 CONSIDERATIONS)










Availability
Preparation time
Expertise
Cost
Scoring
Testing time
Test & question types
Ease in administration
Availability of norms
Reporting
The Devil is
in the details!
LOGISTICS (CONTINUED)
OTS: Availability
Does a survey
currently exist for our
needs?
 If you can afford it,
the survey is available

OTS: Prep time

Short
HG: Availability
“If you build it they (i.e.,
data) will come”
 Takes time & resources
to develop

HG: Prep time

Can take considerable time
What is the survey timeline? Is it feasible?
 Have you considered administration planning?

LOGISTICS (CONTINUED)
OTS: Expertise

Fully-developed
protocol allows one to
administer after
reading manual
OTS: Scoring
Can be delayed if
scoring off campus
 Need to adhere to the
administration cycle

HG: Expertise
Takes content,
measurement, and
administrative
experience
 Psychometrics!!!

HG: Scoring

Can be immediate
Related to
expertise
LOGISTICS (CONTINUED)
OTS: Testing time

HG: Testing time
Fixed based upon
 Flexible as long as the
content and
survey meets institutional
administration
& programmatic needs
protocol
If administering in
class do you have
faculty buy-in?
OTS: Test type

Type of test and
questions are
predetermined
HG: Test type

Allows for flexibility in
type of test (objective
/open-ended) and type of
question (MC, rank
ordering, etc.)
LOGISTICS (CONTINUED)
OTS: Ease of Admn
Requires standardized
administration
 Special training for
testers

OTS: Norms

National & interinstitutional comparison
OTS: Reporting

Standard formats that
don’t always relate to
institution
HG: Ease of Admn

Allows for greater
flexibility
IRB
HG: Norms

Intra-institutional
comparison
HG: Reporting

Institutional tailoring of
results and reporting
LOGISTICS (CONTINUED)
OTS: Cost
Primary costs
associated with
purchase price
 Other costs:

Scoring
 Data
 Specialized reporting
 Human resources to
coordinate campus
administration
HG: Cost

Instrument development
 Ensuring psychometric
properties
 Scoring & recording data
 Reporting findings



Primary costs
associated with
development costs

Other costs
Software/hardware
 Training

Recurring cost

Primarily one-time
investment
Purpose
Match
Logistics
Acceptance
Response
Quality
O-T-S VS. HG MYTHS
 You
can only gather comparison data from
national (OTS) surveys
 It is cheaper to develop and administer a
homegrown survey
 Off-the-shelf surveys don’t require any
work
 Homegrown surveys are hard.
 You don’t need IRB approval for local
assessment
 Off-the-shelf surveys study all the
important topics
FYE ASSESSMENT EXAMPLES
“Off-the-Shelf”
“Homegrown”
 CIRP
 Services

Freshman Survey

Eduventures

Your First College

Student Voice
Year (YFCY) Survey
 Software
 NSSE

Zoomerang
 Educational

Survey Monkey
Benchmarking
Incorporated
CONTINUUM OF ASSESSMENT
Survey
Zoomerang Monkey
CIRP
NSSE
EBI
OfftheShelf
Eduventures
Student
Voice
Home
Grown
Download