Collecting Assessment Data

advertisement
Collecting
Assessment Data
Lance C. Kennedy-Phillips, PhD
Director, Student Life Research and Assessment
The Ohio State University
OUR
PURPOSE
2
Why are we here?
• How to collect assessment data
• Where to collect assessment
• Share ways in which we have used “outside” data
to strengthen our work
• Share considerations to keep in mind when seeking
and using “external” data
Poll Question: Time spent on assessment
•
What percentage of your time (position
description) is designated for assessment and
research?
A.
B.
C.
D.
E.
Less than 20
Between 21 and 40
Between 41 and 60
Between 61 and 80
More than 80
Poll Question: Years in the field
How many years have you been in the
assessment and/or research field?
A.
B.
C.
D.
E.
Less than 1 year
1 – 2 years
3 – 5 years
6 – 8 years
More than 8 years
The
Assessment
Loop
6
Assessment Loop
Adapted from:
Maki, P.L. (2004).
Assessing for
Learning: Building
a Sustainable
Commitment
Across the
Institution. Sterling,
VA: Stylus.
Gather
Evidence
Interpret
Evidence
Identify
Outcomes
Implement
Change
7
Collecting Data
• Driven by a question
• Selection of method should align with your question
and the type of information that you need.
• May need more than one option to fully understand
an issue
• Qualitative and quantitative
• Not dichotomous, really a continuum
8
Collecting Data
•
•
•
•
•
•
•
•
Use of institutional data
Surveys
Rubrics
Interviews/Focus groups
Document analysis
Participant observation/Observation
Photo elicitation
Journaling
9
Strengths and limitations
•
•
•
•
•
•
•
•
•
•
Sample sizes
Level of detail and depth versus breadth
Direct versus indirect measures
Numbers or stories
Validity
Reliability
Credibility
Transferability
Dependability
Confirmability
10
Qualitative
Assessment
11
Qualitative Assessment
Why Qualitative Assessment?
• Qualitative inquiry allows us to ask
different types of questions that surveys
alone might not be appropriate for.
• Important to remember that this data is
just as good as qualitative data, it is just
another kind of information.
• An accessible introduction to this type of
assessment might allow for your offices
to expand their thinking to more deeply
explore student needs.
Qualitative Assessment
For example…
• A survey question might present a
statement such as: “Attending [event]
enhanced my understanding of
diversity.” Then the respondent would fill
in their answer on a scale (strongly agree
to strongly disagree).
However, a qualitative question might ask:
• “How was your understanding of diversity
affected by attending [event]?”
Qualitative Assessment
Expanding thinking:
• The purpose of exploring questions in a
qualitative manner is to give us data that
we might not be able to get from
quantitative methods alone.
Professionals…
• Need to get past the initial hesitations
with this type of research
Quantitative
Assessment
15
If you torture numbers long enough they'll confess to anything.
If you want a green suit turn on a green light.
You don't fatten the pig by weighing it.
Not everything that counts can be counted, and not everything that can be
counted counts.
Statistics are no substitute for judgment . . . and vice versa.
People don't plan to fail; they fail to plan.
The only people who really welcome change are wet babies.
Change isn't about solving problems, its about living in a better future.
Data don't speak to strangers.
Without data, you're just another person with an opinion.
Statistics are like bikinis. What they reveal is suggestive, but what they
conceal is vital.
Data don’t lie… people do!!!.
Mixed
Methods
Assessment
20
Mixed Methods Assessment
• Be careful – there is a good way – and a background to
the design
• Yes, MMR does seem particularly applicable to SA work.
• Make sure you understand what you want in your
outcomes – look for balance –
Creswell’s strategies & models
• Sequential Explanatory
• I think it’ the most popular
• Can seem to make the most sense in SA
• Weight is given to quantitative side, qual is the “support”
• Particularly helpful when you want to know “why”
something is the way it is
• Weakness is time and also ensuring that bias doesn’t
interfere in the second stage!
Sequential Exploratory…
• Similar to previous, but weight changes
• More popular with qual researchers who need to
build on a foundation
• Explores a phenomenon
• Evaluate dispersion of something among a population
– feelings about AIDS, develop an instrument that can
be used across a population
• Requires a substantial amount of time to do well – and
must make decisions at several points about the “most
pertinent” results.
Concurrent Triangulation
• The most familiar – criticized because of the lack of
integration
• Compares two sets of data (one qual, one quant) and
then analyzes for corroboration or disconfirmation…
• Can offset a particular weakness in one area
• Takes great effort to equally view the two sides and it’s
hard to resolve discrepancies
• EX. Look at your student data file – what would be a good
study to complement hard data?
Where are
the data?
25
Institutional Data
Questions to consider:
• What type of institutional data would strengthen
your current work?
• What types of data are readily available on your
campus?
• What other data exists, where is it housed, and who
are the “gatekeepers”?
Types of institutional data
• Institutional Indicators
• Fact Books or Fact Files
• Enrollment Data
• Institutional Research Reports: Retention and
Graduation Rates
Institutional Surveys – Things to
consider
• Psychometrics
•
•
•
•
Instrument development
Validity
Reliability
Conceptual Framework
• Sampling
• Administration
• Research Team
• Quality control
• IRB
National Surveys - Things to consider
• Psychometrics
http://nsse.iub.edu/html/about.cfm
• Cost
• Utility: Driving decision-making vs. good to know
• Benchmark group
• Should be remotely representative of your campus
• Consortia
• Nationally normed
External Data
• Integrated Postsecondary Education Data System
• Common Data Set
• Other sources
• National Consortium
Integrated Postsecondary
Education Data System (IPEDS)
What is IPEDS?
•
Integrated Post Secondary Education Data System
•
Series of surveys sent out from the NCES (National
Center for Education Statistics)
•
Gathers information from colleges and universities
across the country who participate in federal
student aid programs
•
Any institution that participate in federal student
aid programs must publically report their data
(over 6700 schools annually report)
•
Colleges, universities, federal agencies have a
shared interest and investment in this information
What data are available in IPEDS?
• Institutional characteristics
• Enrollment
• Student financial aid
• Degrees conferred
• Student persistence
• Human resources
Additional Resources
• College Navigator (college search)
• IPEDS Data Center (comparing institutional
data, creating reports)
• IPEDS Table Library (national and state level
data)
• IPEDS Resources (general resources, faq, etc)
• http://nces.ed.gov/programs/digest/d08/
• Association for Institutional Research (IPEDS
Webinars) : www.airweb.org
Common Dataset (CDS)
•
Available on most IR websites
•
Originally developed to provide a central source for
publications to get institutional data
•
Cleaner and simpler than IPEDS
•
Lacks flexibility
•
Consistent definition of terms
•
http://oaa.osu.edu/irp/irosu_cds.php
What data are available in the CDS?
•
•
•
•
•
•
•
•
•
General Information
Enrollment and Persistence
Admissions data (Transfer and First-Year)
Academic Offerings and Policies
Student Life
Annual Expenses
Financial Aid
Instructional Faculty and Class size
Degrees conferred
Other Sources of Data
• Specialized Consortiums/Associations
• The Association of American Universities Data Exchange
: http://aaude.org/
• Council of Independent Colleges (CIC):
http://www.cic.edu/makingthecase/index.asp
• Accountability tools
• Voluntary System of Accountability (VSA)
• University and College Accountability Network (UCAN)
• State Coordinating Boards
• http://www.flbog.org/resources/quickfacts/
• http://regents.ohio.gov/perfrpt/index.php
Download