Open-Ended Survey Questions - SNAAP

advertisement
Open-Ended Survey Questions:
Non-Response Nightmare or Qualitative Data Dream?
American Educational Research Association
April 2013
Angie L. Miller, Ph.D.
Amber D. Lambert, Ph.D.
Center for Postsecondary Research, Indiana University
Literature Review
 There is an increasing trend for requiring colleges
and universities to show measures of their
effectiveness (Kuh & Ewell, 2010)
 Combination of the struggling economy, funding cuts to
higher education, and the evolution of the traditional
higher education model (i.e. distance education,
MOOCs, etc.)
 Alumni surveys are an important tool for
assessment, but often have lower responses rates
(Smith & Bers, 1987)
 Due to bad contact information, suspicion of money
solicitation, and decreased loyalty after graduation
Literature Review
 Despite lower response rates, qualitative data
from open-ended questions can still provide
rich information from relatively few
respondents (Geer, 1991; Krosnick, 1999)
 Disadvantages of open-ended questions:
 Heavy burden on respondents (Dillman, 2007)
 Some personal characteristics, such as language fluency
and positive affect, can impact likelihood of responding
to open-ended questions (Wallis, 2012)
Research Questions
 Do open-ended responses represent the opinions of the
entire sample? Are some types of respondents more
likely to complete these questions? Does question
placement on the survey impact responses?
The purpose of this study is to explore whether those
with certain demographic and personal
characteristics, including gender, age, cohort, number
of children, marital status, citizenship, race, current
employment status, income, and institutional
satisfaction level, are more or less likely to respond to
open-ended questions placed at the beginning,
middle, and end of an online alumni survey.
Method: Participants
 Data from the 2011 administration of the Strategic
National Arts Alumni Project (SNAAP)
 Participants were 33,801 alumni from 57 different
arts high schools, arts colleges, or arts programs
within larger universities
 Sample consisted of 8% high school level, 70%
undergraduate level, and 22% graduate level alumni
 38% male, 62% female, .2% transgender
 Majority (87%) reported ethnicity as Caucasian
 Average institutional response rate: 21%
 Only used those who completed the entire survey
(i.e. did not drop out before the end) N = 27,212
What is SNAAP?
 On-line annual survey designed to assess and improve
various aspects of arts-school education
 Investigates the educational experiences and career
paths of arts graduates nationally
 Questionnaire topics include:







Formal education and degrees
Institutional experience and satisfaction
Postgraduate resources for artists
Career
Arts engagement
Income and debt
Demographics
Method: Open-Ended Measures
 From beginning of survey (Question 17 of 82)
 “Is there anything that [your institution] could have done
better to prepare you for further education or for your
career? Please describe.”
 From middle (Question 44 of 82)
 “Please describe how your arts training is or is not relevant
to your current work.”
 From end (Question 80 of 82)
 “If there are additional things you would like to tell us
about your education, life, and/or career that were not
adequately covered on the survey, please do so here.”
Method: Demographic Measures
 Demographic information collected for:
 Gender (Categorical- 3)
 Age group (Ordinal ranges)
 Graduation cohort (Ordinal ranges)
 Number of children (Ordinal ranges)
 Marital status (Categorical – 4)
 Citizenship (Binary)
 Race/ethnicity “check all” (Binary – 7 total)
 Current employment status (Categorical – 7)
 Income (Ordinal midpoints of ranges)
 Institutional satisfaction level (Ordinal - 4-point
Likert)
Analyses
 Series of 14 chi-squared analyses was done for each
of the 3 open-ended question binary variables
 For gender, age group, graduation cohort, number of
children, marital status, citizenship, each
race/ethnicity option, and current employment status
 3 independent samples t-tests completed for
institutional satisfaction level and each of the 3
open-ended variables
 3 non-parametric Mann-Whitney U tests were
completed for income comparisons
 Skewed variance violated t-test assumptions
Results: Descriptive Statistics
Much higher percentages of responses for beginning
and middle questions than for the end question:
% Responding
100%
80%
79%
68%
60%
40%
24%
20%
0%
Beginning
Middle
End
Results: Chi-Squared Analyses
Results: Chi-Squared Analyses
 Females more likely to respond
 Over 50 years of age more likely to respond
 Graduating in or before 1990 more likely to respond
 Singles less likely to respond
 Those with no dependent children more likely to
respond
 Unemployed, retired, or with “other” employment
status more likely to respond
 U.S. citizens more likely to respond
 Race: Asians less likely to respond, but those with
“other” race/ethnicity more likely to respond
Results:
Means and Other Ordinal Comparisons
Those answering beginning and end questions were
significantly less satisfied with overall institutional
experience
Didn’t
Answer
Item:
Mean
Answered
Item:
Mean
t value
df
Effect
Size (d)
Near-Beginning Item
3.57
3.40
20.33***
19914.16
.26
Middle Item
3.44
3.45
-.907
27082
.01
Near-end Item
3.46
3.42
4.20***
9767.14
.06
*p<.05; **p<.01; ***p<.001
Results:
Means and Other Ordinal Comparisons
Those answering the open-ended questions had a
significantly lower income that those who did not,
across all questions
Didn’t
Answer Item:
Mean Rank
Answered
Item:
Mean Rank
Mann-Whitney U
value
Near-Beginning Item
13860.60
13173.39
74282028.0***
Middle Item
12056.14
11307.10
54347479.5***
Near-end Item
12099.86
11409.52
40097544.5***
*p<.05; **p<.01; ***p<.001
Discussion
 Many patterns of results are consistent with
previous literature
 English language fluency is a factor in responding
to questions that require language production
(Wallis, 2012)
 U.S. citizens more likely to respond
 Open-ended responses require greater time and
mental effort
 Those with no dependents, unemployed, retired, and
from older graduation cohorts more likely to respond
Discussion
 Those with negative feelings may be more likely to
voice their opinions in the open-ended items
 Negativity bias found in research with workplace
environments (Poncheri et al., 2008)
 Explains why unemployed more likely to respond
 Also explains why those who provided responses have
significantly lower income and are less satisfied with
institutional experience
 An analysis of the content of the comments might
shed more light on this interpretation
Discussion
 Also found an interesting pattern of an “other” response
style
 For current employment status and race/ethnicity, those
who select “other” are more likely to respond to openended questions
 Cursory review of the text boxes that accompany the
“other” options shows that many actually do fall into a
listed categorization
 Example: writing in “Caucasian/American Indian” even
those both of these were represented in the race/ethnicity
check all
 Are these people just more verbose? Or do they have a
disposition that resists the confinement of
categorization?
Conclusions
 Limitations of study: sample may not be completely
representative of all arts alumni (response rates and
selective participation)
 Assessing alumni can provide important information
on institutional effectiveness, but must be cognizant
that some groups are more likely to provide openended responses
 More research is needed on the influence of question
placement for particular groups, as well as personal
and environmental influences contributing to the
“other” survey response style
References
Dillman, D. (2007). Mail and internet surveys: The tailored design method (2nd ed.). New
York: Wiley.
Geer, J.G. (1991). Do open-ended questions measure “salient” issues? Public Opinion
Quarterly, 55, 360-370.
Kuh, G. D. & Ewell, P. T. (2010). The state of learning outcomes assessment in the United
States. Higher Education Management and Policy, 22(1), 1-20.
Krosnick, J. (1999). Survey research. Annual Review of Psychology, 50, 537-567.
Smith, K., & Bers, T. (1987). Improving alumni survey response rates: An experiment and
cost-benefit analysis. Research in Higher Education, 27(3), 218-225.
Poncheri, R.M., Lindberg, J.T., Thompon, L.F., & Surface, E.A. (2008). A comment on
employee surveys: Negativity bias in open-ended responses. Organization Research
Methods, 11(3), 614-630.
Wallis, P. (2012, April). Profiling college students who skip open-ended items win
questionnaires with varied item formats. Paper presented at the Annual Meeting of the
American Educational Research Association, Vancouver, British Columbia.
Questions or Comments?
 Contact Information:
 Angie L. Miller anglmill@indiana.edu
 Amber D. Lambert adlamber@indiana.edu
Strategic National Arts Alumni Project (SNAAP)
www.snaap.indiana.edu
(812) 856-5824
snaap@indiana.edu
Download