Assessment with Alumni Surveys - SNAAP

advertisement
Assessment with Alumni Surveys:
Administration Tips and Data Sharing
Suggestions
Assessment Institute
October 2013
Amber D. Lambert, Ph.D.
Angie L. Miller, Ph.D.
Center for Postsecondary Research, Indiana University
Abstract
Alumni surveys can be important sources of information
for institutions, yet many obstacles are involved in their
execution. The Strategic National Arts Alumni Project
(SNAAP) is a multi-institution online survey of arts
graduates from secondary and postsecondary institutions.
This presentation shares lessons learned during the first
five years of SNAAP, including issues related to accurate
alumni contact information, response rates, design factors,
and implementation of results. Actual examples from
previous SNAAP-participating institutions are shared to
promote idea generation among participants for the use of
alumni surveys at their own institutions.
Literature Review and
Introduction to SNAAP
Literature Review
 As funding to higher education institutions
continues to be cut, colleges and universities
are often required to show measures of their
effectiveness (Kuh & Ewell, 2010)
 Surveys are used in many areas of higher
education (Kuh & Ikenberry, 2009; Porter,
2004)
 Alumni surveys can provide valuable information on
student satisfaction, acquired skills, strengths and
weaknesses of the institution, and current career
attainment
Literature Review
 A major concern with all surveys, and alumni
surveys in particular, is low response rates
 Over the last decade survey response rates have
been falling (Atrostic, Bates, Burt, & Silberstein,
2001; Porter, 2004)
 Alumni surveys often have lower response rates
than other types of surveys (Smith & Bers, 1987)
due to:
 Bad contact information
 Suspicion of money solicitation
 Decreased loyalty after graduation
SNAAP
 As an example, we present some best practices for
survey administration and share results from the
Strategic National Arts Alumni Project (SNAAP)
 What is SNAAP?
 Online annual survey designed to assess and
improve various aspects of arts-school education
 Investigates the educational experiences and career
paths of arts graduates nationally
 Findings are provided to educators, policymakers,
and philanthropic organizations to improve arts
training, inform cultural policy, and support artists
SNAAP Basics: Who is surveyed?
 Participants drawn from..
 Arts high schools
 Independent arts colleges
 Arts schools or departments in
comprehensive colleges & universities
 Broad definition of “arts”
 All arts alumni, all years (since 2011)
 2008-2010: surveyed selected cohorts
Increasing Numbers…
 2010 Field Test
 Over 13,000 respondents
 154 Institutions
 2011 Administration
 More than 36,000 respondents
 66 institutions
 2012 Administration
 More than 33,000 respondents
 70 institutions
 2013 Administration
 Currently underway
 Combined 2011 and 2012 respondents to create a “SNAAP Database”
with over 68,000 respondents – plan to add 2013 data after this year!
Questionnaire Topics
1.
Formal education and degrees
2.
Institutional experience and satisfaction
3.
Postgraduate resources for artists
4.
Career
5.
Arts engagement
6.
Income and debt
7.
Demographics
Participating institutions receive...
 Institutional Reports: Customized, Confidential
 Separate reports for undergraduate and graduate
alumni
 Both quantitative and qualitative data
 Special report on “Recent Grads”
 Comparative data with other schools
 Complete data file of responses
 Workshops/webinars on how to use data
Survey Administration
Challenges
Survey Administration Challenges:
Locating the Lost
 Important that contact
information is accurate and
up-to-date
 Encourage proactive efforts
 Newsletters
 Websites, social networking
 Alumni tracking
 Contracted with Harris
Connect, a direct marketing
firm
Survey Administration Challenges:
Response Rates
 Response rates are directly
related to the accuracy of
contact information
 Incentives: only minimally
effective
 “Open enrollment” features can
increase number of responses
 Social networking sites
 Need to verify respondents
Survey Administration Challenges:
Response Rates
 Email invitations to
participate in the
survey
 Is it better to have
HTML or plain text?
 For the 2011
administration, we
created visually
appealing email
invitations in HTML
format
Survey Administration Challenges:
Response Rates
 For the 2012 administration, we systematically
compared the effectiveness of HTML invites to
plain text invites across the 5 email contacts sent to
participants
 Results of this experiment suggested that a
combination of message types gets the highest
response rates
 Plain text was more effective for the initial contact
 HTML was more effective for follow-up contacts
 Potential reasons: plain text may reach larger
numbers, but HTML may give the project legitimacy
Examples: Sharing on Campus
 Miami University of Ohio: “Assessment Brief”
published by Institutional Research office after
receiving their 2012 SNAAP Institutional Report
 Purdue University: Office of Institutional Research
published a 4-page report summarizing data from
2011 Institutional Report, integrating both
quantitative and qualitative data
Assessment Brief #62
October 12, 2011
Miami University
Assessment
Using Feedback from Miami Alumni
to Improve Educational Effectiveness
Surveying Alumni
Miami students are frequently surveyed throughout
their college experiences. However, assessing the
long-term impact of students’ Miami education can
also require reaching out to students after they
graduate. Feedback from alumni, who are now using
the skills they developed at Miami, can greatly
improve educational effectiveness.



The Strategic National Arts Alumni Project
(SNAAP) survey gathers information about fine arts
alumni to better understand the relationship between
arts education and arts-related occupations. The
SNAAP participants from Miami University
consisted of 220 undergraduate fine arts alumni who
graduated in the following years: 1990, 1995, 2000,
and 2005-2009.
Respondents were least satisfied with
opportunities to network with alumni and others,
advice about further education, career advising,
and work experience.
The vast majority of respondents reported
developing critical and creative thinking skills
while at Miami and found these skills important
in their future careers.
Fine arts alumni were less likely to report that
Miami helped them to develop business and
technological skills related to their field.
Student Satisfaction
The survey included questions about institutional
experiences and career choices. To capture
institutional experiences, the survey prompted alumni
to report their overall satisfaction with their education
as well as their satisfaction with specific areas (e.g.,
academic advising, freedom to take risks). In the
career section, alumni reported their current and
previous occupations, their satisfaction with these
jobs, and their current level of fine arts engagement.
To explore the intersection between institutional
experience and careers, the survey asked alumni
about the skills and competencies they developed at
Miami University as well as which skills were most
important in their current job.
By reviewing these results, faculty and staff can
better understand how students’ experiences at
Miami prepare them for their career.
Key Findings
 Fine arts alumni were satisfied with their
experiences at Miami University; 94% of
undergraduate arts alumni rated their overall
experience as good or excellent.
 Arts alumni were especially satisfied with their
sense of belonging at Miami and with their
instructors.
Recommendations
The SNAAP survey highlights the importance of
gathering alumni feedback. Such feedback is a
valuable resource for assessing educational impact
and improving educational effectiveness across the
university. The SNAAP survey helps faculty and
staff in fine arts by identifying the following:
 Common occupations and post-secondary
degrees among graduates
 Skills and competencies that students will
frequently use in their careers
 Levels of student satisfaction with various
aspects of their Miami experience
These results can help the division improve retention
and graduation rates and better prepare students for
their future careers.
If you have comments or questions, please contact the Center for the Enhancement of Learning, Teaching and University
Assessment at celt@muohio.edu or 513-529-9266. Previous Briefs are available online at:
http://www.units.muohio.edu/celt/assessment/briefs/.
Examples: Alumni and Donor Outreach
 The University of Texas at Austin: College of Fine
Arts promoted SNAAP results in 2011 with the
Dean’s letter in its quarterly print publication,
thanking alumni for participating and sharing
selected findings
 In 2012, UT Austin also used their SNAAP data,
integrated with other information sources such as
IPEDS, to develop a web page that illustrates some
of its findings with info-graphics
Examples: Alumni and Donor Outreach
 Herron School of Art + Design (IUPUI): created a
website to share selected findings, and integrated
SNAAP data with “alumni profiles” on the website
 Virginia Commonwealth University: mentions
SNAAP participation in their VCU Alumni
publication, and incorporated an option for alumni
to update their contact info for future surveys
Examples: Recruitment
 Kent State University: created flier for those
considering majoring in visual arts using aggregate
SNAAP findings
 Herron School of Art + Design (IUPUI): created a
recruitment brochure based on its alumni
achievements, potential careers, and comments
from the SNAAP survey
Examples: Program & Curricular Change
 Virginia Commonwealth University: found
discrepancies when comparing the business skills
alumni needed for their work to the business skills
learned at VCU
 Introduced a new “Creative Entrepreneurship” minor to
address this
 University of Utah: found that many alumni were
dissatisfied with career advising their school offered
 College of Fine Arts developed an Emerging Leaders
Program that offers “high-stakes internships,” mini-grants,
and peer-mentoring opportunities designed to prepare
students for their transition into the world of work
New VCU Course Catalog Listings…
References
Atrostic, B. K., Bates, N., Burt, G., & Silberstein, A. (2001). Nonresponse in U.S.
government household surveys: Consistent measure, recent trends, and new insights.
Journal of Official Statistics, 17(2), 209-226.
Kuh, G. D. & Ewell, P. T. (2010). The state of learning outcomes assessment in the
United States. Higher Education Management and Policy, 22(1), 1-20.
Kuh, G. D. & Ikenberry, S. O. (2009). More than you think, less than we need: Learning
outcomes assessment in American higher education, Urbana, IL: University of Illinois
and Indiana University, National Institute of Learning Outcomes Assessment.
Porter, S.R. (2004). Raising response rates: What works? New Directions for
Institutional Research, 121, 5-21.
Smith, K., & Bers, T. (1987). Improving alumni survey response rates: An experiment
and cost-benefit analysis. Research in Higher Education, 27(3), 218-225.
*Special thanks to Miami, Purdue, IUPUI, UT Austin, VCU, Kent State, and U
of Utah for sharing their examples with SNAAP!
Download