ALUMNI RELATIONS
Recommend to a Friend?
q YES q NO
Cornell
University
uses new tool
to measure
effectiveness
of events
By JENNIFER
LYNHAM
CUNNINGHAM
36
O
C
T
O
B
E
R
2
0
1 2
“So, Jane, how was your event last night?”
“It was fabulous! A lot of alumni showed up, and everyone seemed to have a great
time. They loved the speaker too.”
“Great. Congratulations!”
“Thanks! Gotta run. I’ve got three more events next week!”
Walk the halls of the alumni affairs office at New York’s Cornell University, and this
is the type of conversation you’ll overhear a few times each week. We spend millions of
dollars and thousands of staff and volunteer hours to produce more than 1,400 events
around the world each year. That’s one event every six hours.
Is it worth it? Do the 40,000 alumni, parents, and friends who attend feel closer
to Cornell after these events? Do they disengage because we didn’t deliver the experience they expected? To find out, we are using a new tool to quantify the quality of
“customer” relationships.
MEANINGFUL METRICS
I put customer in quotes because the early adopters of the tool we use, the Net
Promoter system, are businesses. NPS users believe building long-term relationships
with customers leads to higher and more sustainable profits. The thousands of businesses that use the tool include major corporations such as American Express,
Apple, General Electric, Home Depot, JetBlue, LEGO, Procter & Gamble, and
Zappos. NPS is both a metric (or score) and an approach to conducting business
that focuses on the quality of customer relations.
Businesses that accept this premise calculate the metric using some form of this
question: “On a scale of 0 to 10, how likely are you to recommend X to a colleague
or friend?” Answers to a second question, “Why did you answer that way?” initiate a
customer feedback loop and help employees identify and fix problems.
In this system, people who answer 9 or 10 are considered Promoters. These are
the organization’s best friends. They’re the people who not only like the product or
C
A
S
E
C
U
R
R
E
N
T
S
37
©2012 BRIAN STAUFFER c/o THEiSPOT.COM
service but will sing about it to others. For most
businesses, they account for more than 80 percent
of referrals.
People who answer 7 or 8 are called Passives. They
liked the experience but not so much that they’re
willing to risk their social capital for it. Their repurchase and referral rates are as much as 50 percent
lower than that of Promoters.
alumni engagement, the currency of our industry.
When asked about the crossover potential of NPS
for nonprofits, Reichheld responded in an email,
“Net Promoter is particularly relevant for nonprofits
because it offers a practical and systematic process for
creating accountability for excellence across the wide
range of constituents that comprise a nonprofit.”
To explore how nonprofits are using NPS, Bain
In this system, people who answer 9 or 10 are considered Promoters.
People who answer 0 to 6 are called Detractors.
They may actively tell people about a bad experience;
in fact, 80 percent of negative word-of-mouth reviews
come from this group.
The Net Promoter score is equal to the percentage
of Promoters minus the percentage of Detractors.
ORIGINS OF NPS
Fred Reichheld, called the “godfather of customer
loyalty” by eBay’s president and CEO—another NPS
advocate—developed Net Promoter in the early 2000s
with his team at Bain & Company, an international
consulting firm headquartered in Boston. As the
founder of the firm’s Loyalty Practice, Reichheld
wanted to know how accurately traditional customer
satisfaction surveys predicted actual customer behavior.
The team analyzed dozens of satisfaction questions and
found that responses to questions about a customer’s
likelihood to recommend a product or service correlated most strongly with repurchases, referrals, and
other actions that contribute to a company’s growth.
But, you may say, we’re not a business! True, in
alumni relations for educational institutions we don’t
& Company started a branch of its established NPS
Loyalty Forum, a semi-annual conference for senior
executives from several of the companies mentioned
earlier and dozens more. The first two of these new
Social Impact Forums have assembled about 20 nonprofit organizations, including Cornell, Dartmouth
College in New Hampshire, Babson College in
Massachusetts, DonorsChoose.org, Big Brothers Big
Sisters, City Year, and Ascension Health. Rob Markey,
co-author with Reichheld of the book The Ultimate
Question 2.0: How Net Promoter Companies Thrive in
a Customer-Driven World, is a driving force behind
the Social Impact Forum. He stresses that NPS can
be used to better comprehend and meet the needs of
various constituencies, including donors, alumni, and
service providers.
CORNELL’S EXPERIENCE
For about three years, Cornell has been using NPS
primarily for event and meeting feedback. The practice started following conversations with Barbara
Talbott, a friend of Cornell and former chief marketing officer for Four Seasons Hotels and Resorts,
headquartered in
Toronto. Our alumni
affairs staff had
People who answer 0 to 6 are called Detractors.
already embraced the
idea that delivering
think in terms of customers or profits, but we do
“wow” customer service at events would lead to more
aspire to nurture loyalty from alumni as measured in
loyal alumni and ultimately more volunteerism and
gifts of time, talent, and treasure. Customer relationgifts, but we were struggling with how to quantify
ships built on loyalty lead businesses to higher growth that goal and measure it consistently across our proand profitability. They lead our institutions to more
gram areas. Talbott pointed us toward NPS. We read
38
O
C
T
O
B
E
R
2
0
1 2
ALUMNI RELATIONS
the books, held discussions, and then started incorporating the system into our surveys.
We now have a pretty well-oiled system in place
whereby we send alumni the same five-question survey following all staff-driven events. Typically our
response rate is about 30 percent, but for special
gatherings such as reunion and affinity networking
events it can jump to 50 percent. We haven’t scaled
up to include the 1,100 annual volunteer-driven
events just yet, but it’s on the table to possibly start
later this year.
We have one SurveyMonkey account so we can
easily download survey data from across program areas
into a spreadsheet with formulas for calculating the
Net Promoter score and the number of Promoters,
Passives, and Detractors.
About a week after an event, we
send the spreadsheet, along with
People
suggestions for addressing attendee
feedback, to the program director
responsible for the event. The
spreadsheet also matches respondents’ contact information with their gift officer’s name, so if tracked
prospects respond, we can send their comments
along to our fundraisers.
We have established annual average scores for
each program area. If an event’s score comes in way
below or above the average, we consider it a yellow
flag that we should try to fix the problem for next
time or study what went so well and replicate it.
Our average score for staff-driven events was 52 percent for fiscal year 2011 and 60 percent for fiscal
year 2012. About 70 percent of respondents are
Promoters, 22 percent are Passives, and 8 percent
are Detractors.
Why do we use Net Promotor scores and not
another customer-satisfaction metric? There are three
reasons:
NPS is simple. We don’t need fancy software to
implement it or statisticians to interpret it because we
use a popular web-based survey tool and a spreadsheet
we developed with a few formulas built in.
It’s consistent. We can compare a career networking event year to year, and we can meaningfully compare it to a completely different kind of program. We
could even compare our events with similar ones at
other institutions.
We’re confident it’s a solid metric. Thousands of
successful and long-standing organizations, including
Enterprise Rent-A-Car, Southwest Airlines, Logitech,
and Progressive Insurance, have vetted its validity
and usefulness all the way up through their CEOs.
So far, we haven’t heard any compelling arguments
against it.
DATA STORIES
We recently analyzed 30 events to see if there were
any patterns in the feedback among Detractors. The
happiest takeaway is that we have talented program
directors and event planners. It turns out that when
people are unhappy at events, it’s usually because
they expected something different, not because the
who answer 7 or 8 are called Passives.
event itself was executed poorly. Reasons for low
scores include things such as the recommended parking garage wasn’t close enough, the people an attendee
wanted to meet didn’t show up, or someone didn’t
realize the event had a cash bar. Armed with this type
of specific feedback, it’s fairly easy for staff to make
improvements.
Only about 8 to 10 percent of our survey respondents are Detractors. However, among all the alumni
who have attended an event in the last four years,
25 percent have come to only one. Why aren’t those
people returning?
We also analyzed Promoters vs. Detractors in
terms of giving and found that donors at every level
are more likely to recommend Cornell events to fellow alumni than nondonors. Alas, this is a chickenand-egg scenario: Do they donate because they enjoy
events, or as donors, are they prone to view Cornell
events favorably? Does the answer matter?
Finally, we find that affinity-based events generally
receive higher scores than come-one, come-all events.
Women’s lunches, athletic team reunions, and niche
networking events often get scores in the range of
70 to 90 percent. Through the alumni feedback, we
find that it’s not necessarily the programming that
generates these scores; rather it’s interaction with the
C
A
S
E
C
U
R
R
E
N
T
S
39
ALUMNI RELATIONS
other alumni, parents, and friends. Conversely, and
somewhat logically, webinars often score in the range
of 30 to 55 percent.
survey data, then reaching out to Promoters for testimonials and information that we can use to attract
other like-minded participants,” Thompson says.
WE ARE PROMOTERS OF NPS
LESSONS LEARNED
We’ve done several presentations on NPS for peers
at other institutions, and a few now use it, including
Vanderbilt University, Santa Clara University, and
Johns Hopkins University.
James Stofan, associate vice chancellor of alumni
relations at Vanderbilt in Tennessee, says his alumni
We’re a couple of years into our NPS journey at
Cornell, and the three most important lessons we’ve
learned are these:
Get buy-in from senior leaders first. Chris
Marshall, associate vice president of alumni affairs, is
a strong advocate of NPS. He has repeatedly stressed
Net Promoter Percentage of Promoters Percentage of Detractors
=
–
Score
(9s and 10s)
(0 through 6s)
association appreciates the consistency of the metric,
which informs decisions about programs to invest in
and repeat. In addition, the Vanderbilt Alumni Association recently started using NPS for board member
exit surveys. Responses will help shape a new initiative
to retain, recommend, and recruit volunteer leaders.
Santa Clara University in California started using
NPS about a year ago with the idea that the feedback
loop itself could be an engagement tool for volunteers.
“We were looking for ways to engage a particular
committee of our national alumni board, and around
the same time we saw a compelling presentation on
NPS,” says Kathy Kale, executive director of the
alumni association. “We thought ‘What if we gave
the survey results to our board members and let them
contact the responders?’”
The tactic worked. “The volunteers derive great
satisfaction from knowing they helped turn around
Detractors or further engaged Promoters, and of
course, the alumni appreciate the outreach,” Kale
says. “Our next step is to analyze how it’s been
going, which direction to grow it, and which processes we can tweak to make it even more effective
and useful.”
Ridia Anderson and Elena Thompson at Johns
Hopkins in Maryland are using NPS to segment
target audiences for affinity-group programs. As they
expand offerings, they are building a website. “To
help us shape the message, we’re analyzing post-event
40
O
C
T
O
B
E
R
2
0
1 2
to staff the importance of sending post-event surveys
and follow-up. He requests being copied on every
post-event survey report. He’s also spoken about the
metric and the feedback loop at board of trustee
meetings and gatherings of advancement professionals outside Cornell.
Without this kind of top-down push to change the
thinking from quantity of events to quality of engagement and follow-through, the system would not have
launched.
Don’t hype the number. Until you have a benchmark average for each type of program, it’s not helpful
to talk too much about the event scores themselves.
Instead, emphasize the importance of closing the
feedback loop and acting on the comments. As that
becomes standard operating procedure, you’ll gather
more data and the scores will inevitably increase.
Start small. We began the process with a very
small bang, using it just for events. We’ve given colleagues ample time to understand and embrace it.
It took about a year to figure out the mechanics, get
a critical mass of staff using it, and gather enough
compelling data to make an argument for why everyone in the division and across the colleges and
units—not just in advancement—should use NPS.
We’re not completely there, but we recently had a
few colleges and units come on board, and last fall a
couple development colleagues used NPS for a highlevel stewardship event.
NEXT UP
At Bain & Company they talk a lot about the NPS
“journey.” That’s because it really is a mindset as
much as it is a process. As your organization starts to
focus on alumni loyalty as a key driver, all kinds of
possibilities—and challenges—start to emerge.
Now that we’ve cut our teeth on post-event surveys,
we’re going to do a few more things. First, we want to
dig even deeper into the study of loyalty economics—
the correlation between Promoters and their donations of talent and money. Second, we want to find
out why that 25 percent of alumni who went to an
event didn’t attend a second. Finally, it’s time to start
talking about broadening the use of NPS beyond
events. We hope to send an email survey this fall to
all alumni for whom we have an email address. We
want to ask questions about their decision to attend
Cornell, their likelihood to recommend Cornell to
others, and their participation in alumni activities.
We’ll continue to participate in Bain & Company’s
Social Impact Forum, which has been an important
learning resource. It’s also been career-changing to
spend quality time with nonprofits in other industries,
sharing donor and volunteer strategies. The challenges
are very similar, but the ways other organizations
approach solutions present a learning opportunity.
Of course, in keeping with the NPS approach, I
have to end with the ultimate question: After reading
this article, how likely are you to recommend to your
colleagues that you start using NPS? C
Jennifer Lynham Cunningham is the senior director of metrics
and marketing in the Office of Alumni Affairs at Cornell University
in New York. She blogs about Cornell data weekly at blogs.cornell.
COPYRIGHT ©2012 BY THE COUNCIL FOR
ADVANCEMENT AND SUPPORT OF EDUCATION;
REPRINTED WITH PERMISSION FROM THE
OCTOBER 2012 ISSUE OF CURRENTS. ALL
RIGHTS RESERVED.
edu/fridayta.
C
A
S
E
C
U
R
R
E
N
T
S
41