Faster, Cheaper, Better
Online Research: A Focus on Quality in
Realizing the Promise of the Internet
Kurt Knapton, Executive Vice President
e-Rewards Market Research
214-743-5429
kknapton@e-rewards.com
The concepts contained in this presentation are the property of e-Rewards, Inc.
Duplication or dissemination of this information without the express written
consent of e-Rewards, Inc. is prohibited.
November 7, 2005
e-Rewards Market Research is the online quality leader
www.e-rewards.com/researchers
e-Rewards Market Research serves its clients with high-quality online
sample from its B2C, B2B, and specialty panels.
2
e-Rewards Consumer™ Panel
e-Rewards Business™ Panel
Over 2,00,000 members
Over 1,000,000 members
300+ profile dimensions
40+ firmagraphic dimensions
Geographically balanced
C-Level Executives
7%
Mean Age
44
Professional/Managerial
72%
Men
53%
College Degree or More
72%
Women
47%
Post Graduate Study/Degree
32%
Specialty Consumer Panels
Specialty Business Panels
• Ailments Panel
• CxO Panel
• Life Events Panel
• IT Decision Maker Panel
• Affluent Panel
• Business Owners Panel
• Traveler Panel
• Physician Panel
• 8 Other Consumer Specialty Panels
• 6 Other Business Specialty Panels
© 2004 e-Rewards, Inc. All Rights Reserved.
Session Topic: Online Research Quality
The dramatic shift in recent years towards online
research has been driven largely by the desire
for reduced field time and cost savings.
But has the promise of “faster and cheaper”
desensitized the research community to potential
quality issues when deploying online
methodologies?
3
© 2004 e-Rewards, Inc. All Rights Reserved.
Online Research
Question to Researchers:
“Which of the following describes Online Research
versus other research modes?”
Faster
(Not in Debate – A Key Strength)
Cheaper
Better
4
© 2004 e-Rewards, Inc. All Rights Reserved.
(Typically the Most Cost Effective)
(It Depends on How You Conduct it!)
What is Better About Online Research?
Question to Researchers:
“What aspects of the Online Research mode are better?
The Survey Instrument
•
•
•
•
•
•
•
•
•
•
5
© 2004 e-Rewards, Inc. All Rights Reserved.
Preferred by Respondents (More Convenient)
Reduces Interviewer Bias
Doesn’t “Call in the Middle of Dinner” (Polite)
Media Richness (Graphics/Video/Audio)
Question-Level Validation (No Data Gaps)
Ease of Operation (Self-Paced, Start/Stop)
Perceived as Anonymous (Candid Honesty)
Quiet/Non-Intrusive Interview Context
Longer Attention Spans (vs. Phone)
Eliminates Data “Re-Keying” Error
What is Better About Online Research?
Question to Researchers:
“What aspects of the Online Research mode are better?
Respondent Control
•
•
•
•
•
•
6
© 2004 e-Rewards, Inc. All Rights Reserved.
Targeted Population Frames
Respondent Screening
National / International Reach
Storage/Retrieval of Profile Data History
Past Participation Tracking
Automated “Time to Complete” Capture
What is Better About Online Research?
Question to Researchers:
“What aspects of the Online Research mode are better?
Sample Quality (the key issue)
•
•
•
•
•
•
Methodological Purity?
How Representative?
Respondent Validation--“Survey Gamers”?
Respondent Duplication?
Response Rates?
Overusage?
“Better” (or even “As Good”) depends on Sample Quality
7
© 2004 e-Rewards, Inc. All Rights Reserved.
What is the Impact of Low Sample Quality?
“Quality Research Requires Quality Respondents”
GARBAGE IN
GARBAGE OUT
Results integrity is ultimately at stake. Of course,
this is true across all research modes.
8
© 2004 e-Rewards, Inc. All Rights Reserved.
What Defines Online Research Sample Quality?
We Believe There are 5 Key Areas of Sample Quality
 SAMPLE RECRUITMENT
 SAMPLE SCREENING
 SAMPLE COMPOSITION
 SAMPLE MAINTENANCE
 SAMPLE INCENTIVES
9
© 2004 e-Rewards, Inc. All Rights Reserved.
How Can Online Sample Quality be Measured?
We think in terms of a 15 Point Checklist
 SAMPLE RECRUITMENT
1. Control of Sample Sources / Pre-Validation (Where it all starts!)
2. Recruitment Method (“Closed” vs. “Open” – Pros/Duplication)
3. Recruitment Mode Diversity (Mix of Online & Offline)
 SAMPLE SCREENING
4. “Double Blind” Screening Technique (Enforced Internal Validity)
Source: http://company.e-rewards.com/15points
10
© 2004 e-Rewards, Inc. All Rights Reserved.
How Can Online Sample Quality be Measured?
15 Point Checklist for Sample Quality (Continued)
 SAMPLE COMPOSITION
5.
6.
7.
8.
9.
Fraud Prevention/Detection (An Ounce of Prevention…)
Sample Verification (Multiple Checks)
Normalization (Example: Gender Composition is Telling)
Segmentation (Deeper Profiling = More Sample Control)
Participation Rules by Topic (Track and Enforce)
Source: http://company.e-rewards.com/15points
11
© 2004 e-Rewards, Inc. All Rights Reserved.
What Defines Online Research Sample Quality?
15 Point Checklist for Sample Quality (Continued)
 SAMPLE MAINTENANCE
10.
11.
12.
13.
14.
Sample Profile Data Recency (Old Data is Less Reliable)
Survey Frequency Controls (Over-Surveying Hurts Everyone)
High Response Rates (Lessens Non-Response Bias Risk)
High Retention Rates (Key for Longitudinal Observations)
Respect for Respondent Privacy (Promotes Honesty & Trust)
 SAMPLE INCENTIVES
15. A “Fair Value Exchange” with Respondents (Value Their Time)
Source: http://company.e-rewards.com/15points
12
© 2004 e-Rewards, Inc. All Rights Reserved.
Other Voices in the Industry are Speaking About
Online Research Quality Concerns
comScore
In a recent study, comScore used observation-based research to track their
members and quantify the concentration levels and activity of “professional
survey respondents”.
Sigma Validation
Mary Beth Weber at the AMA 2005 Marketing Research Conference, Boston, MA
September 25-28, 2005, “Why Validate Internet Research?”
20/20 Research-Online
September 1, 2004 White Paper by Rachael Krupek entitled: “Handling Paid
Survey Sites.”
American Sports Data
Harvey Lauer, “You Say Evolution, I Say Devolution: Has Data Collection Improved
or Gotten Worse?” Quirks’ Marketing Research Review July/August 2005
13
© 2004 e-Rewards, Inc. All Rights Reserved.
What comScore Sees:
“More Than 30% Of All Online Surveys Are Completed By Less Than
0.25% Of The Population
A recent study by comScore has confirmed the dawn of the "professional
survey respondent," and validated the growing concern that such consumers
do not represent the broader population. Further, panelists in this small group
take an average of 80 surveys over a 90-day period — with some taking
several surveys per day!
comScore research also shows that members of the panels offered by most of
the leading online survey suppliers are, on average, members of as many as
seven other panels! It goes without saying that these levels of saturation are
unacceptable and can be expected to have a significant negative impact on
the quality and accuracy of panelists' survey responses.
The industry is at a crossroad — market factors render a complete return to RDD
impractical, but there is clearly a need to identify methods to improve the quality
of online samples and associated responses.”
Source: http://www.comscore.com/custom-research/sample.asp Also referenced by Mary Beth Weber, Sigma Validation at
AMA 2005 Marketing Research Conference, Boston, MA September 25-28, 2005, “Why Validate Internet Research?”
14
© 2004 e-Rewards, Inc. All Rights Reserved.
What 20/20 Research–Online Says:
“Almost all paid survey sites encourage members to join all 250-450+ research
companies’ panels on their list so their members can have the chance to
participate in hundreds of surveys, focus groups and mystery shops.”
“Some sites even offer software to ‘help you fill out your surveys up to 300%
faster.’ In essence, these sites teach people how to be professional
respondents.”
Because these sites are legal as long as they deliver the list/report/database
consumers are paying for, market research professionals have to be proactive
in protecting the integrity of their databases by using some of the
following practices:
…Decide if you will accept respondents from paid survey sites into your
database at all or conditionally
…Contact paid survey sites and ask them to remove your company from
their list/report/database”
Source: September 1, 2004 White Paper by Rachael Krupek entitled “Handling Paid Survey Sites.”
http://www.qualtalk.com/news/wp040901.htm Also referenced by Mary Beth Weber, Sigma Validation at AMA 2005
Marketing Research Conference, Boston, MA September 25-28, 2005, “Why Validate Internet Research?”
15
© 2004 e-Rewards, Inc. All Rights Reserved.
Why Should I Listen? Does it Matter?
We asked these questions ourselves.
As a result, we decided to enter into a 6-month
observational test during the first half of 2004 to test
out one of the 15 points of online sample quality
differentiation:
Online Panel Recruitment Method
(“Open” vs. “Closed” Approach)
16
© 2004 e-Rewards, Inc. All Rights Reserved.
What is Meant By “Open” vs “Closed” Online Panel
Recruitment / Sample Sourcing?
“Open” Online Panel Recruitment may be defined as a
method of allowing any person who has access to the
Internet to “self select” and enroll into a market
research panel.
By contrast, “Closed” or “By Invitation Only” Online
Panel Recruitment may be defined as a method of
inviting only pre-validated individuals or individuals who
share known characteristics to enroll into a market
research panel.
17
© 2004 e-Rewards, Inc. All Rights Reserved.
The “Open” Recruitment Problem:
Self-Selecting Professional Survey Takers
The Open Door
Literally scores of
pages of links to
“open” online panel
recruitment sites
18
© 2004 e-Rewards, Inc. All Rights Reserved.
“Open” Panel Recruitment Example #1
19
© 2004 e-Rewards, Inc. All Rights Reserved.
“Open” Panel Recruitment Example #2
Dozens of
panels
sourcing
members
from the
same place
20
© 2004 e-Rewards, Inc. All Rights Reserved.
“Open” Panel Recruitment Example #3
21
© 2004 e-Rewards, Inc. All Rights Reserved.
“Open” Panel Recruitment Example #3 (Con’t.)
Over 300 more
Panels Listed
22
© 2004 e-Rewards, Inc. All Rights Reserved.
CASE STUDY: A Comparison of “Open” vs. “By
Invitation Only” (or “Closed”) Sample Sourcing
e-Rewards Market Research, an online sample provider
based in Dallas, Texas, conducted a 6-month
observational test during the first half of 2004 to
compare and contrast “open” sample/respondent
recruitment quality vs. e-Rewards’ controlled “by
invitation only” or “closed” sample/respondent recruitment
approach.
We wanted to know if we were “missing the bus” by not
accepting members from paid survey sites and online panel
aggregator sites.
The following pages present the key findings of our study.
23
© 2004 e-Rewards, Inc. All Rights Reserved.
CASE STUDY: Experiment Methodology Overview
Between January 2004 – June 2004, e-Rewards Market Research
enrolled 38,162 panel members (representing less than 2% of all
currently enrolled panel members) into its Consumer Panel using an
“open” enrollment methodology.
In a parallel tracking experiment that was conducted for an
additional year of study (July 2004 – July 2005), observations were
made to compare these “open-sourced” members with e-Rewards’
“by invitation only” sourced panelists.
The following areas were compared for the two groups:
Demographics, Motivations, Average Survey Complete Times, Fraud
Detection, Response Rates, and Outside Panel Participation
Duplication and Frequency.
24
© 2004 e-Rewards, Inc. All Rights Reserved.
CASE STUDY: Demographic Comparison
“Open” Sourced
Source: e-Rewards Market Research panel statistics, 2005
25
© 2004 e-Rewards, Inc. All Rights Reserved.
vs.
“By Invitation Only”
Source: e-Rewards Market Research panel statistics, 2005
CASE STUDY: Demographic Comparison
“Open” Sourced
Source: e-Rewards Market Research panel statistics, 2005
26
© 2004 e-Rewards, Inc. All Rights Reserved.
vs.
“By Invitation Only”
Source: e-Rewards Market Research panel statistics, 2005
CASE STUDY: Motivational Profile Comparison
“Open” Sourced
2.19 Answers per Respondent
Source: e-Rewards Market Research panel statistics, 2005
27
© 2004 e-Rewards, Inc. All Rights Reserved.
vs.
“By Invitation Only”
1.77 Answers per Respondent
Source: e-Rewards Market Research panel statistics, 2005
CASE STUDY: Motivational Profile Comparison
“Open” Sourced
Source: e-Rewards Market Research panel statistics, 2005
vs.
“By Invitation Only”
Source: e-Rewards Market Research panel statistics, 2005
Note: Panelists are asked during panel enrollment to provide a maximum number of survey e-mails preferred.
28
© 2004 e-Rewards, Inc. All Rights Reserved.
CASE STUDY: Professional Survey Taking
Behavior Comparison: “Mean Survey Time”
“Open” Sourced
Mean Time to Complete =
8 minutes : 22 Seconds
vs.
“By Invitation Only”
Mean Time to Complete =
9 minutes : 45 Seconds
14% Faster
Source: e-Rewards Market Research panel statistics, 2005
Source: e-Rewards Market Research panel statistics, 2005
Source: e-Rewards Market Research panel statistics, 2005
29
© 2004 e-Rewards, Inc. All Rights Reserved.
CASE STUDY: Professional Survey Taking
Behavior Comparison: “Survey Time Outliers”
“Open” Sourced
“By Invitation Only”
% of “Too Fast” Outliers
Average per Study
% of “Too Fast” Outliers
Average per Study
2.1%
0.6%
Source: e-Rewards Market Research panel statistics, 2005
30
vs.
© 2004 e-Rewards, Inc. All Rights Reserved.
Source: e-Rewards Market Research panel statistics, 2005
CASE STUDY: Response Rate Comparison
“Open” Sourced
“By Invitation Only”
Average Response Rate =
21.8%
Average Response Rate =
22.7%
Comparable
Comparable
Source: e-Rewards Market Research panel statistics, 2005
31
vs.
© 2004 e-Rewards, Inc. All Rights Reserved.
Source: e-Rewards Market Research panel statistics, 2005
CASE STUDY: Outside Panel Membership
(e.g. Cross Panel Duplication)
“Open” Sourced
Source: e-Rewards Market Research panel statistics, 2005
32
© 2004 e-Rewards, Inc. All Rights Reserved.
vs.
“By Invitation Only”
Source: e-Rewards Market Research panel statistics, 2005
CASE STUDY: Outside Panel Membership
(e.g. Cross Panel Duplication)
“Open” Sourced
Source: e-Rewards Market Research panel statistics, 2005
33
© 2004 e-Rewards, Inc. All Rights Reserved.
vs.
“By Invitation Only”
Source: e-Rewards Market Research panel statistics, 2005
CASE STUDY: Outside Panel Survey Frequency
(e.g. Cross Panel Duplication)
“Open” Sourced
vs.
Source: e-Rewards Market Research panel statistics, 2005
34
© 2004 e-Rewards, Inc. All Rights Reserved.
“By Invitation Only”
Source: e-Rewards Market Research panel statistics, 2005
CASE STUDY: Outside Panel Survey Frequency
“Open” Sourced
Source: e-Rewards Market Research panel statistics, 2005
35
© 2004 e-Rewards, Inc. All Rights Reserved.
vs.
“By Invitation Only”
Source: e-Rewards Market Research panel statistics, 2005
This is a Global Market Research Industry Issue
Market Research Trends Affecting our Industry in the Next 510 Years?
 35% answered that “The Internet will continue to revolutionise the business”
Most Serious Threats to the Industry in the next 5-10 Years?
 54% said, “Clients lacking skills to recognise the difference between good and
poor quality of research by clients”
Key Factors for the Market Research Industry in the Future?
 58% responded, “Standards of Performance/Quality Standards”
Source: CASRO’s 30th Annual Conference on September 28-30, 2005, Gunilla Broadbent (ESOMAR
Council Member & Treasurer) presented World ESOMAR Research findings from Vision 2010 Survey
36
© 2004 e-Rewards, Inc. All Rights Reserved.
In Summary
The many inherent advantages of online research
will likely continue to propel its adoption as a
leading source of research responses.
However, the delivery of “faster and cheaper”
research should not stop the research community
from addressing online quality issues and
establishing meaningful quality metrics.
37
© 2004 e-Rewards, Inc. All Rights Reserved.
QUESTIONS AND ANSWERS
THANK YOU!
Kurt Knapton, Executive Vice President
e-Rewards Market Research
214-743-5429
kknapton@e-rewards.com
The concepts contained in this presentation are the property of e-Rewards, Inc.
Duplication or dissemination of this information without the express written
consent of e-Rewards, Inc. is prohibited.
November 7, 2004