Considerable research has been conducted on methods for

advertisement
Survey Completion Rates and Resource Use
at Each Step of a Dillman-Style Multi-Modal Survey
Submission for Public Opinion Quarterly
Authors’ Names and contact information:
Andrea Hassol, Abt Associates Incorporated, e-mail: andrea_hassol@abtassoc.com
Holly Harrison, Abt Associates Incorporated, e-mail: holly_harrison@abtassoc.com
Brenda Rodriguez, Abt Associates Incorporated, e-mail: brenda_rodriguez@abtassoc.com
Ricki Jarmon, Abt Associates Incorporated, e-mail: ricki_jarmon@abtassoc.com
Austin Frakt, Abt Associates Incorporated, e-mail: frakt@bu.edu
55 Wheeler St., Cambridge MA 02138
Phone: 617-492-7100
Fax: 617-492-5219
Supported by contact #CMS-500-00-0032/TO2
Abstract
Considerable research has been conducted on methods for improving response rates to
surveys. Several key factors are known to affect response rates, including: salience of the
survey, survey mode, monetary incentives, and multiple contacts. A responsemaximizing approach to multi-modal surveys, as best articulated by Dillman (2000),
includes up to five contacts with survey recipients, stamped return envelopes,
personalized correspondence and a token financial incentive.
In designing data collection strategies, survey researchers must weigh available resources
against expected returns. Before beginning our survey of Medicare beneficiaries, we
searched the literature for information on large surveys where researchers measured the
additional increment in completion rates achieved by each step in the process, as well as
the costs of each sequential step. We found minimal research that tangibly demonstrated
the effects on completion rates of reminder postcards, additional mailings, and the use of
first class versus priority mail.
This article describes the results of a Dillman-style multi-modal approach, questionnaire
completion rates, and costs associated with each step in the approach. The survey was
conducted as part of an evaluation of beneficiaries’ experiences with a new Medicare
insurance option. This article does not describe the effects of the option; it focuses on
survey methodology and completion rates rather than on the substance of the survey
itself. This work validates many of the axioms of survey research, in a much-studied
population, and offers survey researchers benchmarks for at each additional step in the
process.
Rates of Survey Completion and Resource Use at Each Step of a Dillman-Style Multi-Modal Survey
2
Considerable research has been conducted on methods for improving response
rates to surveys. Several key factors are known to affect response rates, including
salience of the survey (Herberlein and Baumgartner, 1978), form of mailing and
monetary incentives (Dillman, 1991), and multiple contacts (Linsky, 1975, Dillman,
1991). A response-maximizing approach to multi-modal surveys, as best articulated by
Dillman (1978), includes:





A respondent-friendly questionnaire;
Up to five contacts with recipients of the survey (a brief pre-notice letter sent
a few days prior to the arrival of the questionnaire; a questionnaire mailing
that includes a detailed cover letter; a thank you/reminder postcard; a
replacement questionnaire; a final contact, possibly by telephone.);
Inclusion of stamped return envelopes;
Personalized correspondence; and
A token financial incentive.
When designing cost-efficient data collection efforts, researchers must usually balance
available resources against expected returns. For example, telephone follow-up can be
extremely costly and might yield fewer completes than would an additional mailing of a
questionnaire. To maximize response rates, we sought to utilize as many of Dillman’s
recommendations as were feasible in the context of the study.
The findings in this paper are from a survey funded by the Centers for Medicare
and Medicaid Services (CMS). CMS specified a response rate of over 70% for the
survey, using a mail survey with telephone follow-up. A literature review was conduced
to identify efficient strategies for maximizing response in this type of multi-modal
survey. We found no published studies that specifically demonstrated the results of the
“Total Design Method” (TDM) approach made famous by Dillman (1978). Other
researchers have focused on the effects of questionnaire length on response rates
Rates of Survey Completion and Resource Use at Each Step of a Dillman-Style Multi-Modal Survey
3
(Dillman, 1993; Axford, 1997); ways to increase mail response rates (Finke, 1985;
House, 1977; Nederhof, 1988); and the effects of multiple mailings on external validity in
mail surveys (Dalecki, 1983).
A common methodology in survey research is repeated mailings with telephone
follow-up, but there is a dearth of information about expected results for each step of the
process. Such information could guide researchers in tailoring their survey strategies to
maximize efficient use of resources for a given field period. This article seeks to fill a gap
in the literature by describing the questionnaire completion rates yielded from a TDM
approach over a 21-week field period. These data serve to validate many of the axioms
of survey research, in a much-studied population, and offer survey researchers
benchmarks for each additional step in a Dillman survey process.
Although this survey was implemented in a manner consistent with Dillman’s
strategy of repeated contacts, advance letters, and so on, we were unfortunately forced to
violate some of Dillman’s suggestions. For example, the questionnaire, although
attractive, was not as respondent-friendly as we would choose, and it had many ‘skip
patterns’ that respondents were required to follow. This complexity was necessary
because certain sections of the questionnaire were appropriate only for respondents with
particular insurance coverage. In addition, the topic of this survey was more salient to
some groups of respondents than to others, and we hypothesized that completion rates
would differ accordingly. The survey was entirely voluntary (as explained in the advance
letter and all cover letters), and no token financial incentives were allowed for this
Federally sponsored survey. Because this survey was implemented without many of the
Rates of Survey Completion and Resource Use at Each Step of a Dillman-Style Multi-Modal Survey
4
advantages other surveys employ to enhance response rates, implementation methods
were the primary tools to maximize response rates and reduce non-response bias.
Methodology
The eligible sample was defined as community-dwelling Medicare beneficiaries,
with valid addresses, who were either aged or disabled (but not individuals with End
Stage Renal Disease). The initial sample of 7,730 was drawn from Medicare’s
Enrollment DataBase (EDB), which contains: name and address, date of birth, race, sex,
language preference (either English or Spanish),1 and disability status of all Medicare
beneficiaries.
The total sample, spread across 25 states, included beneficiaries who had enrolled
in a new Medicare-approved insurance plan (Private Fee-For-Service), as well as other,
similar Medicare beneficiaries who either remained in traditional Medicare with an
indemnity-style Medigap supplemental policy or who were in managed care plans
approved by Medicare. The demographics of the sample are shown in Table 1. We
hypothesized the questionnaire would be more salient to respondents with PFFS
experience, and that completion rates would be highest in this group.
1
Few of the 7,730 were listed in the sample database as preferring Spanish rather than English (64 of
7,730), and indeed only 11 respondents completed the survey in Spanish.
Rates of Survey Completion and Resource Use at Each Step of a Dillman-Style Multi-Modal Survey
5
Table 1: Demographics of eligible Medicare Beneficiaries based on EDB data
PFFS
MEDICARE (WITH
OR WITHOUT
MEDIGAP
SUPPLEMENT)
MEDICARE
MANAGED
CARE
PLAN
TOTAL
SAMPLE
% <65 (disabled
beneficiaries)
% 65-74
% 75-84
% 85+
24
16
14
19
41
29
6
47
31
6
51
30
5
45
30
6
% male
% female
44
56
44
56
46
54
44
56
% white
% black
% other
92
5
3
89
8
2
89
7
4
90
6
3
Age
Sex
Race*
-
* EDB data do not distinguish white Hispanics from black Hispanics; many black Hispanics may be
listed as ‘black’ but many other black Hispanics could be listed as ‘other’.
Individuals living in nursing homes were ineligible, as were persons who were
impaired in ways that made response impossible. Proxies were permitted for this latter
group including spouses, life-partners, or adult children. We also learned of some deaths
of respondents that occurred after the sample was drawn. Some of the sampled
beneficiaries had only guardians or conservators/trustees listed as their addresses; these
individuals were assumed to be incompetent or institutionalized and therefore ineligible.
The final eligible sample was 7,4572.
Despite repeated attempts by mail and phone, no contact was ever made with
1,252 sample members to verify eligibility. We do not know the portion of this group
who were ineligible (deceased, institutionalized, and so on.). We followed standard
survey procedures and estimated which members of this ‘non-locatable’ group would be
2
Eliminated from the sample were 56 deceased individuals, 143 individuals dwelling in nursing homes or
other institutions, and 74 who were too impaired to respond (and had no proxies).
Rates of Survey Completion and Resource Use at Each Step of a Dillman-Style Multi-Modal Survey
6
eligible based on known characteristics. This estimation process subtracted 221 cases, or
17 percent of the 1,252 cases where eligibility could not be verified. Our final sample
was 7,236 eligible and ‘assumed eligible’ Medicare beneficiaries.
Questionnaire: The 13-page questionnaire was attractively printed in a booklet with a
blue cover and clear instructions. The questionnaire included questions from the
Comprehensive Assessment of Health Plans Survey (CAHPS) instrument, the SF36, and
from other Medicare-sponsored surveys, as well as new questions relevant to the PFFS
plan. The questionnaire was most salient to respondents enrolled in the new insurance
plan and probably less salient to others who could have enrolled in the new plan but did
not. The questionnaire was not particularly respondent-friendly and contained several
skip patterns. Respondents were guided through these skips with visual cues (arrows)
and written directions, but they had to be vigilant about observing the indicated skip
patterns. Although approximately half of returned mail questionnaires had at least one
error in following skip patterns, virtually all erred by including more information than
was needed (failing to skip). Although the skip patterns were difficult to follow, the
errors in following the skip patterns did not damage the data.
The cover of the questionnaire contained a text box in which a Spanish instruction
appeared, advising any individual who wanted a Spanish version of the questionnaire to
call a toll-free number to request the Spanish version. Respondents who preferred to
conduct the interviews by phone, in English or Spanish, could call the same toll-free
number, which was staffed by a bilingual interviewer. Throughout the course of the 21week field period, there were only 15 inquiries to this toll free line. Most of these
Rates of Survey Completion and Resource Use at Each Step of a Dillman-Style Multi-Modal Survey
7
requests were for Spanish versions of the survey or to complete telephone interviews in
Spanish.
Survey Implementation: The field period was scheduled to last 21 weeks, between July
and December 2002. It began with an advance letter (sent to all sampled beneficiaries),
and included three mailings of the survey to all remaining eligible non-responders before
beginning phone follow-up. All eligible non-respondents were also sent two reminder
postcards (one between each mailing of the questionnaire). Each time a questionnaire
was mailed, a business reply envelope was enclosed to minimize burden on the sampled
beneficiary and to help increase completion rates. All response envelopes were returned
via postage-paid, first-class mail.
When the first mailing of the survey was sent, telephone interviewers were trained
and ready to accommodate any incoming calls from respondents who chose to complete
the survey over the telephone. In week 12 of data collection, we used an automated
directory assistance (tele-matching) service to locate telephone numbers for individuals
who did not respond by mail. The search used surname and street address as listed in
Medicare’s EDB. Less than half of the cases produced a matching telephone number.
For the third mailing in week 13, we used Priority mail to ship the survey
packages, as opposed to the First Class mail (used in the previous two mailings), because
the outer encasement of the priority envelope conveys the special importance of the
contents and might attract greater recipient attention. In addition, the cost of Priority
mail is much less than that of other attention-getting envelopes such as Federal Express.
In weeks 14-21, we conducted telephone follow-up using computer-assisted
telephone interviewing (CATI), in English and Spanish, to all eligible non-responders
Rates of Survey Completion and Resource Use at Each Step of a Dillman-Style Multi-Modal Survey
8
from whom we could match a telephone number to their address. Table 2 summarizes
the steps and timeline in survey implementation.
Table 2: Timeline of Data Collection Steps
Task in TDM
Approach
Date
Week of Data
Collection
1. Advance letter explaining importance
of survey.
Sent first class
July 23, 2002
2. First questionnaire mailing with return
envelope.
Phone center ready for inbound calls
to complete survey or answer
questions.
Sent first class
July 25, 2002
3. Thank you/reminder postcard.
Sent first class
August 16, 2002
5
4. Second (replacement) questionnaire
with return envelope.
Sent first class
August 28, 2002
6
5. Second thank you/reminder postcard.
Sent first class
September 17, 2002
9
6.
Sent Priority
Mail
October 10, 2002
13
7. Telephone follow-up for 47% of
remaining eligible sample with listed
phone numbers.
CATI
October 21 through
December 5, 2002
14-21
8. Continued receipt of mail surveys and
prompt responses to all re-mail
requests in English and Spanish.
Federal Express
October 21 through
December 5, 2002
14-21
1
1
Third (replacement) questionnaire
with return envelope.
Throughout the 21week field period, it was imperative to be able to monitor not only the
overall sample but also completed cases by sample types.
Rates of Survey Completion and Resource Use at Each Step of a Dillman-Style Multi-Modal Survey
9
Sample Management: Our monitoring system enabled us to mail questionnaires only to
eligible non-responders and to pull any cases from the mail file that were completed in
the time between drawing the initial file for the mail house and the mailing date. Once
phone follow-up began in week 14, this database was updated daily, downloading case
dispositions from the computer assisted telephone interviewing (CATI) system, as well as
providing reports to the telephone center on which cases needed to be updated, with mail
data, in CATI as either complete or ineligible.
This database was an essential component of our sample management system and
minimized burden on the respondent population by avoiding re-mailings to ineligible
sample members or to individuals who completed the survey by mail or phone. In
addition, we carefully assigned interviewer labor only to non-responding cases. The
success of this management system was demonstrated most clearly in the fact that only
21 cases completed the survey by both phone and mail over the course of the entire field
period.
Results
Mailed questionnaires were considered complete if they contained responses to at
least 40% of the questions (which all did). In addition, there were 21 sampled
beneficiaries who firmly stated they are “not on Medicare,” although the EDB indicates
otherwise3.
These 21 individuals could arguably have been removed from the denominator as ‘ineligible,’ but we
know from our considerable experience with this population that a small percentage of sample members
state – in error – that they are not on Medicare. For some surveys, including this one, analysts want to
3
Rates of Survey Completion and Resource Use at Each Step of a Dillman-Style Multi-Modal Survey
10
Table 3 describes the final disposition of cases in the eligible sample.
Table 3: Final Disposition of Cases at Week 21 of Field Period
Final Case Disposition
Number
Completes by mail
Completes by telephone
Total Completes
Not on Medicare (self-reported)
Confirmed eligible but nonresponsive
Unknown eligibility; estimated to be eligible
Total Non-Completes
TOTAL ELIGIBLE SAMPLE
4,595
690
5,285
21
771
1,159
1,951
7,236
% of Eligible+
Assumed
Eligible Sample
63.5%
9.5%
73.0%
0.3%
10.7%
16.0%
27.0%
100%
Figure 1 shows the weeks of data collection, as well as the completion rate during
the field period. This figure shows that approximately 1 to 2 weeks after each
questionnaire mailing, numbers of returned questionnaires increased dramatically,
leveling out again until the next questionnaire mailing. The reminder postcards,
particularly the second one, generated little increase in returned questionnaires.
know which respondents believe they are not on Medicare, because those in managed care plans tend to be
more likely to hold this erroneous view. Because this survey concerned experiences with different forms of
Medicare insurance, a misperception about not being on Medicare is valuable information, and these
responses were treated as completes.
Rates of Survey Completion and Resource Use at Each Step of a Dillman-Style Multi-Modal Survey
11
Completion Rate
Figure 1: Completion Rates by Week of Data Collection
100%
90%
80%
70%
60%
50%
40%
30%
20%
10%
0%
0
1
2
3
4
5
6
7
8
9
10 11 12 13 14 15 16 17 18 19 20 21
Week
For this survey, 87% of completes were mailed questionnaires, and 13% were
computer- assisted telephone interviews. We anticipated that the questionnaire would be
more salient to respondents in PFFS, and, indeed, completion rates were highest for this
group. In the PFFS group, 79% of eligibles completed the survey; in the Traditional
Medicare group, with or without a Medigap Supplement, 69% completed the survey; and
in the Medicare Managed Care group, 67% completed the survey. Salience of the
questionnaire does appear to have influenced completion rates. As noted earlier, the
group who probably found this survey most relevant (PFFS group) had the highest
completion rate.
Costs: Fixed costs of the survey included: the design of the questionnaire, OMB
processing, programming the questionnaire into CATI, translating the questionnaire into
Spanish, coordinating with mail house and print vendors, managing the data collection
process, and training interviewers. These costs are fixed, regardless of the number of
Rates of Survey Completion and Resource Use at Each Step of a Dillman-Style Multi-Modal Survey
12
completed questionnaires. Incremental costs (postage, printing, interviewer time, coding)
varied depending upon the number of completed surveys.
Figure 2 shows total costs, as they accrued over time, through each successive month
during the 21 week filed period. The remaining 13% of costs were for data file
preparation.
Figure 2: Percent of Total Budget Spent
100.0%
86.9%
80.0%
71.0%
60.0%
55.9%
43.4%
40.0%
20.0%
19.6%
5.7%
0.0%
1
2
3
4
5
6
Month of DC
In terms of response mode, CATI accounted for 23% of total spending, from
programming the instrument through interviewing and coding and the mail/paper effort
comprised for 64% of spending (data not shown).
Discussion
Based on these data, we find that it is possible to attain a response rate of 73%
with an aged and disabled Medicare population, using an enhanced Dillman multi-modal
approach, in approximately 21 weeks of data collection. Each re-mailing of the
questionnaire resulted in a noticeable increase in response during the field period. There
was less indication of a similar ‘spike’ following the thank you/reminder postcards. It is
Rates of Survey Completion and Resource Use at Each Step of a Dillman-Style Multi-Modal Survey
13
not clear that this funding means that the reminder postcards could be eliminated without
damaging response; the reminders are part of a series of contacts which may – in their
entirety – have generated a sense of obligation on the part of potential respondents. The
Dillman approach includes two mailings of the questionnaire itself; we added a third (via
Priority Mail) to determine whether or not it would yield substantial additional response,
which it did. Because CATI follow-up is more costly than a Priority Mailing, we can
recommend this third ‘special’ mailing as being a cost-effective addition to the standard
Dillman approach.
Telephone follow-up yielded only 690 additional completed responses. This result
could be due to many factors including: a general and widespread disinclination to
respond to telephone surveys (Kessler, 1995; Allen, 2002; Massey, 1997; Cox, 2002);
active scam-avoidance campaigns aimed at protecting seniors in particular (AARP
Bulletin, 2002; NYC Department of Consumer Affairs, U.S. Department of Justice); and
the fact that the telephone effort targeted the hardest-to-reach remainder of the sample
population – the easy-to-reach responds replied by mail. As the telephone follow-up
yielded only a small additional response (13% of all completed surveys) but consumed
23% of total costs, it is a very costly method for boosting survey completion rates a
relatively small and final increment. These findings may be helpful to researchers
planning large-scale surveys of the Medicare population as well as for other large
surveys.
Rates of Survey Completion and Resource Use at Each Step of a Dillman-Style Multi-Modal Survey
14
Works Cited
American Association for Retired Persons. Consumer Alert: 9 Warning Signs of a Scam and
Scam Prevention Worksheet; AARP bulletin, May 2002.
American Association for Retired Persons (AARP). Telemarketing Fraud: The Dos and Don'ts;
http://www.aarp.org/fraud/2fraud.htm
Axford, Rita; Carter, Barbara; Grunwald, Gary. “Enhancing Dillman's Total Design Method for
Mailed/Telephone Surveys Using Current Technology to Maximize Cost Benefit Ratios.”
Australian and New Zealand Journal of Sociology; 1997, 33, 3, Nov, 387 393.
Cox, Brenda . Daniel O'Connor, Kathryn Chandler; Clyde Tucker (Discussant); An Investigation
of Response Rates in Random Digit Dialed Telephone Surveys presented January 23, 2002 at the
Bureau of Labor Statistics
Dalecki, Michael G.; Ilvento, Thomas W.; Moore, Dan E., “The Effects of Multi Wave Mailings
on External Validity of Mail Surveys.” Presentation at the Annual Meeting of Rural Sociological
Society, Lexington, KY, August, 1983.
Dillman, DA; Reynolds RQ & Rockwood, TH (1991). “Focus Group Tests of Two Simplified
Decennial Census Forms I.” (technical report #91-39). Pullman, WA.
Dillman, DA. Mail and Internet Surveys: the tailored design method. (2000). Wiley & Sons, NY.
Dillman, Don A.; Sinclair, Michael D.; Clark, Jon R. “Effects of Questionnaire Length,
Respondent Friendly Design, and a Difficult Question on Response Rates for Occupant
Addressed Census Mail Surveys.” Public Opinion Quarterly; 1993, 57, 3, fall, 289 304.
Dillman, Don A. “The Design and Administration of Mail Surveys.” Annual Review of
Sociology; 1991, 17, 225 249.
Dillman, Don A. Mail and Telephone Surveys: The Total Survey Design Method. NY: Wile,
1978.
Fink, A. and Kosecoff, J. How to Conduct Surveys: A Step-by-Step Guide. Beverly Hills, CA:
Sage Publications, 1985.
HealthCare Financing Administration. Medicare Pamphlet; Protect Yourself Against Health Care
Fraud; Publication No. HCFA-10111.
Herberlein, TA and Baumgartner, R (1978). “Factors affecting response rates to mailed
questionnaires: a quantitative analysis of the published literature.” American Sociological
Review, 43, 447-462
House, James S.; Gerber, Wayne; McMichael, Anthony J. “Increasing Mail Questionnaire
Response: A Controlled Replication and Extension.” Public Opinion Quarterly. 1977, 41, 1,
spring, 95 99.
Kessler, R.C., Little, R.J.A., Groves, R.M. (1995). Advances in strategies for minimizing and
adjusting for survey non-response. Epidemiological Reviews, 17:192-204.
Rates of Survey Completion and Resource Use at Each Step of a Dillman-Style Multi-Modal Survey
15
Linsky, AS (1975). “Stimulating responses to mailed questionnaires: a review.” Public Opinion
Quarterly, 39, 82-101.
Marillene Allen, Chair, Response Rate Committee, Professional Marketing Research Society,
2002; http://www.pmrs-aprm.com/SpecialResponse
Massey JT, O'Connor D, Krotki K, RESPONSE RATES IN RANDOM DIGIT DIALING (RDD)
TELEPHONE SURVEYS; Education Statistics Services Institute; presented at American
Statitistical Association meetings, 1997.
National Sheriffs Association, Operation Fraud stop: A Partnership to Reduce Telemarketing
Fraud and Assist Victims.
Nederhof, Anton J. “Effects of a Final Telephone Reminder and Questionnaire Cover Design in
Mail Surveys.” Social Science Research; 1988, 17, 4, Dec, 353 361.
Nederhof, Anton J., “A Survey on Suicide: Using a Mail Survey to Study a Highly Threatening
Topic.” Quality and Quantity; 1985, 19, 3, June, 293 302.
NYC Dept. of Consumer Affairs. Telephone Scams & Senior Consumers: A Word to Seniors
from NYC Dept. of Consumer Affairs; http://www.nyc.gov/html/dca/pdf/telscam.pdf
Schaefer, David R.; Dillman, Don A., “Development of a Standard E Mail Methodology: Results
of an Experiment.” Public Opinion Quarterly; 1998, 62, 3, fall, 378 397.
Telephone Scams And Older Consumers; Protect Yourself Against TELEMARKETING
FRAUD;
http://www.discoveringmontana.com/doa/consumerProtection/Downloads/TelemarkPrevention.p
df
U.S. Dept. of Justice, Elder Financial Exploitation Prevention Program
Rates of Survey Completion and Resource Use at Each Step of a Dillman-Style Multi-Modal Survey
16
Download