CHS 218 - Community Health Sciences

advertisement
CHS / Epi M218
Fall 2010
Page 1
COMMUNITY HEALTH SCIENCES/EPIDEMIOLOGY M218
Questionnaire Design and Administration
Course web site: http://ccle.ucla.edu
Day & Time: Mon & Wed 8-10 A.M.
Room:
CHS 41-235
ID#:
840 108 200 (CHS)
844 110 200 (EPI)
Instructor:
Linda B. Bourque
Office:
41-230 CHS
Office Hrs: Mon & Wed 10:00-11:30
Sign up for appointments on sheet outside office.
TEXTBOOKS:
A.
Required books available for purchase in the Health Science Bookstore:
1.
2.
3.
B.
Recommended books available for purchase in the Health Sciences Bookstore.
1.
2.
3.
4.
C.
LuAnn Aday, Designing and Conducting Health Surveys, 3rd edition, JosseyBass, 2006.
Linda Bourque and Eve Fielder, How to Conduct Telephone Surveys, The Survey
Kit, Sage Publications, 2nd Edition, 2003.
Materials available on course website and other UCLA web sites.
Linda Bourque and Eve Fielder, How to Conduct Self-Administered and Mail
Surveys, 2nd Edition, The Survey Kit, Sage Publications, 2003.
Linda Bourque and Virginia Clark, Processing Data: The Survey Example, Sage
Publications, 1992.
Jean M. Converse and Stanley Presser, Survey Questions, Sage, 1986.
Orlando Behling and Kenneth S. Law, Translating Questionnaires and Other
Research Instruments, Problems and Solutions, Sage Publications, 2000.
Recommended books available in the UCLA libraries:
1.
2.
3.
4.
5.
6.
7.
8.
9.
10.
Arlene Fink, How to Ask Survey Questions, The Survey Kit, Sage Publications,
1995, 2nd edition, 2003.
Arlene Fink, How to Design Surveys, The Survey Kit, Sage Publications, 1995,
2nd edition, 2003.
Eleanor Singer and Stanley Presser, eds., Survey Research Methods, A Reader,
The University of Chicago Press, 1989.
Donald Dillman, Mail & Telephone Surveys, Wiley-Interscience, 1978.
Peter H. Rossi, James D. Wright, Andy B. Anderson, Handbook of Survey
Research, Academic Press, 1983.
Seymour Sudman & Norman M. Bradburn, Asking Questions, Jossey-Bass, 1982.
Robert M. Groves & Robert L. Kahn, Surveys by Telephone, Academic Press,
1979.
Norman M. Bradburn & Seymour Sudman, Polls & Surveys, Jossey-Bass, 1988.
Jean M. Converse, Survey Research in the United States, University of California
Press, 1987.
CHS / Epi M218
Fall 2010
Page 2
11.
12.
13.
D.
Hubert O'Gorman, ed., Surveying Social Life, Wesleyan University Press, 1988.
Herbert H. Hyman, Taking Society's Measure, Russell Sage Foundation, 1991.
Judith M. Tanur, ed., Questions About Questions, Russell Sage Foundation, 1992.
Supplementary Materials
All of the following articles are available on the class website at http://ccle.ucla.edu.
When you use information from articles, please remember that they are under copyright.
Articles on the Web Site:
1. Adua L, JS Sharp. Examining survey participation and response quality: The significance
of topic salience and incentives. Survey Methodology 2010; 36: 95-109.
2. The American Association for Public Opinion Research, 2009. Standard Definitions:
Final Dispositions of Case Codes and Outcome Rates for Surveys. 6th edition. Lenexa,
Kansas: AAPOR.
3. Ansolabehere S, BF Schaffner. Residential mobility, family structure, and the cell-only
population. Public Opinion Quarterly 2010; 74:244-259.
4. Barón JD, RV Breunig, D Cobb-Clark, T Gørgens, A Sarbayeva. Does the effect of
incentive payments on survey response rates differ by income support history? Journal of
Official Statistics 2009; 25:483-507.
5. Barton, AH. Asking the Embarrassing Question. The Public Opinion Quarterly 22: 6768, 1958.
6. Bhopal, Raj & Liam Donaldson, White, European, Western, Caucasian, or What?
Inappropriate Labeling in Research on Race, Ethnicity, and Health,” American Journal of
Public Health 88(9):1303-1307, 1998.
7. Binson, D., J.A. Canchola, J.A. Catania, “Random Selection in a National Telephone
Survey: A Comparison of the Kish, Next-Birthday, and Last-Birthday Methods,” Journal
of Official Statistics 16(1):53-59, 2000.
8. Bischoping, K., J. Dykema, “Toward a Social Psychological Programme for Improving
Focus Group Methods of Developing Questionnaires,” Journal of Official Statistics
15(4):495-516, 1999.
9. Blair, E.A., G.K. Ganesh, “Characteristics of Interval-based Estimates of
Autobiographical Frequencies,” Applied Cognitive Psychology 5:237-250, 1991
10. Bourque, L.B. “Coding.” In M.S. Lewis-Beck, A. Bryman, T.F. Liao, Editors, The Sage
Encyclopedia of Social Science Research Methods, Volume 1, Thousand Oaks, Ca: Sage
Publications, 2003, pp. 132-136.
CHS / Epi M218
Fall 2010
Page 3
11. Bourque, L.B. “Coding Frame.” In M.S. Lewis-Beck, A. Bryman, T.F. Liao, Editors, The
Sage Encyclopedia of Social Science Research Methods, Volume 1, Thousand Oaks, Ca:
Sage Publications, 2003, pp. 136-137.
12. Bourque, L.B. “Cross-Sectional Design.” In M.S. Lewis-Beck, A. Bryman, T.F. Liao,
Editors, The Sage Encyclopedia of Social Science Research Methods, Volume 1,
Thousand Oaks, Ca: Sage Publications, 2003, pp. 229-230.
13. Bourque, L.B. “Self-Administered Questionnaire.” In M.S. Lewis-Beck, A. Bryman, T.F.
Liao, Editors, The Sage Encyclopedia of Social Science Research Methods, Volume 3,
Thousand Oaks, Ca: Sage Publications, 2003, pp. 1012-1013.
14. Bourque, L.B. “Transformations.” In M.S. Lewis-Beck, A. Bryman, T.F. Liao, Editors,
The Sage Encyclopedia of Social Science Research Methods, Volume 3, Thousand Oaks,
CA: Sage Publications, 2003, pp. 1137-1138.
15. Norman Bradburn, The Seventh Morris Hansen Lecture on “The Future of Federal
Statistics in the Information Age,” with commentary by TerriAnn Lowenthal, Journal of
Official Statistics 15(3):351-372, 1999.
16. Bradburn, N.M. “Understanding the Question-Answer Process,” Survey Methodology
30:5-15, 2004.
17. Bradburn, N.M., L.J. Rips, S.K. Shevell, “Answering Autobiographical Questions: The
Impact of Memory and Inference on Surveys,” Science 236:157-161, 1987.
18. Brick, J.M., J. Waksberg, S. Keeter, “Using Data on Interruptions in Telephone Service
as Coverage Adjustments,” Survey Methodology 22(2):185-197, 1996.
19. Brick JM, PD Brick, S Dipko, S Presser, C Tucker, Y Yuan. Cell phone survey feasibility
in the U.S.: Sampling and calling cell numbers versus landline numbers. Public Opinion
Quarterly 2007; 71: 23-39.
20. Brick JM, WS Edwards, S Lee. Sampling telephone numbers and adults, interview
length, and weighting in the California Health Interview Survey cell phone pilot study.
Public Opinion Quarterly 2007; 71:793-813.
21. Caplow, T., H.M. Bahr, V.R.A. Call. “The Polls--Trends, The Middletown Replications:
75 Years of Change in Adolescent Attitudes, 1924-1999,” Public Opinion Quarterly
68:287-313, 2004.
22. Chang L, JA Krosnick. Comparing oral interviewing with self-administered computerized
questionnaires: An experiment. Public Opinion Quarterly 2010; 74: 154-167.
23. Chang L, JA Krosnick. National surveys via rdd telephone interviewing versus the
CHS / Epi M218
Fall 2010
Page 4
internet: Comparing sample representativeness and response quality. Public Opinion
Quarterly 2009; 73: 641-678.
24. Christian, L.M., D.A. Dillman. “The Influence of Graphical and Symbolic Language
Manipulations on Response to Self-Administered Questions,” Public Opinion Quarterly
68:57-80, 2004.
25. Conrad, FG, MF Schober. Promoting Uniform Question Understanding in Today’s and
Tomorrow’s Surveys, Journal of Official Statistics 21: 215-231, 2005.
26. Conrad FG, J Blair. Sources of error in cognitive interviews. Public Opinion Quarterly
2009; 73: 32-55.
27. Converse, Philip E. & Michael W. Traugott, “Assessing the Accuracy of Polls &
Surveys,” Science 234:1094-1098, November 28, 1986.
28. Couper, M.P., “Survey Introductions and Data Quality,” Public Opinion Quarterly
61:317-338, 1997.
29. Couper, Mick P., Johnny Blair & Timothy Triplett, “A Comparison of Mail & E-mail for
a Survey of Employees in U.S. Statistical Agencies,” Journal of Official Statistics
15(1):39-56, 1999.
30. Couper, Mick P., “Web Surveys: A Review of Issues and Approaches,” Public Opinion
Quarterly 64:464-494, 2000.
31. Couper, M.P., R. Tourangeau. “Picture This! Exploring Visual Effects in Web Surveys,”
Public Opinion Quarterly 68:255-266, 2004.
32. Couper MP, E Sionger, FG Conrad, RM Groves. Experimental studies of disclosure risk,
disclosure harm, topic sensitivity, and survey participation. Journal of Official Statistics
2010; 26:287-300.
33. Curtin, R, S Presser, E Singer. “Changes in Telephone Survey Nonresponse Over the Past
Quarter Century,” Public Opinion Quarterly 69:87-98, 2005.
34. de Leeuw, ED. “To Mix or Not to Mix Data Collection Modes in Surveys,” Journal of
Official Statistics 21: 233-255, 2005.
35. Dengler, R., H. Roberts, L. Rushton, “Lifestyle Surveys--The Complete Answer?”
Journal of Epidemiology and Community Health 51:46-51, 1997.
36. Dillman, DA, A Gertseva, T Mahon-Haft. “Achieving Usability in Establishment Surveys
Through the Application of Visual Design Principles,” Journal of Official Statistics 21:
183-214, 2005.
CHS / Epi M218
Fall 2010
Page 5
37. Durrant GB, RM Groves, L Staetsky, F Steele. Effects of interviewer attitudes and
behaviors on refusal in household surveys. Public Opinion Quarterly 2010; 74:1-36.
38. Dykema, Jennifer, Nora Cate Schaeffer. “Events, Instruments, and Reporting Errors,”
American Sociological Review 65:619-629, 2000.
39. Erosheva EA, TA White. Issues in survey measurement of chronic disability: An example
from the national long term care survey. Journal of Official Statistics 2010; 26:317-339.
40. Frankenberg, E, NR Jones. “Self-Rated Health and Mortality: Does the Relationship
Extend to a Low Income Setting?” Journal of Health and Social Behavior 45: 441-452,
2004.
41. Fricker, S, M Galesic, R Tourangeau, T Yan. “An Experimental Comparison of Web and
Telephone Surveys,” Public Opinion Quarterly 69:370-392, 2005.
42. Fullilove, Mindy Thompson, “Comment: Abandoning 'Race' as a Variable in Public
Health Research--An Idea Whose Time Has Come,” American Journal of Public Health
88(9): 1297-1298, 1998.
43. Galesic M, M Bosnjak. Effects of questionnaire length on participation and indicators of
response quality in a web survey. Public Opinion Quarterly 2009; 73: 349-360.
44. Gardner, W., B.L. Wilcox, “Political Intervention in Scientific Peer Review,” American
Psychologist 48:972-983, 1993.
45. Gaziano, C. “Comparative Analysis of Within-Household Respondent Selection
Techniques,” Public Opinion Quarterly 69:124-157, 2005.
46. Groves, R.M., M.P. Couper, “Contact-Level Influences on Cooperation in Face-to-Face
Surveys,” Journal of Official Statistics 12(1):63-83, 1996.
47. Groves RM, E Peytcheva. The impact of nonresponse rates on nonresponse bias: A metaanalysis. Public Opinion Quarterly 2008; 72: 167-189.
48. Israel GD. Effects of answer space size on responses to open-ended questions in mail
surveys. Journal of Official Statistics 2010; 26: 271-285.
49. Keeter S, C Miller, A Kohut, RM Groves, S Presser. Consequences of reducing
nonresponse in a national telephone survey. Public Opinion Quarterly 2000; 64:125-148.
50. Krenzke T, L Li, K Rust. Evaluating within household selection rules under a multi-stage
design. Survey Methodology 2010; 36:111-119
51. Krosnick, Jon A., Allyson L. Holbrook, Matthew K. Berent, Richard T. Carson, W.
Michael Hanemann, Raymond J. Kopp, Robert Cameron Mitchell, Stanley Presser, Paul
CHS / Epi M218
Fall 2010
Page 6
A. Ruud, V. Kerry Smith, Windy R. Moody, Melanie C. Green, Michael Conaway, “The
Impact of ‘No Opinion’ Response Options on Data Quality, Non-Attitude Reduction or
an Invitation to Satisfice?” Public Opinion Quarterly 66:371-403, 2002.
52. Kornhauser, Arthur, and Paul B. Sheatsley, “Questionnaire Construction and Interview
Procedure,” Appendix B, in Claire Selltiz, Lawrence S. Wrightsman, Stuart W. Cook
(eds.), Research Methods in Social Relations, 3rd Edition, Holt, Rinehart and Winston,
1976, pp. 541-573.
53. Krosnick, Jon A., “Survey Research,” Annual Review of Psychology 50:537-67, 1999.
54. Lavin, Daniele, Douglas W. Maynard. “Standardization vs. Rapport: Respondent
Laughter and Interviewer Reaction During Telephone Surveys,” American Sociological
Review 66:453-479, 2001.
55. Lee S, HA Nguyen, M Jawad, J Kurata. Linguistic minorities in a health survey. Public
Opinion Quarterly 2008; 72:470-486.
56. Macera, Caroline, Sandra Ham, Deborah A. Jones, Dexter Kinsey, Barbara Ainsworth,
Linda J. Neff. “Limitations on the Use of a Single Screening Question to Measure
Sedentary Behavior,” American Journal of Public Health 91:2010-2012, 2001.
57. Martin, E., T.J. DeMaio, P.C. Campanelli, “Context Effects for Census Measures of Race
and Hispanic Origin,” Public Opinion Quarterly 54:551-566, 1990.
58. Mokdad, Ali H., Donna F. Stroup, Wayne H. Giles, “Public Health Surveillance for
Behavioral Risk Factors in a Changing Environment, Recommendations from the
Behavioral Risk Factor Surveillance Team,” Morbidity and Mortality Weekly Report 52
(RR-9), Centers for Disease Control, May 23, 2003.
59. Morrison RL, DA Dillman, LM Christian. Questionnaire design guidelines for
establishment surveys. Journal of Official Statistics 2010; 26:43-85.
60. Olsen, Jørn on behalf of the IEA European Questionnaire Group, “Epidemiology
Deserves Better Questionnaires,” International Journal of Epidemiology 27:935, 1998.
61. Petrolia DR, S Bhattacharjee. Revisiting incentive effects: Evidence from a randomsample mail survey on consumer preferences for fuel ethanol. Public Opinion Quarterly
2009; 73: 537-550.
62. Peytchev A. Survey breakoff. Public Opinion Quarterly 2009; 73: 74-97.
63. Peytchev A, RK Baxter, LR Carley-Baxter. Not all survey effort is equal: Reduction of
nonresponse bias and nonresponse error. Public Opinion Quarterly 2009; 73: 785-806.
64. Presser, S., M.P. Couper, J.T. Lessler, E. Martin, J. Martin, J.M. Rothgeb, E. Singer,
CHS / Epi M218
Fall 2010
Page 7
“Methods for Testing and Evaluating Survey Questions,” Public Opinion Quarterly
68:109-130, 2004.
65. Rizzo, L., J. M. Brick, I. Park, “A Minimally Intrusive Method for Sampling Persons in
Random Digit Dial Surveys,” Public Opinion Quarterly 68:267-274, 2004.
66. Sayles H, RF Belli, E Serrano. Interviewer variance between event history calendar and
conventional questionnaire interviews. Public Opinion Quarterly 2010; 74: 140-153.
67. Scheuren F, American Statistical A. What is a Survey?: American Statistical Association;
2004.
68. Schräpler JP, J Schupp, GG Wagner. Changing from PAPI to CAPI: Introducing CAPI in
a longitudinal study. Journal of official Statistics 2010; 233-269.
69. Shaeffer, EM, JA Krosnick. GE Langer, DM Merkle. “Comparing the Quality of Data
Obtained by Minimally Balanced and Fully Balanced Attitude Questions,” Public
Opinion Quarterly 69: 417-428, 2005.
70. Sigelman, L, S.A. Tuck, JK Martin. “What’s In a Name? Preference for ‘Black’ Versus
‘African-American’ Among Americans of African Descent,” Public Opinion Quarterly
69: 429-438, 2005.
71. Singer E, J Van Hoewyk, MP Maher. Experiments with incentives in telephone surveys.
Public Opinion Quarterly 2000; 64:171-188.
72. Singer E, editor. Special Issue: Nonresponse bias in household surveys. Public Opinion
Quarterly 2006; 70 (5).
a. Groves RM. Nonresponse rates and nonresponse bias in household surveys, 646675.
b. Abraham KG, A Maitland, SM Bianchi. Nonresponse in the American time use
survey: Who is missing from the data and how much does it matter? 676-703.
c. Johnson TP, YI Cho, RT Campbell, AL Holbrook. Using community-level
correlates to evaluate nonresponse. 704-719.
d. Groves RM, MP Couper, S Presser, E Singer, R Tourangeau, GP Acosta, L
Nelson. Experiments in producing nonresponse bias. 720-736.
e. Olson K. Survey participation, nonresponse bias, measurement error bias and total
bias. 737-758.
f. Keeter S, C Kennedy, M Dimock, J Best, P Craighill. Gauging the impact of
growing nonresponse on estimates from a national RDD telephone survey. 759779.
g. Brick JM, S Dipko, S Presser, C Tucker, Y Yuan. Nonresponse bias in a dual
frame sample of cell and landline numbers. 780-793.
h. Link MW, AH Mokdad, D Kulp, A Hyon. Has the national do not call registry
helped or hurt state-level response rates? A time series analysis. 794-809.
CHS / Epi M218
Fall 2010
Page 8
73. Smyth JD, DA Dillman, LM Christian, M McBride. Open-ended questions in web
surveys: Can increasing the size of answer boxes and providing extra verbal instructions
improve response quality? Public Opinion Quarterly 2009: 73: 325-337.
74. Stevens, Gillian & David L. Featherman, “A Revised Socioeconomic Index of
Occupational Status,” Social Science Research 10:364-395, 1981.
75. Stevens, Gillian & Joo Hyun Cho, “Socioeconomic Indexes and the New 1980 Census
Occupational Classification Scheme,” Social Science Research 14:142-168, 1985.
76. Suchman, L., B. Jordan, “Interactional Troubles in Face-to-Face Survey Interviews,”
Journal of the American Statistical Association 85(409):232-253, 1990, with
Commentary by Stephen E. Fienberg, Mary Grace Kovar and Patricia Royston, Emanuel
A. Schegloff, and Roger Tourangeau, and Rejoinder by Lucy Suchman and Brigitte
Jordan.
77. Tambor, E.S., G.A. Chase, R.R. Faden et al, “Improving Response Rates Through
Incentive and Follow-up: The Effect on a Survey of Physicians' Knowledge of Genetics,”
American Journal of Public Health 83:1599-1603, 1993.
78. Todorov, A., C. Kirchner, “Bias in Proxies’ Reports of Disability: Data from the National
Health Interview Survey on Disability,” American Journal of Public Health 90(8):12481253, 2000.
79. Toepoel V, M Das, A van Soest. Design of web questionnaires: The effect of layout in
rating scales. Journal of Official Statistics 2009; 25:509-528.
80. Tourangeau R, MP Couper, F Conrad. Color, labels, and interpretive heuristics for
response scales. Public Opinion Quarterly 2007; 71: 91-112.
81. Tourangeau R, RM Groves, C Kennedy, T Yan. The presentation of a web survey,
nonresponse and measurement error among members of web panel. Journal of Official
Statistics 2009; 25:299-321.
82. Tucker, C., J. M. Brick, B. Meekins, Household Telephone Service and Usage Patterns in
the United States in 2004: Implications for Telephone Samples. Public Opinion Quarterly
2007; 71: 3-22.
83. van Tuinen HK. Innovative statistics to improve our notion of reality. Journal of Official
Statistics 2009; 25: 431-465.
84. Wang, J.J., P. Mitchell, W. Smith, “Vision and Low Self-Rated Health: The Blue
Mountains Eye Study,” Investigative Ophthalmology and Visual Science 41(1):49-54,
2000.
CHS / Epi M218
Fall 2010
Page 9
85. Willis, G.B., P. Royston, D. Bercini, “The Use of Verbal Report Methods in the
Development and Testing of Survey Questionnaires,” Applied Cognitive Psychology
5:251-267, 1991.
86. Yan T, R Curtin, M Jans. Trends in income nonresponse over two decades. Journal of
Official Statistics 2010; 26: 145-164.
Course Materials Available on Course Web Site
Information about Institutional Review Boards
1. OPRR Reports, Protection of Human Subjects, Title 45, Code of Federal
Regulations, Part 46, Revised June 18, 1991, Reprinted March 15, 1994.
2. Siegel, Judith, Linda Bourque, Example of Submission, Questions Raised by the
IRB and Responses, 2002.
Materials developed at the UCLA Institute for Social Science Research
Engelhart, Rita, “The Kish Selection Procedure”
Codebooks
Example of a Codebook, December 1, 2002.
Also on earthquake web site:
http://www.sscnet.ucla.edu/issr/da/earthquake/erthqkstudies2.index.htm
The construction of scales and indexes
1. Inkelas, Moira, Laurie A. Loux, Linda B. Bourque, Mel Widawski, Loc H.
Nguyen, “Dimensionality and Reliability of the Civilian Mississippi Scale for
PTSD in a Postearthquake Community,” Journal of Traumatic Stress 13, 149-167,
2000.
2. McKennel, A.C., Chapter 7, “Attitude Scale Construction,” in C.A.
O'Muircheataugh & C. Payne (eds.), Exploring Data Structures, Vol. 1, The
Analysis of Survey Data, John Wiley & Sons, 1977, pp. 183-220.
3. Bourque, L.B, H. Shen. “Psychometric Characteristics of Spanish and English
Versions of the Civilian Mississippi Scale,” Journal of Traumatic Stress 2005;
18:719-728.
CHS / Epi M218
Fall 2010
Page 10
Materials related to the administration and analysis of data collected with
questionnaires
1. Questionnaire for Assignment #1
2. Record for Non-respondents
3. Enlistment Letters
4. Call Record
5. Formatting Questionnaires
6. Income Questions
7. Calculating Response Rates
8. Examples of Grids
9. Codebook and Specifications
10. Constructing a Code Frame
11. Scale Construction Example
Questionnaires, Specifications and Codebooks are also available at:
://www.sscnet.ucla.edu/issr/da/earthquake/erthqkstudies2.index.htm and
://www.ph.ucla.edu/sciprc/3_projects.htm under Disasters.
The books and articles listed above will give you a background on and an introduction to
surveys and questionnaires. Each book has different strengths and weaknesses. They should be
considered resources. The required books are available in the Health Sciences Bookstore. The
Recommended books are available in the various UCLA libraries. The decision as to which
books you buy and the order in which you read them is yours. I recommend reading all the
material you buy or check out as soon as possible. It will then be available to you as a resource
as we go through the quarter.
The articles on the web site provide you with examples of some of the journals where
research about questionnaires, their administration, and surveys can be found. They also provide
information about some of the “cutting-edge” issues of concern. Currently, a major focus is on
response rates, particularly for telephone interviews, and web-based administration of
questionnaires.
CHS / Epi M218
Fall 2010
Page 11
COURSE REQUIREMENTS AND GRADING
Subjects and Site:
Each student selects a topic on which s/he wants to design questionnaires, and the site(s)
at which s/he will conduct the interviews needed in pretesting the questionnaire. You are free to
select any site and any sample of persons with the following exceptions:
1.
All respondents MUST be at least 18 years of age.
2.
DO NOT collect information from respondents such as name, address, and phone
number which would enable them to be identified.
3.
DO NOT interview persons in the Center for Health Sciences or persons
connected with the Center for Health Sciences.
4.
DO NOT interview your fellow students, your roommates, your friends, your
relatives, or persons with whom you interact within another role (e.g., employees,
patients).
5.
DO NOT ask about topics which would require the administration of a formal
Human Consent Form.
Should you violate these requirements, the data collected will not fulfill the requirements
for an assignment in this class. Only interviews, not self-administered questionnaires, can be
used for pretesting the questionnaires developed in this class.
Course Objectives and Assignments:
The objective of this course is to learn how to design respectable questionnaires.
Research data can be collected in many ways. Questionnaires represent one way data is
collected. Although usually found in descriptive, cross-sectional surveys, questionnaires can be
used in almost any kind of research setting. Questionnaires can be administered in different ways
and the questions within a particular questionnaire can assume an infinite variety of formats.
As is true of any research endeavor, there are no absolutes in questionnaire design. There
are no recipes and no cookbooks. The context of the research problem you set for yourself will
determine the variety of questionnaire strategies that are appropriate in trying to reach your
research objective; the context will not tell you the absolutely “right” way to do it.
The final “product” for the quarter is a questionnaire designed in segments and pretested
at least three times. The questionnaire will be designed to collect data to test a research objective
specified by you during the second week of the quarter. The final version of the questionnaire is
due Wednesday, December 8th at 5:00 PM. All assignments must be typed; handwritten
materials are not accepted. Every version of your questionnaire must be typed, but final versions
should be as close to “picture-ready” copy as you can manage. For Assignment 5, due on
CHS / Epi M218
Fall 2010
Page 12
December 8th, you will provide the final copy of your questionnaire, a full copy of
Interviewer/Administrator Specifications, a Codebook and/or coding instructions, a summary of
data collected in your last pretest, a tentative protocol that could be used to analyze data collected
with your questionnaire, and what, if anything, further you would like to do if time allowed.
The following five assignments will move you toward the final product.
ASSIGNMENTS
Assignment 1: Practice Interviewing (5% of Final Grade) Due October 4
This assignment is designed to expose you to the process of interviewing. Questionnaires
will be handed out on the first day of class (September 27th). You are to conduct 9 interviews.
On October 5, turn in both the completed interviews and a brief write-up describing where you
went, what happened and a brief description of the data you collected. These materials are also
on the course web site.
In selecting respondents, go to a central public location such as a shopping area, the
beach, or a park. In conducting your interviews, try to obtain a range of ages, sexes, and ethnic
groups. You will be given identification letters to carry in case anybody asks who you are.
DO NOT INTERVIEW ON PRIVATE PROPERTY UNLESS YOU HAVE PERMISSION.
THIS AFFECTS MANY SHOPPING CENTERS.
Keep track of the characteristics of refusals on the “Record for Non-respondents.” A
refusal is a person you approach for an interview who turns you down.
Assignment 2: Statement of Your Research Question (5% of Final Grade)
Due October 6
Questionnaires are designed to get data that can be used to answer one or more research
questions. To help you get started, state a research question. Remember it should be relevant to
the interviewing sites available to this class. Is your research question, as written, testable?
What concepts are included in or implied by your question? Can your concepts be
operationalized into working definitions and variables for which a questionnaire is a viable data
collection procedure?
Assignment 3: “Mini-Questionnaire” #1. (25% of Final Grade) Due October 20
Part 1
Prepare and test “Mini-Questionnaire” #1. This represents your first attempt at designing
a questionnaire to test your “Research Question.” The substantive content of the questionnaire
should focus on current status, behaviors or knowledge. You can choose any topic that interests
you, but since our focus is on “health,” you may want to consider asking about: 1) Current acute
and chronic diseases, accidents, injuries, disabilities, and impairments; and 2) Knowledge and
CHS / Epi M218
Fall 2010
Page 13
use of health services.
In addition to substantive content, all questionnaires must collect demographic
information on such things as:
1.
2.
3.
4.
5.
Respondent age
Respondent education
Individual, family or household income
Occupation
Respondent marital status
There is no limit to the number of questions you may include. However, you must
provide a minimum of 6 questions in addition to the demographic questions discussed above. I
expect your questionnaire to include a mixture of open-ended and closed-ended questions.
Open-ended questions are particularly useful when you are in the process of exploring an area of
research or in the initial stages of designing a questionnaire.
In preparing the questions in your questionnaire, keep in mind the problems of survey
research design which have been discussed in class and in the readings. Pay particular attention
to the following:
1.
2.
3.
4.
5.
6.
7.
Respondent frame of reference--will it be the same as yours?
Level of concreteness/abstraction.
Question referent--is it clear, and is it what you intend?
Tone of question--will it stimulate yea-saying?
Balance--within the question and in the set of questions.
Problems of bias induced by wording--watch out for leading, loaded terms, etc.
Screening questions to reduce noise due to non-attitudes.
Indicate explicitly the format of the questions. How will it look? Present the questions in
the order you want them to appear in the questionnaire. Pay particular attention to the following:
1.
2.
3.
4.
Problems of preservation due to fatigue.
Problems of bias induced by contamination of response due to ordering of
questions.
Problems of threatening material/invasion of privacy.
Skip patterns to tailor questionnaire for various respondent types.
In sum, your questionnaire should look as much as possible like a finished product, ready
to be fielded or at least pre-tested.
Part 2
In addition to your questionnaire you must provide a justification for each question. This
is the beginning of writing Specifications. For each question or set of related questions there
should be a brief statement as to why the question is included/necessary, and the rationale behind
the format selected. IT IS NOT SUFFICIENT TO SAY “IT'S SELF-EVIDENT.” It is
CHS / Epi M218
Fall 2010
Page 14
NEVER self-evident to someone else--like me! Specifications should also include the research
question being tested and information about how your sample was selected and from where.
Part 3
Test your questionnaire by interviewing a convenience sample of at least five
respondents.
Part 4
You must also complete two applications for Human Subjects Protection specific to an
“Application for the Involvement of Human Participants in Social Behavioral and Educational
Research (SBER) and Health Services Research (HSR).” Go to
http://ohrpp.research.ucla.edu. Click on “Forms,”
(http://ohrpp.research.ucla.edu/pages/forms-main). Click on Application for Social Behavioral &
Educational Research (SBER) & Health Services Research (HSR)(HS-1) This gives you
“(HS-1)Application for the Involvement of Human Participants in Social Behavioral &
Educational Research (SBER) & Health Services Research (HSR).” This is the first form that
you must complete; we sometimes call it the long form.
Now go to Exempt Certifications (HS-7)
(http://www.oprs.ucla.edu/human/forms/exempt-certifications). Click on Exempt Categories.
Read through the categories so that you understand them. Then click on Certification of
Exemption Form - Categories 1, 2, 3, 5 and 6. This takes you to the second form that you must
fill out “Application for Certification of Exemption from IRB Review for Exemption Categories
1, 2, 3, 5, and 6.”
Further Information About the Protection of Human Subjects
If you are currently involved in any research activities, as a UCLA student you must be
certified. If you were certified in the past but did NOT complete the new certification process by
September 1, 2009, you cannot be involved in any UCLA research projects that involve human
subjects until you are certified. To find out about and complete the certification process, go to
http://ohrpp.research.ucla.edu/pages/certification.
REMEMBER!
All research that involves human subjects that is conducted by UCLA faculty, staff or
students MUST be cleared by the Office for the Protection of Human Subjects (OPRS).
On October 20, turn in:
1.
2.
3.
4.
5.
6.
All the completed interviews you did.
A copy of your Human Subjects materials: HS-1 which is required for the full
consent process; and HS-7 as if you qualified for and were requesting an
exemption.
A copy of your specifications.
One copy of the blank questionnaire for me.
Fifteen copies of the blank questionnaire; these are to share with your classmates.
A brief report (5-7 typed pages) describing the instrument you constructed, the
CHS / Epi M218
Fall 2010
Page 15
data collected with it, the respondents from whom the data was collected, what
you think worked well and what you think did not, and how you would change it.
Assignment 4: “Mini-Questionnaire” #2. (25% of Final Grade) Due November 17
Revise the questionnaire you designed in Assignment 3 in accordance with your accrued
wisdom and the succinct observations from me and your classmates.
Add a new set of questions that collects at least one of the following: sensitive behaviors,
retrospective data, or attitudes and opinions. For example, you might design questions that will
elicit information about substance abuse (e.g., use of alcohol), the use of non-traditional health
practices (e.g., faith healers, curanderos, over-the-counter drugs, other people's drugs, etc.),
threatening behaviors (e.g., abortion, etc.). Retrospective data might be collected about past
health care experienced by the respondent over his/her lifetime. Finally, you might find out the
respondents' opinions of their current or past health care. If you have a good reason, you could
adopt or adapt sets of questions from other studies if they help you get to your objective.
Explicitly indicate the format of the questions. Will there be a checklist? How should it
look if presented to the respondent? Do you need a card to cue the respondent? What should be
on the card? Are other visual aids needed?
Start designing a codebook that can be used with your questionnaire. The codebook
should include information on how verbal answers are converted to numbers, where the variables
are located, and what the variables are named. I recommend using your questionnaire as the
basis for your codebook.
Whenever you write a question, you should have in mind the probable responses--if you
cannot think of the responses, then you have not thought about the question enough!! The
process of setting up categories for expected (and finally actual) responses is called code
construction. Closed-ended, pre-coded questions have already had codes constructed for them;
the respondent is presented with a specified set of alternatives which are the codes used later in
data analysis. The only additional coding problem presented by pre-codes is how to handle
residuals. For the code construction assignment, you must consider each of your pre-coded
questions, assign numbers to the alternatives following the procedures outlined in class
discussions and readings, and solve the residual problem.
For open-ended questions, you have to consider all possible responses and list these along
with code numbers. Include instructions for the coder to follow regarding how many responses
are to be coded, any precedence rules to follow and any other problems you think might arise.
Remember in this case also to provide a way of handling residual categories.
Remember to include codes for the required questions on age, education, income and
marital status. Do not attempt to set up a code for occupation; do write a paragraph outlining
your thoughts about how one would go about coding occupational data.
CHS / Epi M218
Fall 2010
Page 16
You do not have to write specifications for this assignment. You may want to start
revising your old ones and writing new ones in anticipation of Assignment 5.
Test your questionnaire by interviewing at least five respondents.
On November 17, turn in:
1.
A report (7-10 typed pages) describing the development of the instrument--why
items were selected, how and why they were revised; the data collected with this
instrument; the sample of respondents from whom the data were collected.
2.
Sixteen copies of the blank questionnaire; one for me and 15 to share with your
classmates.
3.
The codebook.
Assignment 5: Your Magnum Opus! (40% of the final grade) Due December 8 by 5:00 PM
This is the culmination of all your work! Revise your earlier questionnaires consistent
with your vastly increased wisdom. Remember that you should have a “final product” that is as
close to “picture-ready copy” as you can manage. This questionnaire should include variable
names for coding. Turned in with the questionnaire are a final set of Specifications and a final
Codebook, along with a write-up that summarizes your pretest interviews of this version of the
questionnaire with 8-10 respondents, a proposed analysis plan, and discussion of any further
changes that might be considered were you to actually use this instrument in a study.
On December 8, turn in:
1.
A 7-10 page report that summarizes your pretest interviews, a proposed analysis
plan, and a discussion of any further changes that should be considered were you
to actually use this instrument in a study.
2.
One blank questionnaire.
3.
One set of final specifications.
4.
One final codebook.
CHS / Epi M218
Fall 2010
Page 17
GENERAL STATEMENTS ON GRADING AND
PRESENTATION OF ASSIGNMENTS
When you enter M218, it is assumed that you will exit with a grade of “B.” A “B” is a
good, respectable grade. I write lots of letters of recommendation for people who get “B’s” in
M218. An “A” grade is earned by doing a really exceptional job. If you end up with a “C”
grade, it probably will be because you did not make a serious effort in this class: you did not do
the reading, you never came to class, you left all the assignments for the night before, etc. In
other words, it is hard to get a “C” in this class, BUT if that is what you earn, then that is what
you will get.
It is expected that all assignments will be turned in on the date due. There are no
extensions. Incompletes are not given in this course.
CHS / Epi M218
Fall 2010
Page 18
CLASS SCHEDULE
WEEK/DATE, ASSIGNMENTS
TOPIC, RELEVANT READINGS
WEEK 1:
CONTEXT OF THE QUESTIONNAIRE
1.
Study Objective
2.
Sample and Unit of Analysis
3.
Types of Data to be Collected
4.
Surveys
5.
Funding Services: Contracts and Grants
September 27
Relevant Readings: Aday, Chapters 1-5, 6-7; Bourque &
Fielder, Chapters 1, 5; Bourque in Lewis Beck, Bryman,
Liao, pp. 229-230.
September 29
STARTING A RESEARCH STUDY
1.
Study Objective
2.
Research Questions
3.
Hypotheses, Concepts, and Working Definitions
4.
Variables: Independent, dependent, control
5.
Levels of Measurement
Relevant Readings: Aday, Chapters 1-5; Bourque &
Fielder, Chapter 1.
WEEK 2:
October 4
ASSIGNMENT 1 DUE
TYPES OF QUESTIONNAIRES
1.
Administrative Types
2.
Question Types: Open/Closed
3.
Information Obtainable by Questionnaire:
Facts, Behaviors, Attitudes
Relevant Readings: Aday, Chapters 1-5; Bourque &
Fielder, Chapter 1; Curin, Presser, Singer; Fricker et al.
October 6
ASSIGNMENT 2 DUE
HUMAN SUBJECTS PROTECTION AND FORMS
Download the HS-1 form, the HS-7 form, and the list of
Exemption categories before class from
http://ohrpp.research.ucla.edu. See instructions on pages
13-14.
Relevant Readings: OPRS web site and materials on class
web site.
CHS / Epi M218
Fall 2010
Page 19
WEEK 3:
October 11
QUESTIONS TO OBTAIN DEMOGRAPHIC
INFORMATION
1.
Why?
2.
How much?
3.
How?
4.
Location?
5.
Household Roster
6.
Selecting Questions from Other Studies
Relevant Readings: Aday, Chapters 8, 10; Bourque &
Fielder, Chapters 2, 3; Sigelman, Tuck, Martin; examples
on course web site and earthquake web site.
October 13
“BEGINNINGS” AND “ENDS” OF QUESTIONNAIRES
1.
Call Record Sheet
2.
Enlistment Letters
3.
Questions to Interviewer
Relevant Readings: Bourque & Fielder, Chapter 6;
examples on websites.
WEEK 4:
October 18 & 20
QUESTIONNAIRE SPECIFICATIONS
1.
Functions
2.
Format
Relevant Readings: Bourque and Fielder, Chapter 3.
ASSIGNMENT 3 DUE October 20
WEEK 5:
October 25
ASCERTAINING INFORMATION ABOUT
RETROSPECTIVE BEHAVIORS
1.
Grids
2.
Histories
3.
Aided Recall
4.
Use of Records
Relevant Readings: Aday, Chapter 11;
CHS / Epi M218
Fall 2010
Page 20
WEEK 5, continued
October 27
ASCERTAINING INFORMATION ABOUT
THREATENING BEHAVIORS
1.
Approaches
2.
Closed vs. Open
3.
Using Informants
4.
Location
5.
Validation
Relevant Readings: Barton.
WEEK 6:
November 1 & 3
CODEBOOKS AND CODE CONSTRUCTION
1.
Objective
2.
Types
3.
Content Analysis
Relevant Readings: Aday, Chapter 13; Bourque & Fielder,
Chapter 3; Bourque and Clark; Bourque, Coding, Code
Frames; examples on web sites.
WEEK 7:
November 8 & 10
APHA
Guest Lecturer: Tonya Hays, Former Field Director, UCLA
Survey Research Center
Tips on conducting interviews and monitoring the quality
of data collection.
WEEK 8:
November 15
FORMATTING QUESTIONNAIRES
1.
Order/Location
2.
Grouping
3.
Spacing
Relevant Readings: Aday, Chapter 12; Bourque & Fielder,
Chapter 4; Couper, Tourangeau; Krosnick et al; Shaeffer et
al.
CHS / Epi M218
Fall 2010
Page 21
November 17
MEASURING ATTITUDES
1.
Beginning
2.
Developing Composite Measures
3.
Use of Existent Measures
Relevant Readings: Aday, Chapter 11; examples on
websites.
ASSIGNMENT 4 DUE November 17
WEEK 9:
November 22 & 24
MEASURING ATTITUDES, CONTINUED
1.
Reliability
2.
Validity
WEEK 10:
November 29
DATA PROCESSING AND ANALYSIS OF
QUESTIONNAIRE DATA
1.
Raw Data vs. Processed File
2.
Coding
3.
Data Entry/Keypunching
4.
Cleaning
5.
Raw vs. Actual Variables
6.
Data Quality, Missing Data, etc.
Relevant Readings: Aday, Chapters 14, 15; Bourque &
Fielder, Chapter 6.
December 1
ADMINISTRATION OF SURVEYS
Relevant Readings: Aday, Chapter 13; Bourque & Fielder,
Chapter 6.
WEEK 11:
December 8
ASSIGNMENT 5 DUE AT 5:00 PM
CHS / Epi M218
Fall 2010
Page 22
OBJECTIVES
ASPH COMPENTENCIES RELEVANT MATERIALS
Upon completing this course…
Know how to design, develop, E.2. Identify the causes of
All textbooks, readings,
administer and document
social and behavioral factors
lectures and assignments.
questionnaires.
that affect health of
individuals and populations.
E.5. Describe steps and
procedures for the planning,
implementation and evaluation
of public health programs,
policies and interventions.
E.6. Describe the role of social
and community factors in both
the onset and solution of
public health problems.
E.8. Apply evidence-based
approaches in the development
and evaluation of social and
behavioral science
interventions.
C.1. Identify key sources of
data for epidemiologic
purposes.
C.10. Evaluate the strengths
and limitations of
epidemiologic reports.
Communication and
Informatics: The ability to
collect, manage and organize
data to produce information
and meaning that is exchanged
by use of signs and symbols;
to gather process, and present
information to different
audiences in-person, through
information technologies, or
through media channels; and
to strategically design the
information and knowledge
exchange process to achieve
specific objectives.
Program Planning: The ability
to plan for the design,
development, implementation,
CHS / Epi M218
Fall 2010
Page 23
Know how to write
questionnaire specifications.
Know how to develop
codebooks.
and evaluation of strategies to
improve individual and
community health.
K.3. Explain how the findings
of a program evaluation can be
used.
K.7. Differentiate among
goals, measurable objectives,
related activities, and expected
outcomes for a public health
program.
Communication and
Informatics: The ability to
collect, manage and organize
data to produce information
and meaning that is exchanged
by use of signs and symbols;
to gather process, and present
information to different
audiences in-person, through
information technologies, or
through media channels; and
to strategically design the
information and knowledge
exchange process to achieve
specific objectives.
Communication and
Informatics: The ability to
collect, manage and organize
data to produce information
and meaning that is exchanged
by use of signs and symbols;
to gather process, and present
information to different
audiences in-person, through
information technologies, or
through media channels; and
to strategically design the
information and knowledge
exchange process to achieve
specific objectives.
Assignments 3 and 5
Lectures on 10/18 & 10/20
Assignments 4 and 5
Lectures on 11/1 & 11/3
CHS / Epi M218
Fall 2010
Page 24
Know how to submit research
proposals for review by
Institutional Review Boards.
E.9. Apply ethical principles
Assignment 3
to public health program
Lecture on 10/6
planning, implementation and
evaluation.
J.2. Apply basic principles of
ethical analysis (e.g. the Public
Health Code of Ethics, human
rights framework, other moral
theories) to issues of public
health practice and policy.
Download