Evaluation of Health Promotion CS 652 Sarah N. Keller Overview Terms Methods Approaches Future Research Terms Formative evaluation Needs assessment; baseline data collection; audience research; results feed into message/campaign design Process evaluation Monitoring; tracking progress & activities; shortterm evaluation; in-process impact Impact evaluation Outcomes evaluation; end-user changes in KAP Methods Focus groups Develop clearer understanding of perceptions, values, tastes, psychosocial factors N = 8-12 Data = qualitative data recorded & transcribed Procedure = group interview, moderated by a single facilitator &/or note-taker Analysis = qualitative analysis for key phrases, research questions, language, etc. Purpose = formative research; message design; pretesting; monitoring research; audience reactions & feedback Methods Surveys Written, telephone, in-person, or online Questions = closed-end or open-end Data = quantitative & qualitative data N = 200 (informal market sample) – 1,500 (minimum for national representation of any subgroup) Analysis – Bivariate correlations; cross-sectional; multivariate regression Purposes = compare baseline to outcome indicators; pre-/post-test comparison; descriptive information; correlations of two or more variables with each other Methods In-depth Interviews Process = face-to-face; one-on-one; structured or unstructured Data = qualitative, recorded & transcribed Analysis = qualitative Purpose = to uncover personal, sensitive or in-depth information; psychosocial factors, anecdotes, personal histories Online Methods Online Surveys Samples = Convenience sample of who comes to a site; Internet users at large, or of a specific demographic of health risk category Experimental design = can direct one group to look at a Web site, and not others (treatment and control) Process = can create online forms attached to Web sites that automatically spit data into a database Data = quantitative or qualitative Analysis = same as with other surveys Online Methods Usability studies Purpose = Find out how users search for a topic or interface with a site; to revise or pre-test a Web site. Process = Convene a group of users in a PC room & ask them to conduct “scavenger hunts” Data = audio, video, observation notes, log data, etc. Online Methods Online Focus Groups Can be conducted in chat rooms, email, or by recruiting volunteers via a Web site form Purpose = Assess users’ opinions of a topic, Web site, or health area; etc. Data = Qualitative Limits = Invasion of privacy, reliability of self-reports, and informed consent process Approaches: CPP Evaluating Community Planning Define the evaluation goals Document that CPP has taken place Document whether program goals are being met Identify strengths & weaknesses Holtgrave, D.R., et al. (1996). Methodological issues in evaluating HIV Prevention Community Planning. Public Health Reports, 111 (suppl. 4). Approaches: CPP Draw a Logic Model of CPP Process Logic model = Graphic representation of an intervention; purpose is to show logical connections between conditions, activities & outcomes. Conditions/problems = What CPP designed to change Activities = Components of CPP undertaken Outcomes = Short-term program goals Approaches: CPP Develop Evaluation Plan Goal #1 (Process objective) – To document that the activities have taken place as planned. Goal #2 (Outcomes) – To document whether shortterm program goals are being met. Specific, measurable outcomes derived from general goals. Goal #3 – (Lessons learned) Identify strengths & weaknesses (e.g., an objective might not be reasonable, or more technical assistance may be required). Approaches: CPP Translating plan into action Core objectives – National assessment of community planning does not require data on all process & outcome variables that might be useful at a local level. Budgetary reporting Case studies Example: Hospital Marketing Goals: Determine attributes that physicians consider important in deciding to join hospitals Determine which factors most affect physicians’ evaluation of hospital quality Cronin, JJ., Joyce, M.L. (1987). Medical staff perceptions: Implications for the design of hospital marketing programs. JHCM, 7 (3). Example: Hospital Marketing Lit Review Shows physicians are important influential agents in determining consumer hospital selection Methods Identify research variables Hospital performance factors Attributes that define hospital’s “offering” to physicians Approaches: Hospitals Methods Focus group (n=8) physicians to discuss “What actually defines what a hospital has to offer to you in order to attract you to use facility?” AND, “What determines your evaluation of a particular hospital?” Analysis Attributes categorized into 11 indicators Evaluative dimensions categorized into 4 groups Used for survey design Approaches: Hospitals Methods Mail survey (n=169) – of physicians responded Setting – 4 hospitals in Florida 3-part instrument Questions to capture main Research Q’s Demographics Approaches: Hospitals Instrument Respondents asked to recall decision to join hospital and rate importance of each of 11 attributes on a scale from 1-5. Respondents asked to indicate perception of performance of area’s 4 hospitals based on 11 attributes Respondents asked to evaluate 4 hospitals using 4 evaluative dimensions from focus groups Example: Hospitals Quantitative data analysis Means – Mean response for each of 11 attributes Correlation – Relationship b/t physicians’ perception of hospitals performance (using 11-attributes as IVs) & their evaluation on dimensions (4 dimensions as DVs). Multiple regression See which factors (clusters of attributes) explained most variance in perceptions (evaluation of hospitals) Limitations Research settings & samples need to be expanded for generalizability Privacy, informed consent an issue Cost of research Process vs. outcome Use of theory