Bolivian Case (Chapter 11-13) Campaign (video, materials) Study Design & Data Collection Campaign Exposure Results Correlation between Exposure and Outcomes Correlation between Exposure and Outcomes with controls subgroup analysis ANOVA regress option Multiple (logistic) regression Panel Data Difference Scores Lagged Variables 1 Chapter 11: Measuring Program or Campaign Exposure • Program exposure is the degree audience (recipients) recall and recognize the program • Exposure measured to determine if intended audience received message and how they interpreted it. • Measured: 1) reach 2) freq. 3) understand & 4) impact 2 Measuring Exposure: Recall vs. recognition and spot vs. message (Measured for each medium) Table 11-1 Two dimensions of campaign exposure. Spot Message Recall (spontaneous) Did you hear/see something? What was the message? Recognition (prompted) Pictorial/video/aural cues Themes are read 3 Bolivia Campaign • • • • • • CCP Started in Bolivia in 1986 Intensive Lobbying w/ Policymakers Microcasettes for Autobuses IEC Subcommitte of the NRHP Committee Materials Development 1991-1993 Mass Media Campaigns 1994, 1995, 1996 (Phase I, I.5, II) 4 5 Botx 11-1. Other Campaign Elements of the National Reproductive Health Program In addition to the mass media campaign, there was other promotional items produced and disseminated during phase I. Material Description Counselor training Over 500 health counselors were trained in interpersonal communication and counseling via a trainer methodology Manual A counselor manual was developed and 400 copies were disseminated Clinic poster One thousand copies of a family planning poster were produced and disseminated Flyers Twelve different flyers on reproduction health were created and 30,000 copies of each were produced and disseminated Booklet A family planning booklet was produced and 127,000 copies were disseminated 6 Box 11-1 (cont.) Material Flipchart Description One thousand copies of a family planning flipchart to be used in counselor sessions were produced and disseminated Clinic Videos Three different clinic videos were created and 500 copies of each were distributed to health centers Advertising Poster Billboard An advertising poster was produced and 6000 copies were disseminated The slogans “Get information and services here” and Reproductive health is in your hands” were created AN advertising billboard was created and 500 copies were disseminated 7 Box 11-2. Description of Mass Media Spots for National Reproductive Health Promotion Eleven mass media spots were created for the Bolivia NRHP campaign. A video-tape of these spots with English subtitles is available from John Hopkins University Center for Community Program. Spot Ministry of Health Description The Bolivian Minister of Health stated the value of the reproductive health and the need to reduce maternal mortality Introduction Reproductive Health Explained the four major components of reproductive health: family planning, parental care, childbirth and breastfeeding Family Planning Presented different contraceptives, calendar, condoms, pills and IUD 8 Box 11-2 (cont.) Spot Parental care Breast-feeding Description Promoted health services attendance during pregnancy Promoted benefits of breast-feeding Childbirth Promoted being attended by a provider during delivery Abortion Presented family planning as a means to avoid unwanted pregnancy, and being faced with an abortion decision Family planning testimonial Parental care testimonial Childbirth testimonial Presented a satisfied user of family planning Presented a satisfied recipient of prenatal care Presented a mother pleased with having a provider attend her childbirth 9 Box 11-3: Campaign Broadcast Schedule for Phase 1 May June July Aug. Sept. Oct. Nov. Weeks 1 23 4 12 34 1 23 4 12 34 1 23 4 12 34 1 23 4 Minister X Intro. XX XX X X X XXXX X Rep. Health Fam. Plan. Prenatal Breastfeed. Childbirth Abortion FP Test. XX XXX X XX XXX XX XX X X PC Test. XX CB Test. XX Total 10 Figure 11-2: Spontaneous Campaign Recall 100 Percent 80 80.2 73.1 58.3 60 45.8 34.6 40 20 0 TV Spot TV Mes s age Radio Spot Radio Mes s age A ny Spontaneous Rec all 11 Figure 11-3: Campaign Cost per Exposure by Audience Reach for Levels of Theta Cost / Exposure 0.5 0.4 0.3 0.2 0.1 0 0.01 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1 Exposure 12 Figure 11-4. Images shown to respondents to assess TV spot recognition with prompting. 13 M in is try TV Spots 2 40 2 2 1 1 18 Ri ng er Cl in ic Cl in ic Ri ng er 17 Ri ng er 2 1 24 Cl in ic Cl in ic of He al th Pl an ni ng 20 1 M in is try Fa m ily Pr en at al ee di ng 30 Ri ng er of He al th Pl an ni ng Br ea st -f 30 Fa m ily ee di ng Po st na ta l 57 Pr en at al Br ea st -f bo rti on 40 Po st na ta l A Recognition (%) 60 bo rti on Te st im on ia l Pr en at al Te st im on ia l Ch ild bi rth 70 A Te st im Pr on en ia l at al Te st im on ia l Ch ild bi rth Recognition (%) Figure 11-5. TV spot recognition by time of broadcast. a. 80 67 56 50 39 31 24 16 10 12 0 TV Spots Figure 11-5 b. 80 70 60 50 Actual 30 Predicted 20 10 0 14 Ringers • Used to measure demand bias or measurement error • Images masquerade as true program products • Affirmative responses measure false positives • No agreed upon way to include in exposure scales, can – Divide scale by ringers – Subtract ringers from exposure scale – Include as separate variable 15 In Bolivia data • Same image from same commercial had 57% recognition • Clinic video image had 18% & 24% recognition (also assoc. with attendance at clinics) • 2 different images from same spot: 63% & 53% 16 Figure 11-6. Reported source of RH information (N=7016). Percent 70 60 TV 50 Radio 40 Friends 30 Other Media 20 Don't Know 10 0 2/1/1994 (Baseline) 11/1/1994 (Folow -up) 8/1/1996 (2nd Follow -up) Figure 11-7. Message Source by Respondent’s Language 70 66 61 55 60 48 Percent 50 42 36 40 30 39 TV 26 Radio 20 10 0 Spanish Quechua Aymara Other Respondent's Language 18 Table 11-2. Multiple Regression Coefficients (Betas ) for Campaign Exposure (TV Ad Recognition) on Socio-Demographic Characteristics, the Bolivia NRHP Mass Media Campaigns Spontaneous TV Ad Recognition Urban Samples November, 1994 (N=2,354) Panel Sample s August, 1996 (N=2,396) September, 1995 (N=798) February, 1996 (N=545) Round 2 (N=419) .35** Round 1 Education August, 1996 .14** .21** .08 .14 .21** .07 .17* Income .03 .05 .02 .01 -.02 Age -.04 -.18** -.05 -.07 -.08 # Children -.03 .06 -.04 .01 -.01 Female .15** .13** .29** .15** .03 Married .05 .05 .03 -.04 .01 City Size .20** -.03 - - - Own TV .14* .13** .12** .01 .01 Spanish .05* .08* .13** .03 Adjusted R 2 .12** .12** 12** .11** .10 .22** 19 Figure 11-8: NRHP Campaign Recall for Panel Data Percent 100 87 60 78 73 80 66 54 41 40 20 0 TV Spot TV Message Recall Type Oct-95 Feb-96 Aug-96 20 Figure 11-9: Recognition of TV spots by time of broadcast for panel sample (N=419) 100 80 Percent 60 Baseline Follow -up 40 20 0 rth bi d il Ch ia on m ti s Te l ta na e Pr s Te ia on m ti A on rti o b s Po l ta a tn st a e Br ng di e fe l ta a en th ng al ni e n H a Pr Pl of y ily tr m is Fa in M 1 er g n Ri ic in Cl 1 ic in Cl 2 2 er g n Ri 21 Figure 11-10: Conceptual model of selectivity types SES Literacy Behavior Motivational Access Campaign Exposure Cognitive Decoding Conditional Behavior Methodological Behavior Predisposition Knowledge, Attitudes & Practices Cuing No Behavior Response Bias Change 22 Ch. 12: Measuring Outcomes • Determine outcomes • For the Bolivia study: – Knowledge –awareness of FP methods – Positive attitude toward FP – Current use of modern FP method 23 Data Collection • Survey designed & pilot tested by JHU/CCP; IEC Subcommittee of NRHP; & E&E, private public opinion firm • Two urban probability samples before and after campaign • Probability sample from one small city (Potosi) treated as a panel • Middle & lower middle income residents • Equal % of men and women, matched to interviewers 24 Table 12-1. Study Design for Phase 1 Group 1 Feb./’94 Mar.Oct./’94 O1 X1 Group 2 X1 Group 3 X1 Nov./’94 Sep./’95 O2 O3 Oct.Jan./’96 Feb./’96 N (X2) 2256 (X2) 2354 X2 O4 545 X1 - Initial Broadcast of Mass Media Campaign X2 - Re-broadcast of Mass Media Campaign 25 Questionnaire 1) demographics (Q1-Q15); 2) attitudes (Q16-Q18); 3) FP awareness and use (Q19-Q28); 4) RH detailed knowledge (Q29-Q31); 5) RH service access (Q34-Q47) 6) Breastfeeding (Q48-Q54) 7) Campaign exposure and media use (Q56-Q75) 8) Personal networks (Q76-Q89) 26 Data Analysis • Created variable “wave” • KAP outcomes created – Knowledge created variables for “know” & “don’t know” – Attitude constructed by summing 10 attitude items – Practice – dummy indicator “use any modern method” 27 Figure 12-1. KAP scores at baseline, 9month, & 28-month follow up. 87 90 88.2 88.8 80 70 60.8 Percent 60 63.4 56.4 March-'94 50 November-'94 35.5 40 35.7 August-'96 30.2 30 20 10 0 Knowledge (Awareness) Attitude (Scale Score) Practice (Method Use) 28 Control Variables • • • • • • • Education – 1 to 6 Income – 1 to 6 Age Number of children City prevalence rank Own TV Speak Spanish 29 Table 12-2. Sample Characteristics for Married Woman in Urban Bolivia (N= 2818) Cross-sectional survey waves Factor March 1994 (N = 915) November 1994 (N = 1014) August 1996 ( N = 889) Family planning awareness (%) 56 61 63 Attitude score (%) 87 88 89 Current modern use (%) 30 36 36 Age [average (SD)] 31.0 (7.12) 31.4 (7.38) 30.8 (7.41) No. children [average (SD)] 2.72 (1.82) 2.71 (1.72) 2.65 (1.67) 30 Table 12-2. Sample Characteristics (cont.) Cross-sectional survey waves Factor: Education* March 1994 (N = 915) November 1994 (N = 1014) August 1996 ( N = 889) None 5.7 4.4 2.6 Primary 24.9 23.2 20.9 Middle 18.8 17.4 17.2 Secondary 33.8 35.8 38.7 Technical 6.7 8.0 7.4 Some post-secondary 10.2 11.2 13.2 * p < 0.05 31 Table 12-2. Sample Characteristics (cont.) Cross-sectional survey waves Factor: Income* March 1994 (N = 915) November 1994 (N = 1014) August 1996 ( N = 889) None 0.98 0.7 0.2 80-140 Bs 10.4 5.4 4.3 141-500 Bs 54.2 48.7 40.9 501-800 Bs 19.2 25.4 26.6 801-1100 Bs 8.2 10.6 15.6 1100+ Bs 7.0 9.1 12.4 * p < 0.01 32 Table 12-2. Sample Characteristics (cont.) March 1994 (N = 915) November 1994 (N = 1014) August 1996 ( N = 889) Age [average (SD)] 31.0 (7.12) 31.4 (7.38) 30.8 (7.41) No. children [average (SD)] 2.72 (1.82) 2.71 (1.72) 2.65 (1.67) 33 Table 12-2. Sample Characteristics (cont.) Cross-sectional survey waves Factor: City Prevalence March 1994 (N = 915) November 1994 (N = 1014) August 1996 ( N = 889) El Alto 12 26 14 La Paz 21 35 28 Cochabamba 33 27 32 Sucre 24 25 39 Oruro 31 32 42 Tarija 36 38 49 Santa Cruz 48 46 54 34 Table 12-2. Sample Characteristics (cont.) Cross-sectional survey waves Factor: March 1994 (N = 915) November 1994 (N = 1014) August 1996 ( N = 889) Own TV* [average (SD)] 89.3 (30.9) 92.5 (26.3) 91.1 (28.5) Speak Spanish** [average (SD)] NA 86.6 (34.1) 81.6 (38.8) Campaign exposure ** [average (SD)] NA 41.7 (26.6) 56.4 (28.7) *p < 0.05; ** p <0.01 35 Table 12-3. Regression Coefficients for KAP on Controls and Campaign Exposure. Awareness Attitude FP Use Education 0.34** 0.08 1.27* Income 0.14* 0.06 1.26* Age -0.02 -0.11 0.99 # children 0.07 0.09 1.14* City Prev. 0.03 0.11* 1.16* Own TV 0.07 0.06 1.55 Spanish -0.02 -0.07 2.59 Exposure 0.10* 0.06 1.89* R-squared 0.19* 0.04* 0.09* 36 Panel Data • 798 residents randomly selected in Sept. 1995 • Re-interviewed 545 in Jan./Feb. 1996 • Re-interviewed 419 in Aug. 1996 (over 50% retention rate). 37 Attrition Analysis • Respondents lost to: – 1) refusal – 2) cannot be located – 3) no longer qualify • Create variable in baseline dataset indicating status (e.g. refused, not found, don’t qualify, participated) • Conduct bivariate & multivariate comparing interview status with control and outcome varialbes • Valente & Saba 1998 appendix provides results 38 Figure 12-2: Knowledge, Attitude, and Practice Scores at Baseline, 6-Month, and 11-month Follow-up for the Panel Sample of Married Women in Potosí, Bolivia (N=212). 87.9 90 88 89.9 80 65.7 70 57.5 Percent 60 50 March-'94 46.8 November-'94 August-'96 40 30 21 20 21 13.5 10 0 Knowledge (Awareness) Attitude (Scale Score) Practice (Method Use) 39 Table 12-4. Panel Data: Regression Coefficients for KAP on Controls and Campaign Exposure. Change In: Awareness Attitude FP Use Education 0.39** 0.06 1.39 Income -0.05 -0.14 1.02 Age 0.26 0.15 0.98 # children -0.11 0.03 1.23 Own TV -0.04 0.01 0.57 Spanish -0.43* -0.22 1.31 Exposure 0.11 0.07 3.84 R-squared 0.17* 0.03 0.06* 40 Change scores criticized • Regression to the mean – Is there a correlation between change and baseline? – R(aware, change-aware) = -0.56 etc. • Assumes perfect correlation between baseline & followup • Equations 12-1 to 12-4 illustrate 41 Correlation between the Baseline and the Change in Outcome Indicators Correlation between the Baseline and the Change in Outcome Indicators. Difference In: Baseline Score Awareness RH Know. Attitude Intention IPC Cur. Use Awareness -0.56 -0.29 0.04 -0.08 -0.21 0.12 RH Know. -0.18 -0.65 0.02 -0.04 -0.25 0.18 RH Attit. -0.03 -0.08 -0.70 -0.07 -0.11 0.03 Intention -0.17 -0.12 0.01 -0.59 -0.24 0.17 IPC -0.11 -0.19 -0.11 -0.02 -0.75 0.18 Cur. Use -0.15 -0.14 0.04 -0.29 -0.22 0.20 42 Table 12-5. Lagged Analysis Awareness Attitude FP Use Baseline Score 0.23* 0.25 6.84* Education 0.47** -0.04 1.36 Income 0.00 -0.03 0.87 Age 0.22 0.03 0.96 # children -0.15 0.11 1.34 Own TV 0.04 -0.02 0.54 Spanish -0.26* 0.02 1.43 Exposure 0.24* 0.15 1.91 R-squared 0.38* 0.06 0.13* 43 Reporting Results • Standardized vs. Un-standardized coefficients • Un-standardized = Y = coefficient times change in one unit of x • Standardized varies between –1 & 1. • Un-standard. useful for discussing changes and magnitude of impact whereas standardized useful comparing magnitudes between variables • Reporting significance p<.05, .01, .001 for example 44 Box 12-3: Mass. Anti Tobacco Campaign • Siegal & Biener’s (2002) interviewed 592 randomly selected 12-15 years old • Original follow up survey was used as baseline and new follow up conducted 4 years later • 12-13 year old youth exposed to campaign less likely to become established smoker • Those who recalled the campaign were more like to report lower smoking prevalence rates 45 Chapter 13: Advanced Design & Statistical Topics • 8 Advanced Statistical topics • Some require special data (multi wave analysis, meta-analysis) some special software (Structural Equation Modeling) • Advanced statistical techniques are no substitute for methodological (and logical) rigor 46 Bolivia 3-wave Analysis (Phase II Evaluation) • Data collected in July-August 1996 • Surveyed new urban sample and reinterviewed panel • Same questionnaire with new color images for spot recall and some new campaign messages 47 Stepwise Regression • Tradeoff between type I and type II error • “The investigator can neither afford to make spurious positive claims (type I) nor fail to find important relationships (type II). • Stepwise regression guards against this tradeoff by only including those variables or blocks of variables that contribute significantly to the explained variance in the outcome. 48 Table 13-1. Stepwise multiple regression analysis for three-wave dataa Cross-sectional data (N = 2818) Multiple Regression Logistic Regression Knowledge Attitude Use Past modern use 0.18* 0.13* 1480* Education 0.32* 0.01 Dropped Income 0.08* 0.04 Dropped Age -0.02 -0.07† Dropped No. children 0.05 0.05†† Dropped City Method use rank -0.01 0.01 Dropped Campaign recall 0.09* Dropped Dropped Campaign recognition 0.07† 0.05† 2.19† Adjusted R² 0.24* 0.03* 0.51* aEducation, income, age and number of children were entered as a block *p < 0.001; †p< 0.01; ††p< 0.05 49 Table 13-1. Stepwise multiple regression analysis for three-wave dataa Panel Data (N = 171) Multiple Regression Logistic Regression Knowledge Attitude Use Outcome at time 1 0.32* 0.37* 15.8- Outcome at time 2 Dropped Dropped 4.9 † Past modern use 0.11* Dropped Dropped Education 0.25 † Dropped Dropped Income -0.01 Dropped Dropped Age -0.03 Dropped Dropped No. children 0 Dropped Dropped City Method use rank -- -- -- Campaign recall Dropped Dropped Dropped Campaign recognition 0.28* 0.15 † † Dropped Adjusted R² 0.49* 0.15* 0.32* aEducation, income, age and number of children were entered as a block *p < 0.001; †p< 0.01; ††p< 0.05 50 Structural Equation Modeling SEM is a technique used to estimate simultaneous linear models SEM is conducted with specialty software such as LISREL (Linear Structural Relations); EQS; 51 Conceptual Model of Campaign Impact SES Knowledge Gender Income Education Campaign Exposure Practice Psychographic Characteristics Attitude Family Characteristics Interpersonal Contacts 52 SEM Advantages • Model multiple dependent variables • Assess variable measurement at the same time you assess theoretical model fit • Provides direct and indirect effect estimates • Can test models with mediating variables 53 SEM Disadvantages • Can be difficult to use since it involves new software • Difficulty testing large models (I.e., those with numerous variables) • Difficulty testing measurement and theoretical models simultaneously 54 Conceptual Model of Campaign Impact SES Knowledge Gender Income Education Campaign Exposure Practice Psychographic Characteristics Family Characteristics Attitude Interpersonal Contacts 55 Figure 13-1: Conceptual model for impact of Bolivia reproductive health communication campaign. 0.35 Education Knowledge 0.12 0.06 0.11 0.14 0.12 Income 0.10 0.19 Campaign Exposure -0.08 Age 0.10 0.07 0.08 Practice 0.17 -0.05 0.15 0.09 0.04 0.04 Number of Children City Prevalence Rank -0.08 0.06 Attitude Direct Effect Exposure on Practice=0.07 Indirect Effect Exposure on Practice=0.02 D=0.28 Chi-Square=7.3 (p=NS) AGFI=.996 RMSR=0.007 56 SEM Terms • • • • Theoretical and measurement model Inner and outer model Endogenous and Exogenous variables Beta refers to parameters linking dependent variables • Gamma refers to parameters linking independent vars. to dependent vars. • Coefficient of Determination (D) is R squared 57 Figure 13-2 Time 1 Time 2 Time 3 0.19 Education 0.28 Exposure 0.35 0.20 Exposure Exposure 0.12 Income 0.25 0.21 Marital St. 0.13 Method Awareness 0.19 0.20 Age Method Use 0.35 Method Awareness 0.21 0.41 0.29 0.18 Method Use Method Awareness 0.26 0.39 0.20 Method Use # Children D=31% AGFI = 94% RMSR=0.06 Event History Analysis Study of events over time Appropriate when the researcher has a longitudinal record for individuals It addresses two issues: • Censoring • Time-varying covariates Censoring • Censoring occurs when the longitudinal record is incomplete • Right-censoring occurs when the data end prematurely (data are collected before everyone has the event). • Left-censoring occurs when the data start after the event has begun for some. Time-varying Covariates • Variables that change over time Examples • Product price or perceived price • Proportion of adopters in one’s network Data Are Reshaped Data contain information on each respondent’s ID and the time period of the interview Data are reshaped so that each person-time period constitutes one observation. A person who was interviewed 5 times will contribute five observations to the data set. Time constant covariates are constant and the time varying ones will have varying values Dependent Variables Hazard Rate Proportion of events that occur at each time period Outcome is binary (0/1) Use Logistic regression to estimate: log(p(t)/1-p(t)) as a function of intercept and covariates Table 13-2. Schematic Diagram of Data Reconfiguration for Event History Analysis Original Data Reconfigured Data Observation No. ID No. Year of Adoption New Observation No. ID No. Adoption Year Indicator 1 1 6 1 1 1 0 2 2 4 2 1 2 0 3 3 2 3 1 3 0 4 1 4 0 5 1 5 0 6 1 6 1 7 2 1 0 8 2 2 0 9 2 3 0 10 2 4 1 11 3 1 0 12 3 2 64 1 Time Series • Data on behavior over a long period of time typically 30 or more time points • Data should be: – – – – – – Periodic Accurate Reliable Consistent Sufficient Diverse 65 Meta-Analysis: Literature 1. Gather all published and unpublished studies on a topic 2. Construct a table to tally relationships 3. Compute the proportion of support found for various relationships. 4. Categorize the literature in terms of which and what degree concepts have been explored and how they relate to one another. 66 Meta-Analysis: Statistical 1. Gather studies. 2. Record the sample sizes, means, and variances of the dependent and independent variables under investigation. 3. Record the relationship found between the two variables in each of the studies. 4. Calculate the meta-level relationship. 67 Weighing Data • Bias in data collection sometimes needs to be corrected • Samples can be adjusted to one another or to some external standard (e.g., census data). • To weigh data, the researchers creates a variable that signifies the proportion each case should represent in the analysis • For example, a sample with 60% women and 40% requiring weights to get equal representation would have a weighting variable of 0.83 for women and 1.25 for men. (i.e., for women: 60x=50, x=.83; for men: 40x=50, x=1.25) 68 Cost-effectiveness & Cost-benefits • Cost-effectiveness: Ratio of Cost to effect computed for a program and usually used to compare different interventions/programs between each other or the status quo. • Cost benefits: $Benefit-$Costs to determine a quantifiable amount of benefit derived from an investment - generally calculated on the individual level. 69 Ch. 14: Dissemination of Evaluation Findings • The most important yet most neglected aspect of evaluation • Neglected because don’t know what to do until findings are known • Dissemination is controversial • Dissemination should follow a certain framework • Findings are communicated in a number of stages • Findings need to be tailored to their intended audience • Communication of findings is a process not a one time event and not a product 70 Plan for Dissemination • Projects that are not documented do not exist. • Plan for documentation of the program and its evaluation • Lessons are learned in all evaluations whether they be about program implementation or about evaluation 71 Dissemination Conflict 3 Reasons for conflict 1) Evaluators & Designer disagree on appropriate audience & methods 2) Pressure to report positive results 3) Resources 72 4 Techniques for Dissemination 1. Meet with designers and other stakeholders 2. Presentations at meetings, conferences, workshops, & professional societies 3. Key findings or preliminary report 4. Academic journal article 73 1. Meet with Designers 1. So they can be respond to stakeholders 2. So they can plan future activities 3. So they help with interpretation of findings 4. Test your own interpretations 74 2. Conference Presentations • Valuable opportunity to share results and get advice • Opportunity to test results and interpretation • Opportunity to identify colleagues who can help with further analysis and interpretation • Numerous meetings exist 75 3. Key Findings Report • • • • • • Executive Summary Program Description Study Design Sample Description Estimate of Effect Discussion 76 4. Academic Journal Publication • • • • • Most evaluations are not published Academic publishing is time consuming Rewarding to academics not designers Journals sponsored by professional associations Editor, styles, formats, and emphasis vary quite considerably 77 Steps in the Academic Journal Publication Process - Table 14-1 1. 2. 3. 4. 5. 6. 7. Analysis completed, findings agreed upon Draft article Circulate to colleagues for internal and External review Paper presented at Conference Submit to appropriate journal Editor (editorial board) decides to send out for review (12 months) 8. 2-4 blind reviews are conducted and communicated to editor (2-3 months) and decision made (reject, R&R, R&R accepted, accept as is) 78 Academic Journal Steps (cont.) 9. Revised and resubmitted (2-4 months) 10. Paper is resubmitted 11. Editor sends back to reviewers (usually just negative ones) (2-3 months) 12. Final decision made, author receives acceptance notice and further instructions 13. Receive publication date, and review galleys 2 months prior to publication 14. Paper appears in print 79 Policy Process • It takes a long time for evaluation findings to be translated into policy changes • Figure 14-1 depicts the actors involved • Process is messy and non-linear and may be driven by emotions and feelings more than data • A lot of interest in evidence based practices 80 Figure 14-1: Research findings inform many stages in the social change process. Policy Feedback Technology Transfer Agency Utilization Utilization Utilization 81 Technology Transfer • The conversion of sponsored research into commercial application • TT is designed to get some return on Federal sponsorship of research • Teflon developed by NASA for space program 82 Dissemination • The communication of information to constituents or audiences • Diffusion is the spread of new ideas and practices • Evaluation findings not used nearly as much as hoped or as we would like 83 Utilization • Unfortunately, utilization of evaluation research findings is generally less than optimal • Programmatic/intervention decisions often made for political or public opinion or personal reasons rather than based on data • As evaluators we need to (be prepared to) provide accurate and reliable data for health/policy issues • Influenced by cues or triggers (public agenda) 84 Agenda Setting Process Political Agenda Public Agenda Media Agenda 85 Table 14-2: Bolivia Dissemination Timeline Activity First Campaign Ends Meeting with Programmers Debrief NRHP/AID Programmer/Funder Follow-up Meeting Release Preliminary Findings Report Second Campaign Begins Submit Academic Paper CR Paper Appears Submit 2nd Academic Paper Date November, 1994 March, 1995 May, 1995 October, 1995 February, 1996 March, 1996 March, 1997 February, 1998 April, 1999 86 Graphics • Graphic displays of data, results and concepts is a very important component of evaluation communication • Graphics grabe the attention of the reader • Individuals often process information more easily through visual display than through text • Graphics can communicate complex relationships 87 Interpretation • One of the hardest and least appreciated activities is the time it takes to interpret findings • Evaluation findings often under-utilized because findings are not completely interpreted • What does study mean? What are next steps? Who changed? Can this be replicated? • What are the barriers to change? How much time will it take? 88 Ethical Considerations • • • • • Data collection (informed consent) Data sharing Data Reporting Our own personal conduct IRBs 89 Budgets • Most important aspect of research • Appendix D provides sample budget • Budgets are estimates, they provide the opportunity to think through what will happen, and who will do what, how long it will take • 10-15% of program budget should go to evalaution 90 Book Summary • Text has attempted to summarize the steps, procedures, and techniques for evaluating health promotion programs. • It is intended to help evaluators, those who fund evaluation, and those who want to understand it. • Every evaluation will be different, but lessons learned and experience are learned. • Material presented in 3 sections theory, methods, results. • Bolivia program was a prominent example, and may not be typical. • It is hoped that this material will facilitate the science the program evalaution. 91 Study Design (Box 14-1) 1. 2. 3. 4. 5. 6. 7. 8. 9. 10. 11. 12. Set goals & objectives (ch. 1 & 2) Use theory to describe the behavior (ch. 3) Design the study (ch. 6) Determine sample size & sample selection (ch. 7) Determine threats to validity (ch. 6) Plan formative research to understand the setting (ch. 3) Plan Process research to monitor implementation (ch. 5) Specify analysis plan (ch. 9-13) Specify the budget (ch. 14/ Appendix D) Draft the instruments to be used (ch. 8/ Appendix B) Estimate timeline Create dissemination plan 92 Steps in Evaluation Project (Box 14-2) 1. 2. 3. 4. 5. 6. 7. 8. 9. 10. Project is initiated Develop evaluation design (Box 14-1) Conduct formative research Program is monitored Post program data collected Analyze data to measure impact Present findings to program planners Preliminary findings report written Interpret data to determine overall impacts of program Recommendations for future programs are made 93 Future Program Evaluation • Evaluations should be consistent conceptually, operationally, & empirically. • Do the results tell a valid and consistent story of what happened? • Do the findings make sense, or are there inconsistencies to be resolved? • Can they be replicated? 94 Future of Program Evaluation • Growing demand for evaluators • Increasing pressure and need for talented and experienced evaluators • Need for capable qualitative and quantitative analytical skills • Becoming more challenging 95