Conducting systematic reviews of public health and health promotion interventions Nicki Jackson Senior Training and Support Officer Cochrane Health Promotion and Public Health Field Overview Background to systematic reviews International systematic review initiatives Resources required Setting the scope of your review Asking an answerable question Searching for studies Data abstraction Principles of critical appraisal Synthesis of evidence Interpretation of results Writing the systematic review Objectives This workshop will enable you to: 1. Be familiar with some of the key challenges of conducting systematic reviews of health promotion and public health interventions 2. Formulate an answerable question about the effectiveness of interventions 3. Identify primary studies, including developing strategies for searching electronic databases Objectives cont. 4. 5. 6. 7. Evaluate the quality of an individual health promotion or public health study Synthesise the evidence from primary studies Formulate conclusions and recommendations from the body of evidence Evaluate the quality of a systematic review Acknowledgement The Public Health Education and Research Program (PHERP) “Promoting and facilitating evidence-based policy and practice in Public Health and Health Promotion” Sydney Health Projects Group, School of Public Health, University of Sydney School of Public Health, La Trobe University Cochrane Health Promotion and Public Health Field Background to systematic reviews Types of reviews Reviews (narrative/literature/ traditional) Systematic reviews Meta-analysis Narrative reviews Usually written by experts in the field Use informal and subjective methods to collect and interpret information Usually narrative summaries of the evidence Read: Klassen et al. Guides for Reading and Interpreting Systematic Reviews. Arch Pediatr Adolesc Med 1998;152:700-704. What is a systematic review? A review of the evidence on a clearly formulated question that uses systematic and explicit methods to identify, select and critically appraise relevant primary research, and to extract and analyse data from the studies that are included in the review* *Undertaking Systematic Reviews of Research on Effectiveness. CRD’s Guidance for those Carrying Out or Commissioning Reviews. CRD Report Number 4 (2nd Edition). NHS Centre for Reviews and Dissemination, University of York. March 2001. High quality Structured, systematic process involving several steps : 1. 2. 3. 4. 5. 6. 7. Plan the review Formulate the question Comprehensive search Unbiased selection and abstraction process Critical appraisal of data Synthesis of data (may include meta-analysis) Interpretation of results All steps described explicitly in the review Systematic vs. Narrative reviews Scientific approach to a review article Criteria determined at outset Comprehensive search for relevant articles Explicit methods of appraisal and synthesis Meta-analysis may be used to combine data Depend on authors’ inclination (bias) Author gets to pick any criteria Search any databases Methods not usually specified Vote count or narrative summary Can’t replicate review Advantages of systematic reviews Reduce bias Replicability Resolve controversy between conflicting studies Identify gaps in current research Provide reliable basis for decision making Increased interest in systematic reviews Government interest in health costs Variations in practice Public want information Facilitated by computer developments Competing factors and pressures Expectations Evidence Experience Opinions Financial pressures Time pressures Who benefits? Practitioners - current knowledge to assist with decision making Researchers - reduced duplication - identify research gaps Community - recipients of evidence-based interventions Funders - identify research gaps/priorities Policy makers- current knowledge to assist with policy formulation Limitations Results may still be inconclusive There may be no trials/evidence The trials may be of poor quality The intervention may be too complex to be tested by a trial Practice does not change just because you have the evidence of effect/effectiveness Clinical vs. public health interventions Clinical Public health Individuals Populations and communities Single interventions Combinations of strategies Outcomes only (generally) Processes as well as outcomes Often limited consumer input Involve community members Quantitative approaches to in design and evaluation research and evaluation Qualitative and quantitative Health promotion theories and beliefs International systematic review initiatives Sources of systematic reviews in HP/PH Cochrane Collaboration Guide to Community Preventive Services (The Guide), US The Effective Public Health Practice Project, Canada Health Development Agency, UK The Evidence for Policy and Practice Information and Co-ordinating Centre (EPPICentre), UK Centre for Reviews and Dissemination, UK The Campbell Collaboration Cochrane Collaboration Named in honour of Archie Cochrane, a British researcher In 1979: “It is surely a great criticism of our profession that we have not organised a critical summary, by specialty or subspecialty, adapted periodically, of all relevant randomised controlled trials” Cochrane Collaboration International non-profit organisation that prepares, maintains, and disseminates systematic up-to-date reviews of health care interventions The Cochrane Library www.thecochranelibrary.com The Cochrane Library Cochrane Systematic reviews : Cochrane reviews and protocols Database of Reviews of Effects: Other systematic reviews appraised by the Centre for Reviews and Dissemination. Cochrane Central Register of Controlled Trials: Bibliography of controlled trials (some not indexed in MEDLINE). Cochrane database of Methodology Reviews: Cochrane reviews of methodological studies. The Cochrane Methodology register: Bibliography of studies relating to methodological aspects of research synthesis About the Cochrane Collaboration: Information about review groups, Fields, Centres, etc. Contact details provided. Health Technology Assessment Database: HTA reports NHS Economic evaluation database: Economic evaluations of health care interventions. Organisation Review groups Fields Steering group Centres Methods groups Consumer network Collaborative Review Groups Focused around health problems (50) Produce reviews Editorial base facilitates review process International and multidisciplinary eg. Airways Group Heart Group Skin Group Drugs and Alcohol Group Injuries Group Breast Cancer Group Cochrane Centres Support review groups and reviewers within area (13) Promote Cochrane Collaboration Link to Government and other agencies Not a production house for reviews eg. Australasian Cochrane Centre South African Cochrane Centre Italian Cochrane Centre Cochrane Health Promotion and Public Health Field Registered in 1996. Administered from Melbourne. Funded by VicHealth (Co-directors at EPPI-Centre, UK) 330 members on contact database across 33 countries Aims: Promoting the conduct of reviews on HP/PH topics Educating HP/PH practitioners about the Cochrane Collaboration, encouraging use of systematic reviews Referring professionals to other databases Reviews in HP/PH Primary prevention of alcohol misuse in young people Parent-training programmes for improving maternal psychosocial health Interventions for preventing childhood obesity Interventions for preventing eating disorders in children and adolescents Supported housing for people with severe mental disabilities For further information The Cochrane Collaboration http://www.cochrane.org The Cochrane Health Promotion and Public Health Field http://www.vichealth.vic.gov.au/cochrane/ The Australasian Cochrane Centre http://www.cochrane.org.au The Cochrane Library http://www.thecochranelibrary.com Other sources The Guide to Community Preventive Services http://www.thecommunityguide.org/ Other sources Effective Public Health Practice Project (EPHPP) http://www.city.hamilton.on.ca/PHCS/EPHPP/default.asp Other sources Health Development Agency http://www.hda-online.org.uk/ Other sources Evidence for Policy and Practice Information and Coordinating Centre (EPPI-Centre) http://eppi.ioe.ac.uk Other sources Centre for Reviews and Dissemination http://www.york.ac.uk/inst/crd Other sources The Campbell Collaboration http://www.campbellcollaboration.org/ Resources required Conduct of systematic reviews Topic of relevance or interest Team of co-authors Training and support Access to/understanding of stakeholders or likely users Funding and time (at least 6 months) Access to databases of published and unpublished literature Statistical software, if appropriate Bibliographic software Review manuals Cochrane Collaboration Reviewers’ Handbook Cochrane Collaboration Open Learning Materials NHS Centre for Reviews and Dissemination Guidance for those Carrying Out or Commissioning Reviews The Methods of the Community Guide A Schema for Evaluating Evidence on Public Health Interventions EPPI-Centre Reviewers’ Manual Guidelines for HP/PH reviews Cochrane Health Promotion and Public Health Field website http://www.vichealth.vic.gov.au/cochrane /activities/guidelines.htm Setting the scope of your review Advisory Groups Policy makers, funders, practitioners, recipients/consumers Make or refine the review question Provide background material Help interpret the findings Assist with dissemination Formal (role descriptions) or informal LUMP OR SPLIT? Lump or split? Users needs Policy – broad reviews to answer questions when there is a range of options Practitioners – more-specific interventions or approaches Lump – inform which interventions to implement, more time-consuming Split – Yes/No to implement, less time Lumping or splitting Interventions to modify drug-related behaviours for preventing HIV infection in drug users Interventions to modify sexual risk behaviours for preventing HIV infection Interventions to modify sexual risk behaviours for preventing HIV infection in men who have sex with men. Interventions for preventing HIV infection in street youth Interventions for preventing HIV infection in young people in developing countries Counselling and testing for preventing HIV infection Writing your protocol 1) Background Why is it important? How important is the problem? Is there uncertainty? What is the reasoning as to why the intervention(s) might work? (include theoretical frameworks) Other similar reviews? Writing your protocol 2) Objectives What are the questions/hypotheses? 3) Selection criteria PICO(T) Population(s) Intervention(s) Comparison(s) Outcomes (Primary / Secondary) Types of studies Writing your protocol 4) Planned search strategy Databases and terms 5) Planned data extraction Processes and outcomes? More than one reviewer? Planned quality appraisal (incl. checklists) 6) Method of synthesis Tabulate Narrative/qualitative synthesis or meta-analysis Asking an answerable question Importance A clearly framed question will guide: the reader in their initial assessment of relevance the reviewer on how to collect studies on how to check whether studies are eligible on how to conduct the analysis Questions of interest Effectiveness: Does the intervention work/not work? Who does it work/not work for? Other important questions: How does the intervention work? Is the intervention appropriate? Is the intervention feasible? Is the intervention and comparison relevant? Answerable questions EFFECTIVENESS A description of the populations P An identified intervention I An explicit comparison C Relevant outcomes O A PICO question Time-consuming question: What is the best strategy to prevent smoking in young people? An answerable question Q. Are mass media (or school-based or community-based) interventions effective in preventing smoking in young people? Choose to look at mass media interventions ……… The PICO(T) chart Problem, population Intervention Comparison Outcome Types of studies Young people under 25 years of age a) Television b) Radio c) Newspapers d) Bill boards e) Posters f) Leaflets g) Booklets a) School-based interventions b) No intervention a) objective measures of smoking (saliva thiocyanate levels, alveolar CO) b) self-reported smoking behaviour c) Intermediate measures (intentions, attitude, knowledge, skills) d) Media reach a) RCT b) Controlled before and after studies c) Time series designs Types of study designs Randomised controlled trial Quasi-randomised/pseudo-randomised controlled trial/controlled clinical trial Controlled before and after study/cohort analytic (pre and post-test)/concurrently controlled comparative study Uncontrolled before and after study/cohort study Interrupted time series Qualitative research See handbook Inequalities as an outcome Health inequalities “the gap in health status, and in access to health services, between different social classes and ethnic groups and between populations in different geographical areas.”1 Other factors used in classifying health inequalities2 Place of residence Race/ethnicity Occupation Gender PROGRESS Religion Education Socio-economic status Social capital 1 Public Health Electronic Library. http://www.phel.gov.uk/glossary/glossaryAZ.asp?getletter=H. 2 Evans T, Brown H. Injury Control and Safety Promotion 2003;10(2):11-12. Inequalities reviews First Cochrane review Effectiveness of school feeding programs for improving the physical, psychological, and social health of disadvantaged children and for reducing socio-economic inequalities in health Defining effectiveness: More effective for disadvantaged than advantaged Potentially effective: Equally effective for both groups (prevalence of health problems greater in disadvantaged groups) If intervention is only allocated to disadvantaged and is effective Incorporating inequalities into a review Reviews rarely present information on differential effects of interventions Systematic review methods for distilling and using information on relative effectiveness are underdeveloped. Difficult to locate studies – need broad search Need original data from authors Low power to detect subgroup differences Complexity and variety of study designs Finding the evidence Systematic review process 1. 2. 3. 4. 5. 6. Well formulated question Comprehensive data search Unbiased selection and abstraction process Critical appraisal of data Synthesis of data Interpretation of results A good search Clear research question Comprehensive search All domains, no language restriction, unpublished and published literature, upto-date Document the search (replicability) 1. Electronic searching Database choice should match area of interest: Medical: Medline, EMBASE, CINAHL Social Science: PsycINFO, Social Science Citation Index, Sociological Abstracts Educational: ERIC Other: AGRIS (agricultural), SPORTSDiscus (sports), EconLit (economics) Other registers: CENTRAL (Cochrane), BiblioMap (EPPI-Centre), HealthPromis (HDA) MeSH / subject headings Textwords Components of electronic searching 1. Describe each PICO component 2. Start with primary concept 3. Find synonyms a) Identify MeSH / descriptors / subject headings b) Add textwords 4. Add other components of PICO question to narrow citations (may use study filter) 5. Examine abstracts 6. Use search strategy in other databases (may need adapting) Example Mass media interventions to prevent smoking in young people P= Young people STEP ONE: Find MeSH and textwords to describe young people Tick ‘map term’ to find MeSH Click on i to find other suitable terms Example Mass media interventions to prevent smoking in young people P= Young people MeSH: Adolescent Child Minors Example Mass media interventions to prevent smoking in young people P= Young people Textwords: Adolescent Child Juvenile Young people Student Girl Boy Teenager Young adult Youth BOOLEAN OPERATORS - OR Or is more! Similar terms – combine MeSH and textwords for each PICO element Broader results Adolescent A d o l e s c e n t S t u d e n t Student Textwords Truncation $: To pick up various forms of a word Teen$.tw Smok$.tw Teenage Smoke Teenager Smoking Teenagers Smokes Teens Smoker Teen Smokers Textwords Wild cards ? and #: To pick up different spellings Colo?r.tw (? Can be substituted for one or no characters) Colour Color Wom#n.tw (# Substitutes for one character) Woman Women Textwords Adjacent ADJn: retrieves two or more query terms within n words of each other, and in any order Great when you are not sure of phraseology Eg sport adj1 policy Sport policy Policy for sport Eg mental adj2 health Mental health Mental and physical health Example continued Mass media interventions to prevent smoking in young people I = Mass media interventions STEP TWO: Find MeSH and textwords to describe mass media interventions Example continued MeSH Mass media Audiovisual aids Television Motion pictures Radio Telecommunications Newspapers Videotape recording Advertising Example continued Mass media interventions to prevent smoking in young people O = Prevention of smoking STEP THREE: Find MeSH and textwords to describe prevention of smoking BOOLEAN OPERATORS – AND Fewer records Different concepts – each element of PICO Focussed results Smoking Smoking adolescent Adolescent Example of search P = YOUNG PEOPLE MeSH Textwords ………………………. ………………………. ………………………. ………………………. ………………………. ………………………. OR I = MASS MEDIA MeSH Textwords ………………………. ………………………. ………………………. ………………………. ………………………. ………………………. OR O = PREVENTION OF SMOKING OR AND I AND C C = (if required) MeSH ………………………. ………………………. ………………………. P Textwords ………………………. ………………………. ………………………. AND O Different bibliographic databases Databases use different types of controlled vocabulary Same citations indexed differently on different databases Medline and EMBASE use a different indexing system for study type PsycINFO and ERIC do not have specific terms to identify study types Need to develop search strategy for each database CINAHL Medline PsycINFO Compare subject headings MEDLINE Adolescent CINAHL Adolescence PsycINFO Adolescent attitudes Child Mass media Child Communications Mass media media subject heading not used! Pamphlets Radio Television Radio Television Radio Television Study design filters RCTs Non-RCTs Not yet developed, research in progress Qualitative research See Cochrane Reviewer’s Handbook Specific subject headings used in CINAHL, ‘qualitative research’ used in Medline CINAHL Filter: Edward Miner Library http://www.urmc.rochester.edu/hslt/miner/digital_library/tip_sheets /Cinahl_eb_filters.pdf Systematic reviews/meta-analyses CINAHL: as above Medline http://www.urmc.rochester.edu/hslt/miner/digital_library/tip_sheets /OVID_eb_filters.pdf Medline and Embase http://www.sign.ac.uk/methodology/filters.html PubMed Other sources of primary research Other sources of primary research Other sources of primary research Other sources of primary research 2. Unpublished literature Only 30-80% of all known published trials are identifiable in Medline (depending on topic) Only 25% of all medical journals in Medline Non-English language articles are underrepresented in Medline (and developing countries) Publication bias – tendency for investigators to submit manuscripts and of editors to accept them, based on strength and direction of results (Olsen 2001) 2. Unpublished literature Hand searching of key journals and conference proceedings Scanning bibliographies/reference lists of primary studies and reviews Contacting individuals/agencies/ academic institutions Neglecting certain sources may result in reviews being biased Examples of search strategies HEALTH - - Cochrane Injuries Group Specialised Register Cochrane Library databases MEDLINE EMBASE National Research Register EDUCATIONAL/PSYCHOLOGICAL - - PsycInfo ERIC (Educational Resources Information Center SPECTR (The Campbell Collaboration's Social, Psychological, Educational and Criminological Trials Register) TRANSPORT - - NTIS TRIS ITRD RANSDOC Road Res (ARRB) ATRI (Australian Transport Index) GENERAL - Zetoc (the British Library conference proceedings database) - SIGLE (System for Information on Grey Literature in Europe) - Science (and Social Science) Citation Index There was no language restriction. In addition we undertook a general Internet search focusing on the websites of relevant road safety organisations. Reference lists of all potentially eligible studies were examined for other relevant articles and experts in the field were contacted for additional information. The database and website searches were performed during the early months of 2002. Ker K, Roberts I, Collier T, Beyer F, Bunn F, Frost C. Post-licence driver education for the prevention of road traffic crashes. The Cochrane Database of Systematic Reviews 2003, Issue 3. Art. No.: CD003734. DOI: 10.1002/14651858.CD003734. Examples of search strategies Project CORK BIDS ISI (Bath Information and Data Services) Conference proceedings on BIDS Current contents on BIDS PSYCLIT ERIC (U.S.A.) ASSIA MEDLINE FAMILY RESOURCES DATABASE EMBASE Health Periodicals Database Dissertation Abstracts SIGLE DRUG INFO SOMED (Social Medicine) Social Work Abstracts National Clearinghouse on Alcohol and Drug Information Mental Health Abstracts DRUG INFO. DRUG database Alcohol and Alcohol Problems Science Database - ETOH Foxcroft DR, Ireland D, Lister-Sharp DJ, Lowe G, Breen R. Primary prevention for alcohol misuse in young people. The Cochrane Database of Systematic Reviews 2002, Issue 3. Art. No.: CD003024. DOI: 10.1002/14651858.CD003024. Librarians are your friends! Data abstraction DATA ABSTRACTION Effective Public Health Practice Project reviews The Community Guide http://www.thecommunityguide.org/methods/a bstractionform.pdf Effective Practice and Organisation of Care Review Group http://www.epoc.uottawa.ca/tools.htm NHS CRD Report Number 4. http://www.york.ac.uk/inst/crd/crd4_app3.pdf Details to collect Publication details Study design Population details (n, characteristics) Intervention details Theoretical framework Provider Setting Target group Study details (date, follow-up) Consumer involvement Process measures – adherence, exposure, training, etc Context details Outcomes and findings Pilot test on a sub-sample of studies Example 1 - RCT Table 3. Weighted mean difference in BMI standard deviation score and vegetable intake between the five intervention schools and their control schools BMI Vegetable intake Weighted mean difference % weight Weighted mean of school difference % weight of school 1 0 (-0.2 to 0.1) 25.8 0.2 (-0.1 to 0.4) 25.5 2 0.1 (0 to 0.2) 18.0 0.4 (0.2 to 0.7) 18.2 3 0.1 (-0.1 to 0.2) 22.5 0.3 (0.1 to 0.5) 23.0 4 -0.1 (-0.3 to 0) 19.8 0.4 (0.1 to 0.7) 16.0 5 -0.2 (-0.3 to 0) 13.9 0.1 (-0.1 to 0.4) 17.4 Overall 0 (-0.1 to 0.1) 0.3 (0.2 to 0.4) Example 2 - CBA Table 2. Estimated Differences in Daily Dietary Intake Based on Repeated 24-Hour Recalls at Follow-up for Children in Intervention (n=173) vs Control (n=163) Schools, Controlling for Baseline Measures No. fruits and vegetables per 4184 kJ Control 1.41 Intervention 1.78 Principles of critical appraisal Systematic review process 1. 2. 3. 4. 5. 6. Well formulated question Comprehensive data search Unbiased selection and abstraction process Critical appraisal of data Synthesis of data Interpretation of results Critical appraisal The process of systematically examining research evidence to assess its validity, results and relevance before using it to inform a decision. Alison Hill, Critical Appraisal Skills Programme, Institute of Health Sciences, Oxford http://www.evidence-based-medicine.co.uk Critical appraisal I: Quantitative studies Why appraise validity? Not all published and unpublished literature is of satisfactory methodological rigour Just because it is in a journal does not mean it is sound! Onus is on you to assess validity! Quality may be used as an explanation for differences in study results Guide the interpretation of findings and aid in determining the strength of inferences Why appraise validity? Poor quality affects trial results by exaggerating intervention effect: Inadequate allocation concealment exaggerated treatment effects by 35-41% (Moher 1998, Schulz 1995) Lack of blinding of subjects exaggerated treatment effect by 17% (Schulz 1995) Open outcome assessment exaggerated treatment effect by 35% (Juni 1999, Moher 1998) “The medical literature can be compared to a jungle. It is fast growing, full of dead wood, sprinkled with hidden treasure and infested with spiders and snakes.” Peter Morgan, Scientific Editor, Canadian Medical Association Bias 1. 2. 3. 4. 5. 6. 7. 8. Selection bias Allocation bias Confounding Blinding (detection bias) Data collection methods Withdrawals and drop-outs Statistical analysis Intervention integrity Selection bias Recruit participants Allocation of concealment Intervention Allocation Control Confounding Exposed to intervention Integrity of intervention Not exposed to intervention Follow-up Intention-to-treat Follow-up Withdrawals Outcome Analysis Blinding of outcome assessment Outcome Data collection methods Analysis Statistical analysis Selection bias Recruiting study population Differences in the way patients are accepted or rejected for a trial, and the way in which interventions are assigned to individuals Difficult in public health studies Question One: checklist a) b) Are the individuals selected to participate in the study likely to be representative of the target population? What percentage of the selected individuals/schools, etc agreed to participate? Allocation bias Randomisation (coin-toss, computer) Alternate, days of week, record number Allocation schedule Allocation Intervention Control Allocation Intervention Control Allocation bias Need comparable groups Randomisation = similar groups at baseline Allocation schedule should not be administered by person who is responsible for the study to prevent manipulation Concealed allocation? Lancet 2002; 359:614-18. Allocation bias Reduced by: centralised randomisation on-site computer system with group assignments in a locked file sequentially numbered, sealed, opaque envelopes any statement that provides reassurance that the person who generated the allocation scheme did not administer it Not: alternation, dates of birth, day of week. Question Two: checklist Allocation bias: Type of study design RCT Quasi-experimental Uncontrolled study Confounding Need similar groups at baseline Determine which factors could confound the association of the intervention and outcome Non-randomised studies – can never adjust for unknown confounding factors (and difficulties in measuring known confounding factors) If confounding – adjusted for in analysis Question Three: checklist Confounders: Prior to intervention, were there differences for important confounders reported? Were the confounders adequately managed in the analysis? Were there important confounders not reported? Blinding outcome assessors Detection bias – Blinding of outcome assessors to prevent systematic differences between groups in the outcome assessment Question Four: checklist Blinding Were the outcome assessors blind to the intervention status of participants? Yes No Not applicable (if self-reported) Not reported Data collection methods More often subjective outcomes in health promotion Require valid and reliable tools Question Five: checklist Data collection methods Were data collection methods shown or known to be valid and reliable for the outcome of interest? Withdrawals from study Attrition bias Systematic differences between groups in losses of participants from the study Look at withdrawals, drop-outs Question Six: checklist Withdrawals and drop-outs What is the percentage of participants completing the study? Statistical analysis Power / sample size calculation Intention-to-treat Cluster studies Allocate by school/community etc Generally analyse at individual level Unit of analysis errors Appropriate sample size determination Question Seven: checklist Is there a sample size calculation? Is there a statistically significant difference between groups? Are the statistical methods appropriate? Unit of allocation and analysis? Was a cluster analysis done? Intention to treat analysis Integrity of intervention = Fidelity = Implementation = Delivery of intervention as planned Integrity of intervention PH/HP are complex – multiple components Integrity – delivery of intervention Adherence to specified program Exposure – no. of sessions, length, frequency Quality of delivery Participant responsiveness Potential for contamination RELATED TO FEASIBILITY School-based AIDS program 19-lesson comprehensive school based program Unit Unit Unit Unit 1: 2: 3: 4: Basic information on HIV/AIDS Responsible behaviour: delaying sex Responsible behaviour: protected sex Caring for people with AIDS School-based AIDS program Program had no effect on knowledge and attitudes and intended behaviour No contamination Focus groups: Program not implemented Role plays and condoms not covered Teachers only taught topics they preferred Shortage of class time, condoms is controversial, teachers left or died Gimme 5 Fruit, Juice and veges School-based intervention curriculum included components to be delivered at the school and newsletters with family activities and instructions for intervention at home. Small changes in F, J, V consumption All teachers were observed at least once during the 6week intervention. Only 51% and 46% of the curriculum activities were completed in the 4th and 5th grade years In contrast, teacher self-reported delivery was 90%. Davis M, Baranowski T, Resnicow K, Baranowski J, Doyle C, Smith M, Wang DT, Yaroch A, Hebert D. Gimme 5 fruit and vegetables for fun and health: process evaluation. Health Educ Behav. 2000 Apr;27(2):167-76. Question Eight: Checklist What percentage of participants received the allocation intervention? Was the consistency of the intervention measured? Is contamination likely? Different study designs Non-randomised studies Allocation of concealment bias Confounding – uneven baseline characteristics Uncontrolled studies Cannot determine the size of the effect – the effect relative to that which might have occurred in the absence of any intervention Example – allocation bias Non-randomised study “Randomisation was not possible because of the interests of the initial participating schools in rapidly receiving intervention materials” Bias cont.. Rivalry bias ‘I owe him one’ bias Personal habit bias Moral bias Clinical practice bias Territory bias Complementary medicine bias ‘Do something’ bias ‘Do nothing’ bias Favoured/disfavoured design bias Resource allocation bias Prestigious journal bias Non-prestigious journal bias Printed word bias ‘Lack of peer-review’ bias Prominent author bias Unknown or non-prominent author bias Famous institution bias Large trial bias Multicentre trial bias Small trial bias ‘Flashy title’ bias Substituted question bias Esteemed professor bias Geography bias Bankbook bias Belligerence bias Technology bias ‘I am an epidemiologist’ bias Quality of reporting ≠ quality of study It may be necessary to contact the authors for further information about aspects of the study or to collect raw data Schema for Evaluating Evidence on Public Health Interventions Record the scope of the review and review question Appraise each article or evaluation report Formulate summary statement on the body of evidence http://www.nphp.gov.au/publications/rd/schemaV4.pdf Five sections 1. Recording the purpose and scope of 2. 3. 4. 5. your review Evaluating each article in the review Describing the results Interpreting each paper Summarising the body of evidence Critical appraisal II: Qualitative studies Qualitative research … explores the subjective world. It attempts to understand why people behave the way they do and what meaning experiences have for people. Undertaking Systematic Reviews of Research on Effectiveness. CRD’s Guidance for those Carrying Out or Commissioning Reviews. CRD Report Number 4 (2nd Edition). NHS Centre for Reviews and Dissemination, University of York. March 2001. Uses of qualitative research Qualitative research relevant to systematic reviews of effectiveness may include: Qualitative studies of experience Process evaluation Identifying relevant outcomes for reviews Help to frame the review question Fundamentals Accepts that there are different ways of making sense of the world Study is ‘led’ by the subjects’ experiences, not researcher led (open) No one qualitative approach: different questions may require different methods or combinations of methods Findings may translate to a similar situation, but are not usually generalisable or totally replicable CASP appraisal checklist 1. 2. 3. 4. 5. 6. 7. 8. 9. Clear aims of research (goals, why it is important, relevance) Appropriate methodology Sampling strategy Data collection Relationship between researcher and participants Ethical issues Data analysis Findings Value of research (context dependent) 1. Aim of research Describes why the research is being carried out Goal Importance Relevance 2. Appropriate methodology Does it address any of the following? What is happening? How does it happen? Why does it happen? Eg, why women choose to breastfeed, why is there miscommunication, how did the intervention work? 3. Qualitative research methods Observation – non-verbal and verbal behaviour by notes, audio, video Interviews – semi- or unstructured Text – diaries, case notes, letters Focus groups – semi- or unstructured 4. Recruitment method How were participants selected? Eg. Maximum variation approach? Why was this method chosen? 5. Data collection How was data collected? Did the researchers discuss saturation of data? Are the methods explicit? 6. Reflexivity Meaning given to data Types of interview questions asked Researcher Area being studied Venue 7. Ethical issues Consent Confidentiality Professional responsibility Advice Reporting 8. Data analysis Interpretations are made by the researcher Often uses thematic analysis Transcribe data, re-reads it and codes it into themes/categories Is there a description of the analysis? How were the themes derived? Credibility of the analysis Method of analysis Clarity of approach Use of all of the data Triangulation Respondent validation 9. Statement of findings Findings are explicit Quality of the argument Replicability by another researcher Alternative explanations explored 10. Value of research Contribution to knowledge Potential new areas of research or interventions Applicability of results Other qualitative checklist See NHS CRD Report Number 4 http://www.york.ac.uk/inst/crd/report4.htm Quality framework Government Chief Social Researcher’s Office, UK http://www.strategy.gov.uk/files/pdf/Quality_fram ework.pdf Synthesising the evidence Steps 1. 2. Table of study data Check for heterogeneity No – meta-analysis b. Yes – identify factors, subgroup analysis or narrative synthesis a. 3. 4. Sensitivity analyses Explore publication bias Steps Step One: 1. Table of study data Year Setting Population details (including any baseline differences) Study design Intervention details (including theory) Control group details Results Study quality Study Setting Sample size and characteristics Unit of randomisati on and analysis Theory Intervention Length of follow-up Outcome Baseline characteristi cs Aarons et al 2000 6 Junior High Schools Washington DC 582 grade 7 students Mean age 12.8 years 52% female, 84% AfricanAmerican, 13% low socioeconomic status Randomisati on: School Analysis: Individual Social Cognitive Theory 3 reproductive health lessons taught by health professionals, 5 sessions of postponing sexual involvement curriculum. Control group: conventional programme 3 months, 96.4% followed Intercourse. Use of birth control at last intercourse Favour intervention Coyle et al 2001 20 urban high schools Texas and California 3869 grade 9 students, mean age 15 years, 53% female, 31% white, 27% Hispanic, 16% African American Randomisati on: School Analysis: Adjusted Social Learning Theory Safe choices: 10 lessons for grade and 10; lessons for grade 10 on knowledge and skills and led by trained peers and teachers 31 months, 79% followed Intercourse. Use of birth control at last intercourse Favour control DiCenso A, Guyatt G, Willan A, Griffith L. Interventions to reduce unintended pregnancies among adolescents: systematic review of randomised controlled trials. BMJ 2002;324:1426-34 Steps Step Two: Check for heterogeneity Are the results consistent? Yes Meta-analysis No Narrative synthesis or subgroup analysis Explain causes of heterogeneity Terminology Homogeneity = similar Homogenous studies – if their results vary no more than might be expected by the play of chance (opposite= heterogeneity) Investigating heterogeneity Graphically: If homogenous studies – Point estimates on same side of line of unity CI should overlap to a large extent Lack of outliers Heterogeneity - the eyeball test Investigating heterogeneity Statistically: p=<0.1 would indicate heterogeneity But test has low power when there are a few studies Lack of statistical significance does not imply homogeneity Sources of heterogeneity Examples: Populations Interventions Outcomes Study designs Study quality Need to identify which factors contribute to heterogeneity Identifying and dealing with heterogeneity Subgroup analyses By gender, age group, quality, type of intervention…..but keep analyses to a minimum! Dealing with heterogeneity Not all systematic reviews are meta-analyses “…it is always appropriate and desirable to systematically review a body of data, but it may sometimes be inappropriate, or even misleading, to statistically pool results from separate studies. Indeed, it is our impression that reviewers often find it hard to resist the temptation of combining studies even when such meta-analysis is questionable or clearly inappropriate.” Egger et al. Systematic reviews in health care. London: BMJ Books, 2001:5 Yes – heterogeneity present Narrative synthesis Describes studies Assesses whether quality is adequate in primary studies to trust their results Demonstrates absence of data for planned comparisons Demonstrates degree of heterogeneity Stratify by – populations, interventions, settings, context, outcomes, validity, etc Adapted from NHS CRD Report No. 4, 2nd Ed. www.york.ac.uk/inst/crd/report4.htm No – heterogeneity not present Meta-analysis What comparisons should be made? What study results should be used in each comparison? Are the results of studies similar within each comparison? What is the best summary of effect for each comparison? Cochrane Reviewers’ Handbook Meta-analysis Weighted average of effect sizes Weighted by study size, events Study results For dichotomous/binary outcomes (Y/N) use: Relative Risk or Odds Ratio Risk = number of events total number of observations Odds = number of events number without the event Relative risk and Odds ratio RR – Risk of the event in one group divided by the risk in the other group OR- Odds of the event occurring in one group divided by the odds occurring in the other group Let’s try! Intervention Group Control Group Total ABC No ABC Total 2 a 4 c 9 62 b 59 d 121 64 63 127 Calculation Relative risk a/(a+b) c/(c+d) 2/(2+62) 4/(4+59) = 0.49 The risk of developing ABC was 49% of the risk in the control group The intervention reduced the risk by 51% of what it was in the control group Calculation Odds ratio a/b c/d 2/62 = 0.48 4/59 The intervention reduced the odds of having ABC by about 50% of what they were Odds Ratio Graph LEFT E S S M O RIGHT E Line of no significance less than 1 1 more than 1 Odds Ratio Best/point estimate Confidence Interval less than 1 1 more than 1 Odds Ratio – with pooled effect size Best/point estimate Confidence Interval less than 1 1 more than 1 Confidence Interval (CI) Is the range within which the true size of effect (never exactly known) lies, with a given degree of assurance (usually 95%) Confidence Interval (CI) How ‘confident’ are we that the results are a true reflection of the actual effect/phenomena? the shorter the CI the more certain we can be about the results if it crosses the line of unity (no treatment effect) the intervention might not be doing any good and could be doing harm The p-value in a nutshell Could the result have occurred by chance? The result is unlikely to be due to chance The result is likely to be due to chance 0 1 p < 0.05 a statistically significant result p > 0.05 not a statistically significant result p = 0.05 p = 0.5 1 20 1 2 or 1 in 20 result fairly unlikely to be due to chance or 1 in 2 result quite likely to be due to chance Continuous data Data which is normally presented with means and SDs (ie. height, BMI) For each study you need means and SDs to calculate difference Difficult if continuous data arise from different scales Statisticians are your best friend! Statistical software for Meta-analysis The meta-analysis Steps Step Three: Sensitivity analysis How sensitive are the results of the analysis to changes in the way it was done? Sensitivity analysis How sensitive are the results of the analysis to changes in the way it was done? Changing inclusion criteria for types of studies Including or excluding studies where there is ambiguity Reanalysing the data imputing using a reasonable range of values for missing data Reanalysing the data using different statistical approaches Steps Step Four: Explore publication bias Is there a possibility I have missed some studies? Publication bias Funnel plot Studies with significant results are more likely to be Published Published in English Cited by others Sample size Funnel plots Effect size No publication bias = symmetrical inverted funnel Effect size vs. sample size i.e. Smaller studies without statistically significant effects remain unpublished, gap in bottom corner of graph Synthesis of qualitative research In its infancy Widely varying theoretical perspectives Unit of analysis is concept/theme Secondary summary of research Most developed method is metaethnography Help is around the corner – many research projects in progress!! Interpretation of results Formulating conclusions and recommendations VERY IMPORTANT! Many people prefer to go directly to the conclusions before looking at the rest of the review Conclusions must reflect findings in review Objectives Conclusions Method of review Issues to consider Conclusions should be based on: Strength of evidence Biases/limitations of review Applicability and sustainability of results Trade-offs between benefits and harms Implications for public health and future research Strength and biases Strength How good is the quality of evidence? How large are the effects? Consistent results? Biases / limitations of review Comprehensive search? Quality assessment? Appropriate analysis? Publication bias? Applicability Applicability – relates to: Study population characteristics Validity of the studies Relevant outcomes (incl. efficiency), interventions, comparisons Integrity of intervention – details of intervention (provider, adherence, medium, setting, access, infrastructure) Maintenance of intervention/sustainability Factors relating to the interpretation of effectiveness Theoretical frameworks Integrity of the intervention Influence of context Theory Change in behaviour at individual, community, organisational, policy level Examine the impact of the theoretical framework on effectiveness Group studies according to theory Assists in determining implementation (integrity of interventions) Discuss theoretical frameworks used (all single level?) Context VERY IMPORTANT IN HP/PH REVIEWS Influences effectiveness of intervention Social/cultural, political, organisational Affects ability to pool results Affects applicability Context Time and place of intervention Local policy environment, incl. management support for intervention Broader political and social environment, concurrent social changes Structural, organisational (aspects of system), physical environment Training, skills, experience of those implementing the intervention Characteristics of the target populations (eg. culture, literacy, SES) DATA OFTEN NOT PROVIDED!! Trade-offs Trade-offs Adverse effects / potential for harm Costs of intervention Sustainability Sustainability of outcomes and/or interventions - consider: Economic and political variables Strength of the institution Full integration of activities into existing programs/curricula/services, etc Whether program involves a strong training component Community involvement/participation Implications for PH/HP Not good enough to simply say “more research is needed” State what type of research should be done and why What specific study design or quality issue should be addressed in future research? Writing the systematic review Writing your systematic review Useful review manuals and guidelines for publication Cochrane Reviewers’ Handbook Cochrane Open-Learning materials http://www.cochrane.org/resources/revpro.htm NHS CRD Report http://www.york.ac.uk/inst/crd/report4.htm QUORUM statement (Lancet 1999;354(9193):1896-900) MOOSE guidelines (JAMA 2000;283:2008-2012) Appraisal of a systematic review 10 questions 1. Clearly-focused question 2. The right type of study included 3. Identifying all relevant studies 4. Assessment of quality of studies 5. Reasonable to combine studies 6. What were the results 7. Preciseness of results 8. Application of results to local population 9. Consideration of all outcomes 10. Policy or practice change as a result of evidence