Knowledge Claims - Believe It Or Not? New Trend in Actualizing EvidenceBased Practice in Social Work Setting • “The cranes send the babies to town. When there were more cranes in the town last winter, the number of babies in the next year will increase.” • “Acid in the stomach is the major cause of stomach ulcer.” • “After the Big Bang, the universe has expanded at a slower rate.” • “Chinese fathers are stricter and less kind than Chinese mothers” ("yan fu ci mu”) . • Adventure-based counseling is an effective intervention approach in changing the behavior of delinquents. • After school care service is helpful to students. Daniel TL Shek, PhD, FHKPS, BBS, JP Chair Professor of Applied Social Sciences Department of Applied Social Sciences The Hong Kong Polytechnic University 1 The Counting Horse - Believe It Or Not? • A horse (Clever Hans) was claimed by its owner that it could count and do simple addition and subtraction (including fractions). • Hans gave his answers by tapping his forefoot or by pointing his nose at different answers displayed to him. 2 What is Evidence-Based Practice? z How much do you know about EBP? z EBP is “the integration of best research evidence with clinical experience and client values.” (Sackett et al., 2000, p.1) Sackett, D.L., Straus, S.E., Richardson, W.C., Rosenberg, W., & Haynes, R.M. (2000). Evidence-based medicine: How to practice and teach EBM. New York: Churchill Livingstone. z “Use of best current knowledge as a basis for decisions about groups of patients or populations.” (Gray, 2001, p.20) Gray, J.A.M. (2001). Evidence-based medicine for professionals. In A. Edwars & G. Elwyn (Eds.), Evidencebased patient choice: Inevitable or impossible? (pp. 19-33). New York: Oxford University Press. 3 4 An Updated Model for Evidence-Based Decisions: Gambrill, E. (2006). Evidence-based practice and policy: Choice ahead. Research on Social Work Practice, 16, 338-357. 5 Steps of EBP 1. Converting information needs related to practice decisions into well-structured answerable questions. 2. Tracking down, with maximum efficiency, the best evidence with which to answer them. 3. Critically appraising that evidence for its validity, impact (size of effect), and applicability (usefulness in practice). 4. Applying the results of this appraisal to practice and policy decisions (decisions on generalizability of findings). 5. Evaluating effectiveness and efficiency in carrying out. Steps 1 to 4 and seeking ways to improve them in the future. Clinical Characteristics and Circumstances Clinical Expertise (Sackett et al., 2000, pp. 3-4) 5 Client Preferences and Actions Research Evidence 6 Gambrill (2006): Critical Thinking Values as the Foundation of EBP Rubin, A., & Parrish, D. (2007). Challenges to the future of evidence-based practice in social work education. Journal of social Work Education, 43, 405-428. Values z Courage z Curiosity z Intellectual Empathy z Humility z Integrity z Persistence • Evidence-based practice as an empowerment process • Attempts to maximize the likelihood that the clients will receive the most effective intervention • Associated with a higher probability of treatment success, but not a guarantee (program with higher probability of adolescent drug prevention – carnival; singing concert; curricular-based intervention?) Honor Ethical Obligations z Transparency z Avoiding harm z Informed consent z Maximizing autonomy z Self determination z Empowerment 7 8 Rubin and Parrish (2007): Common Myths Against EBP Choices in EBP (Gambrill, 2006) • Myth 1: Ignores clinical expertise and client values • Myth 2: Inconsistent with client empowerment • Myth 3: Inadequate studies on various populations • Myth 4: Lag time in demonstrating effectiveness • Myth 5: Evidence-based practice is the concern of academics only • Your myths and understanding about EBP? z How systemic to be (educators, researchers, practitioners)? z Who will select the practice? z What outcome and indicators will be focused on? z What style of EBP will be used (e.g. knowledge manager)? z How transparent regarding the evidentiary status of services? z How and in what ways to involve the clients? z Whether to implement needed organization change? z How rigorous to be in reviewing the evidentiary status of services? z How transparent to be regarding the evidentiary status of services? 9 • • • • • • • • • • • • 10 How to Get the Best Available Evidence? The Campbell Collaboration (C2) How often do you read social work journals? Professional and academic journals – A MUST Journal: Evidence-based Social Work Journal: Research on Social Work Practice Database: Social Work Abstracts (Social Work and allied disciplines) PsycINFO (Psychology, social work and allied disciplines) Medline (Medicine and allied disciplines) Sociological Abstracts (Sociology and allied disciplines) ERIC (Education and allied disciplines) CINAHL Plus (Clinical professions and allied disciplines) Government databases: SAMHSA and NREPP Campbell Collaboration • helps people make well-informed decisions by preparing, maintaining and disseminating systematic reviews in education, crime and justice, and social welfare • an international research network that produces systematic reviews of the effects of social interventions • voluntary cooperation among researchers of a variety of backgrounds • Campbell currently has five Coordinating Groups: Social Welfare, Crime and Justice, Education, Methods, and the Users group • The Coordinating Groups are responsible for the production, scientific merit, and relevance of our systematic reviews • Campbell's International Secretariat is now located in Oslo and hosted by the Norwegian Knowledge Centre for the Health Services 11 12 10 Key Principles • The Campbell Collaboration (C2) grew out of a meeting in London in 1999 • 80 people from four countries attended the meeting, many from our sibling organization The Cochrane Collaboration (C1). • Cochrane had been producing systematic reviews in healthcare, since 1994. Many of its members saw the need for an organization that would produce systematic reviews of research evidence on the effectiveness of social interventions. • Creation of The Campbell Collaboration in 2000: inaugural meeting in Philadelphia, USA attracting 85 participants from 13 countries • With partnerships developing in a number of countries, Campbell has held eight Annual Colloquia to date • A Nordic Campbell Centre was added to the Collaboration in 2001, supported by the Danish Government and the Nordic Council of Ministers. • Collaboration: internally and externally fostering good communications • Building on the enthusiasm of individuals • Avoiding duplication, by good management and coordination to ensure economy of effort. • Minimizing bias, through a variety of approaches such as scientific rigor • Keeping up to date: incorporation of new evidence. • Striving for relevance, by promoting the assessment of policies and practices using outcomes that matter to people. • Promoting access: wide dissemination of the outputs • Ensuring quality, by being open and responsive to criticism, applying advances in methodology • Continuity: key functions is maintained and renewed. • Enabling wide participation by reducing barriers to contributing and by encouraging diversity. 13 14 15 16 17 18 19 20 21 22 23 24 Oversea Experiences of Evidence-Based Practice • Numerous examples! Do you know them? • Campbell Collaboration Library • Mental health, substance abuse, suicide abuse prevention programs • Durlak, J. A., & Wells, A. M. (1997). Primary prevention mental health programs for children and adolescents: A metaanalytic review. American Journal of Community Psychology, 25, 115-152. • Catalano, R. F., Berglund, M. L., Ryan, J. A. M., Lonczak, H. S., & Hawkins, J.D. (2002, June 24). Positive youth development in the United States: Research findings on evaluations of positive youth development programs. Prevention & Treatment, 5, Article 15. Retrieved November 17, 2004, from http://www.journals.apa.org/ prevention/volume5/pre0050015a.html • Zwi K, Woolfenden S, Wheeler D, O'Brien T, Tait P, Williams K. School-based education programmes for the prevention of child sexual abuse. Campbell Systematic Reviews 2007:5 DOI: 10.4073/csr.2007.5 • Studies evaluated in this review report significant improvements in knowledge measures and protective behaviours. Several studies reported harms, suggesting a need to monitor the impact of similar interventions. Retention of knowledge should be measured beyond School-based education programmes for the prevention of child sexual abuse3-12 months. Further investigation of the best forms of presentation and optimal age of programme delivery is required. 25 • Fisher H, Montgomery P, Gardner F. Opportunities provision for preventing youth gang involvement for children and young people (7-16). Campbell Systematic Reviews 2008:8 DOI: 10.4073/csr.2008.8 • No evidence from randomised controlled trials or quasi-randomised controlled trials currently exists regarding the effectiveness of opportunities provision for gang prevention. Only two studies addressed opportunities provision as a gang prevention strategy, a case study and a qualitative study, both of which had such substantial methodological limitations that even speculative conclusions as to the impact of opportunities provision were impossible. Rigorous primary evaluations of gang prevention strategies are crucial to develop this research field, justify funding of existing interventions, and guide future gang prevention programmes and policies. 27 26 • Smedslund G, Hagen KB, Steiro A, Johme T, Dalsbø TK, Rud MG. Work programmes for welfare recipients. Campbell Systematic Reviews 2006:9 DOI: 10.4073/csr.2006.9 • Welfare-to-work programmes in the USA have shown small, but consistent effects in moving welfare recipients into work, increasing earnings, and lowering welfare payments. The results are not clear for reducing the proportion of recipients receiving welfare. Little is known about the impacts of welfare-to-work programmes outside of the USA. 28 Rosenbaum, D.P., & Hanson, G.S. (1998). Assessing the effects of school-based drug education: A six-year multi-level analysis of Project D.A.R.E. Journal of Research in Crime and Delinquency, 35(4), 381-412. Question for Reflection and Discussion: It is commonly believed that organizing visits for high-risk youths will lower juvenile crime rates. Based on such a belief, many District Fight Crime Committees, Correctional Services Department and NGOs in Hong Kong organize visits for high risk young people. Do you think such programs are really effective in reducing juvenile crime rates? 29 • Drug Abuse Resistance Education Program (DARE) – reaching 25 million students in the US, adopted in 44 foreign countries. • 18 pairs of elementary schools in urban, suburban and rural areas of Illinos • Matched for school characteristics, random assignment • Pretest and posttest measurement (6 years), with data collected from students and teachers 30 • Outcome measures – drug use behavior; onset of alcohol use; general attitudes toward drugs; Attitudes toward the use of specific drugs; perceived benefits and costs of using drugs; selfesteem; attitude towards police; peer resistance skills, school performance, delinquent and violent behavior • Multi-level analyses with seven waves of posttreatment data showed that D.A.R.E. had no long term effect on a wide range of drug use measures 31 Cost-Benefit Analysis Via an Longitudinal Experimental Study • Wortman, P.D. (1995). An Exemplary Evaluation of a Program that Worked: The High/Scope Perry Preschool Project. Evaluation Practice, 16(3), 257-265. • Is there a high-quality, randomized evaluative study that shows that preschool education provides these benefits or prevents these negative social consequences? • What is the real solution to poverty? 33 • The goal was to provide a successful initial educational experience that would continue into the regular (K-12) school years • Hypothesis: In the long run, such students would do better • Hypothesis: They would have fewer placements in special education programs, fewer would be retained in grade (or “held back”), fewer would commit crimes, and more would 35 graduate from high school 32 • The program consisted of a 2.5 hour morning session on weekdays and one 1.5 hour “home visit” session per week with the mother and child • The program’s teaching curriculum followed the principles of the developmental psychologist Jean Piaget and used “active learning” where the children plan and carry out learning activities (what did Piaget say?) 34 • How much will the program cost and how much does the American public benefit? • The first cost-benefit analysis, reported by Gramlich (1986), a professor of economics at The University of Michigan, concluded that “the program is a winner.”. • Equally “remarkable”, according to Gramlich, nearly 80 percent of the benefits went to the American public, with the program participants getting the 36 remaining 20 percent Methods and Results of Perry Preschool Evaluation • The second cost-benefit analysis conducted by Barnett in Schweinhart, Barnes, and Weikart (1993): also provides much more accurate estimates of the program’s benefits, especially on crime reduction and income • $95,646 net benefit ($108,002 total benefits minus $12,356 in total costs), $19,750 went to the program participants while the remaining $76,076 was realized by the American public (20% vs. 80% for the participant and the public) • For every dollar (of the $12,356) invested in each preschool child, the public will receive $7.16 in return 37 Method Findings Randomized Experiment Immediate, short-term IQ gains; achievement gains; fewer placements in special education programs; higher high school graduation rate Long-term, 22-year Fewer lifetime arrests including adult and follow up using data drug crimes; shorter probation or parole; archives and higher monthly earnings and home interviews ownership; lower use of social services such as welfare; fewer out-of-wedlock births and abortions Cost-benefit analysis $7.16 in benefits for every $1 cost; 80 percent are societal benefits Causal Model Supports a theory of expectationachievement-motivation for benefits Case Study Brief qualitative descriptions of 8 study participants at age 20-24 and at age 27 38 Local Experiences of Evidence-Based Practice • Few examples – Why? • Project Astro: drug prevention program for adolescents with high-risk for substance abuse • Objective, subjective, and qualitative evaluation; quasi-experimental approach • Project P.A.T.H.S. • Multiple evaluation strategies – objective, subjective, qualitative, process and interim evaluation; randomized group trial 39 40 Project P.A.T.H.S: Randomized Group Trial Involving Experimental and Control Schools • 24 pairs of experimental and control schools were randomly chosen from the schools participating in the project All schools Joining The Project 41 Experimental Schools Random Assignment Control Schools 42 Experimental Schools Participants Secondary 1 students cohort admitted in Sept. 2006 Data Collection Sept. 2006 to August 2011 (2 years’ follow-up): 8 waves Sept. 2006 to August 2009 Program Duration Content of Evaluation •Objective outcome evaluation •Subjective outcome evaluation •Qualitative evaluation •Process evaluation Control Schools Secondary 1 students cohort admitted in Sept. 2006. They WILL NOT join the P.A.T.H.S. Sept. 2006 to August 2011: 8 waves Sept. 2007 to August 2010 (Secondary 1 students admitted in Sept. 2007) •Objective outcome evaluation 43 44 Example: Chinese Positive Youth Development Scale (12 Domains) 2-way ANOVA 1-way ANCOVA CPYDS:12 Domains (Posttest) Pretest vs. posttest as within subjects factor Experimental vs. Control as between subjects factor Removing the effect of age, school banding and initial positive development F (1,5505) = 9.04, p < .005 4.319 DV=Posttest scores; adjusting for age, school banding, initial positive development and pretest scores F (1/5504) = 8.21, p < .005 4.279 Linear Mixed Model DV=Posttest scores; adjusting for age, school banding, initial positive development, pretest scores and random effect of schools Fixed effect of group (F) = 3.36, p < .05 (one-tailed) Estimates: Experimental = 0; Control = - 0.041 Control Group Experimental Group 45 Figure 2: Academic and School Competence 3.20 4.475 Estimated Marginal Means Estimated Marginal Means Figure 1: Psychosocial Competence 46 4.450 4.425 4.400 4.375 4.350 3.15 3.10 3.05 3.00 2.95 4.325 Wave 2 Wave 3 Experimental Group Wave 2 Wave 4 Wave 3 Experimental Group Control Group 47 Wave 4 Control Group 48 Figure 4: Delinquent Behavior 4.200 Estimated Marginal Means Estimated Marginal Means Figure 3: Clear and Positive Identify 4.175 4.150 4.125 4.100 4.075 4.050 0.55 0.50 0.45 0.40 0.35 Wave 2 Wave 3 Experimental Group Wave 4 Wave 2 Control Group Wave 3 Experimental Group Wave 4 Control Group 49 50 Obstacles to EBP: (Gambrill, 2006; Shek, Lam & Tsoi, 2004) Three Challenges Rubin and Parrish (2007): z z z z z z z Preference for authority-based practice Non-availability of relevant databases Self-deception Justificatory approach to knowledge Time constraints Chinese culture (face, interpersonal harmony) Professional culture of social work (non-critical thinking; nonexistence of continuing professional education)) z Shek, D.T.L., Lam, M.C., & Tsoi, K.W. (2004). Evidencebased practice in Hong Kong. In B. Thyer and M.A.F. Kazi (Eds.), International perspectives on evidence-based practice in social work (pp. 167-181). London: Venture Press. • Apparent disparities in how EBP is being defined • Maximizing the feasibility of implementing the EBP process after graduation and not treating authoritative publication as evidence-based materials • Preventing evidentiary standards from getting softened 51 Unresolved Challenges Rubin and Parrish (2007): 52 Hierarchy of Evidence Grade 1. Inadequate support by agency superiors and policies 2. Time constraints 3. Reliance on weak studies as evidence-based support 4. Generalizability issues 5. Need for long-term follow-up 6. Erosion of evidentiary hierarchy 7. Rejection of evidentiary hierarchy 1. 2. Grade 2 is met plus evidence of effectiveness when the preventive intervention is implemented in its intended setting with adequate training of personnel and monitoring of implementation and outcomes. Evidence based on multiple well-designed, randomized, controlled trials or multiple well-designed, interrupted timeseries experiments that were conducted by two or more independent research teams. SOURCE: Biglan, A., Mrazek, P.J., Carnine, D., & Flay, B.R.(2003). The integration of research and practice in the prevention of youth problem behaviors. American Psychologist, 58, 433-440. 53 54 Hierarchy of Evidence Hierarchy of Evidence Grade Grade 3. Evidence based on multiple well-designed, randomized, controlled trials or multiple welldesigned, interrupted time-series experiments that were conducted by a single research team. 4. Evidence based on at least one well-designed, randomized, controlled trial or an interrupted timeseries design that was replicated across three cases. 5. Evidence based on non-equivalent group design (i.e., comparisons between groups that were not effectively randomized to conditions). 6. Evidence based on pre-post evaluation with no comparison group or multiple posttests (i.e., preexperimental design). 7. Evidence based on clinical experience by respected authorities (researchers and practitioners), descriptions of programs, and case reports. 55 56 The Quest for Effective Social Work Intervention Program Rubin and Parrish (2007): Recommendations 6 possible types of programs: • Social Work education: defining and strengthening the related EBP components • Removing obstacles in EBP • Improve colleagues’ understanding of EBP • Avoid softening of evidentiary evidence • Critical appraisal of research studies • Guidelines for describing conclusions of research studies 1. 2. 3. 4. 5. 6. Ineffective or harmful intervention Intervention unlikely to be beneficial Intervention with unknown effectiveness Intervention with both benefits and adverse effects Intervention likely to be beneficial Intervention is effective reflected by clear evidence Shek, D.T.L. (2008) Enthusiasm-based or evidencebased charities: personal reflections based on the Project P.A.T.H.S. in Hong Kong. TheScientificWorldJOURNAL: 8, 802–810. 57 58