2. Selecting and Implementing an EBP

advertisement
WKF-FS-58708.SHIFT-FS:Layout 1
9/18/08
3:15 PM
Page 1
DECISION TREE
WKF-FS-58708.SHIFT-FS:Layout 1
9/18/08
3:15 PM
Page 2
EVIDENCE-BASED PRACTICE REGISTRIES
For this project, criteria determined by two evidencebased practice registries were used to identify
evidence-based practices.
National Registry of Evidence-based Programs &
Practices (NREPP) (www.nrepp.samhsa.gov/)
Suicide Prevention Resource Center (SPRC) Best
Practices Registry (BPR) For Suicide Prevention
(www.sprc.org)
• Must demonstrate one or more positive outcomes
(p ≤ .05) in mental health (which includes suicide)
and/or substance use behavior among individuals,
communities, or populations.
• Three expert reviewers conducted the SPRC BPR
review process, rating the quality of each
intervention based on 10 criteria:
• Intervention results must have been published in
a peer-reviewed publication or documented in a
comprehensive evaluation report.
- Theory, intervention fidelity, design, attrition,
psychometric properties of measures, analysis,
threats to validity, safety, integrity, and utility.
• Programs meeting standards of evidence were
classified as Effective or Promising.
- Effective programs utilized superior evaluation
methods to demonstrate a strong causal link
between the program and appropriate outcomes.
- Promising programs were evaluated using less
rigorous methods, or demonstrated a moderate
causal link between the program and appropriate
outcomes.
More information about the process used to evaluate
effectiveness can be found at: www.sprc.org/
featured_resources/bpr/PDF/ebpp_proj_descrip.pdf.
• Documentation of the intervention and its proper
implementation (e.g., manuals, process guides, tools,
training materials) must be available to the public to
facilitate dissemination.
More information about the review process can be
found at www.nrepp.samhsa.gov/review.htm.
WKF-FS-58708.SHIFT-FS:Layout 1
9/18/08
3:15 PM
Page 17
EVIDENCE
The two checklists may be used to
evaluate the quality of the research.
These checklists are intended as a guide
to evaluate the level of evidence provided. A randomized controlled trial
(RCT) is seen as the gold standard for
research findings. All findings, however, should be rigorously evaluated
before being accepted as valid.
Often it is not possible to conduct an
RCT; in this case research findings
from other research designs can
also provide possible evidence of
effectiveness. You may need to make
a “judgment call” about the quality
of the research and the strength of the
research outcomes, while taking into
consideration the questions asked
below, and the evaluators’ own
knowledge and experience.
Social work research takes place in
varied settings and often involves
very complex situations. Therefore,
no simple formula exists. Decisions
about what makes an EBP should be
based on a combination of experience,
cultural competency, education,
training, and information culled by
using the following questions.
WKF-FS-58708.SHIFT-FS:Layout 1
9/18/08
3:15 PM
Page 18
A list of references used to create the checklists—along
with some further reading—appears below.
Resources
Coalition for Evidence-Based Policy. (2006). Which study designs
can produce rigorous evidence of program effectiveness? A brief
overview. Available online at: www.evidencebasedpolicy.
org/docs/RCTs_first_then_match_c-g_studies-FINAL.pdf
Coalition for Evidence-Based Policy. (2007). Checklist for reviewing
a randomized controlled trial of a social program or project,
to assess whether it produced valid evidence. Available online at:
www.evidencebasedpolicy.org/docs/Checklist-For-ReviewingRCTs-Final.pdf
Institute for the Advancement of Social Work Research [IASWR].
(2007). Partnerships to integrate evidence based mental health
practices into social work education and research. Available
online at: www.charityadvantage.com/iaswr/EvidenceBased
PracticeFinal.pdf
National Center for Education, Evaluation, and Regional
Assistance (2003). Identifying and implementing educational
practices supported by rigorous evidence: A user friendly guide.
Available at: www.ed.gov/rschstat/research/pubs/rigorousevid/
rigorousevid.pdf
What Works for Children (2006). Evidence guide: An introduction
to finding, judging and using research findings on what works
for children and young people. Available at: www.whatworks
forchildren.org.uk/docs/tools/evidenceguide%20june2006.pdf
WKF-FS-58708.SHIFT-FS:Layout 1
9/18/08
3:15 PM
Page 19
CORE COMPONENTS
What are Core Intervention Components?
• Specific to the individual program
• Identification should involve careful research and
results from replicated implementation
• Program philosophy and values, including guidance
on direct services and strategies for fully integrating
the program philosophy into actual program
operations and service delivery
• Service delivery model and activities, including
structure, service duration, setting and staff skills
and protocols
• Treatment or service components that promote
consistency in service delivery across staff
What are Core Implementation Components?
• Cost of program
• Staff qualifications, experience (selection criteria),
and recruitment
• How much training, who delivers it, and in what
setting or format
• Coaching and mentoring
• Evaluation
• Administrative structures and processes that
facilitate implementation of the program by
practitioners and supervisors, such as: ensuring
adequate time is set aside for staff training, and
that trainers and supervisors receive the training
and coaching they need
• System-level interventions, such as strategies to
ensure the availability of the financial
organizational, and human resources required
to support practitioners work
References
Fixsen, D., Naoom, S. F., Blase, K., Friedman, R. M., & Wallace, F.
(2005). Implementation research: A synthesis of the literature.
National Implementation Research Network. Available at:
http://nirn.fmhi.usf.edu/resources/publications/Monograph/
Gorman-Smith, D. (2006). How to successfully implement
evidence-based social programs: A brief overview for
policymakers and program providers. [Working paper.]
Coalition for Evidence-Based Policy. Available at:
www.evidencebasedpolicy.org/docs/How_to_successfully_
implement_eb_progs-final.pdf
Metz, A., Bowie, L., & Blase, K. (2007). Seven activities for
enhancing the replicability of evidence-based practices. Child
Trends. Available at: www.childtrends.org/files/child_
trends-2007_10_01_RB_Replicability.pdf
WKF-FS-58708.SHIFT-FS:Layout 1
9/18/08
3:15 PM
Page 20
CORE COMPONENTS CHECKLIST
QUESTIONS TO ASK IN IDENTIFYING CORE COMPONENTS
1
Program philosophy and values
•
•
2
Will this intervention’s philosophy or values conflict with
the implementation site’s existing philosophy and values?
Will this intervention’s philosophy or values conflict with
the philosophy or values of concurrent programs at this site?
Service delivery
Participants
• Was this intervention developed for use with a specific
demographic group?
• Can it be adapted appropriately for use with other groups
(e.g., cultural, gender, sexual orientation, age)?
Structure
• What materials are needed?
• How many people can be served during each session
or intervention?
Duration
• Does the intervention require a set number of sessions?
• Does it require a specific amount of time per session?
Setting
• Will the intervention or program take place at a
particular location?
Community
• Does the intervention fit with the cultural norms of
the community?
RESPONSES
WKF-FS-58708.SHIFT-FS:Layout 1
9/18/08
3:15 PM
Page 21
QUESTIONS TO ASK IN IDENTIFYING CORE COMPONENTS
Staff
• Are staff who provide the intervention required to have
specific qualifications (such as a degree or amount
of experience)?
• Are these staff members required to have specific training?
• Is a certain number of staff required?
• Do recommended staffing ratios exist (practitioner-to-client,
supervisor-to-practitioner)?
Protocols
• Are staff required to go over specific materials or to
adhere to a specific script?
Consistency
• What components have been identified that promote
consistency across staff?
3
Cost
•
•
•
•
4
How much does it cost to secure the developer’s services?
What is included in that cost?
If the intervention is not affordable, is there a way to
implement only part of the intervention?
Do costs include salaried positions? In-kind costs?
Special equipment?
Training
•
•
•
•
•
•
How many hours of training will be needed?
Who will deliver the training?
What format will be used to deliver training (online,
in person, manual)?
Will it utilize a “train the trainers” model?
Where will training be delivered?
Is training ongoing?
RESPONSES
WKF-FS-58708.SHIFT-FS:Layout 1
9/18/08
3:15 PM
Page 22
QUESTIONS TO ASK IN IDENTIFYING CORE COMPONENTS
5
Adaptable or discretionary program components
•
•
6
RESPONSES
Can elements of the program be adapted?
What elements of the program are considered adaptable
or discretionary?
Fidelity
•
•
What instruments are available to assess practitioners’
adherence to—and competence in—the intervention’s
core components?
What tests have been done to ensure the fidelity
instruments’ validity and reliability?
References
The above checklist was created based upon the
following publications:
Metz, A., Bowie, L., & Blase, K. (2007). Seven activities for
enhancing the replicability of evidence-based practices.
Child Trends. Available at: http://www.childtrends.org/
files/child_trends-2007_10_01_RB_Replicability.pdf
Education Development Center. Designing and implementing
programs based on research. Available at: http://notes.edc.org/
HHD/MSC/mscres.nsf/e8607b6263a685788525686d005eedf2/
7de12bfe43d4a655852568e80054284a/$FILE/Topic6Research.doc
Gorman-Smith, D. (2006). How to successfully implement evidence
based social programs: A brief overview for policymakers and
program providers. Coalition for Evidence-Based Policy
(working paper). Available at: http://www.evidencebased
policy.org/docs/How_to_successfully_implement_eb_
progs-final.pdf
Fixsen, D., Naoom, S. F., Blase, K., Friedman, R. M., & Wallace, F.
(2005). Implementation research: A synthesis of the literature.
National Implementation Research Network. Available at:
http://nirn.fmhi.usf.edu/resources/publications/Monograph/
WKF-FS-58708.SHIFT-FS:Layout 1
9/18/08
3:15 PM
Page 23
ORGANIZATIONAL CLIMATE
The following checklists are intended to gauge the climate of an agency.
WKF-FS-58708.SHIFT-FS:Layout 1
9/18/08
3:15 PM
Page 24
The organizational climate checklist and community
climate checklist examine how well the selected EBP
will fit with programs the agency is already
implementing, and will determine whether key
staff members and community members are likely
to buy into and accept the program.
These scales are adapted from a feasibility measure
created for the field of education. The original measure
provided a weighted scale for assessing each of these
variables. Again, due to the variability of these suicide
prevention programs—and due to the variability of
the settings in which these programs may be
implemented—the checklist does not provide
instruction in how to consider each of these factors.
Instead, each agency should use the following questions
as prompts, to discover areas in which adjustments
are necessary before moving forward. The questions
will also highlight areas of strength that should be
capitalized upon. In certain instances, some of the
questions may not apply.
Resources
The checklist was adapted from:
Education Development Center. Designing and implementing
programs based on research. Available at:
http://notes.edc.org/HHD/MSC/mscres.nsf/e8607b6263a685788
525686d005eedf2/7de12bfe43d4a655852568e80054284a/$FIL
E/Topic6-Research.doc
WKF-FS-58708.SHIFT-FS:Layout 1
9/18/08
3:15 PM
Page 25
SITE READINESS
WKF-FS-58708.SHIFT-StateResearch:Layout 1
9/18/08
3:10 PM
Page 14
EVIDENCE-BASED PRACTICE ATTITUDE SCALE3
EBPAS© Gregory A. Aarons, PhD
The following questions ask about your feelings about using new types of therapy, interventions, or treatments.
Manualized therapy refers to any intervention that has specific guidelines and/or components that are outlined in
a manual and/or that are to be followed in a structured/predetermined way.
Fill in the circle indicating the extent to which you agree with each item using the following scale:
WKF-FS-58708.SHIFT-StateResearch:Layout 1
9/18/08
3:10 PM
Page 15
3Aarons, G. A. (2004). Mental health provider attitudes toward adoption of evidence-based practice: The Evidence-Based Practice Attitude
Scale. Mental Health Services Research, 6(2), 61–74.
WKF-FS-58708.SHIFT-StateResearch:Layout 1
9/18/08
3:10 PM
Page 18
WKF-FS-58708.SHIFT-StateResearch:Layout 1
9/18/08
3:10 PM
Page 19
Innovators are those who have curiosity and keen interest in new technologies.
Early adopters are more cautious than innovators but are still willing to experiment when a new technology
shows promise.
The early majority is still more cautious, waiting until some promise of a technology is shown through the
experiences of early adopters.
The late majority is even more cautious and risk averse, waiting until there is ample evidence that the risks of
adopting the technology are low and benefits are evident.
Laggards are unlikely to adopt a new technology unless it is absolutely necessary or an extremely convincing
case has been made for adoption.
WKF-FS-58708.SHIFT-FS:Layout 1
9/18/08
3:15 PM
Page 26
PROGRAM EVALUATION
Selecting and Implementing Evidence-Based Practices
References and Resources
Aarons, G.A. (2004). Mental Health Provider Attitudes Toward Adoption of Evidence-Based Practice: The
Evidence-Based Practice Attitude Scale. NIH-PA Author Manuscript.
Aarons, G.A. (2005). Measuring Provider Attitudes Toward Evidence-Based Practice:
Consideration of Organizational Context and Individual Differences. Child Adolesc Psychiatr Clin N
Am. 14(2): 255–viii.
Coalition for Evidence-Based Policy (2006). Which Study Designs Can Produce Rigorous Evidence of
Program Effectiveness? A Brief Overview. Retrieved May 21, 2008, from
http://www.evidencebasedpolicy.org/docs/RCTs_first_then_match_c-g_studies-FINAL.pdf
Coalition for Evidence-Based Policy (2007). Checklist for Reviewing a Randomized Controlled Trial of a
Social Program or Project, to Assess Whether It Produced Valid Evidence. Retrieved May 21, 2008,
from
http://www.evidencebasedpolicy.org/docs/Checklist-For-Reviewing-RCTs-Final.pdf
Drake, R.E., Goldman, H.H., Leff, H.S., Lehman, F., Dixon, L., Mueser, K.T., & Torrey, W.C. (2001).
Implementing evidence-based practices in routine mental health service settings. Psychiatric
Services, 52, 179-182
Education Development Center. Designing and Implementing Programs Based on Research. Retrieved
May 21, 2008, from
http://notes.edc.org/HHD/MSC/mscres.nsf/e8607b6263a685788525686d005eedf2/7de12bfe43d4a65
5852568e80054284a/$FILE/Topic6-Research.doc
Fixsen, D., Naoom, S.F., Blase, K., Friedman, R.M., Wallace, F. (2005). Implementation Research: A
Synthesis of the Literature. National Implementation Research Network.
Gorman-Smith, D. (2006). How to Successfully Implement Evidence-Based Social Programs: A Brief
Overview for Policymakers and Program Providers. Coalition for Evidence-Based Policy (working
paper).
Metz, A., Bowie, L., Blase, K. (2007). Seven Activities for Enhancing the Replicability of Evidence-Based
Practices. Child Trends.
National Registry of Evidence-based Programs and Practices. Questions You Might Want To Ask a
Developer As You Explore the Possible Use of an Intervention. Retrieved June 18, 2008, from
http://www.nrepp.samhsa.gov/pdfs/questions-for-developers.pdf
IASWR (2007). Partnerships to Integrate Evidence Based Mental Health Practices into Social Work
Education and Research. Retrieved May 28, 2008, from
http://www.charityadvantage.com/iaswr/EvidenceBasedPracticeFinal.pdf
U.S. Department of Education Institute of Education Sciences National Center for Education Evaluation
and Regional Assistance (2003). Identifying and Implementing Educational Practices Supported By
Rigorous Evidence: A User Friendly Guide. Retrieved May 21, 2008, from
http://www.ed.gov/rschstat/research/pubs/rigorousevid/rigorousevid.pdf
What Works for Children (2006). Evidence Guide: An introduction to finding, judging and using research
findings on what works for children and young people. Retrieved May 21, 2008, from
http://www.whatworksforchildren.org.uk/docs/tools/evidenceguide%20june2006.pdf
Additional Resources
Evidence-Based Practice: Registries and Databases
California Evidence-Based Clearinghouse for Child Welfare
(http://www.cachildwelfareclearinghouse.org/)
The Campbell Collaboration: C2-Ripe Library
(http://www.campbellcollaboration.org/frontend.aspx)
Center for the Study and Prevention of Violence
(http://www.colorado.edu/cspv/blueprints)
The Cochrane Library
(http://www3.interscience.wiley.com/cgi-bin/mrwhome/106568753/HOME)
The Evaluation Center’s EBP Metabase
(http://www.tecathsri.org/ebp_search.asp?stmode=start)
Evidence-Based Behavioral Practice (EBBP)
(www.ebbp.org)
Matrix of Children’s Evidence-Based Interventions
(http://www.nri-inc.org/reports_pubs/2006/EBPChildrensMatrix2006.pdf)
National Cancer Institute: Research-Tested Intervention Programs
(http://rtips.cancer.gov/rtips/index.do)
Preventing Drug Abuse among Children and Adolescents: Examples of Research-Based Drug Abuse
Prevention Programs
(http://www.nida.nih.gov/Prevention/examples.html)
SAMHSA’s National Registry of Evidence-Based Programs and Practices
(http://www.nrepp.samhsa.gov/)
Social Programs That Work
(http://www.evidencebasedprograms.org/)
State of Hawaii: Effective Psychosocial Interventions for Youth with Behavioral and Emotional Needs
(http://hawaii.gov/health/mental-health/camhd/library/pdf/ebs/ebs012.pdf)
Western Capt Best and Promising Practices Database
(http://casat.unr.edu/bestpractices/search.php)
Online Resources and Research
Articles for the Symposium: Improving the Teaching of Evidence-Based Practice
http://www.utexas.edu/ssw/ceu/practice/articles.html
Center for Evidence-Based Practice: Young Children with Challenging Behavior
(http://challengingbehavior.fmhi.usf.edu)
Child Trends
(http://www.childtrends.org)
ClinicalTrails.gov
http://www.nlm.nih.gov/pubs/factsheets/clintrial.html
EBP Exchange - UMB School of Social Work
http://ebpexchange.wordpress.com/
The e-Community Forums at The Evaluation Center @ Human Services Research Institute
http://www.tecathsri.org/bb/index.php
Evidence-Based Practice and Cultural Competence in Child Welfare
http://ssw.cehd.umn.edu/EBP-CulturalCompetence.html
Evidence-Based Practice Implementation Resource Kits for Mental Health
http://mentalhealth.samhsa.gov/cmhs/communitysupport/toolkits/about.asp
MedlinePlus
http://medlineplus.gov/
The National Implementation Research Network
http://nirn.fmhi.usf.edu/aboutus/01_whatisnirn.cfm
The Need for an Evidence-Based Culture: Lessons Learned From Evidence-Based Practices
Implementation Initiatives
Vijay Ganju, Ph.D.
http://www.nri-inc.org/reports_pubs/2006/EBPCultureImpLessons2006.pdf
Organizational Readiness for Change and Opinions Toward Treatment Innovations
http://www.oregon.gov/DHS/mentalhealth/ebp/org-readiness4change.pdf
Planning and Budgeting Evidence-Based-Practices for Mental Health and Substance Abuse Disorders
http://www.nri-inc.org/reports_pubs/2004/EBPPlanBudgetBroskowski2004.pdf
Position Paper on Native American Treatment Programs and Evidence-Based Practices
http://www.oregon.gov/DHS/mentalhealth/ebp/native-american-trtmntpro-ebp.pdf
PubMed: Medline
http://www.nlm.nih.gov/pubs/factsheets/pubmed.html
The Role Of Leadership In Mental Health System Transformation
Jeanne C. Rivard, Ph.D.
http://www.nri-inc.org/reports_pubs/2006/EBPLeadershipMHRivard2006.pdf
SAMHSA: A Guide To Evidence-Based Practices (EBP) on The Web
http://www.samhsa.gov/ebpwebguide/index.asp
Download