PowerPoint - Childhood Development Initiative

advertisement
Implementing What Works
Daniel Perkins & Brian Bumbarger
Pennsylvania State University
Importance of Research Standards
There could be no wiser investment in our
country than a commitment to foster the
prevention of mental disorders [or problem
behaviors] and the promotion of mental
health through rigorous research with the
highest of methodological standards. Such a
commitment would yield the potential for
healthier lives for countless individuals and
the general advancement of the nation's
well-being.
Institute of Medicine- 1994
What are Evidence-based Programs?
The Gold Standard
• Strong evidence of effectiveness
– Randomized controlled trials, well designed
and implemented
– Trials showing effectiveness in two or more
settings (including a setting similar to that of
school/classroom implementing the program)
(at least 300 students or 50-60 classrooms.)
Quality + Quantity = “Strong” Evidence
Why is a Randomized Clinical Trial (RCT)
Convincing?
 We know unequivocally if a program is effective
 Not due to pre-test differences
 Not due to other changes that might explain effects
 Replication of effects using RCT greatly increases
confidence that the program causes the changes
 Examples: Nurse Family Partnership Program; PATHS
Life Skills Training; SFP 10-14; Early Head Start
What is Convincing?
The choice of research methodologies is a
major issue in examining preventive
interventions and research trials designed to
determine their outcomes. It determines
whether evidence is compelling. The ideal
design is a randomized controlled trial.
Institute of Medicine- 1994
RCTs, Service, & Ethics
 Tension between rigorous science and providing
services.
– Without evidence of positive impacts then we cannot be certain that
what we are doing is working, or worse, is not causing harm.
– Not providing services to all feels like we are not being truthful to
community.
 Agencies who provide innovative programs almost
uniformly believe that their programs work
 There is a need for a clear framework of
accountability
 The answer is compromise in terms of providing the
best service and at the same time doing the most
rigorous possible science.
Why is a Randomized Clinical Trial Not
Sufficient by Itself?
 There is a need for replication
 There is a need to show effects across different
populations
 Ethnicity, Urban/Rural, Levels of Education, Types of
Communities
 There is a need for a carefully developed set of training
procedures to ensure fidelity when disseminated
 There is a need to learn how to flexibly adapt some
aspects of the model to the “culture” of different
communities
In the past…
• 20 years ago, there were NO empiricallyvalidated prevention programs
• Efforts were guided primarily by “good
intentions” and “gut instinct”
• Hundreds of millions of dollars were spent
without any accountability
• Prevention was considered more “art” than
“science”
Now…
• Two decades of rigorous scientific research have informed
our knowledge of epidemiology, etiology, methodology, and
prevention practice
• We have learned more about what causes and what works
to preventing youth problem behaviors and promoting
positive youth development in the last 20 years than we did
in the previous 200 years
• We have tested theories of changes (public health model)
that guide our program
• Today, there are many programs that have been proven
effective in well-designed studies and have been
independently replicated
• There is clearly a “science” of prevention!
Why Evidence-based Programs?
• Required use of “scientifically-based
research” to decide which
interventions to use and those that
will be funded
• Accountability
• To ensure the smart use of LIMITED
resources
When are Evidence-based
Programs Needed?
• When you want to increase the likelihood that
your program will have expected impacts (longterm).
• When there is support to implement an
evidence-based program with rigor by the
collaborators. (Evidence-based programs take a
lot a time to implement if done right)
The Impact of Programs that Work
modelprograms.samhsa.gov
• Life Skills Training cut tobacco, alcohol, and marijuana use
50% - 75%
• Nurse Home Visitation reduced alcohol use by 56% in
children 15 years after the intervention
• Project TND found a 26% reduction in regular hard drug
use
• All-Stars reduced poly-drug use 40-60% at immediate posttest
• Project Alert reduced marijuana use initiation by 30% and
regular marijuana use by 60%
Prevention is Cost-effective
www.wa.gov/wsipp
(measured benefits and cost per youth)
Program
Benefits
Costs
B-C
Nurse Home Visitation*
$26,298
$9,118
$17,180
Guiding Good Choices*
$7,605
$687
$6,918
Strengthening Families 10-14*
$6,656
$851
$5,805
Project Northland*
$1,575
$152
$1,423
LifeSkills Training*
$746
$29
$717
Project TND*
$279
$5
$274
All-Stars*
$169
$49
$120
Functional Family Therapy*
$16,455
$2,140
$14,315
Multisystemic Therapy*
$14,996
$5,681
$9,316
DARE
$0
$99
-$99
Intensive Supervision Probation
$0
$1,482
-$1,482
-$11,002
$54
$-11,056
Scared Straight
…Still more work to do
• Most prevention programs being utilized are not
EBIs
• Research has shown that most are not being
implemented with fidelity
• There is tension between rigorous science design
and providing services to all
Next Steps With Research-Based Programs:
Ensure Implementation Quality
When Communities Adopt Research-Based Programs
the Central Concerns Are:
 Maintaining High Fidelity
 Understand What Factors Influence Implementation
Quality
Leads to A New Generation of Research Questions
– What factors influence the quality of
implementation?
– How does implementation quality effect outcome?
“Our work must
emphasize deliberate
investment in positive factors
that research has shown to
be closely tied to reduced
levels of negative behaviors
as well as
increased levels of
thriving [resiliency] attitudes
and behaviors.”
(Blyth, 2000)
Replication of Effective Programs:
PA’s Blueprints Initiative
Sites
Programs
1998
12
17
1999
8
9
2000
21
24
2001
19
21
2002
21
24
2005
14
24
2006
14
15
2007
15
15
Total
124
149
Risk-focused Prevention Planning
(the CTC model)
Collect local data on risk and protective factors
Use data to identify priorities
Select and implement evidence-based program
that targets those factors
Re-assess prevalence of risk and protective factors
Issues & Challenges
• Readiness/Program Selection
• Understanding of program and what is required
• Buy-in of key stakeholders
• Training availability/access
• Cost, timeliness and turnover
• Fidelity
• Ongoing TA
• Monitoring/measurement
• Measurement of program impact
• Sustainability
Why does fidelity matter?
• Research has clearly linked fidelity with
positive outcomes
• Higher fidelity is associated with better
outcomes across a wide range of programs
and practices (PATHS, MST, FFT, TND,
LST and others)
• Fidelity enables us to attribute outcomes
to the intervention, and provides
information about program feasibility
The reality….
• While possible, fidelity is not a naturally
occurring phenomenon – adaptation (more
accurately program drift) is the default
• Most adaptation is reactive rather than
proactive
• Most adaptation weakens rather than
strengthens the likelihood of positive
outcomes
Adaptation happens…
• Between 23% and 81% of program
activities may be omitted during
implementation. (Durlak, 1998)
• Only 19% of schools implement researchbased curricula with fidelity. (Hallfors & Godette,
2002)
• Only about 75% of the students received
60% or more of the Life Skills Training
Program. (Botvin, et al., 1995)
Adaptation as a Function of Training
(formal training by the developer)
100
Percent
80
60
40
20
0
trained
Fidelity
not trained
Adaptation
Is adaptation
inevitable/necessary?
• Research shows that a high degree of
fidelity is attainable (Project TND,
PROSPER, Blueprints)
• There is little empirical support for
cultural adaptation of EVPs
– Most have shown similar effects across
gender, ethnicity/race, SES
– Studies of prospective cultural
adaptations have failed to yield positive
outcomes
Improving fidelity locally
• What gets measured matters
• Improve practitioner knowledge of
prevention science
• Use adaptation discussion as a tool for
training on the logic model of an
intervention
• Build a sustainable infrastructure for
monitoring implementation fidelity and
quality
• Build internal capacity AND desire
Building internal capacity and
motivation
• Approach fidelity from a practical,
accountability perspective – don’t make it a
research issue
• The goal is to develop local intrinsic
motivation for monitoring fidelity and quality
of program delivery – it must be tied to
outcomes
• Involve local practitioners/implementers in
the development and conduct of evaluation
– Process evaluation is fidelity monitoring
Practical strategies
• Peer coaching, peer observation
• Schedule regular opportunities for
reflective practice and de-briefing
• Never let the initial training be the only
training
• Data in must ALWAYS require data out –
create feedback loops and safe
environments for reflection
• Foster internal competition
• Emphasize the importance of a clear
understanding of a program’s logic model
Where to find
evidence-based interventions
•
•
•
•
•
•
•
•
The What Works Clearinghouse
(http://www.,w-w-c.org/)
SAMHSA National Registry of Effective Prevention Programs
((http://www.modelprograms.samhsa.gov))
The Promising Practices Network (http://www.promisingpractices.net/)
Blueprints for Violence Prevention
(http://www.colorado.edu/cspv/blueprints/index.html)
The International Campbell Collaboration
(http://www.campbellcollaboration.org/Fralibrary.html)
Safe and Sound: An Educational Leader’s Guide to Evidence-Based Social and Emotional
Learning Programs (http://www.CASEL.org)
Social Programs that Work
(http://www.excel.gov.org/displayContent.asp?Keyword=prppcSocial)
Center for Disease Control Effective Programs
(http://www.cdc.gov?healthyYouth.partners/registries.htm)
When one has no stake in the way things
are, when one’s needs and opinions are
provided no forum, when one sees oneself
as the object of unilateral actions, it
takes no particular wisdom to suggest
that one would rather be elsewhere.
-S. Sarason, 1990
Successful Community Engagement
• Use data about strengths and needs of community to inform your
selection strategies
• Agencies and staff buy-in is critical;
– Participation in decision-making & understanding of overall logic model
• Support for the community member engagement on the
management teams
– Advisory board that engages the support of local Champions and
community leaders
– Opportunities for community members to complete meaningful tasks
• Ongoing communication among staff and agencies
– Learning Communities
– Recognize successes
• Social Marketing Strategy to obtain support of
citizens (timing)
Critical Elements of Youth on
Management teams
• Adult support
• Youth-friendly environment
• Opportunities to complete meaningful tasks
• Opportunities to learn and use new skills.
We are Guilty of many errors and many faults,
but our worst crime is abandoning the children,
neglecting the fountain of life.
Many of the things we need can wait.
The child cannot. Right now is the time his bones
are being formed, his blood is being developed.
To him we cannot answer ‘Tomorrow.’
His name is ‘Today.’
Gabriela Mistral, Nobel Prize-winning Poet
Download