jjdpc presentation - EPISCenter

advertisement
Evidence-Based Prevention and
Intervention Support Center
(EPISCenter)
Training and Technical Assistance Supporting
Evidence-based Prevention and Intervention,
and Juvenile Justice System Improvement
September 9, 2014
Presentation Outline
1.
PCCD’s Approach to Prevention
2.
Implementation Quality & Monitoring
3.
The Role of Technical Assistance
4.

Strategic Coordination

PAYS, CTC, EBPs, EBIs

Systems for Data Collection & Reporting

Virtual Technical Assistance
Outcomes of a State Agency & University Partnership
2
Key Aspects and Activities
1.
Using data to identify community risk and needs (CTC)
2.
Identifying local services to match those needs (CTC)
3.
Assessing additional programmatic needs; CTC, EBPs, fit &
feasibility
4.
High fidelity and quality implementation of EBPs
5.
Roll-up and tracking of implementation and outcomes data
6.
Estimating return-on-investment/cost-benefit analysis
7.
Supporting sustainability from seed grants to post-funding
8.
Developing statewide capacity for prevention across all
levels
3
EPISCenter: Initiatives and Goals
Multi-Agency Steering Committee
(Justice, Welfare, Education, Health)
Intermediary and State-level Prevention Support System
Support to
Community Prevention
Coalitions
Support to
Evidence-based
Prevention & Intervention
Programs
Improve Quality of
Local Innovative Programs
and Practices
Implementation
Quality
Resource
Center Mission:
To support the proliferation of quality prevention and intervention programs
Broad-scale
Dissemination
aimed at promoting
positive
youth development and
preventing violence, delinquency, substance abuse and other problem behaviors
Sustainability
inLong-term
children and
adolescents.
4
PART I
PCCD’S APPROACH TO
PREVENTION:
A RESEARCH-INFORMED
STRATEGY
Moving From Prevention Science . . .
Implement &
Evaluate
Programs
Develop &
Test
Interventions
Identify Risk
& Protective
Factors
Define the
Problem
Problem
Response
6
Prevention Science Background…
We know a great deal about how youth problems develop,
and how to effectively prevent them (& reduce prevalence)
• Known risk & protective factors
• Multiple domains of influence
– Community, family, school, peer, individual
• Multifinality and equifinality
– A public health approach to public safety
• Different trajectories (early vs. late starters)
• Criminogenic impact of intervention
7
The Continuum of Confidence
Programs/services can be placed along a
continuum of confidence based on their evidence or theory
*Bumbarger & Rhoades, 2012
Ineffective
“This program has been
evaluated and shown to have
no positive or negative effect”
Very
Confident
HARMFUL
Ineffective
Iatrogenic (Harmful)
“This program has been
rigorously evaluated and
shown to be harmful”
 Research-based
 Best Practices
“We’ve done it
and we like it”
unknown
“This program is based on sound
theory informed by research”
Promising
 Promising Approaches
“We really think this
will work… but we need
time to prove it”
EFFECTIVE
Very
Confident
 Evidence-based
“This program has
been rigorously evaluated
and shown to work”
How confident are we that this program or practice is a good use of resources
AND improves outcomes for children and families?
8
…To Prevention Service
Provide
Technical
Assistance
Set & Collect
Performance
Measures
Monitor Quality
of Program
Implementation
Assess
Public Health
Impact
Response
9
Implementation Science Background…
We also know a great deal about factors that influence
quality of program implementation
• Champions
• Stakeholder buy-in
• Program credibility
• Site’s capacity for implementation
• Strong site coordinators
• Training in the program model
• Implementer confidence in delivery
• Proactive-ness and responsiveness of technical assistance
10
Pennsylvania’s EBP Dissemination Model
Prevent
dependency,
• • PA
Youth
Survey +delinquency, and ATOD use to the greatest
1989
degree possible (primary prevention)
• CTC prev. infrastructure & prioritize RPFs +
1994
• Intervene effectively with youth for whom primary prevention is not
1998/2001
• Targeted
sufficient support for selected EBPs +
2001/2008
• • Technical
assistance
to promote
Allow communities
the flexibility
to select quality,
strategies that best meet
local needs
dissemination,
and sustainability =
Create community-level infrastructure for strategic prevention
2003/2005
• • Population
level impact
planning and coordination
• Provide accountability and use scarce resources efficiently
ULTIMATELY….
• To “move the needle” on key indicators of (behavioral)
health at the POPULATION level
11
Multifinality Key Point:
Equifinality Key Point:
When we target risk
There are multiple routes
factors (the underlying
to address a targeted
causes of behavior), we
behavior problem.
address more than one
problem behavior.
Risk Factors (Causes)
Adolescent Problem
Behaviors (Outcomes)
12
Adolescent Problem
Behaviors (Outcomes)
Implications:
State and local agencies
interested in efficiently
addressing youth
problem behaviors
should collectively focus
on underlying risk and
protective factors that
Risk Factors (Causes)
and effectively
drive common and
shared problems.
13
The Pennsylvania Youth Survey (PAYS):
PA’s Essential Tool for Prevention Planning
• Measures risk and protective factors across multiple domains.
• A voluntary survey conducted in schools every other year for
youth in 6th, 8th, 10th, and 12th grades.
• Adapted from the Communities That Care Youth Survey, additional
questions added on gambling, prescription drug abuse, other antisocial behaviors, and experience of trauma and grief.
• All CTC Sites are essentially required to use it, and many additional
schools volunteer to participate.
• 2013 PAYS: 200,000+ youth, 335 school districts, 70 other schools
14
Creating Fertile Ground for EBPs:
The Role of PAYS in Data-Informed Prevention Planning
(The Communities That Care model)
Form local coalition
of key stakeholders
Re-assess risk and
protective factors
Collect local data on
risk and protective
factors
Leads to community
synergy and
focused resource allocation
Use data to
identify priorities
Select and implement evidencebased program that targets
those factors
15
A Review of Research Findings on PA Coalitions
• Connection to coalitions increases likelihood of program
sustainability 2-yrs after funding ends. CTC more likely than
non-CTC to be sustained.
• Connection to CTC coalitions reduces likelihood of making
program adaptations over time (4-yr longitudinal study)
• Knowledge of the CTC model significantly predicts coalition
providing downstream support for evidence-based
programming.
• EBP implementers report CTC assistance in:
• Mobilizing the community
• Supporting actual prevention efforts
• Promoting evidence-based programming
• Assisting in impact evaluation
16
5 year Longitudinal Study of PA Youth
% Change of CTC/EBP Youth Over
Comparison Group
40
• 419 age-grade cohorts over
a 5-year period
35
Youth in CTC communities
with EBPs:
25
Academic
Performance
33.2
30
20
• Lower rates of delinquency
School
Engagement
15
• Greater resistance to
16.4
10
negative peer influence
5
• Stronger school
0
engagement
• Better academic
achievement
-5
-10
-15
-10.8
-10.8
Delinquency
Negative Peer
Influence
Feinberg, M.E., Jones, D., Greenberg, M. T., Osgood, W. D., & Bontempo, D. (2010). Effects of the Communities that Care
model in Pennsylvania on change in adolescent risk and problem behaviors. Prevention Science, 11, 163-171.
17
Impact on Juvenile Court Placement Rates:
Comparison of Placement Rates for Counties* With and Without
an EBI
11.00
10.50
10.53
10.00
9.50
9.89
10.05
9.76
No EBI
9.00
9.07
8.50
Adopted
EBI
8.70
8.00
7.50
7.79
7.78
2009
2010
7.00
2007
2008
Bumbarger, B. K., Moore, J., & Rhoades, B. (2010). Impact of evidence-based interventions on delinquency placement rates.
18
Presentation at 2011 Society for Prevention Research annual meeting. Washington, DC.
The Big Picture:
Evidence-based Programs are a wise
investment of state resources
• Communities with EBPs embedded in the context of CTC
have lower levels of delinquency and youth drug use*
• EBPs in PA produce an overall return of $5 for every $1
invested – a statewide return measured in hundreds of
millions**
• Conservative estimates of the CTC model demonstrate a $5
return, more realistic estimates indicate a $10 return***
* Feinberg, M.E., Jones, D., Greenberg, M. T., Osgood, W. D., & Bontempo, D. (2010). Effects of the Communities that Care
model in Pennsylvania on change in adolescent risk and problem behaviors. Prevention Science, 11, 163-171.
** Jones, D., Bumbarger, B., Greenberg, M., Greenwood, P., and Kyler, S. (2008). The Economic Return on PCCD’s
Investment in Research-based Programs: A cost-benefit assessment of delinquency prevention in Pennsylvania. Prevention
Research Center, Penn State University.
*** Kuklinski, M. R., Briney, J. S., Hawkins, J. D., & Catalano, R. (2012). Cost-benefit analysis of Communities that Care
outcomes at eighth grade. Prevention Science, 13, 150-61.
19
PART II
IMPLEMENTATION
QUALITY & MONITORING
Basis for “Evidence” in EBPs
• Deterrent effect with a strong research design
• Sustained effect
• Multiple site replication
21
Deviating from Program Design
• “When communities ‘tweak’ a program to suit their own
preferences or circumstances, they wind up with a
different program whose effectiveness is unknown.”
Blueprints for Healthy Youth Development
www.colorado.edu/cspv/blueprints
?
22
The Connection Between Implementation & Outcomes
Berkel, C., Mauricio, A. M., Schoenfelder, E., & Sandler, I. N. (2011). Putting the pieces together: An integrated model of
23
program implementation. Prevention Science, 12(1), 23-33
Barriers to Fidelity
• Unforeseen challenges (time, resources, population access)
• Lack understanding of the program’s underlying theory
• Implementers lack necessary skills
• Programs that are not “user friendly”
• Lack of administrator support
• Status quo
24
Elements of Implementation Monitoring
• Based in understanding of program’s theory of change/logic model
• Data collection
– Process measures (e.g., # of lessons delivered)
– Outcome measures (e.g., reduced antisocial attitudes)
– Implementer & participant feedback (e.g., how well did Session 6 go?)
– Use of valid, reliable survey tools (developmentally appropriate, actually measures
program targets)
• Observation of program delivery
– By developer, or certified trainer/implementer
• Implementer & administrator review and reflection on data
• Adjustments to implementation made accordingly
25
Implementation Monitoring Example
26
PART III
THE ROLE OF
TECHNICAL
ASSISTANCE
27
EPISCenter: Initiatives and Goals
Multi-Agency Steering Committee
(Justice, Welfare, Education, Health)
Intermediary and State-level Prevention Support System
Support to
Community Prevention
Coalitions
Support to
Evidence-based
Prevention & Intervention
Programs
Improve Quality of
Local Innovative Programs
and Practices
Mission: Quality
Implementation
To support the proliferation of quality prevention and intervention programs
aimed at promoting
positive
youth development and
Broad-scale
Dissemination
preventing violence, delinquency, substance abuse and other problem behaviors
Long-term
Sustainability
in
children and
adolescents.
28
Methods of Technical Assistance
• Strategic Plan & Outreach
• Resource development, distribution
• Fact sheets, how-to guides, webpages
• Trainings
• In-person, webinars, YouTube videos
• Networking meetings (quarterly)
• CTC & program-specific, cross-over meetings
• Connect with, learn from, and problem-solve with peers
• On-site consultations and implementation plan development
• Training, fidelity, outcome measurement, implementation,
connection to coalition
29
STRATEGIC
COORDINATION
Cross-Systems Engagement
State Agencies
• Cross-agency
coordination
• System-level
barriers
• Gap analysis
Program Developer
• Programmatic
barriers to
implementation/
sustainability
• PA-based Trainings
• “Group” discounts
Implementing Site
• Recruitment
• Data collection
• Local stakeholder
buy-in
Technical Assistance
Policy Recommendations
Research
31
Strategic Connections
32
Strategic Connections
• DDAP trainings in CTC needs assessment training
• For Single County Authorities to assess local program and
service needs
• Connecting with Intermediate Units to discuss programmatic needs
and availability of PCCD funding for EBPs
• Virtual web-meetings with PA stakeholders to review evidencebased intervention utilization and outcomes data, including JCJC,
OMHSAS, CJJT&R, JJSES Stage 3
• Discussions with developers of evidence-based programs that may
meet PA needs identified by gap analysis; fit and feasibility
• EPISCenter prominently featured in the Prevention Research
Center’s strategic plan
• Consultation to other states on development of state-level
infrastructure for prevention (PA is a model!)
33
Strategic Connections
PA Infrastructure for Efficiencies:
SFP 10-14 Training & Quality Assurance
• Trainer cadre – trained to train up SFP facilitators and observers
• Improved turn-around time for scheduling training
• Decreased PA funds going to out-of-state trainer travel expenses
• Quality assurance “designees”
• Improved turn-around time for scheduling QA visit, and for
receiving QA feedback and certification letter
IYS Training & Curriculum Discounts
• In-state IYS planned training dates
• Improved ability to plan for training due to availability
• Increased access to timely training
• Decreased out-of-state travel expense for grantees
• Curriculum discounts to PCCD grantees
• EPIS as “ordering & invoicing hub”
• Scaled discounts according to quantity of curriculum ordered
• Decreased PCCD cost for materials
34
Strategic Development
Gap Analysis for PCCD:
 Phase I: Review state-level PAYS and Juvenile Justice data to identify
risk & needs (PAYS 2011, 2013, Disposition 2012, Recidivism 2007-09).
 Phase II: Identify programs that address PA needs, determine level of
evidence in program effectiveness, recommend programs for PA to
support (what remains are “true” gaps).
• Phase III: Drill-down to county-level indicators of risk using PAYS and JJ
data, draw in additional data from other sources (Dept. of Health, PDE,
etc.), identify strengths (positive indicators, coalition presence, prior
funding), recommend programs and strategies to address youth needs.
6 person team comprised of EPIS representation across CTC, prevention,
intervention, and juvenile justice staff.
Phase I and II presentation and program recommendations are posted
online: http://episcenter.psu.edu/gaps
35
Strategic Development
Cross-pollinating across CTC and SPEP:
• Raising awareness in community prevention coalitions of the JJSES
activities, SPEP
• SPEP focus on quality improvement has implications for all youth-serving
programs
• More widespread use of the YLS will result in coalitions’ ability to use that
data in conjunction with PAYS to determine their youths’ risks and needs
• Strengthening the role and connection between local prevention efforts
and local juvenile justice efforts
36
SUPPORT FOR
THE PAYS
Support for Informed Use of the PAYS
• Presentations & Trainings
• PAYS 101 & 2013 Webinars
• PA Safe Schools Conference
• Commonwealth Prevention Alliance (CPA)
• CTC Regional Meetings
• Resources
• Short, online tutorial videos
• Guide for interpreting data
• Templates for sharing community data
• PDE Guide & Workbook
• PAYS 2013
• Community/district recruitment
• Advisory Group
• Using public health model
• Prevention planning
• Grant writing
• Connecting to
coalitions/community
• Sustainability
• Beyond 101: Advanced topics
38
Support for Informed Use of the PAYS
39
Support for Informed Use of the PAYS
40
Support for Informed Use of the PAYS
41
SUPPORT FOR
CTC
EPISCenter Initiative Areas
Support to
Community Prevention
Coalitions
Support to
Evidence-based
Prevention & Intervention
Programs
Improve Quality of
Local Innovative Programs
and Practices
• Communities That Care (CTC)
• Drug-Free Communities (DFC)
• Strategic Prevention Framework (SPF)
• Integrated Services Plan
• Hybrid models
43
Creating Fertile Ground for EBPs
Risk-focused Prevention Planning
(the Communities That Care model)
Form local coalition
of key stakeholders
Re-assess risk and
protective factors
Collect local data on
risk and protective
factors
Leads to community
synergy and
focused resource allocation
Use data to
identify priorities
Select and implement evidencebased program that targets
those factors
44
Support for the Communities That Care Process
45
The PCCD CTC Grantee: TA & Milestones
• New grantee orientation
• Mobilizer training
• Milestones & Benchmarks assessment with the Board (grant start,
middle, end)
• Co-development of implementation plan
• On-site trainings of mobilizer, board, and workgroups in CTC process
• Quarterly networking meetings (regional NW, SW, C, NE, SE)
• Quarterly site visits
• Monthly phone consults between TA provider and mobilizer
46
Assessing & Supporting Community Coalitions
• Web-based data collection from CTC board members
– Board membership, leadership, relationships, work style
– CTC process
– Programs implemented
– Barriers experienced
– Technical assistance accessed
• Provide feedback to sites on coalition functioning
• Summary report to TA consultant
• Report presented to CTC site
• Used for strategic planning
47
Virtual Technical Assistance: CTC Resources
48
SUPPORT FOR
EBPs
EPISCenter’s Four Focal Initiatives
Support to
Community Prevention
Coalitions
Support to
Evidence-based
Prevention & Intervention
Programs
Improve Quality of
Local Innovative Programs
and Practices
• ART - Aggression Replacement Training
• BBBS - Big Brothers Big Sisters
• IYS - Incredible Years (Parenting; Basic & Advanced)
• IYS – Incredible Years (Youth; Classroom & Small Group)
• LST - Life Skills Training
• OBPP – Olweus Bullying Prevention Program
• PATHS - Promoting Alternative Thinking Strategies
• PTNDA - Project Toward No Drug Abuse
• SFP 10-14 – Strengthening Families Program 10-14
50
EPISCenter’s Four Focal Initiatives
Support to
Community Prevention
Coalitions
Support to
Evidence-based
Prevention & Intervention
Programs
Improve Quality of
Local Innovative Programs
and Practices
• Functional Family Therapy (FFT)
• Multisystemic Therapy (MST)
• Multi-dimensional Treatment Foster Care
(MTFC)
51
The PCCD EBP Grantee: TA & Milestones
• New grantee orientation
• Initial site visit by Prevention Coordinator
• Co-development of implementation plan
• Spring and fall site visits, and site ratings
• Quarterly networking meetings
• Quality assurance visit, rating, and feedback by program developer
• Near-end of grant Outcomes Report
• Ongoing quarterly reporting of process & outcomes data
52
Content and Skill Areas for Technical Assistance
General and program-specific capacity for evidence-based programming
•
Using data-informed decision making strategies
•
Understanding the program components, underlying theory
•
Coaching in implementation fidelity (general, program-specific, lesson- or
component-specific)
•
Identifying participant recruitment and retention strategies for site coordinators
•
Supporting implementation and outcomes monitoring; data collection and
evaluation
53
Content and Skill Areas for Technical Assistance
General and program-specific capacity for evidence-based programming
•
Developing sustainability strategies; education and support in other funding
streams, needs-based budget process/planning, medical assistance funding,
program transfer to local infrastructure (e.g., school)
•
Connecting with other “learning community” members
•
Facilitating communication with developers, researchers
•
Building stakeholder buy-in, meaningful reporting on outcomes
54
Virtual Technical Assistance: EBP Resources
55
SYSTEMS FOR DATA
COLLECTION AND
OUTCOMES REPORTING
Standardized Data Collection and Support
• Spreadsheet Tools
– PCCD-funded prevention programs (ART, BBBS, IYS,
LST, OBPP, PATHS, SFP 10-14, PTNDA)
• INSPIRE
– Intervention Programs (FFT, MST, MTFC)
• Same goals:
– Generating process & outcome data, monitoring
– Increasing reliability & validity of data
– Reducing data reporting burden
– Increasing usability of data by variety of stakeholders
57
The Next Generation of EBP Spreadsheets
58
Quarterly and Annual INSPIRE Reporting
59
VIRTUAL
TECHNICAL
ASSISTANCE
Virtual Technical Assistance: YouTube & Webinars
61
Virtual Technical Assistance: Outreach
62
Virtual Technical Assistance: Outreach
63
PART IV
OUTCOMES OF STATE
AGENCY & UNIVERSITY
PARTNERSHIP
History of Research-Based Prevention in
Pennsylvania
1994: Key state leaders introduce Communities that Care (CTC) in PA
–
Spearheaded by Pennsylvania Commission on Crime & Delinquency (PCCD) and Juvenile Court Judges’
Commission
1994-2002: Initiation of CTC funding by PCCD
–
16 cycles of CTC model introduced in ~120 communities
1996: PCCD Co-funding of research for Blueprints programs
1998: Process Study of CTC conducted by Prevention Research Center
–
–
Resulted in creation of statewide TA infrastructure to support CTC
Formalized connection between CTC and EBP Initiative
1998: Initiation of Evidence-based Program Initiative by PCCD
–
10 cycles of EBPs funded over 13 years, resulting in ~200 EBPs
2001: Narrowed list of supported EBPs, aka “PA Blueprints”
2008: PCCD created Resource Center for Evidence-Based Prevention and
Intervention Programs and Practices
–
Multi-agency Steering Committee Representing Justice, Welfare, Education, and Health
65
Policy and Practice Innovations
• Development and support of communities of practice
• Including common public health language in RFAs
• Statewide surveillance system (PAYS)
– Focus on underlying causal mechanisms vs. narrowly defined
behavioral outcomes
• Community coalitions as local prevention infrastructure
• Ongoing monitoring of implementation
– Requires tools, skills, and motivation
Bumbarger, B. K., & Campbell, E. M. (2011). A state agency-university partnership for translational research and the
dissemination of evidence-based prevention and intervention. Administration and Policy in Mental Health and Mental Health
66
Services Research, 39, (4), p. 268-277.
From Lists to Improved Public Health…
• Synthesis and translation of research to practice,
(and practice to research)
• EBP selection, dissemination, and uptake
• Ensuring sufficient implementation quality and fidelity
• Understanding adaptation and preventing program drift
• Measuring and monitoring implementation and outcomes
• Policy, systems, and infrastructure barriers
• Coordination across multiple programs and developmental
stages
• Sustainability in the absence of a prevention infrastructure
Bumbarger, B. and Perkins, D. (2008). After Randomized Trials: Issues related to dissemination of evidence-based interventions.
Journal of Children’s Services,3(2), 53-61.
Bumbarger, B., Perkins, D., and Greenberg, M. (2009). Taking Effective Prevention to Scale. In B. Doll, W. Pfohl, & J. Yoon (Eds.)
Handbook of Youth Prevention Science. New York: Routledge.
67
From the Field to the Research Journals
68
Collaborative Policy Innovators:
James Anderson
Mike Pennington
Linda Rosenberg
Keith Snyder
Clay Yeager
Investigators and Authors:
Brian Bumbarger
Mark Feinberg
Louis Brown
Michael Cleveland
Jennifer Sartorious
Brendan Gomez
Stephanie Bradley
Mark Greenberg
Brittany Rhoades
Wayne Osgood
Damon Jones
Julia Moore
Richard Puddy
Elizabeth Campbell
The EPISCenter and research described here are supported by grants from the Pennsylvania
Commission on Crime and Delinquency. Special thanks to the staff of the Office of Juvenile
Justice and Delinquency Prevention (OJJDP).
69
Download