Developing and implementing multi-level program evaluation

advertisement
Developing and Implementing
Multi-Level Program Evaluation Plans
for SAT-ED Grants
3/11/2013
Michael L. Dennis,
Chestnut Health Systems. Normal, IL
Available from www.gaincc.org/presentations
Created for: Substance Abuse and Mental Health
Services Administration’s (SAMHSA) Center for
Substance Abuse Treatment (CSAT) under contract
number HHSS283200700003I, Task Order
HHSS28300002T
Goals for the Presentation??
2
1. Summarize the key problems in our field that SAT-ED
is attempting to address
2. Review objectives, key questions and sources of data
to be addressed in the evaluation
3. Identify key steps in designing, implementing and
using evaluation to help manage and improve
programs
4. Discuss strategies for reliable, valid, and efficient
collection and analysis of state (including
commonwealth), site, and client-level data
5. Provide links to further resources and training
Objectives of SAT-ED
3
To improve treatment for adolescents through the
 Development of a Learning Laboratory with collaborating
local community-based treatment provider sites.
 Improvements in State level infrastructure through workforce
development, financial planning, licensure and certification
 Improvement of Site level infrastructure through
implementation of Evidence-Based Practice (EBP) related to
assessment and treatment
 Assessment, treatment and monitoring of change at the
client level.
Feds have to be able to describe what was done with the money
State and sites have to decide what to try and sustain
Typical Components of a
Multi-Level Evaluation Plan
4
1. Needs assessment
2. Description of program activities, Theory of
Change and/or Logic Model
3. Approach to stakeholders
4. Evaluation questions, data sources and
methodology
5. Performance monitoring and reporting
TIP: Labels and the order of components can vary to fit your situation,
the point here is really to make sure that you have them covered or
that your team makes an informed decision not to address them
1. Needs Assessment
5
 Description of infrastructure and site level
needs and what information is still needed
 Local system and/or cultural consideration
 Articulate the rationale for the selection of the
targeted
 Infrastructure activities
 Site selection
 Evidence-based assessment selection
 Evidence-based treatment selection
Structural Challenges to Delivery of Quality Care
in Behavioral Health Systems
6
1. High turnover workforce with variable education background
related to diagnosis, placement, treatment planning and
referral to other services
2. Heterogeneous needs and severity characterized by multiple
problems, chronic relapse, and multiple episodes of care over
several years
3. Lack of access to or use of data at the program level to guide
immediate clinical decisions, billing and program planning
4. Missing, bad or misrepresented data that needs to be
minimized and incorporated into interpretations
5. Lack of infrastructure that is needed to support
implementation and fidelity of evidence-based practices that
have been shown to work better on average
Substance Use Disorder & Treatment by Age
Higher rates of
need for young
adults
25%
100%
20%
80%
15%
60%
10%
40%
5%
20%
0%
Substance Use Disorder
Treatment
Unmet Need
<18
7%
0.5%
92%
Source: SAMHSA 2009 National Survey on Drug Use and Health
18-25
20%
1.6%
92%
26+
7%
1.0%
86%
0%
% Unmet Need
Higher rates of
unmet need for
adolescents
and young
adults
% Past Year
77
Substance Use Disorder & Treatment by Age
Completion
rates are lower
for adolescents
and young
adults
Lengths of stay
are shorter for
young adults
and adults
% of Admissions
88
100%
90%
80%
70%
60%
50%
40%
30%
20%
10%
0%
Compl. or Trans.
45 + days
90 + days
<18
56%
67%
43%
Source: SAMHSA 2009 Treatment Episode Data Set – Discharges (TEDS-D)
18-25
56%
51%
33%
26+
62%
45%
30%
No Self-Help Group Participation
in the First 3 Months of Treatment
9
Age*
Higher adolescents and
young adults
* p<.05
SAMHSA 2011 GAIN SA Data Set subset to has 3m Follow up (n=21,228)
Unmet Need for
Mental Health Treatment by 3 Months
10
Age*
Higher for adolescents
and young adults
* p<.05
SAMHSA 2011 GAIN SA Data Set subset to has 3m Follow up (n=14,358)
Unmet Need for
Medical Treatment by 3 Months
11
Age*
Higher for Young
Adults
* p<.05
SAMHSA 2011 GAIN SA Data Set subset to has 3m Follow up (n=8,517)
2. Description of Program Activities,
Theory of Change and/or Logic Model
12
 Describe infrastructure and site level activities
to be conducted and any specific programs or
evidence-based practices you plan to use
 Theory or logic model for each need and how
they will be addressed by the activity and the
expected outcome
 Discuss relationship between various needs,
activities or components, including how state
and site level activities support each other
Expected State-Level
Infrastructure Activity
13
1. Interagency workgroup to improve the statewide infrastructure
for adolescent substance abuse treatment and recovery
2. Memoranda of understanding between SAT-ED awardee
agency and other child-serving agencies
3. Multi-year workforce training plan for specialty adolescent
behavioral health (substance use disorder/co-occurring
substance use and mental disorder) treatment/recovery sector
and other child-serving agencies
4. Comprehensive and integrated continuum of care for
adolescents with substance use and mental health disorders
in terms of both funding and services
Expected State-Level
Infrastructure Activity (continued)
14
5. Financial mapping to understand current funding and
coverage
6. Coordination of funding to make the system more
efficient, expand coverage and shift towards more
effective practices
7. Facilitation of a learning laboratory to use above to identify
target areas of need, attempt change, evaluate the change,
and if necessary adjust strategies to improve the quality of
care
TIP: Can relate to and/or build on activities already under way. You
just want to be sure that you will be prepared to address each area in
your annual and final progress reports
Other Allowable State-Level Infrastructure Activity
15
8. Workforce mapping to understand qualifications of staff across
the continuum of care and the adequacy of the initial training/
continuing education infrastructure already in place
9. College, university and continuing education staff and
programs/faculty infrastructure improvements/expansions and
number of new/existing staff trained
10. Other state wide events to provide continuing or community
education/training
11. Reviewing/revising PROGRAM standards for licensure,
certification, and/or accreditation of programs that provide
substance use and co-occurring mental disorders services for
adolescents and their families
12. Reviewing/revising CLINICAN standards for licensure,
certification, and/or credentialing of clinicians that provide
substance use and co-occurring mental disorders
services for adolescents and their families
Other Allowable State-Level Activity (continued)
16
13. Family / youth support organization creation, expansion,
continuation, or enhancement
14. People newly credentialed/certified to provide substance use
and co-occurring substance use & mental health disorders
15. Policy changes made as a result of the cooperative agreement
16. Financing policy changes completed as a result of the
cooperative agreement
TIP: Choose what makes sense for your needs and proposed
activities. Invest more in measuring those areas where you are
focusing your resources and attention. There is less interest in the
average than identifying and understanding one or more areas where
grantees that have done something they found useful.
Expected Site-Level Infrastructure Activity
17
17. Collaborating sites you have contracting with to provide
evidence practice practices (EBP)
18. EBP related to a) assessment and b) treatment for which you
have contracting to obtain at training and technical support to
implement
19. EBP training type, date, and number staff attending each
20. EBP proficient staff capacity in terms of the number of
employed staff who are certified by level and type of EBP
21. EBP local trainer or supervisory capacity in terms of the
number of employed staff who are certified by level and type
of EBP train and supervise new staff
Other Optional Site-Level Infrastructure Activity
18
22. Implementation of EBP related to assessment in terms of the
number completed, linkage to medical records, use of clinical
decision support, use for program planning (aka meaningful
use)
23. Expansion of coverage based on number and percent of
assessed youth receiving any services billed to insurance
(Medicaid, CHIP, other federal/state, other private) instead of
the block grant
24. Implementation of EBP related to treatment in terms of the
number clients receiving it and receiving target dosage.
TIP: Be sure to think about how to describe, measure and demonstrate a
relationship between state and site level activities related to the chosen
EBP. Collaborate with other state using the same EBP
Comparison of Site EBP for Assessment
19
Evidence-Based Practice
Comprehensive Adolescent
Severity Index (CASI)
IA IL IN KY LA MA ME MT NY OK PR SC WA
X
Global Appraisal of Individual
Needs (GAIN)
Version: SS, Q3, Lite, Core, Full
X
X
X
X
X
X
X
X
?
X
X
I
I
L
C Q Q
L
F
?
Q
I
X
X
?
X
X
X
?
X
X
X
X
X
X
X
X
X
X
X
Follow-up
Government Performance and
Result Act (GPRA)
X
?
X
X
X
X
TIP: Most states/sites have other electronic or hard copy records and
have mentioned additional measures in their proposal or preliminary
evaluation plans; Also, most sites are still the process of deciding
whether to conduct follow-up with EBP or other measures beyond GPRA
Comparison of Site EBP for Treatment
20
Evidence-Based Practice
IA IL IN KY LA MA ME MT NY OK PR SC WA
Adolescent Community
Reinforcement Approach (A-CRA)
X
X
Intensive Community Treatment
(ICT)
Multidimensional Family Therapy
(MDFT)
X
X
X
X
X
X
X
X
Multi-Systemic Therapy (MST)
Seven Challenges (7C)
?
?
X
TIP: Several states have talked about comparing to other EBP within
their state, comparing to the same EBP in other sites , and/or
expanding EBP to other sites.
X
3. Approach to Stakeholders
21
 Identification of state, site, community and
individual (youth, family) level state holders
 Coordination with or creation of strategic
planning groups or interagency councils
 Coordination with electronic medical and
billing records
 Involvement of program directors, information
technology staff, clinical directors,
supervisors, line staff
 Coordinating with or creation of community,
family and/or youth advisory groups
or partnerships
Questions for Stakeholders
22
 Key needs or problems with the current
system that might be addressed
 Critical timelines, measures and products that
would make it more useful to them
 Recognizing how they define & measure
things and where multiple definitions or
measures may be needed across
stakeholders
 What will it take for them support sustainability
beyond the grant?
Identifying & Address Key Subgroups That May
Have Concerns or Barriers to Accessing Services
23
 Demographic groups (e.g., by gender, race,
ethnicity, age, sexual orientation)
 Abilities (e.g., hearing, sight, mobility, IQ)
 Clinical subgroups such as





Primary substance,
Co-occurring mental health/trauma/suicide
Crime/violence or justice involvement
Degree of family support and use
Insurance, transportation or economic
TIP: Health disparities need to be treated similar to safety issues – where best
practice is to diligently look for them and work towards reducing them where ever
possible in order to improve effectiveness and reduce liability
4. Evaluation Questions,
Data Sources & Methodology
24
 Operationalizing the program
objectives/questions into activities, measures
of implementation/outputs and outcomes,
including the frequency of collection and data
sources
 Working backwards to make sure that the
above cross walk maps onto actual contracts,
memos of understanding, and/or expectations
of all stakeholders (many of which are
developed at different points in the proposal
and start up process)
State/Site Level Infrastructure
25
Often a matter of documenting what has been done,
including dates, type and events, number of staff,
degree of completion/certification
Dual Diagnosis Capability in Addiction Treatment
(DDCAT) and Dual Diagnosis Capability Youth
Treatment Tool (DDCYT) measures of availability
and quality of co-occurring services
Identifying how things differ from what was
expected, including
– Unexpected problems and how they were addressed
– Unexpected opportunities and how they were seized
– Things that still need to be or might be done
Common Client Level Questions
26
 What are the characteristics and needs of who
was served?
 What services did they receive?
 To what extent are services targeted at the most
appropriate for severe clients
 To what extent are services effective?
 Are the services cost effective?
TIP: Not every evaluation will address each of these questions or each
question equally well. The point here is to think about how and how well
you will be able to answer each.
Characteristics and needs of who was served?
Measure
GPRA
GAIN
CASI
27
a. Demographics, Veterans, Housing, Justice & Vocational Status
X
X
*
b. Sexual Orientation
c. Current substance use, mental health, health, & HIV risk behavior
X
X
X
X
d. Withdrawal, substance use disorder history & diagnosis
X
X
e. Internalizing and externalizing psychiatric history and diagnosis
X
X
f. Physical health history, disabilities, infectious disease,
X
X
g. History of HIV risk behaviors, & Victimization
**
X
X
h. Strengths, Family, & Environment
***
X
X
i. Current arrest, school, employment
X
X
X
j. Incarceration, arrest and illegal activity history
X
X
k. Cost to society of health care utilization and crime
X
l. Treatment planning and level of care placement
X
* No Veteran status, ** Only1 question on trauma ***No strengths
X
What services did they receive?
a. Initiating treatment within 2 weeks of diagnosis
Records**
CASI*
GAIN*
Measure
GPRA
28
X
X
b. Engagement in treatment for at least 6 weeks
X
X
X
c. Continuing care more than 90 days past intake
X
X
X
X
X X
X
X
d. Level of care & Type of evidenced based practice
e. Range of services received
X
f. Early working alliance or satisfaction
X
g. Satisfaction with services received
X
h. Urine test results
i. Health disparities on need and targeted services
X
X
X
X X
* Only if follow-up version is used **Only if accessible
TIP: Without GAIN/CASI follow-up, you will be very dependent on the
quality of and access to records. With them need to cover first 3 months to
describe most of treatment.
To what extent are services targeted at the most
appropriate or severe clients
29
a. Implementation of reliable, valid and efficient measures of
need and severity
b. Consensus standards on definition of need, link to services
and/or evidence-based practices associated with better
outcomes on average
c. Implementation of clinical decision support and meaningful
use to drive actual treatment planning and services
d. Evaluation of treatment need profiles, gaps and health
disparities and the program level and monitoring of change
over time
To what extent are services effective?
30
a. Improvements in administrative outcomes (e.g., initiation,
engagement, continuing care, evidence-based practices*)
associated with better outcomes on average
b. Participation in self help and recovery support services
c. Among those in need, receipt of services related to cooccurring mental health & physical health problems*
d. Pre-post change in percent of past month abstinence, no
substance related problems, no justice involvement, being
housed, vocational engagement and social connectedness
e. Comparison of the same program over time, across sites, to
other programs, national norms, or standards (ideally matched
programs or clients)*
TIP: * These have to come from records or supplemental data such as
follow-up data.
Are the services cost effective?
31
a. Estimate costs of average services and evidence-based
practices using accounting data*
b. Compare costs to state-wide, federal or published normative
costs overall or adjusting for improved retention*
c. Putting costs in context relative to baseline costs to society of
health care utilization or crime and the extent to which the
program is targeting a high cost subgroup
d. Pre-post change in the cost to society of health care
utilization or crime *
TIP: * These have to come from records, follow-up or other
supplemental data such as follow-up data.
5. Performance Monitoring and Reporting
32
 Early indicators of implementation, fidelity and
steps of the theory of change or logic model
 Important for infrastructure measures to
include necessary steps (e.g., selection,
contracting, events, people, evaluations)
 Client level measures related to
 Recruitment and data collection rate/target,




being on time
Casemix of who is served
Treatment initiation, engagement, continuing care,
satisfaction
Fidelity of EBP
Services targeted at needs
Implementation is Essential
(Reduction in Recidivism from .50 Control Group Rate)
33
The best is to
have a strong
program
implemented
well
Thus, one should optimally pick the
strongest intervention that one can
implement well
Source: Adapted from Lipsey, 1997, 2005 meta analysis of 509 juvenile justice programs
The effect of a well
implemented, weak program
is as big as a strong program
implemented poorly
What gets measured, gets done
What gets fed back, gets done better
What gets incentivized, gets done more often
34
Average practice
based on TEDS
*Based on a count of initiation within 14 days, evidence based practice,
engagement for at least 6 weeks, and any continuing care.
Source: CSAT 2011 AT SA Data Set subset to 1+ Follow ups (n=17,202)
Selected NOMS Outcomes Over Time
35
Variation in
outcomes
Most effects are in the first
90 days, important to
measure outcome and
services received by then
*Interpolated
Source: CSAT 2011 AT SA Data Set subset to 1+ Follow ups **Past month
NOMS Outcome Status at Last Wave
36
Measure favors people who come in
the door without problems
*This variable measures the last 30 days. All others measure the past 90 days.
**The blue bar represents an increase of 50% or no problem.
Source: CSAT 2011 AT SA Data Set subset to 1+ Follow ups
NOMS Outcomes: Count of Positive Outcomes*
(Status at Last Follow up – Status at Intake)
37
78% have
one or more
improved
areas
*Based on count of a reduction in the following variables: Substance use frequency, Abuse/Dependence Sx (past 30d), Physical
Health (past 90d), Mental Health (past 90d), Nights of Psychiatric Inpatient (past 90d), Illegal Activity (past 90d), Arrests (past 90d),
Housed in Community (past 90d), Family/Home Problems (past 90d), Vocational Problems (past 30d), Social
Support/Engagement (past 90d), Recovery Environment Risk (past 90d), Quarterly Cost to Society (past
90d), In Work/School (past 90d) Minus No problems at intake with these variables
Source: CSAT 2011 AT SA Data Set subset to 1+ Follow ups (n=17,722)
Health Care Utilization Cost
38
11% of youth
consume 76% of
health care costs
Source: CSAT 2011 AT Summary Analytic Data Set (n=19,148)
Cost of Crime
39
21% of youth
consume 97% of
health care costs
Source: CSAT 2011 AT Summary Analytic Data Set (n=17,878)
Reduction in Health Care utilization off set
the cost of SUD Treatment within 12 months
40
Adolescent Level of Care
Year
before
intake
Year
One
after
Year
Intake a Savings b
Outpatient
$10,993
$10,433
$560
Intensive Outpatient
$20,745
$15,064
$5,682
Outpatient Continuing Care
$34,323
$17,000
$17,323
Long Term Residential
$27,489
$26,656
$833
Short Term Residential
$25,255
$21,900
$3,355
Total
$15,633
$13,642
$1,992
\a Includes the cost of treatment
\b Year after intake (including treatment) minus year before treatment
EBP like A-CRA Cost more
but Produce Greater Savings too
41
\a Includes the cost of treatment
\b Year after intake (including treatment) minus year before treatment
Impact of Reclaiming Futures Infrastructure
Enhancements to Juvenile Treatment Drug Court
on Cost of Crime to Society
42
\a RF-JTDC is significantly lower at follow-up than JTDC.
Source: Dennis et al 2012
Other Evaluation Training Resources
43














ACYF’s The Program Manager's Guide to Evaluation
http://www.acf.hhs.gov/programs/opre/research/project/the-program-managers-guide-to-evaluation
American Evaluation Association http://www.eval.org/
BJA’s Program Evaluation Manual https://www.bja.gov/evaluation/guide/bja-guide-programevaluation.pdf
CDC’s resource page on program evaluation and logic model development
http://www.cdc.gov/eval/resources/index.htm
CSAP Pathways Course Evaluation 101 http://pathwayscourses.samhsa.gov/eval102/eval102_1_pg2.htm
Evaluator’s Institute http://tei.gwu.edu/
GAO’s Designing Evaluations http://www.gao.gov/products/GAO-12-208G
GAIN Program Management and Evaluation Training (PMET) http://www.gaincc.org/productsservices/training/gain-program-management-and-evaluation-training/
NIAAA’s State-of-the-art methodologies in alcohol-related health services research
http://onlinelibrary.wiley.com/doi/10.1111/add.2000.95.issue-11s3/issuetoc
NIDA’s Blue Ribbon Task Force on Health Services Research
www.drugabuse.gov/sites/default/files/files/HSRReport.pdf
NSF’s User Friendly Handbook http://www.nsf.gov/pubs/2002/nsf02057/start.htm
SAMHSA Center for Behavioral Health Statistics and Quality (CBHSQ) national data sets with
information on need http://www.samhsa.gov/data/
SAMHSA NREPP’s Non-Researcher's Guide to Evidence-Based Program Evaluation
http://nrepp.samhsa.gov/Courses/ProgramEvaluation/NREPP_0401_0010.html
SAMHSA TIP 14: State Outcomes-Monitoring Systems for Alcohol and Other Drug
Abuse Treatment http://store.samhsa.gov/product/TIP-14-State-OutcomesMonitoring-Systems-for-Alcohol-and-Other-Drug-Abuse-Treatment/BKD162
Key Points for Breakout
44
 For each area of required activity and the allowable ones
you have chosen to target,
–
–
–
–
What is your rationale/evidenced of need?
What activity will address it?
What are the expected outcomes of doing this?
How will you document and monitor implementation in real
time?
 Prioritize those the areas in terms of those that
– Are your focus or key for sustainability
– Likely to be difficult or require help
 What needs to be done to finalize the plan before it is
submitted later in March
Download