The Evidence-based Practice of Applied Behavior Analysis

advertisement
The Evidence-based Practice of Applied
Behavior Analysis
Ronnie Detrich
Wing Institute
Acknowledgments
Significant contributions from:
Tim Slocum, Utah State University
Susan Wilczynski, Ball State University
Trina Spencer, Northern Arizona University
Katie Wolfe, University of South Carolina
Teri Lewis, Oregon PBIS Network
Goals for Today
Have a conversation about comprehensive
view of evidence-based practice.
Discussion Questions
• What does the term evidence-based practice mean
to you?
• What counts as evidence?
• What is a practice?
The Research to Practice Gap and Evidencebased Practice
• Long standing concern that decisions about
interventions influenced by sources other than
evidence.
 In medicine, it has been estimated that 10-25% of
decisions are based on high quality evidence.
• If evidence is not informing decisions, then question
becomes what is.
 appeals to philosophy
 anecdotes
 opinions of experts
Evidence-based Practice and the ResearchPractice Gap
• Across disciplines, great concern about the
discrepancy between what is known from research
about effective treatments and the interventions
practitioners routinely employ.
• EBP is basis for closing the gap.
Why Do We Need Evidence-based Practice?
550 named interventions for children and adolescents
Kazdin (2000)
Empirically evaluated
Behavioral
Cognitivebehavioral
Evidence-based interventions are less likely to be used than interventions for
which there is no evidence or there is evidence about lack of impact.
Research to Practice Issues
• The lag time from efficacy research to
effectiveness research to dissemination is 1020 years. (Hoagwood, Burns & Weisz, 2002)
• Only 4 of 10 Blueprint Violence Prevention
programs had the capacity to disseminate to
10+ sites in a year. (Elliott & Mihalic, 2004)
How Does EBP Narrow the Research to Practice
Gap?
• Practitioners must make decisions “now” even when
research evidence is absent or incomplete.
 What is to be the basis for decisions?
• Decisions informed by best available evidence allows
practitioners to:
 Select
 Adapt to fit local circumstances
 Modify on the basis of progress monitoring data
• These decisions require professional judgment.
Evidence-based Practice as Federal Policy
“Where evidence is strong, we should act on it. Where
evidence is suggestive, we should consider it. Where
evidence is weak, we should build the knowledge to
support better decisions in the future.”
(Zients, 2012)
Roots of Evidence-based Practice
• Evidence-based practice has its roots in medicine.
 Movement has spread across major disciplines in human
services.
 Psychology
 School Psychology
 Social Work
 Speech Pathology
 Occupational Therapy
Roots of Evidence-based Practice
• Professional organizations began validating
interventions as evidence-based.
 Mid 1990’s
 Society for the Study of School Psychology
 American Psychological Association
 More recently
 What Works Clearinghouse (Institute for Education Science)
 Campbell Collaboration
 Coalition for Evidence-based Policy
 National Autism Center
Roots of Evidence-based
• Most professional organizations have ethical
guidelines emphasizing services are based on
scientific knowledge.
Ethical Basis for Evidence-based Practice
• American Psychological Association
 Psychologists’ work is based on the established scientific
and professional knowledge of the discipline.
Ethical Basis for Evidence-based Practice
• National Association of School Psychologists
 School psychologists use assessment techniques,
counseling and therapy procedures, consultation
techniques, and other direct and indirect service methods
that the profession considers to be responsible, researchbased practice.
Ethical Basis for Evidence-based Practice
• The Behavior Analyst Certification Board
 The behavior analyst always has the responsibility to
recommend scientifically supported, most effective
treatment procedures. Effective treatment procedures
have been validated as having both long-term and shortterm benefits to clients and society.
What is Evidence-based Practice?
• At its core EBP movement is a consumer protection
movement.
 It is not about science per se.
 It is a policy to use science for the benefit of consumers.
 “The ultimate goal of the ‘evidence-based movement’ is
make better use of research findings in typical service
settings, to benefit consumers and society….”
(Fixsen, 2008)
Evidence-based Practice as a Decision-making
Framework
What Is Evidence-based Practice?
Professional
Judgment
Professional
Judgment
Best available
Best
evidence Available Evidence
Client Values
Client
Values
Sackett et al (2000)
• EBP is a decision-making approach that places
emphasis on evidence to:
 guide decisions about which interventions to use;
 evaluate the effects of an intervention.
Two Ways of Thinking about EBP
• An intervention found to have strong research
support. (Cook, Tankersley, & Landrum, 2009)
• Decision making process that informs all professional
decisions. (Sakett, Straus, Richardson, Rosenberg, & Haynes, 2000)
One Term, Two Meanings=
Terminological Confusion
• Using same term (EBP) to describe two different
constructs creates confusion:
 Empirically supported treatments (EST)=interventions that
meet defined standards of quality and quantity.
 Evidence-based practice=decision making process.
Limitations of Lists of EBP
Benefits
•
Source for effective interventions.
Limitations
•
Must define what counts as a
practice.
 Manualized multi-component
interventions?
 For many clients there are no
validated packages.
 In such instances, evidence cannot
be a guide.
 Micro-practices such as social
reinforcement?
 So broad, there is too much evidence
and much of it not likely to be
relevant to current clinical problem.
 Must be operationalized to be
meaningful.
•
Any appraisal of evidence must begin
with a clear definition of the unit of
analysis.
 Otherwise comparing unequal units.
 Impossible to make coherent
statements about evidence-base
Advantages of Decision-making Framework
• Supports a cogent and transparent description of
 Evidence considered, including direct and frequent
measurement of client behavior
 Why this evidence was identified as “best available.”
 How client values influenced the decision-making process.
 The ways in which clinical expertise was used to
conceptualize the case and integrate the various
considerations.
• Provides a common framework for collaborating with
professionals from other disciplines.
Evidence-based Practice and Applied Behavior
Analysis
• EBP is framework for guiding decisions of practitioners.
• Decisions are based on the integration of:
 Best available evidence
 Client values and context
 Professional judgment
• Consistent with foundational principles of applied
behavior analysis:
 Data-based decision making
 Consideration of client values
 Considerations of contextual fit
 Commitment to research-based treatment
Evidence-based Practice as Decision-making
Framework
• EBP is an effort to improve decision-making in
applied settings:
 by explicitly articulating the central role of evidence in
these decisions thereby improving outcomes.
The Challenge For Practitioners
• Solve a specific problem for a specific client in a
specific context.
 Research base will be more or less relevant.
 Research base can vary from abundant to nonexistent.
The Practical Problem
• Even with insufficient evidence decisions must be
made.
 What are practitioners to do?
 Make the best possible inferences from imperfect evidence?
(While recognizing that uncertainty is inevitable.)
 Make decisions without relying on evidence?
Evidence-based Practice as Framework for
Decision-making
If evidence-based practice is to be a pervasive model
for decision making…
then we need ways to identify and integrate the best
available evidence when the evidence is imperfect.
Dilemma for Practitioners
• Practitioners must make many decisions daily.
• EBP assumes there is evidence for all decisions and
that the relevant evidence is accessible.
• How do practitioners incorporate evidence into all
decisions?
THE CHALLENGE OF
BEST AVAILABLE EVIDENCE
Best Available Evidence
• What do we mean by “best”?
 Quality: research methods and outcomes
 Relevance: close match with our applied question in terms
of:
 Participants
 Treatment
 Outcomes
 Context
 Amount: number of participants, studies, investigators
Best Available Evidence
• What do we mean by “best available”?
 We should use the best of what is available,
 This may mean using extremely high quality evidence
 Or it may mean using evidence that is less certain.
“Unlimited skepticism is equally the child of
imbecility as implicit credulity.”
Dugald Stewart
Goal
• The best available evidence should pervade the
practice of Applied Behavior Analysis.
 What is the “Best Available Evidence” for ABA practice?
 How do we systematically identify it?
Best available evidence
High
Best evidence
Quality
Low
Low
High
Relevance
(P, T, O, C)
Best available evidence
High
Need to Generalize
Uncertainty
Empirically-Supported
Treatment
Quality
Lower Quality Evidence
Uncertain Validity
Low
Low
High
Relevance
(P, T, O, C)
Best available evidence
Unit of analysis dilemma
Highly Specific:
Students
Form of
Treatment
Target Outcome
Context
Conclusion:
There is no reliable
evidence.
High
Quality
Low
Low
High
Relevance
(P, T, O, C)
Best available evidence
Unit of analysis dilemma
Broader:
Students
Form of
Treatment
Target Outcome
Context
Conclusion:
There is some
reliable evidence.
High
Quality
But relevance is a
little less clear.
Low
Low
High
Relevance
(P, T, O, C)
Best available evidence
Unit of analysis dilemma
Very Broad:
Students
Form of
Treatment
Target Outcome
Context
Conclusion:
There is plentiful
reliable evidence.
High
Quality
But relevance is
unclear.
Low
Low
High
Relevance
(P, T, O, C)
Are
Is Reading
IsDirect
direct
Reading
instruction
Instruction
Mastery
Mastery
anreading
effective
1 an effective
programs
reading
program
effective
program
for for
beginning
beginning
readers?
readers?
Reading
All direct instruction
reading programs
Reading Mastery
Mastery
(e.g.,
Mastery,
Soar to
AllReading
DI Reading
Curricula
1-3
Success, 1Rewards)
Even more research available
Little research
Even
available
More
more
research
research
specificavailable
to
available
Reading Mastery 1
Expansion has changed the initial question
Continua of Evidence
Quantity of the Evidence
Meta-analysis
(systematic review)
Repeated Systematic
Measures
Single Case Replication
(Direct and Parametric)
Threshold
Convergent Evidence
of
Evidence
Quality of the Evidence
Current “Gold Standard”
High Quality
Randomized Controlled Trial
Single Case Designs
Semi-Randomized Trials
Well-conducted
Clinical Studies
Uncontrolled Studies
Expert Opinion
Various Investigations
Single Study
General Consensus
Personal Observation
Janet Twyman, 2007
Accessing the Best Available Evidence
1. Systematic reviews – The foundation
 Identifies empirically supported treatments
2. Alternative types of review
 Improve our ability to glean recommendations from imperfect
literature
3. Other units of practice - beyond the package
 Using what we know about intervention elements and principles
4. Progress Monitoring
 The best evidence about what works for this particular case
1. SYSTEMATIC REVIEWS
What Counts as Evidence?
• The term “evidence-based” has become ubiquitous
in last decade.
 The is no consensus about what it means.
 At issue is what counts as evidence.
 Federal definition places emphasis on experimental
methods.
 Preference for randomized trials.
 Definition has been criticized as being positivistic.
What Counts as Evidence?
• Ultimately depends on the question being asked.
 Even behavior analysis allows for qualitative evidence
(social validity measures).
• In evidence-based practice the goal is to identify
causal relations between interventions and
outcomes.
 Experimental methods do this best.
What Counts as Evidence?
• Even if we accept causal demonstrations to be
evidence, we have no consensus.
 Randomized Clinical Trials have become the “gold
standard.”
 There is controversy about the status of single participant
designs.
 Most frequently criticized on the basis of external validity.
 What Works Clearinghouse has established standards for single
participant designs.
Identifying Empirically-supported Interventions
• Identification is more than finding a study to support
an intervention.
• Identification involves distilling a body of knowledge
to determine the strength of evidence.
Systematic Reviews
• Systematic EBP review (e.g., WWC, BEE, NAC)
 Establish standards for:
 Identifying research base
o Participants
o Interventions
o Comparisons
o Outcomes
o Settings
 Quality of evidence
 Quantity of evidence
 Unit typically limited to “programs” (treatment packages)
Identifying Empirically-supported Interventions
 There are no agreed upon standards.
 It is possible for an intervention to be evidence-based using one
set of standards and to fail to meet evidence standards using an
alternative set.
• Two approaches to establishing standards
 Threshold approach:
 Evidence must be of a specific quantity and quality before
intervention is considered evidence-based.
 Hierarchy of evidence approach:
 Best available evidence approach. Strength of evidence falls along
a continuum with each level having differential standards.
Identifying Empirically-supported Interventions
• Two approaches to establishing standards
 Threshold approach:
 Evidence must be of a specific quantity and quality before
intervention is considered evidence-based.
 Hierarchy of evidence approach:
 Best available evidence approach. Strength of evidence falls along
a continuum with each level having differential standards.
Empirically
supported
Intervention
Empirically
supported
Intervention
Empirically
supported
Intervention
Ineffective
Effective
Effective
Ineffective
Assessed Effectiveness
Actual Effectiveness
Effective
Effective
Ineffective
Ineffective
True
Most
likely with
False
hierarchy approach
Positive
Positive
Most
likely with
False
threshold approach
Negative
True
Negative
Choosing Between False Positives and False
Negatives
• At this stage of developing empirically-supported
interventions it is better to have more false
positives than false negatives.
With false negatives,
actually effective
interventions will not be
selected for
implementation.
As a consequence, less
likely to determine that they
are actually effective.
With false positives,
progress monitoring will
identify interventions
that are not effective.
“Best Available Evidence”
Systematic Reviews
•
•
•
•
Strengths
Transparency &
objectivity of process
Rigorous review
methods
Reduced risk of false
positives
Rely on very high
quality evidence
•
•
•
•
Limitations
Often fails to identify any
problem-relevant
intervention
Unit = packages
Not informed by lower
quality evidence
Higher risk of false
negative.
“Best Available Evidence”
Narrative Reviews
Strengths
• Allows for broad
generalization from
specific studies to
implications for
practice.
• Can incorporate many
sources of evidence and
logic/theory
Limitations
• Lack of transparency
regarding:
 Identifying relevant
research base.
 Selecting practices.
 Selecting experts.
• No strength of evidence
rating
“Best Available Evidence”
Best Practice Panels
Strengths
• May include diverse
perspectives:
 Researcher
 Practitioner
 Consumer
• Allows for broad
recommendations
• Tend to include factors
other than research
Limitations
• Diversity may make
consensus difficult
• Process may be more
political than scientific
• Standards for identifying
“best practice” may be
unclear
• May lack transparency at
all levels
• Tend to include factors
other than research
“Best Available Evidence”
What Works Practice Guides
Strengths
• Allows for expert
interpretation of
research
• Interpretation linked to
systematic review
• Includes “level of
evidence” ratings –
rating system is
transparent.
Limitations
• Requires greater
generalization from
specific research
• Depends on particular
experts employed
• Process of generating
recommendations is not
transparent
“Best Available Evidence”
What Works Practice Guides
14 Practice Guides
78 Total Recommendations
Level of Support
Minimal
Percent
45%
Moderate
Strong
33%
22%
Other Sources Of Evidence
Practice Elements/Kernals
• Interventions comprised of component parts.
 Many interventions for similar problem contain the same
elements (praise to interventions for non-compliance).
 Practice element=commonly occurring component of
interventions for a specific problem.
 Determined by frequency count.
 Kernals=component of multi-component interventions.
 Must be experimentally verified as effective.
Practice Elements/Kernals and Uncertainty
• Elements/Kernals have never been evaluated in the
current intervention.
 How do the current elements interact with each other?
 What is the proper dosage of each element?
Principles of Behavior and Instruction
• Fundamental units of our science.
• Have resulted in significant improvement for many
social problems.
Principles and Uncertainty
• Principles are broad statements.
• Decontextualized from any context.
• For intervention, principles must be applied in a
specific way in a specific context.
• The effectiveness of principles-based intervention is
uncertain because the existing literature may not be
relevant to the current situation.
Client Values and Context
• 4.02 Involving Clients in Planning and Consent.
Behavior analysts involve the client in the planning of and consent for behaviorchange programs.
• 4.03 Individualized Behavior-Change Programs.
(a)Behavior analysts must tailor behavior-change programs to the unique
behaviors, environmental variables, assessment results, and goals of each
client.
• 4.06 Describing Conditions for Behavior-Change Program Success.
Behavior analysts describe to the client the environmental conditions that are
necessary for the behavior-change program to be effective.
4.07 Environmental Conditions that Interfere with Implementation.
(a)If environmental conditions prevent implementation of a behavior-change
program, behavior analysts recommend that other professional assistance
(e.g. assessment, consultation or therapeutic intervention by other
professionals) be sought.
(a)If environmental conditions hinder implementation of the behavior-change
program, behavior analysts seek to eliminate the environmental constraints,
or identify in writing the obstacles to doing so.
Client Values
• Goals for intervention
• Acceptability of interventions
• Evaluation of impact of intervention
All reflect dimensions of social validity
Social Validity to Guide Intervention
Professionals’ Notions
• Families most interested in
children displaying specific
developmental skills
associated with routines.
• Focus on communication
skills.
Parents Preferences
• Completing routines in
timely manner.
Strain, Barton, & Dunlap, 2012
Basis for Choosing Treatment
Szatmari (2004)
Treatment
Values
Evidence
Do Nothing
Unethical
Clinical
Paralysis
None
Do Nothing
Unethical
Clinical
Paralysis
None
Toss a Coin
Unethical
in light of
evidence
None
Do Nothing
Unethical
Clinical
Paralysis
None
Toss a Coin
Unethical in
light of
None
evidence
Training
Outdated
None
Current
lots
Do Nothing
Unethical
Clinical
Paralysis
None
Toss a Coin
Unethical in
light of
evidence
None
Training
Outdated
Etiology
Difficult
limited
Do Nothing
Unethical
Clinical
Paralysis
None
Toss a Coin
Unethical in
light of
evidence
None
Training
Outdated
Etiology
Difficult
limited
ABA
Not very
humane
effective
lots
Do Nothing
Toss a Coin
Unethical
Clinical
Paralysis
Unethical in
light of
evidence
None
ABA
Not very
humane
lots
None
Training
Outdated
Etiology
Difficult
limited
Developmental
Social cognitive
Love it
Not yet
To be Ethical:
Inform Parents of Options
Do Nothing
Toss a Coin
Unethical
Clinical
Paralysis
Unethical in
light of
evidence
None
Effective
Etiology
Difficult
Outdated
Developmental
social cognitive
ABA
Not very
humane
None
Training
Love it
lots
Not
yet
limited
Context
2.09
(c) In those instances where more than one
scientifically supported treatment has been established
additional factors may be considered in selecting
interventions, including, but not limited to efficiency
and cost effectiveness, risks and side-effects of the
interventions, client preference, and practitioner
experience and training.
Contextual Fit
• Not all settings can support all interventions.
• Ability to implement a function of:
 Training
 Resources
 Acceptability of intervention.
 Environmental constraints.
• Making interventions a good contextual fit increases
high quality implementation.
 Environments can be assessed for their current practices
and routines.
Clinical Expertise/Professional Judgment
Questions
• How many decisions do you make in a day?
• Of those decisions, how often do you consult the
research literature?
• If not consulting research literature what is basis for
decision?
The Clinical Problem
• Practitioners must make many decisions every day
about services for clients.
• There is an ethical responsibility to make decisions in
a way that most likely improves outcomes for clients.
• What is to be the basis for those decisions?
 Expertise is necessary and inevitable.
Definition of Evidence-based Practice
• Evidence-based practice:
 a framework for decision making
 designed to improve outcomes for clients
• Evidence-based practice is the integration of:
 best available evidence,
 clinical expertise,
 client values and context,
as a basis for decision-making.
Best Available
Evidence
Client
Values &
Context
Clinical
Decision
Clinical
Expertise
Client
Values &
Context
Available
Evidence
Clinical
Expertise
Clinical
Decision
Client
Values &
Context
Available
Best
Available
Evidence
Clinical
Expertise
Clinical
Decision
Clinical Expertise
• To date, most of the attention has been given to best
available evidence.
• Goal today:
 understand what clinical expertise is,
 the necessity and inevitability of it,
 and the limitations of it.
What is Clinical Expertise
Clinical expertise: competence attained by
psychologists through education, training, and
experience that results in effective practice.
(APA Task Force, 2006)
Why Clinical Expertise is Necessary
“Evidence doesn’t make decisions, people do.”
(Haynes, Devereaux, Guyatt, 2002)
“The formalized experience of science, added to the
practical experience of the individual in a complex set
of circumstances, offers the best basis for effective
action.”
Skinner (S&HB, 1953)
Why Clinical Expertise is Necessary
“When we do not know, we guess. Science does not
eliminate guessing, but by narrowing the field of
alternative courses of action it helps us to guess
more effectively.”
Skinner (S&HB, 1953)
Clinical expertise is not the enemy;
It is the means by which evidence contacts clients.
Why is Clinical Expertise Inevitable?
• All clinical problems are contextual.
“Clinical expertise is used to integrate the best research
evidence with clinical data (e.g., information about
the patient obtained over the course of treatment) in
the context of the patient’s characteristics and
preferences to deliver services that have a high
probability of achieving the goals of treatment.”
APA Task Force, 2006
Why Clinical Expertise is Inevitable
“In those instances where more than one scientifically
supported treatment has been established, additional
factors may be considered in selecting interventions,
including, but not limited to, efficiency and costeffectiveness, risks and side-effects of the
interventions, client preference, and practitioner
experience and training.”
BACB Professional and Ethical Compliance Code 2.09c
Why Clinical Expertise is Inevitable
“Clients have the right to effective treatment (i.e.,
based on the research literature and adapted to the
individual client).”
BACB Professional and Ethical Compliance Code 2.09a
Why Clinical Expertise is Inevitable
“The type of assessment used is determined by clients’
needs and consent, environmental parameters, and
other contextual variables.”
BACB Professional and Ethical Compliance Code 3.01a
Why Is Clinical Expertise Inevitable?
• Practitioners always work under conditions of
uncertainty.
 No outcomes are certain.
“The application of research evidence to a given
patient always involves probabilistic inferences.”
(APA Task Force, 2006)
Components of Clinical Expertise
• Ethical practice
• Knowledge of the research literature and its
applicability to particular clients
• Incorporation of the conceptual system of ABA
• Breadth and depth of clinical and interpersonal
skills
• Integration of client values and context
• Recognition of the need for outside consultation
• Data-based decision making
• Ongoing professional development
Slocum, et al., (2014)
Limits to Clinical Expertise
• Humans are flawed decision makers.
“Whenever psychologists involved in research or
practice move from observations to inferences
and generalizations, there are inherent risks of
idiosyncratic interpretations, overgeneralizations,
confirmatory biases, and similar errors in judgment.”
APA Task Force, 2006
• “Biases” are efficient and often correct.
 Trouble comes when they are not critically examined.
Variables Influencing Clinical Expertise
Variables That Influence Clinical Expertise
Clinician History
• History of reinforcement and punishment for clinical
behaviors
• Professional values (i.e., outcomes that function as
reinforcers)
Experience alone is not sufficient for establishing clinical
expertise (Dawes, 1994; Tracey et al., 2014)
Deliberate practice (Ericsson, 2006)
Availability of accurate feedback (Shanteau, 1992)
Variables That Influence Clinical Expertise
Client Outcomes
Client outcomes may reinforce or punish particular clinical
behaviors
 e.g., relying on parental report to identify reinforcers vs.
conducting a preference assessment
But, clinicians may vary in sensitivity to client outcomes as
consequences
 Personal history, clinician history, rule-following (Hayes et al.,
1986)
And, consequences are not optimal for shaping behavior
 Client outcomes are often delayed
 Often not a direct, causal relation between one clinician behavior
and a client outcome
Variables That Influence Clinical Expertise
Organizational Context
• Establishes motivating operations
 What outcomes are reinforcing?
• Sets rules and contingencies that can support or hinder
clinical expertise and ethical practice
 “One size fits all” assessments and curricula
 Number of clients on caseload
 Available resources, access to research
 Use of decision-making flowcharts
 Data shares and feedback on decision-making
Geiger, Carr, and LeBlanc (2000)
Variables That Influence Clinical Expertise
Personal History
• History of reinforcement and punishment that shapes
personal values (i.e., reinforcing outcomes)
• Influences receptiveness to principles of behavior and ethical
standards of ABA
“Spare the rod, spoil the child”
“You can catch more flies with
honey than vinegar”
4.08a: Behavior analysts recommend reinforcement rather than
punishment whenever possible.
Professional and Ethical Compliance Code for Behavior Analysts (2015)
Variables That Influence Clinical Expertise
Training Program
• Preservice Training
 Coursework
 Quality and quantity
 Rules (e.g., definitions and prescriptive recommendations)
 Contingencies (case-based learning, decision-making practice)
 Supervised Experience
 Implementation of assessments and interventions
 Repeated application of decision-making in context
 Explicit feedback
• Continuing Education
 Contact with research and evolving ethical standards
 *Most do not have elements of deliberate practice
Variables That Influence Clinical Expertise
Behavior Analyst Certification Board
Direct and indirect influences on clinical expertise at
various levels
• Content standards: Task list, exam
• Approval of course sequences and university
supervision practica
• Ethical and disciplinary standards
• Supervision standards
• Continuing education requirements
Variables That Influence Clinical Expertise
State and National Organizations
• Additional content standards and ethical guidelines
for personnel preparation
• Conferences for ongoing professional development
 *Most do not have elements of deliberate practice
• Advocacy
Variables That Influence Clinical Expertise
Laws, Policies & Insurance Requirements
May constrain, or set the occasion for, organizations to
support clinical expertise
 Funding and reimbursement rates
 Mandated or prohibited assessments and practices
 Number of hours prescribed/reimbursed
Developing Clinical Expertise
Developing Clinical Expertise
• Minimum of 10 years of intense practice to develop
expert performance.
• Deliberate practice:
 Immediate informative feedback
 Knowledge of results
 Repeatedly perform the same or similar tasks
(Ericsson, Krampe, & Tesch-Romer, 1993)
Developing Clinical Expertise
• Create organizational culture that support decisions
incorporating best available evidence and client values and
context
• Create opportunities for deliberate practice in
 Specific clinical skills
 Decision-making
• Training programs/organizational contexts
 Set the occasion for explicit decision-making (“think-aloud”)
 Give immediate feedback
 Evaluate effects of the decision
 Incorporate decision aids
Developing Clinical Expertise
• BACB
 Recent changes: Task list update, ethics course, ethics
CEUs, enhanced standards for supervision
 Possible considerations: Incorporate deliberate practice
• Laws and policies
 Provide adequate reimbursement to enable reasonable
caseloads
 Require person-centered planning or family priorities in
treatment plans for reimbursement
Putting It All Together
Evidence-based Practice of ABA:
Decision-making Framework
• Decisions are based on the integration of:
 Best available evidence
 Client values and context
 Professional judgment
Spencer, Detrich, & Slocum, 2012
Evidence-based Practice of ABA:
Decision-making Framework
• Decisions are based on the integration of:
 Best available evidence
 Client values and context
 Professional judgment
Spencer, Detrich, & Slocum, 2012
Evidence-based Practice of ABA:
Decision-making Framework
• Decisions are based on the integration of:
 Best available evidence
 Client values and context
 Professional judgment
Spencer, Detrich, & Slocum, 2012
Professional Judgment
Problem to be solved
considering client
values and context
Empirically supported
Professional Judgment
treatments
Other Lit
ReviewsProfessional Judgment
Empirically
supported
treatments
Professional Judgment
Kernels
Empirically
supported
treatments
Other Lit
Reviews
Elements
Principles
Professional Judgment
Empirically
supported
treatments
Other Lit
Reviews
Professional Judgment
Progress Monitoring
P
Empirically
supported
treatments
P
P
Other Lit
Reviews
P
P
Empirically
supported
treatments
Progress
P
Monitoring
P
Other Lit
Reviews
P
The End
Can We Get There From Here?
Sidman, The Behavior Analyst, 2006:
“To make the general contributions of which our
science is capable, behavior analysts will have to use
methods of wider generality, in the sense they affect
many people at the same time- or within a short time,
without our being concerned about any particular
members of the relevant population.”
Systematic Reviews
Strengths
• Reduced bias
 Transparency
 Objectivity
 Rigorous methods
• Reduced risk of false
positives
 Exclusive reliance on
high quality evidence
Limitations
• Often fail to identify
sufficient evidence
• Not informed by lower
quality evidence
• Higher risk of false
negative.
Similarities and Differences Between Behavior
Analysis and Evidence-based Practice
Evidence-based Practice
Unit of analysis
is populations
Evidence is
derived from
systematic
reviews
Practitioner must know
how to implement
effectively
Behavior Analysis
Unit of analysis
Data-based
decision making
Assumption
that science
produces best
outcomes
for consumers
is individual
Evidence is
derived from
experiments
Practitioner must
know laws of behavior
and how to apply
What Counts as Evidence?
• Most established standards for validating
interventions as evidence-based give advantage to
randomized trials for establishing the strength of the
evidence.
 Often single participant designs have no standing or
significantly lower standing than RCT.
Best available evidence
High
Empirically Supported
Treatments
Quality
Low
Low
High
Relevance
(P, T, O, C)
Best available evidence
High
Need to Generalize
Uncertainty
Empirically-Supported
Treatment
Quality
Lower Quality Evidence
Uncertainty
Low
Low
High
Relevance
(P, T, O, C)
Best available evidence
High
Quality
Low
Low
High
Relevance
(P, T, O, C)
Evidence-based
Intervention
Evidence-based
Intervention
Evidence-based
Intervention
Evidence-based
Intervention
Outcomes for Workshop
• Participants will be able to describe the two perspectives
on evidence-based practice.
• Participants will be able to describe different types of
evidence that can be used in decision-making.
• Participants will be able to describe why professional
judgment is necessary and list at least two potential
problems with judgment.
• Participants will be able to describe how client values
shape decisions by professionals.
• Participants will be able to describe how context can
influence decisions made by practitioners.
The practical problem
• Practitioners must often make decisions with
insufficient empirical support.
• What are they to do?
 Make the best possible inferences from imperfect
evidence?
 Make decisions without using systematic evidence?
The practical problem
• If Evidence-Based Practice of ABA is to be a pervasive
model for professional decision-making…
then we need ways to identify the best available
evidence when the evidence is imperfect.
Download