Schlosser & Raghavendra, 2004

advertisement
Evidence-Based Practice in
Augmentative & Alternative
Communication: How do you do it?
What does it mean for Individuals who
use AAC?
Pammi Raghavendra. Ph.D.
Senior Lecturer, Disability & Community Inclusion, School of Health Sciences
Flinders University, Australia
Parimala.raghavendra@flinders.edu.au
ISAAC-Israel Annual National AAC Conference, Tel Aviv
8 June 2014
All day workshop
Workshop Outline
1.
2.
3.
What is EBP?
What does EBP mean for individuals who use AAC and other
stakeholders?
Steps involved in EBP – 7 steps
1.
2.
3.
4.
5.
6.
7.
4.
5.
6.
7.
Asking a clinically relevant & answerable question
Searching for the evidence
Critically appraising the evidence
Collating & synthesising the evidence
Implementing the evidence into practice
Evaluating the use of the evidence
Disseminating the EBP process & findings
Facilitators and barriers to EBP
Practical Suggestions to implement EBP
Nature and extent of evidence in AAC
What can you do to add evidence to the AAC field?
Evidence Based Practice
(EBP)
What is it?
Background to Evidencebased Medicine (EBM)
• Archie Cochrane (1909 - 1988)
– The Cochrane Collaboration
• David Sackett (1914-)
– Definition of EBM, Father of EBM
Evidence-based medicine
“the conscientious, explicit and judicious
use of current best evidence in making
decisions about the care of individual
patients.”
Sackett et al., (1997, p.2)
What is Evidence Based
Practice?
Definition:
“the integration of best research evidence with clinical
expertise and patient values
(Sackett et al., 2000)
What is Evidence Based
Practice in AAC?
Proposed Definition: (Schlosser & Raghavendra, 2004)
EBP is defined as the integration of best and current research
evidence with clinical/educational expertise and relevant
stakeholder perspectives to facilitate decisions for
assessment and intervention that are deemed effective and
efficient for a given direct stakeholder.
EBP in AAC
Schlosser & Raghavendra, 2004
Relevant
Stakeholder
Perspectives
Decisions:
-vocabulary
(organization),
-means to represent
-means to select
-means to transmit
-intervention strategies
-and so forth
Research Evidence-and so forth
Clinical/Educational
Expertise
What is Evidence Based
Practice? – Key concepts
• Integration = joining / synthesis of best
• research evidence
• clinical expertise
• stakeholder perspectives
(patient values)
(Schlosser & Raghavendra, 2004)
Key concepts
continued…
• Best research evidence =
– Data : current, verified and replicated
– High internal validity
– Highest level on hierarchy of evidence
– Adequate external validity and social
validity
(Schlosser & Raghavendra, 2004)
Key concepts continued…
• Clinical expertise = reasoning, intuition,
knowledge and skills related to clinical
skills
• Educational expertise = reasoning,
intuition, knowledge and skills related to
educational skills
(Schlosser & Raghavendra, 2004)
Key concepts
continued…
• Relevant stakeholder perspectives/values = viewpoints,
preferences, concerns and expectations relative to the
assessment or intervention
• Patient/client = direct stakeholder ie direct recipient of any
decision arising from the EBP process
(Schlosser & Raghavendra, 2004)
What EBP is not? Myths about
EBP
•
•
•
•
•
•
EBP is impossible to implement
because we do not have enough
evidence.
EBP already exists
EBP declares research evidence
the authority
EBP is a cost-cutting mechanism.
EBP is cook-book practice.
EBP is impossible to put in place.
(Sackett et al., 1997; Schlosser, 2004)
•
•
•
•
•
•
Best and most current research
evidence is relative
Some practitioners do implement EBP
EBP in medicine/AAC definition-all 3
Not always true
EBP requires not only extensive
clinical expertise but also skillful
integration of all 3 aspects of EBP
some degree of EBP by all.
What does EBP mean for an
individual with complex
communication needs ?
• Individuals with CCN (their families)
– Central to EBP outcomes, promoting adoption of
effective interventions & preventing adoption of
ineffective outcomes
– Active participants in decision making
(ASHA, 2004)
What does EBP mean for a
practitioner/educator?
Best Practice
• Use of EBP as a framework for best practice,
– Promotes development of skills in finding, appraising and
implementing evidence
• Emphasises need for high level clinical & communication skills
• Provides a framework for self-directed, life long learning
Ethical principle: To provide the best available services (assessments &
interventions) to consumers & families
Practice-research gap
What does EBP mean for a
service provider/ an
organisation?
Best Practice at Organisational level
What do families think of using iPads/tablet technology for
communication and participation?
What are the most effective post-school options for students with
CCN using AAC?
•
•
Use of EBP as a framework for effective and efficient services
Facilitate/support to implement EBP
What does EBP mean for
researchers?
•
Excellent opportunity to bridge research-practice
gap
High priority clinical questions in diagnostics,
screening, prognosis, intervention
Conduct high quality research
•
•
–
Disseminating research in way that can be used in
practice
How do you do EBP?
Adapted from Schlosser & Raghavendra (2003)
1. Asking the clinically relevant & answerable
question
2. Searching for the evidence
3. Critically appraising the evidence
4. Collating & synthesising the evidence
5. Implementing the evidence into practice
6. Evaluating the use of the evidence
7. Disseminating the EBP process & findings
Step 1: Ask an clinically
relevant & answerable question
Broad or general questions
•
provide background information
•
Systematic review
e.g.,
• What are the potential barriers and facilitators to hightechnology AAC provision and its ongoing use?
• What are the attitudes toward Individuals who Use
Augmentative and Alternative Communication?
• What is the effectiveness of using iPads/iPods with
individuals with disabilities?
PICO
(Richardson, Wilson, Nishikawa & Hayward,1995)
•
•
•
•
Patient or Population/Problem
Intervention
Comparison or Control
Outcome
PICO
• To clarify the questions related to specific clients
• To develop questions for Systematic Reviews
• To identify the information needed to answer the
question
• To translate the question into searchable terms
• To develop and refine your search strategy
Patient / Population/Problem
• Characteristics of the population e.g
age, gender, diagnosis, ethnic group etc
• How would you describe your
Patient/Population group?
– Balance precision with brevity
Intervention
• Defining the intervention
• What intervention are you interested in
(be specific)?
• Therapy, prevention, diagnostic test,
exposure/aetiology
Comparison or Control
– What alternative or other option are you
comparing your intervention/assessment
to?
• Be specific
– You may be comparing your intervention to
another intervention, or to no intervention
Outcome
• What measurable outcome(s) are you
interested in? What are you hoping to
achieve for the client?
– Be specific
ELEMENT
SUBJECT
KEY WORDS
P
School aged children with
cerebral palsy, complex
communication needs
Child
Children
School aged, 12-13, primary
school, transition, high school
Paediatric/paediatric
Cerebral palsy, physical disability,
spasticity, hemiplegic, CCN, AAC,
severe communication/speech
impairment,
I
Group-Peer training of high school
students (workshops)
Peers, classmates, training,
communication, support
C
Visit by student with CCN plus
training
O
Increased acceptance, welcoming
classroom, social networks
Social networks, friends,
friendships,
Group Work Activity 1
Ask an clinically/service relevant
answerable question using PICO
PESICO template
(Schlosser, Koul & Costello, 2007)
Person/Problem (P)
Person & problem
Environment (E)
Current/future envt & partner’s
knowledge/skills, etc.,
S’s perspectives, attitudes towards
problem
Steps to change persons, envts.
interaction, events, procedures
Compare bet. Interventions or
intervention & no intervention
Stakeholders (S)
Intervention (I)
Comparison/Control (C)
Outcome(s) (O)
Desired outcomes
An example
(Schlosser, Koul & Costello,2007)
Person/pr
oblem
Envt.
Stakeholders
Intervention Comparison
Outcomes
In a 7 yearold boy
with
profound ID
who
exhibits
selfinjurious
behavr.
Who is
currently
in a selfcontained
classroom
And whose
teacher and
aides
suspects
that his
behaviour is
communicati
on based
Is it
sufficient to
rely on
informantbased
assessment
methods
In order to
identify
the
communic
ative
functions
that
maintain
his
problem
behavr. In
a valid
and
reliable
manner/
Or is it
necessary to
also conduct
descriptive
and
experimental
assessments
Group Work Activity 2
Ask an clinically/service relevant
answerable question using PESICO
Step 2: Search for the
evidence
• To start, use the PICO/PESICO
question components to identify the
search terms
Search for the best evidence
• Where to start searching depends on a number of
factors:
– Available time
– Available databases
– Subject matter and domain of the question
– Currency and level of evidence required
Where do I search? Finding the
evidence
•
•
•
•
Courses, proceedings, books,
Journals
(Two million journal articles are published annually in some 20,000
‘biomedical’ journals)
– Secondary research – meta-analysis, systematic reviews
– Primary research – individual research studies
Grey literature e.g. unpublished research, theses
Indexes and databases
– General eg CINAHL, Medline, PubMed, ERIC
– Specialist e.g. Cochrane Library, DARE, PEDro
Where do I search? Finding
the evidence
Hand searches
• Table of contents
• Ancestry search- use reference list and identify other studies
• Forward citation search-who has referred to a particular
article?
Finding evidence on the Internet
– General search engines, e.g., Google
– Specialised Search engines, Google Scholar
Types of Databases
•
•
•
•
Bibliographic/General
eg Medline
PubMed
Specialist Databases
– ERIC
– CINAHL
– PsyInfo
Databases – Full
Text/Specialised
• DARE- Database of Abstracts of
Reviews of Effects (UK)
• Cochrane- Cochrane Database of SR
(Worldwide)
• Expanded Academic ASAP
• Science Direct
• SCOPUS
Strategies for searching
databases
•
•
•
•
•
•
Plan the research, efficiency
Create a search plan
Select and access the right databases
Develop a search statement
Limit, refine and evaluate
Locate the source publication
Strategies for Searching
1) look for pre-filtered evidence (e.g., EBM reviews,
Systematic Reviews, Practice guidelines)
2) look for reviews before individual studies
3) look for peer-reviewed before non-peer reviewed
(Schlosser, Wendt, Angermeier & Shetty, 2005)
Create a Search Statement
• Search commands – Boolean operator/connectors (or, and, not)
example: labour not pregnancy
• Truncation – symbols used to substitute for characters at the
end of a word e.g Ovid uses “$”
example: child$ will give children, childlike
• Wildcards – symbols used to substitute for a letter within a word
e.g Ovid uses #
example: wom#n will give woman and women
• Check Help Screen for each database
Keyword Searching
• Keywords – important words that represent a topic.
Have no control over how a word is used in the
document
The use of quotation marks is useful for example
“intellectual disability”
• Fields – eg author, title, abstract, journal
• Limits – by date, by language, to full text, to
abstracts, Systematic reviews
Hierarchy of Evidence
http://gollum.lib.uic.edu/applied_health/?q=node/7, University of Illinois at Chicago
Proposed hierarchy of evidence to inform intervention development & selection:
Participants with disabilities
(Schlosser & Raghavendra, 2004)
1.
Meta-analysis of
a) RCTs*, b)SSED,
b) Quasi-experimental group designs
2a. One well-designed non RCT,
2b. 1 SSED-1 intervention,
2c. 1 SSED-multiple interventions……..
3. Quantitative reviews that are non-meta-analytic
4. Narrative reviews
5. Pre-experimental group designs & Single case studies
6. Respectable Opinion
Levels of Evidence
• Based on the idea that different grades of evidence (study
designs) vary in their ability to predict the effectiveness of the
health practices
– Reducing biases-Sample, Measurement/detection,
Intervention/Performance
• Higher grades of evidence are more likely to reliably predict
outcomes than lower grades
• Is a system for making sure that you are aware of the strengths
and weaknesses of different study types.
• Several Evidence Grading scales eg: Sackett’s Hierachy of
Evidence, NHMRC, Cochrane
What are Systematic
Reviews?
• A synthesis of original research
• Pre-filtered evidence
A SR aims to synthesize the results of
multiple original studies by using
strategies to reduce bias
(Cook et al., 1997: Schlosser, 2003, cited in Schlosser
et al., 2005)
Systematic Review
(Adapted
from Cochrane Database of Systematic reviews –www.cochrane.org & Centre
for Reviews & Dissemination- www.york.ac.uk)
Transparent process to facilitate
replication
• Pre-defined, explicit methodology is used,
• strict protocol to inlcude as much relevant
research
• original studies apparised and
synthesised in a valid way
• Minimise risk of bias
Systematic Review
Meta analysis is a mathematical synthesis
of two or more primary studies that
addressed the same hypothesis in the
same way.
(Greenhalgh, 1997, BMJ, 315:672-675)
Where to find systematic reviews
•
N-CEP's Compendium of Guidelines and Systematic Reviews (ASHA)
•
Cochrane Collaboration
•
Campbell Collaboration
•
What Works Clearinghouse (US Department of Education)
•
Psychological Database for Brain Impairment Treatment Efficacy
•
National Electronic Library for Health (National Health Service of the UK)
•
Evidence-based Communication Assessment and Intervention (EBCAI)
Journal
•
Speech Byte http://speechbite.com/
Step 3:Critically appraising the
evidence
What is the evidence telling me?
Validity (truth) and usefulness (clinical
relevance)
What is Critical Appraisal?
Appraisal is a technique which offers a discipline for increasing the
effectiveness of your reading, by enabling you to quickly exclude
papers that are too poor a quality to inform practice, and to
systematically evaluate those that pass muster to extract their
salient points”
Adapted from Miser WF (1999). Critical appraisal of literature, J. of American Board of Family practice, 12, 315333. Taken from www.shef.ac.uk
Are the findings
applicable in my
Setting?
Is the quality of the
study good enough to
use the results?
What do the results
mean for my clients?
Difference between reading for content
vs. reading for critical appraisal
• Abstract
• Introduction (background,
literature review), aims of the
study
• Methodology
• Results
• Discussion
• Abstract
• Introduction (background,
literature review), aims of the
study
• Methodology
• Results
• Discussion
Type of Study
• The next step is to work out what study
design will best answer your question
• Levels of Evidence – reflect the
methodological rigour of the study
Type of Question
• Different types of questions are best answered by
different types of studies
• Your question may be:
- Intervention or therapy
- Diagnosis/screening
- Prognosis
- Aetiology or risk factors
Questions -> Research
Designs
•
Therapy/intervention
effectiveness…….
 Experimental
 ObservationalCohort/case-control
•
Test an association between…..…
•
Descriptive info. about relationship
in one participant
 Case Study
•
Explore what factors influenced
outcomes at one point in time
 Cross-sectional
•
Describe experiences……
 Qualitative
What do you need to look for
in studies?
•
•
•
•
Why was the study done?
What type of study design was used?
What are the study characteristics (PICO/PESICO)?
RELIABILITY- Test-retest, intra-rater & Inter-rater
• VALIDITY (Internal validity - What biases exist?)
– Participant selection, comparable groups at baseline, blinding,
follow-up, drop-outs,outcomes, procedural reliability/treatment
integrity)?
– What are the results (size and precision of the effect)?
• External validity -Are the results relevant in my clinical
situation?
• Social Validity
Critical Appraisal Tools
Systematic Reviews
• EVIDAAC Systematic Review Scale
(Schlosser, R. W., Raghavendra, P., & Sigafoos, J., Eysenbach, G., Blackstone, S., &
Dowden, P. (2008)
• Protocol
– Source selection bias
– Trial selection, criteria for pooling
– Study quality
• Data extraction
• Statistical analysis
– Clinical Impact
Appraisal of Systematic Review
Paper
Group Work Activity 3
Barriers and facilitators to the use of high-technology augmentative
and alternative communication devices: a systematic review and
qualitative synthesis
Susan Baxter, Pam Enderby, Philippa Evans and Simon Judge,
INT J LANG COMMUN DISORD, MARCH–APRIL 2012,
VOL. 47, NO. 2, 115–129
Critical Appraisal Tools
Randomised Control Trials (RCTs &Non-RCTs)
PEDro – P Scale- (Moseley, 1999; Maher et al., 2003)
Physiotherapy Evidence Database -1999
www.pedro.fhs.usyd.edu.au/scale_item.html
• To rate RCTs and Non-RCTs, not for SR, case-series, SSED
• 11 item scale, score 1 or 0
• (based on info. found in the paper)
•
•
•
•
•
1st criterion - external validity
2-11 - internal validity
Max.10, Min 0 (RCT=10, Non-RCT=8)
Rated for methodological quality- sources of bias
A high rating does not reflect relevance to practice
Critical Appraisal Tools
McMaster Forms- Developed by Law et al., 1998, Letts et al., 2007
http://www.fhs.mcmaster.ca/rehab/ebp/
Quantitative Review Form:
– Was the sample described in detail?
– Were the outcome measures reliable?
– Was intervention described in detail?
– Were the analysis methods appropriate?
Qualitative Review Form: (ver 2.0, Letts et al., 2007)
– Was sampling done until redundancy in data was reached?
– Was the process of transforming data into themes described adequately?
– Was member checking used to verify findings?
Appraisal of a Qualitative Study
Group Work Activity 4
A Qualitative Analysis of Email Interactions of Children who use
Augmentative and Alternative Communication
ANETT SUNDQVIST* and JERKER ROONNBERG
Augmentative and Alternative Communication, December 2010 VOL. 26 (4), pp. 255–266
References
Auperin, A., Pignon, J.-P., & Poynard, T. (1997). Review article: critical review of meta-analyses of
randomized clinical trials in hepatogastroenterology. Aliment Pharmacological Therapy, 11, 215
Maher, CG., Sherrington, C., Herbert, RD., Moseley, A., & Elkins, M. (2003). Reliability of the PEDro scale
for rating methodological quality of randomised controlled trials. Physical Therapy, 83, 713-721.
Moseley, AM., Maher, C., Herbert, RD., Sherrington, C. (1999). Reliability of a scale for measuring the
methodological quality of clinical trials. Proceedings of the VIIth Cochrane Colloquium, Rome, Italy:
Cochrane Centre, p.39.
Richardson, W., Wilson, M., Nishikawa, J., & Hayward, R. (1995). The well-built question: A key to
evidence-based decisions, ACP Journal Club, 123, A12-A13.
Schlosser, R. W., & Raghavendra, P. (2004). Evidence-based practice in augmentative and alternative
communication. Augmentative and Alternative Communication, 20(1), 1- 21.
Schlosser, R., Wendt, O., Angermeir,K., & Shetty, M. (2005). Searching for evidence in augmentative and
alternative communication: Navigating a scattered literature, Augmentative & Alterntaive
Communication, 21, 233-254.
Schlosser, R. & O’Neil-Pirozzi, T. (2006). Problem formulation in EBP and systematic reviews.
Contemporary issues in communication science and disorders, 33,5-10.
Schlosser, R., Koul, R., & Costello, J. (2007). Asking well-built questions for evidence-based practice
in augmentative and alternative communication. Journal of Communication Disorders, 40 (3),225-238.
www.cochrane.org - Cochrane Collaboration
www.york.ac.uk - DARE
Download