Uploaded by Carpd3

Drisko article copy

advertisement
Clin Soc Work J (2014) 42:123–133
DOI 10.1007/s10615-013-0459-9
ORIGINAL PAPER
Research Evidence and Social Work Practice: The Place
of Evidence-Based Practice
James Drisko
Published online: 20 August 2013
! Springer Science+Business Media New York 2013
Abstract This article will examine what evidence-based
practice (EBP) is and is not. EBP will be defined and
distinguished from other different, but related, terms and
concepts. The steps of EBP as a practice decision-making
process will be detailed and illustrated. The kinds of evidence EBP values and devalues will be discussed in several
contexts. The administrative uses of EBP and its larger
societal context will also be examined. A family therapy
based example of doing EBP in clinical practice will show
how it can be a real benefit to practice, but also points out
some of its challenges and limitations.
Keywords Evidence-based practice ! Clinical
social work practice ! Practice research ! Evidence
Evidence-based practice (EBP) is a complex social movement (Drisko and Grady 2012, Tanenbaum 2003). The core
of EBP is to promote the routine incorporation of the best
available research evidence into practice efforts. The
influence of this movement on various professional efforts
has been fast and wide-ranging. The rapid implementation
of EBP has also brought forth a number of challenges.
First, EBP has become so successful and is applied to so
many uses that both professionals and lay people are often
confused about the definition of EBP and how to apply it in
practice. A large-scale survey by Rubin and Parrish (2007)
documents that even social work educators are unclear
about the definition of EBP and its components. A large-
J. Drisko (&)
Smith College School for Social Work, Northampton,
MA 01060, USA
e-mail: jdrisko@smith.edu
scale survey by Simmons (2013) demonstrates that social
work practitioners do not understand the core EBP practice
decision-making model. This practice decision-making
model has also become jumbled with ‘best practices’ and
‘empirically supported treatments’ that are based on very
different definitions and standards.
Second, EBP encompasses a hierarchy of research
designs that is used to promote some research approaches
and types of evidence while simultaneously devaluing
others with long histories and considerable utility (Black
1994; Cheek 2011; Flyvberg 2011; Popay and Williams
1998; Trinder 2000). The application of this quantitative
research design hierarchy to determine and restrict funding
and academic priorities constitutes social movement within
research, education, economics and politics (Cheek 2011;
Trinder 2000). Third, EBP has been used administratively
as a rationale for making policy and funding changes to
mental health and social work services (Tanenbaum 2003;
Trinder 2000). Finding ways to limit health care costs
while improving outcomes for clients is certainly a worthy
effort social workers would support. However, this macro
level application of EBP is very different from the core
micro level practice decision-making process that is the
most widely used and discussed application of EBP in the
health and mental health professions. Under the ‘flag’ of
EBP, public and private payers are generating lists of
approved practices that are the only interventions they will
fund for use with clients having certain diagnoses and
needs (Drisko and Grady 2012). This administrative
restriction on client choice and professional expertise
directly conflicts with two key components of EBP as a
practice decision-making process. Such administrative
restrictions on payment may also be based on very different
appraisals of outcome research quality than those published
by rigorous volunteer professional organizations.
123
124
Clin Soc Work J (2014) 42:123–133
What is Contemporary EBP?
The current model of evidence-based practice has its origins in the work of Scottish physician Dr. Archie Cochrane. Cochrane (1972) advocated for the use of
experimental outcome research to determine if treatments
were effective, benign, or harmful. Cochrane believed that
use of demonstrated effective treatments would both
improve individual patient outcomes and, in the aggregate,
reduce overall healthcare costs. In recognition of his pivotal contributions, researchers named the Cochrane Collaboration, an international organization promoting
evidence-based practice, in his honor. A group of Canadian
physicians at McMaster University coined the term ‘‘evidence-based medicine’’ (EBM) (Guyatt et al. 2008, p. xx).
This group, led by Dr. David Sackett, has refined and
promoted internationally the contemporary EBM/EBP
practice decision-making model.
The contemporary definition of EBP is simply ‘‘the
integration of the best research evidence with clinical
expertise and patient values’’ (Sackett et al. 2000, p. 1).
Haynes et al. (2002) note that the contemporary EBM/EBP
practice decision-making model actually has four parts
(See Fig. 1). These are (1) the clinical state and circumstances of the client, (2) the best available relevant research
evidence, (3) the client’s own values and preferences, and
(4) the clinical expertise of the clinician. Note carefully
that the expertise of the clinician is the ‘glue’ that combines and integrates all of the other elements in the EBP
process. Note, too, that the EBP model weighs equally the
preferences of the client, clinical expertise and the best
research evidence. Research is vital to EBP, but is just one
component of it. Social workers have widely adopted this
process-oriented definition of EBP (Drisko and Grady
2012; Gilgun 2005; Manuel et al. 2009; Wharton and
Bolland 2012).
Just what client or patient values actually mean,
however, has not yet been fully defined (Gilgun 2005).
Nor have scholars fully defined the meaning of clinical
expertise. However, the American Psychological Association (2006) states such expertise includes standard
competencies to perform assessments, develop treatment
plans, to implement treatments and to evaluate treatments. Clinical expertise also has an interpersonal component encompassing the ability to form and maintain
therapeutically useful alliances with a range of clients,
including those from different cultures and with different
demographic backgrounds. Still, Gilgun (2005) argues
that further elaboration of the meaning of clinical
expertise in EBP is warranted so that clinicians,
researchers, agency administrators, policy makers, and
the public all share a common understanding of this
term.
123
Fig. 1 The contemporary evidence-based practice model from
Haynes et al. (2002)
The Varied Applications of EBP
Professionals may apply EBP to a variety of professional
endeavors. These include treatment planning, selecting
among diagnostic tests and procedures, selecting among
preventive interventions or examining the etiology or origins of disorders. EBP may also be applied to examining
the prognosis of an illness and patient survival rates over
time. At the macro level, the EBP model has been applied
to economic decision-making (Oxford Center for Evidence-based Medicine 2009, 2011). It is unclear, however,
just how patient values and preferences, and clinical
expertise are understood in the macro level applications of
EBP. Even within professional circles, macro level applications of EBP seem to stretch and perhaps challenge the
core practice-decision making model offered by the
McMaster group. Micro and macro applications of EBP
may need better differentiation. In social work, by far the
most common application of EBP is to practice decisionmaking in treatment planning. For this reason, micro level
treatment planning will be the focus of this paper.
What EBP is Not
EBP is often confused with empirically supported treatments (ESTs), empirically supported interventions (ESIs)
and ‘best practices’ (Simmons 2013). EBP is a practice
decision-making process involving several steps between
client and clinician. It is intentionally an interactive process
promoting client input and feedback. ESTs, in contrast,
identify treatments that have some form of research
supporting their effectiveness. That is, ESTs designate
treatments as meeting some minimal standards for
Clin Soc Work J (2014) 42:123–133
effectiveness. A working group of the American Psychological Association set standards for ESTs requiring that a
treatment have demonstrated effectiveness compared to
untreated controls or another already proven effective
treatment in at least two experimental trials. Further, the
treatment must be defined using a treatment manual, and at
least one of the studies showing the treatment is effective
must be done by persons other than the people who
developed the treatment model (Chambless and Hollon
1998). This is the technical definition of an EST. Chambless and Ollendick (2001) extended this definition to
‘empirically supported interventions’ (ESIs). They also use
the term ‘empirically validated treatments’ (EVTs) based
on the same criteria. However, the terms EST, ESI and
EVT are often used much more loosely in the wider literature and in policy advocacy. It is important to check how
any of these terms are specifically defined when reviewing
practice and policy literature. ESTs and ESIs address
specific treatments; in contrast EBP is a practice-decisionmaking process addressing the needs of a specific client.
‘Best practices’ has no standard definition. Best practices may refer to ESTs or EVTs, or only to an author’s
favored approach that lacks any research support. Critical
review of any claim of best practice is strongly
recommended.
The Steps of the EBP Practice Decision-Making Process
Given the limited space available for this article, an overview of the Steps of EBP is next offered. More detail and
more resources may be found in texts like Drisko and
Grady (2012), Grinnell and Unrau (2010), Norcross et al.
(2008), Rubin (2008) or Sands and Gellis (2011).
Different authors use slightly different language to
describe the steps of the EBP process, but major distinctions among them are few. In this version, attention is paid
to using terminology that reinforces all four components of
the EBP model. The six steps of the EBP practice decision
making process are:
1.
2.
3.
4.
5.
Drawing on client needs and circumstances learned in
a thorough assessment, identify answerable practice
questions and related research information needs;
Efficiently locate relevant research knowledge;
Critically appraise the quality and applicability of this
knowledge to the client’s needs and situation;
Discuss the research results with the client to determine how likely effective options fit with the client’s
values and goals;
Synthesizing the client’s clinical needs and circumstances with the relevant research, develop a shared
plan of intervention collaboratively with the client; and
125
6.
Implement the intervention. (Drisko and Grady 2012,
p. 32)
The EBP process must be preceded by a thorough
assessment of the client’s clinical state and needs (Drisko
and Grady, in press). Clearly formulating the needs and
strengths of the client and situation will aid in applying the
EBP process. It will also be useful at the point of assessment to begin to learn the client’s values and preferences.
In this initial assessment process, considerable clinical
expertise is required. The focus of the assessment will also
be shaped by the mission and purposes of the agency (or
funding source) and the social worker’s role within it. From
the assessment, an answerable practice question is identified to begin the formal EBP process.
Sackett et al. (1997) developed a model to help clinicians frame practice questions clearly and efficiently. It is
called the P.I.C.O., or P.I.C.O.T., model. Each letter
addresses one aspect of framing a practice question in EBP.
The ‘P’ is to identify the patient or population; to fully
identify the ‘who’ you need to know about. The goal is to
clearly identify the characteristics of your client and the
client’s strengths. ‘I’ stands for intervention. Given this
specific client and situation, what are the key service
needs? Do you wish to know about what treatments ‘work’
for a specific diagnosis, or about possible preventative
measures? Here the goal is to clarify the kinds of interventions about which you wish to learn. This is very
important in orienting the search for information that is the
next full step of EBP. ‘C’ stands for comparison. Are there
alternative treatments or interventions that would fit the
client’s needs? This is important to help identify a range of
likely effective options to present to the client for review
and consideration. ‘O’ stands for outcomes. What specific
outcomes or goals do you and your client seek? Does the
client seek symptomatic improvement or full remission of
the disorder as a whole? Do certain symptoms or issues
(such as personal safety or risk to others) have immediate
priority? Are there important social circumstances to consider? Finally, ‘T’ stands for type of problem. It is a
reminder that EBP can be applied to questions about
diagnosis, prevention, incidence of disorders and even
economics. As noted, the most common application of EBP
in social work is to differential treatment selection and
planning. Using the P.I.C.O. model helps frame answerable
question to frame the next step of EBP–searching for the
best available research evidence.
Step 1: Identify Answerable Practice Questions
and Research Information Needs
The P.I.C.O. framework helps clinicians develop practice
questions about a specific client and situation. A practice
123
126
question might be: ‘‘What treatments are likely to be
effective for a 10 year old girl who has Reactive Attachment Disorder and has lived for 3 months in a supportive
but new potential adoptive family after 6 foster placements.’’ This is a straightforward ‘what works’ treatment
selection question.
Answerable practice questions are not, however, always
this clear cut. Suppose the client was African-American
and the adoptive family was of a different race. Or the
client strongly valued her Roman Catholic religion but the
adoptive family was Methodist. There will likely be fewer
available studies, and any located studies are less likely to
be of high quality, as additional qualifiers are included in a
practice question. In this case, racial and religious differences do deserve close and careful attention in treatment
planning. However, the clinician may be left with little
specific guidance on the detailed practice question, but
much clearer guidance on the broader issue. It is wise to
define practice questions in detail, but also to understand
very specific questions may not, as yet, yield many quality
research results. This is why clinical expertise, client values and critical thinking are key parts of the EBP process
along with research results.
Step 2: Efficiently Locate Research Results
Using the practice question to orient a search of available
research resources, the second step of EBP is to efficiently
locate research results. While Step 1 of EBP requires strong
clinical skills, Step 2 requires expertise in literature searches. The help of a research librarian or a trained staff
member in an agency is often a real asset when conducting
EBP literature searches.
Many excellent sources of research results are available
online. The Cochrane Library (http://www.thecochrane
library.com/view/0/index.html) and the Campbell Collaboration Library (www.campbellcollaboration.org/library.
php) are international volunteer organizations that assess
and summarize practice research. The Cochrane Library
offers systematic reviews of high quality, quantitative,
medical and psychiatric research organized by DSM and
ICD criteria. Depression, anxiety or reactive attachment
disorder would be appropriate to search in the Cochrane
Libraries. The Campbell Library focuses on social service,
educational and criminal justice programs. The Campbell
Library includes research on psychosocial treatments like
family therapy and mentoring programs, as well as research
on issues like juvenile delinquency and substance abuse.
Another useful online resource is the US government’s
treatment guidelines website (www.guidelines.gov). This
site, like the Cochrane Libraries, is organized by disorder,
but offers more specific guidance for doing practice. Yet
another useful site is the US government’s SAMHSA
123
Clin Soc Work J (2014) 42:123–133
Registry of Evidence-based Programs and Practices (www.
nrepp.samhsa.gov/). Note carefully, however, that the
standards for evidence review used in these US government
web sites differs from the more rigorous international
standards developed by the Cochrane and Campbell
Collaborations.
Unfortunately, resources to answer practice questions
are not always available in a summarized form. In such
cases, clinicians must look for individual research articles
that address their practice information needs. The US
government’s PubMed website is the largest source of
abstracts for medical and psychiatric research articles
(www.ncbi.nlm.nih.gov/pubmed). Many public libraries
offer access to online and print databases that are useful to
practice searches for articles. These may include JStor for
social work articles, and databases like the Expanded
Academic ASAP. All agencies that seek to deliver EBP
should make several databases available to their professional staff members. Additional resources can be located
in EBP texts or through online searches.
Newcomers to EBP are often surprised by the amount of
helpful research that is available. Many quality outcome
research summaries on high incidence disorders are widely
available. On the other hand, newcomers are also struck by
how many disorders and social issues have not yet been
systematically researched and summarized. Many topics of
interest to clinical social workers are in need of both more
exploratory research and more experimental outcome
research: not all literature searches will reveal many
helpful resources.
Step 3: Critically Appraise the Quality
and Applicability of the Located Knowledge
to the Client’s Needs and Situation
If Step 1 of EBP requires strong clinical skills, and Step 2
requires strong literature search skills, Step 3 requires the
skills of a knowledgeable and critical researcher. The first
part of this step is to critically analyze the quality of the
research results you have located. From its origins with
Archie Cochrane, EBM and EBP have privileged the
results of large-scale, quantitative, experimental research
over all other types of evidence. This is clear in the Oxford
University Centre for Evidence-based Medicine (2009,
2011) hierarchy of evidence and in a similar hierarchy
prepared by the GRADE (no date) organization. The
Oxford Hierarchy (www.cebm.net/index.aspx?o=1025)
sets as the ‘best’ or ‘Level 1’ research systematic reviews
that aggregate the results of multiple quantitative experimental studies. Systematic reviews require a group of
researchers to thoroughly identify all relevant studies
internationally, followed by careful quality and bias vetting
of studies prior to inclusion in the review. Meta-analysis
Clin Soc Work J (2014) 42:123–133
statistics allow aggregation and comparison of results
across studies. For a full discussion of systematic reviews,
see Littell et al. (2008) or Drisko and Grady (2012).
The standard EBM/EBP hierarchies of rate studies based
on a typology of ideal research designs (Greenhalgh 2010;
Tanenbaum 2003). ‘Level 1a’ is based on a systematic
review of multiple experimental research studies. ‘Level
1b’ research draws upon consistent results from several
experiments that have not been aggregated using a systematic review process. ‘Level 1c’ is the result of just a
single experiment. Note that the results of experiments are
privileged, and systematic reviews combining multiple
experimental outcomes are considered the ‘best’ source of
knowledge when available. ‘Level 2a’ research is based on
a systematic review of multiple quasi-experiments. ‘Level
2b’ is based results of a few, or a single, quasi-experimental study. Still lower on the research hierarchy are
‘Level 3’ studies based on case-controlled studies, a
research design not widely used in mental health outcome
research. In Level 2 and Level 3 research designs, random
assignment of participants is not used. Level 3 research
designs are based on multiple case comparisons using
replication logic, which is different from the sampling
logic more commonly used in experiments (Anastas 1999).
‘Level 4’ results are based on informal case studies, a
widely used design in traditional mental health research.
‘Level 5’ is applied to practice wisdom or other unstructured approaches to knowledge development. More information on evidence hierarchies and rating research designs
can be found in Drisko and Grady (2012), Norcross et al.
(2008), or Rubin (2008).
Note that this hierarchy is based on the best possible or
ideal forms of research design, which often are very different from the best available completed research. Many
mental health topics lack systematic reviews or even high
quality experimental research. This only means no research
has been done as yet. It does not mean treatments lacking
an experimental research base are not (possibly) effective.
Clinicians are encouraged to seek high quality experimental research and systematic reviews. Clinicians are also
encouraged to think critically about the definitions of disorders and treatments studied, size and nature of samples,
and the nature, quality and completeness of measures used.
Clinicians should keep in mind that simply using an
experimental research design does not mean the research
was conceptualized satisfactorily and completed rigorously. Further, not all experimental research may prove
relevant to your client and the client’s situation.
It worth noting that there are important critiques of the
use of experimental research results, common statistical
methods and the systematic reviews based upon such evidence. In-depth exploration of this issue is beyond the
scope of this paper. Interested readers may wish to read
127
Ioannidis’ (2005) ‘‘Why most published research findings
are false’’ which summarizes the many technical and social
challenges to doing and interpreting quantitative research.
Drisko and Grady (2012) address both the merits and the
limitations of EBP research approaches in social work.
Doing research, like doing practice, is difficult.
The second part of EBP Step 3 is to appraise how well
the rigorous research knowledge one finds fits with the
specific needs of the client and the client’s situation. Strong
research evidence must both be valid and credible as well
as directly relevant to the unique client. In the example of
the 10 year who has Reactive Attachment Disorder
describe above, it is not at all clear that outcome research
on toddlers is relevant, though it might be. Research on
school-age children, especially children who have been in
foster care, would seem much more relevant and informative for this client. In Step 3 of EBP, the clinician
appraises both the quality of the best available research
evidence and its fit to the client and situation.
Step 4: Discuss the Research Results with the Client
to Determine How Likely Effective Options Fit
with the Client’s Values and Goals
The EBP model gives equal weight to the client’s clinical
situation, the best available research, and the client’s values
and preferences. Clinical expertise is applied to integrate
all these elements. In practice, this means that once the best
available research is identified and found relevant to the
client’s needs, a summary of the research should be
brought back to clients for collaborative discussion. This
gives clients an opportunity to clarify how willing and
motivated they are to engage in the treatment options, if
there are aspects of the treatment options they find contrary
to personal values and beliefs, and to state preferences
among the treatment options. This requires that the clinician understands, synthesizes and then summarizes for
clients the best available research in clear language. The
Comparison component of the P.I.C.O. model is a useful
source of alternative treatment options (assuming research
supported alternatives are located). The purpose of this
discussion is to allow clients to voice any concerns about
the treatment and to select among treatments options when
they are available.
Such a collaborative discussion allows clients to raise
concerns or value differences that will shape how they will
view and participate in the treatment. The most clear-cut
issues of value differences arise when clients have religious
or principled objections to specific procedures (such as
blood transfusions) that conflict with their core beliefs and
culture. In biopsychosocial treatments, value differences
may arise around use of medications or specific procedures
and techniques. Step 4 helps to make clients active
123
128
participants in treatment decision-making. It empowers
clients and helps maximize motivation and ‘buy-in’ to
treatment. The Step 4 discussion can also be a valuable step
in developing the therapeutic alliance between client and
clinician.
Step 5: Synthesizing the Client’s Clinical Needs
and Circumstances with the Relevant Research,
Develop a Shared Plan of Intervention Collaboratively
with the Client
Step 5 builds on Step 4 and often overlaps with it. The key
purpose of Step 5 is to fully involve the client in treatment
decision-making and to empower the client as an active
part of the treatment process. What is added in Step 5 is the
clinician’s expert wisdom regarding the feasibility of the
plan for this specific client and situation, and how to begin
the treatment. Given the available treatment options, which
available and realistic approach does the client find most
appropriate? Treatment planning in EBP is a collaborative
process.
At the end of this discussion, the clinician should write
out and formally document the agreed upon treatment plan.
The clinician should also fully document in the client’s
record any concerns the client raises and any treatment
options the client refuses due to value differences. Further,
choices made due to the clinician’s expertise, or the lack of
the ability to provide a specific treatment, should also be
documented in the client’s record. Treatment goals should
be clearly stated and documented in the client’s record.
Note carefully that Steps 4 and 5 provide opportunities
for the client to have input into the treatment planning
process and to tailor it in ways they believe are positive.
Where the expertise of the clinician identifies potential
limitations to the client’s goals, it is important that these
concerns also be voiced to the client, discussed, and documented. The EBP treatment planning process is intended
to be collaborative in order to develop the best possible
plan, which requires clear communication from both
parties.
Step 6: Implement the Intervention
This step of EBP draws on clinical expertise and available
resources. It may seem quite straightforward but, in practice, may pose some serious obstacles. Overlapping with
Step 5, any practical limitations to providing an ideal
treatment plan must be identified. Beginning treatment
requires that the clinician is appropriately trained, competent and qualified to deliver the kinds of treatments the plan
requires. In agency practice, appropriate support and
resources must also be made available. If the plan centers
on specific treatments, for instance Dialectal Behavior
123
Clin Soc Work J (2014) 42:123–133
Therapy (DBT), the clinician should be trained and qualified to deliver the treatment competently. For DBT, this
usually requires agency-based access to groups and planned coverage for times the clinician is not available to the
client. Where the clinician is not qualified to deliver a
specific treatment, or the agency cannot support it, referral
is often indicated.
Practice Evaluation and EBP
Practice evaluation, an important and routine part of all
good practice, should be undertaken along with treatment.
The nature of such practice evaluation will be shaped by
professional standards, agency expectations, and the needs
and abilities of the client.
Some authors make evaluation of practice an additional
seventh step in EBP (Gibbs 2002). Practice evaluation,
done formally or informally, quantitatively or qualitatively,
should always be a part of sound professional practice.
EBP, however, emphasizes large-scale, epidemiological,
experimental research studies (Greenhalgh 2010). The
value of single case evaluations, even those using single
system evaluation designs, is very low on the EBP research
hierarchy. For this reason, I argue that evaluation of
practice should be viewed as a necessary component of
high quality routine practice. It is not, however, actually
part of EBP. Evaluation of case outcomes must be part of
quality practice, but it is not a formal part of the EBP
practice decision-making model (Drisko and Grady 2012).
This six-step EBP model of practice decision-making is
intended to guide treatment planning. Next, the steps of
EBP are applied to a composite family case (done to protect privacy and confidentiality).
A Case Example of the EBP Practice Decision-Making
Process
Jax, a 13 year old African American, was brought for
treatment by his employed working class parents who were
concerned about his ‘‘smoking weed’’ and his declining
grades. The family resided in a tough neighborhood they
called ‘‘transitional’’ and ‘‘showing signs of gang activity.’’
Jax felt his parents were over-reacting and that he ‘‘had
plenty of opportunities to hang with kids you really
wouldn’t like’’ but that he chose not to do so. Jax admitted
he was smoking almost weed daily and often before school.
His father said they had already tried taking a hard line
with Jax, but this had not made any difference. Both parents said it was a hard choice to come to a mental health
center but they did not know what else to do. Some people
at their church had told them not to seek mental health
services. Yet they were concerned Jax ‘‘was on a down-hill
Clin Soc Work J (2014) 42:123–133
slide.’’ All this was said with true concern for each other
and recognition of each person’s different point of view
and motives. Each family member’s shame at seeking
mental health help was palpable. The parents concern for
Jax, and his future as a young Black male, was clear. It
seemed Jax knew he was cared for. Many personal and
family strengths were evident, as well as some clear
concerns.
Jax ‘‘didn’t know why’’ he was smoking pot now; ‘‘we
all do it;’’ ‘‘it’s no big deal.’’ When we met privately, he
said the same thing. ‘‘I enjoy it, most times.’’ Talking to
adults did not appear helpful. At the same time, Jax was
perceptive. He felt peer pressure to smoke pot, and from
friends at church not to smoke pot. He felt caught. Friends
in the neighborhood ridiculed people who did well in
school; yet Jax knew he should do well in school and
seemed to want to do so. He struggled with what it was to
be a man and to be his own man and how his parents might
see him as more grown up.
The whole family was clear they would like to work on
this problem together, as a family group. They set an
informal goal of negotiating some increased space for Jax
to be independent, so long as he met the real goals of
‘‘keeping out of trouble,’’ ‘‘stopping the pot smoking’’ and
improving his grades.
The parents privately expressed concern that ‘‘no treatments we know of helps people really stop using weed.’’ I
agreed to explore what the available research shows about
the effectiveness of treatments for pot smoking using a
family modality. I, too, had concerns that researchers had
not documented effective treatments for teens like Jax. I
also imagined that some potentially useful treatments
might not have been researched using rigorous and thorough methods. On the other hand, the family showed
commitment to each other, openness and motivation to try
to change.
Step 1: Using the P.I.C.O. model, the Person or population is African-American families including a teen age
cannabis abuser with many strengths and many supports.
The Intervention the family sought is family therapy
models or programs. The Comparison would be individual,
group or residential models of treatment for cannabis
abuse. While the family did not ask for such interventions,
they may be useful comparisons for consideration if family
treatments have not been demonstrated effective but others
interventions do have demonstrated effectiveness. The
Outcomes sought are cessation or reduced cannabis use and
improved school grades. Reduced cannabis use would be
an expectable outcome measure in research, but information on school grades might not be so easy to find. The
Type of search is to identify treatment options. This is the
main focus of EBP decisions in the social work literature
currently. The P.I.C.O. helped frame Step 1 of EBP.
129
One limitation of a P.I.C.O. summary is that it may
seem only an approximate fit to the specific details revealed
during the assessment. My more complete formulation of
the case revealed a wider set of strengths and challenges.
He displayed good judgment, personal boundaries, affect
regulation and self control. He could be reflective within
expectable teen age limits. Jax says he smoked pot 4–5
times a week (if this is accurate), but withdraws and seems
offended when I asked how he pays for it. He said he’s
good at basketball, but ‘‘street’’ not varsity level; he works
hard; he knows he’s smart–school is pretty easy. The
‘‘boys’’ are important to him, but often he knows that they
point him in directions different from the hopes of his
parents and his own values. He likes girls but has no
girlfriend and ‘‘no one special.’’ He goes to church weekly
but ‘‘isn’t into it.’’ He volunteers that his Dad sets him up
with work, but it is landscaping labor and ‘‘boring, and
‘‘pays nickels and dimes.’’ He isn’t sure what he wants to
do for work, but wants to make ‘‘reliable money.’’ His
parents are worried about him now that he might shift to a
path that may make his life hard. Yet they seem to know
Jax will have to make some decisions on his own. He is
less in their control. They do want him to know they
support him and he does seem to know this deeply. However, family love doesn’t earn much street ‘‘cred’’ and
personal respect with the boys. This you have to earn on
your own, independently. At another abstract level, the
family faces a routine life course developmental dilemma
that will last some years and will have inevitable ups and
downs. Still, keeping good grades and reducing or ending
the pot smoking may reduce the risks for Jax. It is
important to attend to, but there is much more going on that
matters too. Clinicians must both hold the details and the
larger picture.
Steps 2 and 3: A search of the Campbell Library
revealed three currently in-process research protocols very
close to this topic (family treatment for cannabis use), but
no completed studies. These protocols each addressed
different forms of family therapy for cannabis use in teens,
each with a slightly different age range. None of the protocols specifically identified African-American participants. All address reducing cannabis use, but none
appeared to address improving grades. The major problem
was that these protocols were still in progress and no
conclusions were yet available.
A search of the Cochrane Library revealed some useful
outcome information but not about family therapy specifically. A systematic review by Denis et al. (2008) found
that both cognitive-behavioral therapy (CBT) and motivational therapy, done individually and in group settings,
reduced cannabis use among adults. A single large-scale
trial showed extended, individual CBT to be more effective
than was brief motivational interviewing. There was also
123
130
research support for the use of contingency management
(behavioral interventions) as a useful adjunct to extended
CBT. These studies, however, were done on adults and did
not make any mention of racial or ethnic differences, or
even the inclusion of varied races in the study samples.
Family therapy was not mentioned. Another systematic
review by Smith et al. (2008) found little evidence that
therapeutic communities of several kinds were effective at
reducing or stopping cannabis use. As a comparison
treatment, residential intervention appeared no better than
individual outpatient treatment. Again, the population was
adults, family treatments were not mentioned, improving
grades was not measured and no information about the
inclusion of racial or ethnic variation in the samples was
provided.
A search for individual articles in PubMed and in PsychInfo databases revealed little useful information after
many hours of searching. Searches for terms such as
‘‘treatments ? marijuana ? adolescents’’ and ‘‘treatments ? marijuana ? adolescents ? experiments’’ pointed mainly to plans for studies but very few completed
outcome studies. This was a surprisingly limited search
result from these comprehensive databases.
In stark contrast, a search at the US Government’s
SAMSHA National Registry of Evidence-based Programs
and Practices ultimately revealed several highly rated
treatment outcome studies. Still, at first, using the site’s
‘‘Find an Intervention—Advanced Search’’ tool, searching
for ‘‘marijuana abuse’’ and clicking boxes for ‘‘Adolescents,’’ ‘‘African-American,’’ ‘‘Outpatient,’’ and ‘‘Urban’’
no results were returned (www.nrepp.samhsa.gov/Advanced
Search.aspx). Yet another basic search for ‘‘marijuana
abuse’’ revealed 11 interventions with some empirical
support (www.nrepp.samhsa.gov/SearchResultsNew.aspx?s=
b&q=marijuana%20abuse). Many of these interventions
were school-based prevention programs and not immediately
relevant. Some others were more on target.
One treatment model, multidimensional family therapy
(MDFT) (Liddle 1992) seemed appropriate. The SAMHSA
site reported two experimental outcome studies on MDFT
showing medium to large effect sizes for reducing marijuana use and improving school grade point averages
(Liddle et al. 2001, 2008). This is ‘Level 1c’ research in the
EBP hierarchy. However, the studies addressed alcohol and
substance abusing adolescents with conduct problems,
which is a bit unlike Jax who reports no alcohol use and has
exhibited no serious conduct problems at home or school
(beyond smoking pot). The studies did include AfricaAmerican adolescents. Another study showed MDFT’s
positive results were stable at a 1-year follow up (Liddle
et al. 2009). Applying some critical thinking, the creator of
MDFT therapy also completed both of the outcome studies,
creating a serious risk of attribution bias that the SAMHSA
123
Clin Soc Work J (2014) 42:123–133
site does not point out or address. Still, the model has
empirical support from experimental research and appears
close to what the family is seeking. Adapting even manualized treatments is often necessary due to client needs,
provider availability and changes that occur within the
treatment (Kendall et al. 2008)
Steps 4 and 5: This outcome research information was
provided to the family pretty much as described above.
They were all very surprised so little was known about
treatments to stop pot smoking; they had thought ‘‘lots
more’’ research had been done but that stopping was just
very difficult, ‘‘like stopping smoking.’’ Discussing the
research and options did provide a stimulus for developing
a treatment plan. The family’s values and preferences were
to work on the issue as a family unit. They decided to try
adapted MDFT. They understood that there was some
evidence such a combined individual and family treatment
model would likely be effective. They seemed to like
having major input into the treatment planning process, and
to have ‘‘choices.’’
The focus of the EBP practice decision-making process
in Steps 4 and 5 shifts from finding the best available
research to collaborating with the client to ensure the
treatment plan fits with their values and preferences. The
process is collaborative both in that the client has the power
to refuse a treatment plan with strong research support and
that the clinician shows an understanding of the client’s
problems and strengths. In this way, the clinician demonstrates clinical expertise and considered judgment, but also
checks to be sure the client truly does agree, and has a real,
active, part in the treatment planning process. EBP frames
treatment planning as collaborative, not as a top-down,
expert process determined by the clinician alone.
A true dilemma was that I am not trained and certified in
MDFT. Further, several phone calls showed no MDFT
certified clinicians were available in any nearby agencies.
The model, with its core components of enhancing motivation, building pro-social skills and family boundaries,
work on individual, family and peer issues (Liddle 1992)
shared features with treatments on which I had been trained
and supervised in practice. However, it is not clear that any
adapted treatment would be comparably effective to those
formally delivered and studied in the EBP research. On the
other hand, adaptations of treatments with research support
are common in real-world practice (Drisko and Grady
2012). Geographic location, lack of funding for training,
limited supervision, and payment restrictions frequently
put clinicians in situations where fully delivering a specific
treatment is not possible. In such cases, open discussion
with the client, the application of clinical expertise and
efforts to comply with agency policies are needed. In Jax’s
case, both he and his family wanted to continue with me.
They were not open to referrals (had there been such
Clin Soc Work J (2014) 42:123–133
options). After discussion, the family agreed to work with
me on an adaptation of the core MDFT components, but
knowing this was not exactly MDFT and what we would
undertake did not have full research support.
Step 6: We began 16 weekly individual and/or family
sessions using the adapted MDFT model. Specific efforts
addressed Jax’s own goals and motivation. In addition,
family resources were brought to bear to address Jax’s
goals more directly. Jax reported enjoying this time with
his parents, ‘‘especially going out to eat after we come
here.’’ It was my sense that positive and productive time
spent together—relationship and clear boundaries—was a
key cause of any improvements to follow. We spent a good
deal of time discussing peer issues and what it took to be
independent. Jax seemed recognized and affirmed in a new
way simply by having such discussions. His father said he
felt he had to let Jax ‘‘be a man.’’ Still, this was difficult to
do at times. Both parents’ protectiveness of their son
changed in quality but remained clearly present.
About week 10 of our work together, Jax’s report card
showed much improved grades. He and his parents were
pleased; they praised Jax for doing this hard work. How to
assess Jax’s pot smoking was much less clear. He reported
that his friends gave him a hard time when he tried to
‘‘back away’’ from smoking weed. As a teen seeking
independence, it seemed hardly fair-or likely to be effective-to ask Jax to talk truthfully about his pot use. He did
agree to give me a weekly log of his pot smoking. At the
end of each third session, we met without his parents to
discuss it briefly in private. Jax’s log showed a solid drop
off in pot smoking, a reduction of about 50 %. He had
previously admitted that he mostly enjoyed how pot made
him feel, but that it made him sleepy in school. His log now
showed that he occasionally smoked, but now clearly
marked ‘‘after school’’ or ‘‘at night.’’ I remained unclear if
this self-report measure was valid, but his perspective on
risks (legal and academic) became more thoughtful and
less reactive. While the treatment approach was adapted
MDFT, the early stages had been very smooth given the
family’s motivation and strengths. Many of the changes in
family equilibrium could have been understood alternatively through a structural family therapy lens. Also clear
was that our alliance and relationship mattered to Jax and
his family.
Practice evaluation was formal and informal, and
ongoing throughout the treatment. At the end of 16 sessions
each family member felt progress had been made. The
outcome ‘evidence’ was Jax’s improved grades and his
self-reports of reduced pot smoking. His mother said with
warmth and humor that ‘‘this mental health stuff isn’t so
bad.’’ I felt very much a catalyst for a family of able people
who had made a good but difficult judgment to seek services when the family as a whole needed to change its
131
pattern of interaction. (This, however, was not a specific
goal of therapy.) A follow-up phone call 8 weeks after
termination showed Jax’s grades were still solid and family
harmony had improved in other unexpected ways as well.
Conclusion
The case of Jax illustrates how applying EBP in clinical
practice requires the use of clinical expertise on an almost
moment-to-moment basis. Clinical expertise and many different forms of evidence shape engagement, assessment,
goal setting, defining EBP practice questions, doing EBP
literature searches and bringing research knowledge into
treatment planning—all before treatment had formally
begun! (Or had it perhaps actually begun with the engagement and relationship building that took place during
assessment?) Time and strong research knowledge is also
needed to appraise relevant research studies. Both areas
appear challenging for many clinical social workers
(Wharton and Bolland 2012). Applying EBP further involves
understanding and respecting the views, values and preferences of clients at all times. Clinicians must discuss,
understand, value and collaboratively include client values
and preferences in treatment planning. This is an explicit part
of EBP. EBP is never a top-down, ‘clinician as expert’
approach.
Using the best available research knowledge can point to
treatment options and approaches that the clinician might not
have identified otherwise. It is a true asset in treatment
planning. Still the best available research may often be
partial, incomplete or not very relevant to a specific client.
Many therapies remain un-studied. Other therapies may have
to be adapted to specific settings and available provider skill
sets. EBP may address questions of outcome well, but it does
not address the process-oriented, micro practice questions
clinical social workers also wish to have answered. This is a
serious limitation of EBP as a guide to treatment. Yet this
limitation can be resolved with time and further research, and
an expanded vision of meaningful research in the EBP
hierarchy. Perhaps the EBM/EBP research hierarchies
should be expanded to include more process-outcome
research and greater attention to studies of micro processes
within treatments. The sole EBP focus on broad outcomes is
very useful, but it ignores many questions of interest to clinicians that are also likely to influence client outcomes.
Clinicians need to apply core clinical competencies to
engage, assess, plan, treat and evaluate in order to make EBP
useful and practical. Nonetheless, research can be one
valuable part of treatment planning.
Clinicians must help researchers identify questions that
can guide their work other than end-point outcome
research. Choosing treatments is one concern of clinical
123
132
social workers, but may arise most often in the process of
doing assessment and treatment planning. Better defining
just what, specifically, causes clinical changes in greater
detail would also be useful to clinicians. Such questions are
a core aspect of the common factors approach (Cameron
and Keenan 2010, 2012; Drisko 2004; in press; Lambert
2013; Norcross and Lambert 2011). The common factors
approach conceptualizes the sources of change in practice
as including client factors, the quality of the therapeutic
relationship, specific treatment techniques and the attributes of the clinicians. Common factors researchers have
empirically documented that client factors and the therapeutic relationship are actually greater influences on client
outcomes than are specific techniques or therapies (Norcross and Lambert 2011). Yet detailed attention to client
factors, the therapeutic relationship and the attributes of
clinicians seems to be cut off from EBP at this time. The
result is a simplistic vision of what causes change in
practice (Drisko, in press). Technique alone is the focus of
EBP definitions of treatments. Knowing what treatments
‘work’ is indeed helpful, but knowing what aspects of
treatments are more, or less, effective would also be very
useful in practice. Clinicians make hundreds of decisions in
each clinical session based on many types of evidence.
More attention to process as well as to outcome could
improve the utility of practice research. In this area, clinicians need to inform researchers regarding what are
important practice questions. Researchers must also stay
close to actual practice.
It is not clear that experimental approaches to studying
practice outcomes are a sufficient basis to guide practice
without other kinds of information. Large-scale quantitative research—as privileged in EBP—can be very helpful
in identifying events that are not apparent in individual
cases or small samples. Such research can also identify
differences in outcomes based on demographic variables
such as gender, class and ethnicity that might otherwise be
invisible in smaller scale studies (Greenhalgh 2010). In
medicine, such knowledge has already saved many lives.
On the other hand, researchers still need case-based
exploratory research using qualitative data to identify new
disorders and changes in the expression of recognized
problems. Researchers still need descriptive survey and
incidence research to identify how widespread new disorders are among larger samples. Researchers still need
micro-scale research, including case studies, to identify and
to describe the details of actual clinical practice. Attention
to affect, to process, to relationship, to hunches, and to
complaints may all yield useful formative information to
guide future research. The prominence of EBP, and its
impact of research funding, sends a message that knowledge based on non-quantitative and non-experimental
designs is not valuable (Flyvberg 2011). This is
123
Clin Soc Work J (2014) 42:123–133
intellectually simplistic and ethically unfortunate. Social
workers have long honored ‘‘many ways of knowing’’
(Hartman 1994). EBP should help guide practice, but it
should not become a tool to devalue other important
aspects of practice or of research. Clinicians best guide
practice by applying many diverse forms of knowledge,
derived from many different kinds of ‘evidence’. EBP has
many merits but cannot alone guide clinical practice.
References
American Psychological Association. (2006). Evidence-based practice in psychology (A report of the APA task force on evidencebased practice). American Psychologist, 61, 271–285.
Anastas, J. (1999). Research design for social work and the human
services (2nd ed.). New York: Columbia University Press.
Black, N. (1994). Why we need qualitative research. Journal of
Epidemiological Community Health, 48, 425–426.
Cameron, M., & Keenan, E. K. (2010). The common factors model:
Implications for transtheoretical clinical social work practice.
Social Work, 55(1), 63–73.
Cameron, M., & Keenan, E. K. (2012). The common factors model for
generalist practice. New York, NY: Pearson.
Chambless, D., & Hollon, S. (1998). Defining empirically supported
therapies. Journal of Clinical and Consulting Psychology, 66,
7–18.
Chambless, D., & Ollendick, T. (2001). Empirically supported
psychological interventions: Controversies and evidence. Annual
Review of Psychology, 52, 685–716.
Cheek, J. (2011). The politics and practice of funding qualitative
inquiry: Messages about messages about messages. In N. Denzin
& Y. Lincoln (Eds.), The handbook of qualitative research (pp.
251–268). Thousand Oaks, CA: Sage.
Cochrane, A. (1972). Effectiveness and efficiency: Random reflections
on health services. London: Nuffield Provincial Hospitals Trust.
Excerpts are also available online at www.cochrane.org/aboutus/history/archie-cochrane.
Denis, C., Lavie, E., Fatseas, M., & Auriacombe, M. (2008).
Psychotherapeutic interventions for cannabis abuse and/or
dependence in outpatient settings. Cochrane Database of
Systematic Reviews, 2006, Issue 3. Art. No.: CD005336. doi:
10.1002/14651858.CD005336.pub2.
Drisko, J. (2004). Common factors in psychotherapy effectiveness:
Meta-analytic findings and their implications for practice and
research. Families in Society, 85(1), 81–90.
Drisko, J. (in press). Common factors in psychotherapy. The
encyclopedia of social work online (pages tbd). New York:
Oxford University Press.
Drisko, J., & Grady, M. (2012). Evidence-based practice for clinical
social workers. New York: Springer.
Drisko, J., & Grady, M. (in press). Thorough clinical assessment: The
hidden foundation of evidence-based practice. Families in
Society.
Flyvberg, B. (2011). Case study. In N. Denzin & Y. Lincoln (Eds.),
The handbook of qualitative research (pp. 305–316). Thousand
Oaks, CA: Sage.
Gibbs, L. (2002). Evidence-based practice for the helping professions: A practical guide. Belmont, CA: Brooks-Cole.
Gilgun, J. (2005). The four cornerstones of evidence-based practice.
Research on Social Work Practice, 15(1), 52–61.
Grading of Recommendations Assessment, Development and Evaluation (GRADE) Working Group. (no date). Grading the quality
Clin Soc Work J (2014) 42:123–133
of evidence and the strength of recommendations. Retrieved
from http://www.gradeworkinggroup.org/intro.htm.
Greenhalgh, T. (2010). How to read a paper: The basics of evidencebased medicine (4th ed.). Hoboken, NJ: BMJ Books; BlackwellRiley.
Grinnell, R., & Unrau, Y. (2010). Social work research and
evaluation: Foundations of evidence-based practice. New York:
Oxford University Press.
Guyatt, G., Rennie, D., Meade, M., & Cook, D. (2008). Preface to
users’ guides to the medical literature. Essentials of evidencebased clinical practice (2nd ed.). New York: McGraw-Hill. Also
available online at http://jamaevidence.com/resource/preface/
520.
Hartman, A. (1994). Many ways of knowing. In W. Reid & E.
Sherman (Eds.), Qualitative research in social work (pp.
459–463). Washington, DC: NASW Press.
Haynes, R., Devereaux, P., & Guyatt, G. (2002). Clinical expertise in
the era of evidence based-medicine and patient choice. Evidence-Based Medicine, 7, 36–38.
Ioannidis, J. (2005). Why most published research findings are false.
PLoS (Public Library of Science) Med, 2(8), e124. Retrieved
from
www.plosmedicine.org/article/info:doi/10.1371/journal.
pmed.0020124.
Kendall, P., Gosch, E., Furr, J., & Sood, E. (2008). Flexibility within
fidelity. Journal of the American Academy of Child and
Adolescent Psychiatry, 47(9), 987–993.
Lambert, M. (2013). The efficacy and effectiveness of psychotherapy.
In M. Lambert (Ed.), Bergin and Garfield’s handbook of
psychotherapy and behavior change (6th ed., pp. 169–218).
Hoboken, NJ: Wiley.
Liddle, H. (1992). A multidimensional model for treating the
adolescent drug abuser. In W. Snyder & T. Ooms (Eds.),
Empowering families, helping adolescents: Family-centered
treatment of adolescents with mental health and substance
abuse problems (pp. 91–100). Washington, DC: U.S. Government Printing Office.
Liddle, H., Dakof, G., Parker, K., Diamond, G., Barrett, K., & Tejeda,
M. (2001). Multidimensional family therapy for adolescent drug
abuse: Results of a randomized clinical trial. American Journal
of Drug and Alcohol Abuse, 27(4), 651.
Liddle, H., Dakof, G., Turner, R., Henderson, C., & Greenbaum, P.
(2008). Treating adolescent drug abuse: A randomized trial
comparing multidimensional family therapy and cognitive
behavior therapy. Addiction, 103(10), 1660–1670.
Liddle, H., Rowe, C., Dakof, G., Henderson, C., & Greenbaum, P.
(2009). Multidimensional family therapy for young adolescent
substance abuse: Twelve-month outcomes of a randomized
controlled trial. Journal of Consulting and Clinical Psychology,
77(1), 12–25.
Littell, J., Corcoran, J., & Pillai, V. (2008). Systematic reviews and
meta-analysis. New York: Oxford University Press.
Manuel, J., Mullen, E., Fang, L., Bellamy, J., & Bledsoe, S. (2009).
Preparing social work practitioners to use evidence-based
practice: A comparison of experiences from an implementation
project. Research on Social Work Practice, 19(5), 613–629.
133
Norcross, J., Hogan, T., & Koocher, G. (2008). Clinicians’ guide to
evidence-based practices. New York: Oxford University Press.
Norcross, J., & Lambert, M. (2011). Evidence-based therapy
relationships. In J. Norcross (Ed.), Psychotherapy relationships
that work (pp. 3–21). New York, NY: Oxford University Press.
Oxford University Centre for Evidence-based Medicine. (2009,
March). Levels of evidence. Retrieved from www.cebm.net/
index.aspx?o=1025.
Oxford University Centre for Evidence-based Medicine. (2011). The
Oxford levels of evidence 2011. Retrieved from www.cebm.net/
mod_product/design/files/CEBM-Levels-of-Evidence-2.1.pdf.
Popay, J., & Williams, G. (1998). Qualitative research and evidencebased healthcare. Journal of the Royal Society of Medicine,
91(Supp 35), 32–37.
Rubin, A. (2008). Practitioner’s guide to using research for evidencebased practice. Hoboken, NJ: Wiley.
Rubin, A., & Parrish, D. (2007). Views of evidence-based practice
among faculty in master of social work programs: A national
survey. Research on Social Work Practice, 17, 110–122.
Sackett, D., Richardson, W., Rosenberg, W., & Haynes, R. (1997).
Evidence-based medicine: How to practice and teach EBM. New
York: Churchill Livingston.
Sackett, D., Straus, S., Richardson, W., Rosenberg, W., & Haynes, R.
(2000). Evidence-based medicine: How to practice and teach
EBM (2nd ed). Edinburgh: Churchill Livingstone.
Sands, R., & Gellis, Z. (2011). Clinical social work practice in
behavioral mental health: Toward evidence-based practice (3rd
ed.). New York: Pearson.
Simmons, B. (2013). Individual, agency and environmental factors
impacting clinical social workers’ involvement with evidencebased practice. Unpublished doctoral dissertation, Smith College
School for Social Work, Northampton, Massachusetts 01060.
Smith, L., Gates, S., & Foxcroft, D. (2008). Therapeutic communities
for substance related disorder. Cochrane Database of Systematic
Reviews, 2006, Issue 1. Art. No.: CD005338. doi:10.1002/
14651858.CD005338.pub2.
Tanenbaum, S. (2003). Evidence-based practice in mental health:
Practical weakness meets political strengths. Journal of Evidence
in Clinical Practice, 9, 287–301.
Trinder, L. (2000). A critical appraisal of evidence-based practice. In
L. Trinder & S. Reynolds (Eds.), Evidence-based practice: A
critical appraisal (pp. 212–241). Ames, IA: Blackwell Science.
Wharton, T., & Bolland, K. (2012). Practitioner perspectives of
evidence-based practice. Families in Society, 93(3), 157–164.
Author Biography
Dr. James Drisko is author of ‘‘Evidence-based Practice in Clinical
Social Work’’ with Melissa D. Grady. He has long experience in child
clinical practice, and in both practice process research and in practice
outcome research. Dr. Drisko is professor at the Smith College School
for Social Work in Northampton, Massachusetts.
123
Copyright of Clinical Social Work Journal is the property of Springer Science & Business
Media B.V. and its content may not be copied or emailed to multiple sites or posted to a
listserv without the copyright holder's express written permission. However, users may print,
download, or email articles for individual use.
Download