Running head: SCIENTIST-PRACTITIONER PAPER 1 Definition of the Scientist-Practitioner Framework Despite the relative youth of psychology as a science, the training and practice of clinical psychologists has evolved rapidly. As acknowledged by Challmen at al. (1949), the discipline of clinical psychology born in the early 20th century grew to be known for its strong emphasis on clinical assessment and relative indifference towards research methodologies. With the aim of identifying training needs for clinical psychology students, the scientist-practitioner model emerged as a framework for training students who would be well-prepared in both scientific research and clinical practice. The APA established the scientist-practitioner model on the basis of integrating research and practice, to allow each of these two facets of training to inform one another (Jones & Mehr, 2007). Graduate program educators sought to use the model to “train psychologists who are capable of applying psychological knowledge to their work… as well as possessing the ability to move the field forward and generate fresh knowledge in the form of new empirical findings” (Jones & Mehr, 2007, p. 767). It seems that the most current applications of the scientist-practitioner model in psychology tend to favour an integrated approach (Healy, 2017; Jones & Mehr, 2007). However, without deeper reflection on the literature, it is unclear what this integration actually looks like for psychologists operating within this model. Jones & Mehr (2007) began to concretely illustrate what the integrated model looks like in practice, stating that “a scientist-practitioner is someone who applies critical thought to practice, uses proven treatments, evaluates treatment programs and procedures, and applies techniques based on supportive literature” (p. 770). This quote is helpful for conceptualizing the real-world applications of the scientist-practitioner framework beyond graduate training; this description allows clinicians to look ahead to the activities in which they might engage in their future SCIENTIST-PRACTITIONER PAPER 2 practice and facilitated further exploration into what it means to maintain a scientist-practitioner framework. Evidence-Based Practice: Maintaining a Scientist-Practitioner Framework The use of evidence-based practice (EBP) is at the core of maintaining a scientistpractitioner framework. The Canadian Psychological Association’s Task Force on EvidenceBased Practice of Psychological Treatments published an outline of guidelines and standards for implementing EBP to “develop a position statement regarding the optimal integration of research evidence into practice” (Dozois et al., 2014, p. 154). Namely, “EBP of psychological treatments involves the conscientious, explicit, and judicious use of the best available research evidence to inform each stage of clinical decision-making and service delivery” (Dozois et al., 2014). Upon reflection, it is clear that the utility of EBP aligns closely with those activities which define scientist-practitioners, as Jones & Mehr (2007) described. In 2006, Vespia authored an article to advocate for the benefits of maintaining a scientistpractitioner framework; however, Overholser (2010) identified Vespia’s position as less credible given that she had not engaged in any service delivery in over 3 years. However, Dozois et al. (2014) provided some clarity for how Vespia’s position could still be considered to follow the scientist-practitioner model, indicating that “research should be informed by practice to ensure that the discipline and profession are providing evidence for treatments that respond to the kinds of problems that clients bring to psychology practitioners” (p. 154). Although Overholser’s (2010) concerns for researchers potentially losing touch with practice after periods of clinical inactivity are warranted, Dozois et al.’s (2014) aim to integrate science and practice across roles speaks more to the scientist-practitioner framework as a mindset. Using the best available evidence to inform treatments ensures that our database of scientific knowledge is put to use SCIENTIST-PRACTITIONER PAPER 3 responsibly in alignment with the goal of providing the best possible service to clients. Similarly, researchers conducting treatment outcome and process studies should be mindful of those concerns which are brought forth by clinical populations seeking help. Scientist-Practitioners in Practice Dozois et al.’s (2014) hierarchy of research evidence provides a helpful framework for weighing evidence in terms of its strength. As a practitioner delivering psychological interventions, one might strive to gather as much background information as possible to best understand the reason for referral to services. This would orient clinicians towards the clients’ goals for treatment, and provide a framework for how we might collaboratively evaluate the effectiveness of a given intervention based on progress monitoring (Dozois et al., 2014). It may be most helpful to consider the scientist-practitioner framework as a mindset for approaching practice; using the best available evidence provides the best chance for a client to succeed. However, one potential barrier to this approach might emerge when “perhaps… equally strong studies reach conflicting conclusions” (Rubin, 2007). In this case, it may not be clear whether or not an intervention’s efficacy is supported or refuted by the “evidence.” Based on the literature, the solution to this barrier might rely in developing a deeper understanding of client variables such as their “specific client characteristics, cultural backgrounds, and treatment preferences” (Dozois et al., 2014, p. 155). It also speaks to the importance of working collaboratively with the client, an important facet of Dozois et al.’s (2014) definition of “evidence” in EBP: “the process of evidence-based treatments is one of collaboration with a client” (p. 155). Thus, given a situation where an intervention was both supported and refuted by literature, one might introduce the intervention by describing to a client, using accessible language, what the intervention entails and the fact that there is both evidence in support of and against this intervention’s efficacy. This SCIENTIST-PRACTITIONER PAPER 4 invites the client to take an active role in their treatment plan with the reassurance that the intention is to continually monitor personal progress. In the event that the intervention turned out to not meet the client’s needs, Rubin (2007) suggests moving down the hierarchy to studies of interventions in which methods were less rigorous, but the results were overall still supportive of the intervention as an alternative to the available best evidence. Scientist-Practitioners in Research From the researcher’s perspective, a central challenge to maintaining the scientistpractitioner framework seems to be knowing what is “worth” researching. Reflections in the field of industrial-organizational (IO) psychology point to this as a pervasive issue outside of the clinical field. Delmhorst (2018) describe a trend in IO psychology whereby there is a lag between what is being studied and what questions needed to be answered by employees in the field. This brings to mind the concept of “trendy” intervention studies where clinical populations, such as ASD, receive a lot of attention and funding given the disorder’s prevalence. However, can the same be said for clinical populations of a smaller size? The ideals of high internal validity in randomized-control studies are sensible from a research perspective; this is the surest way to have confidence in the generalizability of a study’s results (Dozois et al., 2014). After browsing some of the Cochrane Reviews, however, it became clear that comorbidity in study samples made it difficult to summarize the effectiveness of interventions (Macdonald et al., 2012; Olthius, Watt, Bailey, Hayden & Stewart, 2016). Similarly, practitioners encounter a broad range of clients who do not present with perfectly distinct, isolated symptoms to address in intervention (Weisz, Krumholz, Santucci, Thomassin & Ng, 2015). In keeping with the scientistpractitioner model, a solution mentioned several times in the literature was to elicit calls for research proposals via networks such as Practice Research Networks (Rotolo et al., 2018; Tasca, SCIENTIST-PRACTITIONER PAPER 5 n.d.). Maintaining involvement in such networks would allow researchers to determine whether or not there was value in studying a smaller clinical population, such as those with comorbid diagnoses. Referring to these networks provides guidance that supports the development of a scientific database while addressing concerns brought forth in practice by populations seeking help. References Albee, G. W., & Loeffler, E. (1971). Role conflicts in psychology and their implications for a reevaluation of training models. The Canadian Psychologist, 12(4), 465–481. doi:10.1037/h0082154 Belar, C. D., & Perry, N. W. (1992). National conference on scientist-practitioner education and training for the professional practice of psychology. American Psychologist, 47, 71–75. doi:10.1037/0003-066X.47.1.71 Challman, R. C., Irwin, F. W., Kelly, E. L., Luckey, B. M., Margaret, A., Mowrer, O. H., … Heiser, K. F. (1949). Doctoral training programs in clinical psychology: 1949. American Psychologist, 4(8), 331–341. doi:10.1037/h0057831 Dawson, G. (2013). Dramatic increase in autism prevalence parallels explosion of research into its biology and causes. JAMA Psychiatry, 70(1), 9–10. doi:10.1001/jamapsychiatry.2013.488 Delmhorst, F. (2018). What if any science will do? Industrial and Organizational Psychology, 11(2), 236–240. doi:10.1017/iop.2018.11 Dozois, D. J. A., Mikail, S. F., Alden, L. E., Bieling, P. J., Bourgon, G., Clark, D. A., … Johnston, C. (2014). The CPA Presidential Task Force on evidence-based practice of SCIENTIST-PRACTITIONER PAPER 6 psychological treatments. Canadian Psychology/Psychologie canadienne, 55(3), 153– 160. doi:10.1037/a0035767 Healy, P. (2017). Rethinking the scientist-practitioner model: On the necessary complementarity of the natural and human science dimensions. European Journal of Psychotherapy & Counselling, 19(3), 231–251. doi:10.1080/13642537.2017.1348376 Jones, J. L., & Mehr, S. L. (2007). Foundations and assumptions of the scientist-practitioner model. American Behavioural Scientist, 50(6), 766–771. doi:10.1177/0002764206296454 Macdonald, G., Higgins, J. P. T., Ramchandani, P., Valentine, J. C., Bronger, L. P., Klein, P., … Taylor, M. (2012). Cognitive-behavioural interventions for children who have been sexually abused. Cochrane Database of Systematic Reviews, 2012(5), 1–70. doi:10.1002/14651858.Cd001930.pub3 Olthius, J. V., Watt, M. C., Bailey, K., Hayden, J. A., & Stewart, S. H. (2016). Therapistsupported Internet cognitive behavioural therapy for anxiety disorder in adults. Cochrane Database of Systematic Reviews, 2016(3), 1–205. doi:10.1002/14651858.CD011565.pub2 Overholser, J. C. (2010). Ten criteria to qualify as a scientist-practitioner in clinical psychology: An immodest proposal for objective standards. Journal of Contemporary Psychotherapy, 40, 51–59. doi:10.1007/s10879-009-9127-3 Rotolo, C. T., Church, A. H., Adler, S., Smither, J. W., Colquitt, A. L., Shull, A. C., …Foster, G. (2018). Putting an end to bad talent management: A call to action for the field of industrial and organizational psychology. Industrial and Organizational Psychology, 11(2), 176–219. doi:10.1017/iop.2018.6 SCIENTIST-PRACTITIONER PAPER 7 Rubin, A. (2007). Practitioner’s guide to using research for evidence-based practice [ProQuest Ebook version]. Retrieved from https://ebookcentral.proquest.com/lib/ualberta/detail.action?docID=353396 Tasca, G. (n.d.). Research, practice, and PRNs: What is a practice research network (PRN)? Retrieved from https://cpa.ca/sections/clinicalpsychology/prns/ Vespia, K. M. (2006). Integrating professional identities: Counselling psychologist, scientistpractitioner and undergraduate educator. Counselling Psychology Quarterly, 19(3), 265– 280. doi:10.1080/09515070600960555 Weisz, J. R., Krumholz, L. S., Santucci, L., Thomassin, K., & Ng, M. N. (2015). Shrinking the gap between research and practice: Tailoring and testing youth psychotherapies in clinical care contexts. Annual Review of Clinical Psychology, 11, 139–163. doi:10.1146/annurev-clinpsy-032814-112820