by Manuela Benichou, Célie Gérard and Hélène Jalin

Introduction

This scientific paper was written by Dattilio, Edwards and Fishman. Franck M. Dattilio is a University professor in the Psychiatry Department at Harvard School of Medicine and at the University of Pennsylvania School of Medicine and he also practices clinical psychology and criminology, as well as family therapy. David J. A. Edwards is a former university professor, he is currently practicing as a clinical psychologist, he does research supervision and postgraduate teaching and he also supervises training in psychotherapy, schema therapy and post-traumatic stress disorder. Daniel B. Fishman is the editor-in-chief of the online journal, Pragmatic Case Studies in Psychotherapy. He is also a psychologist and he is interested in cognitive-behavioral therapies, systematic case study method, pragmatic, organizational psychology among others. 

This article addresses « long-standing divide between researchers and practitioners in the field of psychotherapy, regarding what really works in treatment and the extent to which interventions should be governed by outcomes generated in a laboratory atmosphere » (Dattilio, Edwards & Fishman, 2010, p.427). This paper is intended for researchers and practitioners who have difficulties in agreeing on what works in clinical practice and what achieves significant results in the field of research. Authors raise the problem that clinicians tend to use tools lacking scientific validity, but practice recommendations related to scientific advances are sometimes difficult to apply. They promote a mixed methods paradigm which includes both quantitative and qualitative methods (e.g. systematic case studies for example). 

Content of the article

For the authors, the source of these misunderstandings is that science focuses on the development of general laws and leaves no room for contextualized knowledge. 

When it comes to evaluating the efficiency of a treatment, randomized controlled trials (RCTs) are still considered the gold standard of science. The authors of this article aim to criticize this vision, which they consider too restrictive. For them, case studies, if carried out with scientific methods such as systematic case studies (SCS), should always complement the former. This is because the RCT may neglect or underestimate factors such as the client’s history, his commitment to treatment or the therapeutic alliance.

Furthermore, given that RCT research is a real source of knowledge, this method may deny that the needs of clinicians need more contextual information and cannot be restricted to general laws. The latter, generally leads to a huge loss of information. For example, the use of standardized questionnaires in RCT research doesn’t allow to get a wealth of information concerning patients’ subjectivity and how they have been through complicated events. We can assume that results found may be excessively “purified” of external factors and therefore not useful to practitioners.

However, the case studies were essentially criticized for their lack of rigor: no objective account based on a recording, data selected by the therapist’s own free will, an interpretation too specific without taking into account other explanations, and finally the absence of contextual information. Also, case studies are criticized because data found on one single person (considered as the study population), shouldn’t be generalized to a group of people that seems to have similar characteristics. 

There is therefore a real interest in defining rigorous criteria for obtaining reliable clinical knowledge, in particular the systematic collection of elements allowing a global understanding of the case, the choice of a coherent and recognized theoretical orientation, the formulation of clinical hypotheses and finally a detailed description of the course of therapy, based on recordings. 

As a complement to the RCT, these case studies could focus on patients for whom the therapy has not led to clinical improvement, and highlight the factors that could explain this failure. Another method is the « case comparison method », which can provide useful information on why the method is successful for some persons and not for others. A comparison of good-outcome and poor-outcome cases can provide insight into what factors may explain these differences in treatment outcome. 

Burckell and McMain (in press) were thus able to explain, thanks to this method, a very different progression in two borderline patients – Marie and Dean – by two main factors : the therapeutic alliance and the quality of the therapist’s supervision. A case study provides only a small body of evidence, but repeated observation of the relationships between interventions and responses in a series of cases builds a database of evidence on which clinical theory can be constructed. 

The authors therefore propose that SCS (systematic case studies) become a mandatory part of the scientific reporting of studies evaluating the effectiveness of psychotherapies.

Critical evaluation

In this article, authors make a case against the supposed superiority of idiographic or nomothetic research. The specificity and the contribution is the willingness to get through the limits of each method (qualitative and quantitative), and assume that their strength should be combined to get a methodology which could be rewarding for both researchers and clinicians practitioners. For example, hypotheses and assumptions identified in case studies could be the subject of RCT’s to test them. 

Indeed, the aim of the authors is to create a research paradigm including mixed methods, that is to say qualitative and quantitative, and what they propose is rather successful : systematic case study. However, reconciling field psychologists and research psychologists seems ambitious. Surely a lot of articles like this one will have to be published for the situation to change ! But as far as the field of research is concerned, this paradigm is rather interesting. However, the inclusion of different methods represents a very important additional workload in research, additionally research committees tend to favor short-term studies, which can represent a break in the application of the mixed methods paradigm research. This pitfall is found in the examples cited by the authors : if the case comparison method brings interesting elements to understand the failure of a therapeutic intervention or in any case the non-improvements of the patient’s condition, we have no information on a longer follow-up for Marie, Dean, Tom and Eloïse, the patients mentioned in this article. In addition, we do not have much information on their individual feelings regarding the therapeutic treatment. Longer-term studies would be richer and more interesting.

Conclusion 

To conclude, the authors approach an integrative vision of research methods in clinical psychology which can be very enriching. The practice of psychology in America is undoubtedly different from that of France, we believe that if such an article were written in French, it would be appropriate to address the current wars between psychoanalysis and cognitive behavioral therapies. Indeed, it would be interesting for a psychoanalyst to give his opinion on this article. Also, as a psychologist we must keep in mind that there will always be a gap between the field of research and this gap must exist : the clinical encounter cannot be totally systematized, there will always be an unattainable part and this is the essence of our profession. Subjectivity and encounter with the patient cannot be reduced to standardized methods, however meticulous they may be. The use of technical tools can hamper the bond between a psychologist and his patient, and as specified in this article the best preacher for obtaining therapeutic results is the quality of the therapeutic alliance. However, in the field of research, the combination of methods can only be a scientific advance, even if the application may be long and it will always remain elusive in humans. 

Words we have learned : 

  • Long-standing : de longue date
  • To leave no room to : ne pas laisser de place à
  • A wealth of information : une mine d’informations
  • A pitfall  : un écueil
  • To make a case : plaider, faire valoir ses arguments

Leave a Reply