Introduction

Artificial intelligence (AI) is a topic that is currently (in spring of 2023) repeatedly producing headlines in all media worldwide. One update follows the next, areas of application are being tried out and developed, and there exist innumerable warnings about the dangers of artificial intelligence, which has already arrived in the field of psychotherapy (e.g., Aktan et al., 2022; Cioffi et al., 2022; Sedlakova and Trachsel, 2022). Some psychotherapists offer AI-assisted psychotherapy (Helgadottir and Menzies, 2023), and a frequently asked question is whether AI chatbots can become the psychotherapists of the future (Wei, 2023). Commonly cited examples of psychotherapeutic applications in which AI might be useful relate to psychotherapeutic treatments which are manualized (e.g., Creed et al., 2022; Piette et al., 2022). AI can design a treatment plan adapted to answers to certain diagnostic questions, which patients (with or without therapeutic support) can follow. For example, a group of researchers in China developed a chatbot to help treat patients with anxiety disorders who take anxiolytics. The chatbot includes 15 treatment modules in cognitive behavioral therapy (CBT), psychoeducation, mindfulness therapy, problem solving, positive emotions, coping with insomnia, etc., twelve of which are presented to patients based on their input (Su et al., 2022).

Chatbots are a relatively new technology called conversational artificial intelligence (CAI). One of these chatbots is known worldwide as ChatGPT by the AI-developer OpenAI. The big difference, besides the attention ChatGPT gets in media, is that it is available for and can be used everywhere by everyone who has an internet connection, including psychotherapists and people with psychological disorders. For example, one of the most frequently used functions is generating texts and summaries about various topics for school and studying. The software is so good that the results cannot be detected by conventional plagiarism software. It can only be detected if part of the text comes from the chatbot, and another part is self-written because the self-written part usually contains typos and has a different style. How does the popular chatbot work, and is it really applicable for psychotherapeutic treatment?

This question had already been asked in another paper by Eshghie and Eshghie (2023) that was not yet published when this manuscript was written but was already accessible when this paper was revised. In the article, the authors states that ChatGPT can engage in positive conversations, actively listen, and provide validation and coping strategies (Eshghie and Eshghie, 2023). This paper, in contrast to the article by Eshghie & Eshghie, critically evaluates ChatGPT’s capabilities from the perspective of an approach pluralistic psychotherapy science. The special characteristics of psychotherapy and approach pluralistic psychotherapy science are first presented. The following chapter introduces the chatbot ChatGPT and the current literature about ChatGPT and psychotherapy. Three fields of AI application are then analyzed by means of case examples, namely the suitability of AI as an assistant for psychotherapists, as a support for patients who are already undergoing psychotherapy, and for patients who are not yet undergoing psychotherapy. These case examples are real chats with the bot which can be read in detail in the Supplement. In the final chapters, conclusions are drawn based on ChatGPT’s responses to the author’s requests and discussed in the context of psychotherapy science.

Approach pluralistic psychotherapy science

Psychotherapy has many definitions and is a wide field containing numerous approaches, like psychoanalysis, psychosynthesis, Gestalt therapy, countless methods within cognitive behavioral therapy, provocative therapy, logotherapy, daseinsanalysis, bioanalysis, eco therapy, existential psychotherapy, Morita therapy, and many more (Corsini, 1973). Some researchers and clinicians claim that psychotherapy is a special field of clinical psychology or medicine. Others argue that psychotherapy is a science and profession on its own. Psychoanalysts and psychotherapists applying different approaches have proposed the scientific and professional independence of psychotherapy ever since the so-called question of lay analysis appeared in the 1920s (Pritz, 1996). Following the radical-constructivistic method called treatment-possibilities-expanding psychotherapy science, there have been nearly as many psychotherapy approaches as there are psychotherapists (Raile, 2024). That is due to the psychotherapists’ integrative ambitions. Once they have completed their psychotherapeutic training in a specific method, they look beyond this approach and discover many ways to interpret and treat patients (Norcross and Alexander, 2019; Crameri et al., 2018). These pathways can be integrated into practitioners’ psychotherapeutic schemata and expand their treatment possibilities. The numerous diverse approaches result in each psychotherapist developing his or her own therapeutic style.

If two patients with the same symptoms went to two different physicians or psychologists, these would ideally diagnose and treat both patients according to current scientific standards. This is not possible in psychotherapy because it is not only the symptoms that matter, but the person displaying them, his or her biography, social environment, etc. Psychotherapeutic treatment and research are also significantly related to the individual psychotherapist’s personality and treatment approach. This is one of the reasons why randomized, controlled trials (RCT) are not very useful for practical psychotherapeutic treatment in individual cases (Köhlke, 1992; Kriz 2000, 2008, 2019; Raile, 2024). Therapists must adapt to every situation if they want to treat patients successfully. This characterizes psychotherapy and distinguishes it from experimental psychology. In psychotherapy, there are many ways to work with patients/clients which are no better or worse than other methods, but one may be more suitable for the individual psychotherapist and patient-therapist constellation. From this perspective, it is reasonable and necessary for psychotherapy to maintain its plurality of approaches. The question remains whether and how ChatGPT can be integrated into this multifaceted psychotherapeutic field.

ChatGPT

ChatGPT introduces itself:

“ChatGPT is a chatbot based on a large language model developed by OpenAI called GPT (Generative Pre-trained Transformer). It is designed to simulate human-like conversation by generating text in response to user inputs. ChatGPT is trained on a vast amount of data, allowing it to understand and generate text in a wide range of topics and styles.” (ChatGPT, personal conversation on April 9, 2023)

In practice, it is easy to use. You only need to create an account and log in; then you can ask ChatGPT any question you want answered. The chatbot writes whole paragraphs and texts in the length and comprehensiveness one desires within a few seconds. The simplicity of use is one reason why chatbots like ChatGPT can and probably will be used in the future for questions about any life situation. This includes questions about difficult life challenges, as well as psychological issues.

ChatGPT is presently the most advanced language model to respond adequately to the user’s input text (ChatGPT, personal conversation on April 18, 2023). ChatGPT has answers to every question. When asked how it can be useful for psychotherapy, it offers the following four aspects: 1) It can assist in screening and diagnosis, 2) provide information, reminders, and guidance to patients during treatment, 3) provide emotional support to patients through listening and empathy, and 4) assist with skills training (ChatGPT, personal conversation on April 14, 2023). Although it provides answers, this doesn’t mean they are correct.

Although ChatGPT is a very young phenomenon, the number of scientific texts it creates is growing rapidly. In the short time span between writing this paper and its revision, about 500 new articles on ChatGPT were published according to Scopus. These papers include topics like ChatGPT in orthopedics and sports medicine (Fayed et al., 2023), ChatGPT in academia (Chan, 2023), ChatGPT as a psychiatrist (Das and Ghoshal, 2023), ChatGPT as a scientific reviewer (Qureshi et al., 2023), the moral influence of ChatGPT answers on humans (Krügel et al., 2023), etc. Significantly more technical literature exists on AI in general. The spectrum includes works on information technology, the impacts of AI application in various areas (economy, industry, etc.), as well as cultural, psychological, philosophical, and particularly ethical questions (e.g., Chen, 2023; Gupta et al., 2023; Lapidus, 2023; Rakowski et al., 2023; Ullmann, 1965).

Unfortunately, there still lacks scientific literature about ChatGPT and psychology/psychotherapy (Uludag, 2023), but some papers have been published on related issues. For example, Javaid et al. (2023) cite application areas for ChatGPT in healthcare such as patient education, clinical trial support, access to health-related information, counseling, symptom and syndrome recognition support, treatment program development, and individual health counseling. Bubeck et al. (2023) state that ChatGPT version 4 has a deep understanding of theories of the mind. Sharma et al. (2023) find that AI-human collaborations lead to significantly more empathetic conversations than among humans alone. They do not refer to ChatGPT, but to another chatbot they developed, although the result could be interesting with regard to OpenAI’s chatbot. Carlbring et al. (2023) point out the limitations of ChatGPT, such as the lack of understanding of irony leading to unfavorable reactions or ChatGPT’s limited ability to carry on long conversations, which may hinder therapeutic bonding. Ethical concerns are also discussed in the context of psychotherapy. Biron (2023) finds that posts from chatbots on a peer-based mental-health platform receive higher ratings than posts by humans, which is why transparency concerning the author is important. Wang et al. (2023) note ethical concerns about health care. Legal-ethical questions about responsibility in the case of patient harm are asked, as are questions about algorithmic ethics regarding bias, transparency, validation, and evaluation. New applications of ChatGPT in healthcare and psychotherapy are continuously being developed and explored, for example a yet unpublished tool for psychoanalytic interpretation of patient cases (personal conversation with Karl Golling on September 27th, 2023). In summary, ChatGPT is considered a potentially useful tool in healthcare. Concerns remain about accuracy, reliability, and jurisdictional impact. Further research to evaluate ChatGPT’s clinical outcomes and compare its effectiveness to other AI chatbots in healthcare is necessary and important (Temsah et al., 2023).

Besides scientific papers, there are reports from users who, for example, write about successful trauma recovery through externalization with assistance of the chatbot or claim that “ChatGPT is better than my therapist” because s/he only had to mention things once, and it responds (unlike the therapist) to the whole question (Source: posts of users in Reddit). There are now even instructions from bloggers on how best to use ChatGPT as a psychotherapist substitute. They suggest starting by defining the problem and emphasize that it is important to be honest (a chatbot doesn’t judge and is not indiscrete) and use open-ended questions and positive affirmations. To use ChatGPT’s resources efficiently, it may be necessary to take breaks, to process what has been read, and to know the strengths and weaknesses of the chatbot (Martinez, 2023).

ChatGPT and psychotherapy—three case studies

Methodology

The research question to be answered in the following empirical section concerns how ChatGPT can provide a useful tool for patients and/or psychotherapists within a psychotherapeutic setting or even serve as a substitute for psychotherapy. Three case studies are presented and analyzed here to answer this question. The underlying method is a modifiedFootnote 1 psychotherapeutic case study, which is interpreted with the scientific method of hermeneutics (Chessick, 1990). Artificial intelligence is an intentional system developed by intentional human beings. Zhu and Harrell (2009) argue that statements made by AI can be interpreted hermeneutically. In a hermeneutic approach to psychotherapeutic case studies, an underlying theory of interpretation is important, as well as a research question (Edwards, 1998; Greenwood and Loewenthal, 2007). The case studies reported in this paper are not interpreted from a single psychotherapeutic approach, like psychoanalysis, but from the perspective of a method from pluralistic psychotherapy as described in the section ‘approach pluralistic psychotherapy science’ and especially in Raile (2024). Three case studies are presented according to their practical relevance, comprehensibility, and usefulness in answering the research question. Examples from the author’s own psychotherapeutic practiceFootnote 2 are cited to ensure practical relevance. Three maximally different application scenarios are presented to illustrate the broad field of possible applications of ChatGPT in psychotherapy.

ChatGPT can be used by anyone with internet access. In the context of psychotherapy, psychotherapists, patients, interested laypersons, relatives (of patients or people suffering from mental illness), care professionals etc. could apply the chatbot. Two primary participants in psychotherapy are introduced, psychotherapists and people suffering from mental disorders, whether they are already psychotherapy patients or not (yet). At least three situations can be differentiated in the context of psychotherapy in which ChatGPT can be useful: 1) a person suffering mentally who does not consult a psychotherapist; there is no ongoing psychotherapy, 2) a person with mental-health problems who regularly consults a psychotherapist; there is ongoing psychotherapy, and 3) a person with mental-health problems who regularly consults a psychotherapist, but, due to a vacation or other reasons, the next session is not scheduled until after a longer period of time or has not yet been scheduled, i.e., there is an interruption in psychotherapy sessions. Combined, this results in six different situations to use ChatGPT in psychotherapy.Footnote 3

If only the useful options are considered, two positions will disappear. If there were no psychotherapy, psychotherapists would only ask ChatGPT about fictional cases or general issues. A central question in this study is the potential use of ChatGPT in the context of psychotherapy, i.e., the treatment of individuals with mental disorders. Therefore, the option of using ChatGPT as a theoretical or technical information tool for psychotherapists will not be analyzed or discussed in detail. The same argument is also valid for psychotherapists when there is a long pause between psychotherapy sessions. Another two options are combined in the following case studies, namely those of patients in active and those with long pauses in psychotherapy. In both cases, ChatGPT can be used as a supplement or short-term substitute, either during a longer pause or between two sessions.

Three ways to use ChatGPT in psychotherapy are analyzed in this paper: 1) for psychotherapists during an active psychotherapy session as an AI assistant or supervision tool, 2) for patients who are already in active or interrupted psychotherapeutic treatment and the use of the chatbot for additional support between the sessions, and 3) for people who don’t go to psychotherapists (for numerous possible reasons) and use ChatGPT as a psychotherapy substitute for self-treatment. Case studies are presented in the next three sections.

ChatGPT for psychotherapists

The author of this paper is a psychotherapist trained in individual psychology. A few years ago, I had a 16-year-old patient, Sarah (pseudonymized, she agreed that I can use her case as an example), who suffered from social phobia. Pretending to be the therapist at that time, I asked ChatGPT for assistance in two aspects. First, I was seeking help in diagnosing her, and second, I sought suggestions on how to treat her adequately. My goal was to expand my individual psychological treatment repertoire with the logotherapeutic technique of paradoxical intention because my individual psychological approach wasn’t successful in treating her.

The complete conversation can be read in the Supplement. ChatGPT received a detailed description of the patient and her symptoms from the author. The chatbot promptly responded and suggested three diagnoses, each with brief explanations: depression, generalized anxiety disorder, and social anxiety disorder. The diagnoses were accurate and corresponded to the patient’s behavior at the time, and the suggestions were helpful as a second opinion, as ChatGPT has a good understanding of theories of the mind including psychopathology (Bubeck et al., 2023), but the brief descriptions of each diagnosis were kept to a minimum and not very helpful. At the end of the response, it was added that only a mental-health professional can make a diagnosis, and the questioner should consult one.

No treatment options were mentioned, so I asked what treatment options and therapeutic approaches exist to treat social anxiety disorder. The response included a few common options: cognitive behavioral therapy, mindfulness-based therapy, psychodynamic therapy, and medication. The four therapy options were described so generally that they did not provide a trained psychotherapist with new information (e.g., psychodynamic therapy was described as focusing on unconscious conflicts and working through past experiences in the present). In conclusion, it was stated that it is important to find an effective treatment path with the patient that also fits his/her individual wishes and needs.

Since the response did not provide satisfactory results, a detailed question was asked about whether logotherapy and existential analysis offer helpful treatment paths for social anxiety disorder and how a logotherapist would treat the patient. The AI responded that logotherapy and existential analysis can be effective in the treatment of social anxiety disorder. After a brief description of what logotherapy is, five treatment paths were listed: the search for meaning, reframing negative thoughts, encouraging actions, emphasizing the patient’s personal responsibility for his/her own well-being, and strengthening existential mindfulness. All points were briefly described. One treatment option was missing, namely Frankl’s primary technique for social anxiety: paradoxical intention. If someone were to write a paper on social anxiety and logotherapy and not mention paradoxical intention, the paper would immediately be rejected. Frankl’s technique would have to have a prominent place in the essay, even if one wanted to distance him/herself from it. Since the AI did not mention this technique, it had to be explicitly asked whether the technique makes sense for social anxiety disorder and how it can be applied. Only then did ChatGPT provide a usable, but still rather brief, answer describing the technique and its possible application to the patient. At the end of the response, it was again mentioned that one should be careful when applying the technique because not every patient feels comfortable facing his/her fears in this way. If the psychotherapist has some prior knowledge and if the questions are asked appropriately, the chatbot’s answers are useful in suggesting treatment options or explaining them in more detail.

The responses show a lack of diversity in psychotherapeutic approaches from the approach pluralistic psychotherapy perspective. The suggestions include only cognitive behavioral therapy, mindfulness-based therapy (since the third wave), and psychodynamic therapy. Body psychotherapies, Gestalt therapy, systemic therapy, and existential analysis were not mentioned. From the author’s point of view, this is problematic because the broad field of psychotherapy is already dominated by CBT (David et al., 2018). ChatGPT, which had no experience with psychotherapy or knowledge about the diversity of approaches, is biased and conveys this dominance to the psychotherapists who use it. It is also not helpful if core techniques of a psychotherapeutic approach are not mentioned even when asked by psychotherapists, but only (too) general information is provided. Search engines or psychotherapeutic information websites have a clear advantage over chatbots in this aspect because they provide various search results that often provide far more comprehensive information about psychotherapeutic approaches and techniques. However, Google search, for example, is not interactive and cannot paraphrase results in a more easily understandable way if needed. Queries of ChatGPT should be tailored to the target group to become a supportive tool for psychotherapists. Psychotherapists, at least in German-speaking countries and many other countries (Raile, 2024), have specialized knowledge in the approach in which they were trained, as well as basic knowledge of other approaches. ChatGPT has another useful feature that psychotherapists can also use, namely the ability to interpret patient’s material from the perspective of different psychotherapeutic approaches. In the following chapter, such an example is presented from the patient’s point of view, but this can also be realized by psychotherapists.

ChatGPT for patients in psychotherapeutic treatment

In this case study, a patient who had been in psychoanalytic therapy for several months turned to ChatGPT to bridge the time between sessions, as well as during the therapist’s vacation. Before describing the case studies, I would like to share an interesting observation. While researching this paper, I asked the chatbot a wide variety of questions. Among them was why Sigmund Freud would recommend ChatGPT. One answer was that ChatGPT can serve as a tool for discovering the unconscious. A few days later, I asked it what unconscious desires were behind this question. It answered that it cannot identify or interpret the unconscious. The conversation can be read in the Supplement (Conversation 2). This self-contradiction is almost characteristic of the integration of the psychodynamic in ChatGPT.

In the following case study, a patient’s dream is described anonymously with his consent, and ChatGPT is asked for an interpretation. This could be the case, for example, if a person wants to continue therapy while the psychodynamic psychotherapist is on vacation or to have input on his/her dreams after completion of therapy. In the first step, the chatbot was simply told the dream in detail. It responded that dreams can be mysterious, but it could try to interpret the symbols and meanings. It broke the story down into some symbols and sequences and offered an interpretation for each one. The conclusion was that the dream may suggest a mix of excitement, fear, and uncertainty. After this interesting interpretation, the chatbot was asked if it could also provide a psychodynamic interpretation and was then asked for a systemic interpretation, a psychoanalytic interpretation according to Wilfred Bion, a Daseinsanalytic interpretation, and an individual psychological interpretation according to Alfred Adler. Each of these interpretations began with a paragraph on how dreams are viewed in the particular approach. This was followed by several similar paragraphs. The dream was always broken down into the same symbols and sequences. The interpretations themselves were quite similar, although not identical. One example was a refusal of offer a massage due to concerns about who was offering the massage (the dreamer was male, and the person offering the massage in the dream was a former female classmate). This was interpreted as fear of being dependent on others, as a sense of mistrust or caution in your relationships, and as a fear of intimacy and vulnerability in most of the other interpretations. It is evident in the individual psychological interpretation, for example, that not one of Adler’s technical terms, i.e., feelings of inferiority, striving for power, social interest, etc, was mentioned. There are words, such as striving to master and belong, which can be read in the light of individual psychology.

ChatGPT was finally asked what the conclusion of all the interpretations was. The answer was that it may help reflect on the different interpretations offered and consider which one resonates most with your own experience. That could, indeed, be helpful. From an approach-pluralistic point of view, the interpretations were not bad, but also not outstanding. To put this in perspective, compared to search engines like Google Search, scientific psychotherapy websites, and other AI software, ChatGPT’s interpretations are exceptional. However, the interpretations are certainly not appropriate for publication in professional journals. For patients who want a new perspective on their experiences, like a dream, it can be quite helpful. In the author’s opinion, it is a good opportunity for people who already have psychotherapy experience to work alone with their material, like dreams or memories. This requires the ability to self-reflect as well as the willingness to deal with discomforting interpretations. Individuals who have no experience with professional dream interpretation or interpretations of memories may be scared off by the answers or believe that they represent some kind of “objective truth”.

ChatGPT for patients without psychotherapeutic treatment

One strength of ChatGPT is its ability to provide information. The author tested it by pretending to be a person suffering from a mental illness. The case example represents a real case from the author’s psychotherapeutic practice, which is presented here pseudonymized and with informed consent. I pretended to be that personFootnote 4, who turned to ChatGPT instead of a psychotherapist for help. The conversation can be read in the Supplement (Conversation 4).

One first notices that ChatGPT answers empathicallyFootnote 5 and mentions that it is not a mental-health professional, but just an AI language model that provides a safe, confidential space to talk about one’s problems. Despite several reasons for not seeing a professional (lack of money or energy), it keeps suggesting alternatives, such as free clinics, support groups, and online counseling while keeping in touch and making other suggestions about what to do to feel better. The first suggestions are to prioritize self-care activities and to break tasks into smaller steps which are easier to achieve to improve motivation to complete the tasks. To the answer that going into nature is not possible due to a lack of energy, it replies that you can bring nature into the house by adding plants, opening the window, or watching nature documentaries. Again, it is stressed that small steps are important, even if you go outside for only a few minutes each day.

Another question ChatGPT has been asked is how to deal with traumatic images. The chatbot suggests two techniques: mindfulness and cognitive restructuring. To the feedback that ‟I can’t just observe painful images impassivelyˮ, it replies that not everyone can use all the techniques and suggests others. It promptly suggests another technique, namely self-compassion, and adds that it can take time to find the strategy that works best for coping with trauma. The author’s final response was that simply accepting the pain sounded too easy. The chatbot responded that acceptance alone would not make the pain go away, but it could be a first step. It finally pointed out once more that it is important to get support from loved ones or mental-health professionals, as well as engaging in self-care activities, practicing relaxation techniques, and identifying and challenging negative thoughts and beliefs.

From the viewpoint of approach pluralistic psychotherapy, ChatGPT is strongly biased in favor of CBT. Other approaches are not mentioned, but at least some techniques and suggestions that patients can implement themselves are. A major weakness of ChatGPT, despite regular reminders to see an expert, is its failure to ask for more information, such as biography, presence of suicidal thoughts, symptoms, and other important data. Compared to search engines like Google, the information is useful, but not as broad and detailed. The strength of ChatGPT is the interactivity with the user, who can ask questions at any time and promptly receive adequate answers in a safe environment, unlike openly readable psychotherapy forums, for example. In the author’s opinion, ChatGPT can be a supportive tool, especially because it provides a safe framework and “empathy”, but it must not replace psychotherapy as a stand-alone tool until important adjustments have been made. I strongly agree with the persistent reminders that ChatGPT cannot replace a professional psychotherapist and that the user should consult one.

Discussion

A total of four conversations (three case studies and an interesting anecdote) are presented in the three chapters and can be read in their entirety in the Supplement. Conversations are held with ChatGPT in all three case studies, one from the point of view of a psychotherapist who wants a second (professional) opinion and new input on his case, one from the point of view of a patient who wants to continue working on his content (in this case a dream) during active or interrupted psychotherapy, and one from the point of view of a person with mental impairments who cannot or does not want to see a psychotherapist. From the perspective of approach pluralistic psychotherapy science (Raile, 2024), the one-sidedness of ChatGPT is particularly striking with regard to its therapy proposals, which generally describe CBT methods. Besides CBT and psychodynamic psychotherapy, no other psychotherapy methods are mentioned unless explicitly requested. If explicitly asked about an approach like logotherapy, the answers are very general and incomplete. While ChatGPT’s answers should be more balanced and profound for psychotherapists, its understanding of theories of mind is profound (Bubeck et al., 2023) and its dream interpretations can be really useful for patients who already have experience with this kind of interpretation. People with mental disorders who first contact ChatGPT receive empathic responses and a private and safe environment to open up (Sharma et al., 2023). This is also visible in the last case study. It is positively highlighted that ChatGPT regularly points out that it is not a psychotherapist and patients should consult one. A neutral aspect is that therapy suggestions are made which can be either helpful or harmful according to the situation and considering the limitations of ChatGPT, e.g., unable to understand irony. Negative is the bias in favor of CBT, which can lead people without knowledge of psychotherapy to equate psychotherapy with CBT without knowing that there are numerous other helpful approaches that may be better suited to the person seeking help in the individual case. The bias in favor of CBT is an aspect of algorithmic ethics (Wang et al., 2023) that should either be changed or at least clearly flagged. In addition, it needs to be clarified who is liable for chatbot errors if people with mental issues are harmed. As Wang et al. (2023) already stated, ensuring the accuracy, reliability, and validity of ChatGPT-generated content requires strong validation and ongoing updates based on clinical practice. At least as OpenAI currently works, the content is not continuously validated, and the database is not updated. From the psychotherapeutic-ethical point of view the use of ChatGPT is therefore at least questionable.

The three case studies do not represent all possible cases, but only selected examples representing the chatbot’s responses in the version of March 23, 2023. The responses may be completely different in just three months. The results are, therefore, only a snapshot. Furthermore, the usefulness of ChatGPT for therapists and patients cannot be verified or generalized solely based on the three cases, current statements, and conclusions. There are numerous aspects to be considered whether ChatGPT is for a patient or psychotherapist useful. For example, patients and psychotherapists who have training or experience with CBT may find the chatbot’s responses more useful than those with a background in client-centered psychotherapy. Those psychotherapists and patients who already have knowledge of different psychotherapeutic approaches can ask ChatGPT more detailed questions and receive different answers than those who have no knowledge of psychotherapy. When asking ChatGPT questions and receiving answers, it is hard to determine if the answers are satisfactory or not if one has no knowledge in the field and does not know if the answers are correct and comprehensive. It is the author’s opinion that ChatGPT responses should be regulated based on the professional and scientifically based recommendations of a panel of psychotherapists. As long as ChatGPT is unregulated, the author’s advice is to critically question the chatbot’s answers.

Three examples are presented here that can and will occur in a similar form in practice. ChatGPT is freely accessible and is currently being increasingly used. Not only ChatGPT, but also alternative tools, such as Google search, psychotherapy forums, scientific psychotherapy websites, and other AI software exist. These can hardly compete with ChatGPT at present for several reasons. ChatGPT, unlike most other chatbots, is free and accessible at any time. It is interactive, unlike Google search or scientific psychotherapy sites, and provides more detailed information on demand, in simpler language if needed. ChatGPT answers in real time, unlike psychotherapy forums, and offers a private and protected setting where no other person can read what one writes. ChatGPT is unlike most alternatives, as it is “empathetic”, which can especially assist individuals with psychological disorders who are not seeking psychotherapy or who have issues with trust to speak more openly about their problems. That is why it is important to explore it in detail. This paper is intended as a small ray of inspiration, not a systematic exploration of the possible use of AI as a psychotherapist.

Summary

One of ChatGPT’s strengths is its ability to consider the whole conversation in context, which enables it to react adequately to the user’s input and to provide more detailed information. This sort of psychoeducation can be very helpful for some people suffering from mental issues, but it has two problems: the answers are very brief and do not include the whole patient context because this is not what was requested. The chatbot only provides general information about mental issues, diagnoses, and ways to treat them at home or with a professional therapist. The second problem is the one-sided suggestions: it mainly covers cognitive-behavioral therapy, mindfulness-based therapy, skills training etc. ChatGPT seems to be less able to deal with other approaches, such as humanistic, systemic, and psychodynamic approaches, although the dream interpretations do sound reasonably well-founded. Nevertheless, the subjective factor, i.e., biography, social context, and so on, must always be considered. ChatGPT does not do that.

People suffering from mental illnesses, especially when they have no knowledge about psychotherapy, may think that these are the only treatment possibilities. If we assume that people respond differently to different approaches, this results in a clear weakness of the chatbot. Nor can it adopt a genuinely empathic, appreciative, and congruent attitude toward the patient, which is a central value in person-centered psychotherapy or use the creative methods of Gestalt therapy in a meaningful way. It also cannot recognize nonverbal signals, which are important in the therapeutic process.

Overall, the three case studies illustrate that ChatGPT can provide an interesting complement to psychotherapy and an easily accessible, good (and currently free) place to go for people with mental-health problems who have not yet sought professional help and have no psychotherapeutic experience. Despite the ethical concerns and bias in favor of CBT, the practical importance and value of ChatGPT in psychotherapy should not be underestimated. For psychotherapists, ChatGPT can be a useful tool to get a second opinion on diagnoses, get appropriate treatment techniques suggested, learn about other psychotherapeutic approaches, and get a second interpretation of patient material. There is no comparable technology as easily accessible as ChatGPT, which is equipped with comprehensive information about mental health problems, real-life answers and can even provide interpretations. As long as there are no regulations and access to the chatbot is so easy, it will be used in the field of psychotherapy. It must be considered that the information is one-sided, and it must also be made clear in any future regulation of AI that the proposals are not only insufficient as a psychotherapy substitute, but also have a bias that favors certain methods while not even mentioning other approaches that may be more helpful for some people.