- Independent Researcher, Athens, Greece
Artificial intelligence-based psychotherapy applications have been evolving rapidly in recent years. They seem to offer solutions to a complex world with enormous mental health needs. Easy access, immediacy and low cost are their enduring advantages, while anonymity attracts people who are isolated by the stigma of mental illness. Artificial intelligence-based psychotherapy applications borrow and incorporate elements of the already proven common psychotherapeutic factors, as described in the so-called ‘contextual model’. As decades of practice have shown, these ‘common factors’ seem to prevail in every type of in-person psychotherapy. They are the key elements of their successful outcome and the main reason for the lack of superiority of one type of psychotherapy over another. A key area here is therapeutic alliance, characterized by the therapist's empathy, the patient's expectations, and the shared therapeutic goals. Could artificial intelligence design opportunities so that these factors become even more useful in AI psychotherapy? Improvement in the development of an empathetic therapeutic relationship environment, based on the ‘theory of common factors’, are expected to facilitate the adaptation of interventions and further increase the reliability and effectiveness of AI psychotherapy.
Introduction
As early as 1936, the American psychologist Saul Rosenzweig (1) observed that all types of psychotherapy seemed to be equally therapeutic, invoking the famous saying of the Dodo bird from the story "Alice in Wonderland": "Everybody has won and everybody should have prizes", to characterize the results of psychotherapy. He then proposed as a possible explanation some common therapeutic factors, including psychological interpretation, catharsis and therapist's personality. In 1940, John Watson reported the results of a scientific meeting held to determine areas of agreement between psychotherapeutic systems. Participants, including figures as diverse as Saul Rosenzweig, Alexandra Adler, Frederick Allen, and Carl Rogers, agreed that support, interpretation, insight, behavior change, a good therapeutic relationship, and certain therapist characteristics, were common features of successful psychotherapy approaches. (2).
After several decades of application, it has not been possible to prove that one psychotherapeutic approach is clearly superior to another. Specific psychotherapies such as Cognitive Behavioral Therapy (CBT), Mindfulness-based Cognitive Therapy (MCBT), Interpersonal Therapy (IPT), Acceptance and Commitment Therapy (ACT), and others, generally do not differ in effectiveness. This evidence suggests that factors common to all psychological therapies (i.e., “common factors”) contribute significantly to the therapeutic mechanism. (3–6). For example, the use of ‘transference’, a common element of the interpersonal relationship between patient and therapist, represents an important point of action in the psychotherapeutic process. It is an interesting phenomenon during psychotherapy, in which the patient redirects feelings or desires that were originally addressed to other people in his life onto the therapist. It is both a product of the psychotherapeutic encounter and a mechanism through which the therapy takes place. Could transference, or any other common psychotherapeutic factor, be constructed by programmers and engineers to design therapeutic methods based on artificial intelligence applications? More generally, how could artificial intelligence design opportunities where these factors are also useful in AI psychotherapies? (7).
We may suggest that the greatest integration of the common psychotherapeutic factors in AI-based psychotherapy will enhance the effectiveness and therefore the benefit the patients. The purpose of this article is to review the factors that are common to all types of in person psychotherapy, as a component of their successful outcome. Also, to detect these factors in Artificial Intelligence-based psychotherapy so far, through recent research, based on any use of artificial intelligence in psychotherapy, including AI-chatbots. Artificial intelligence (AI) is defined as the ability of a system to interpret external data, learn from it, and accomplish specific goals through adaptation (8). However, since the term ‘AI’ is used loosely, often simply to describe a classification model (9), the relative immaturity of the field is evident in the absence of consensus on the definition of AI and generative AI in the studies screened in this article.
The common factors of successful psychotherapy
It is now widely accepted that the so-called ‘common factors’ contribute significantly to the success of psychotherapy. The common factors have a long history in the field of psychotherapy theory, research and practice and concern therapeutic alliance, empathy, expectations, cultural adaptation, and therapist differences. These factors are more than a set of therapeutic elements and are common to almost all psychotherapies that have been developed. Collectively, they form a theoretical model regarding the mechanisms of patient change during psychotherapy. The common factors model has been called the contextual model and argues that there are at least three mechanisms through which psychotherapy produces benefits: (a) the real relationship, (b) the creation of expectations, through the explanation of the disorder and the treatment involved, and (c) the implementation of health promoting actions. (10).
Before the work of psychotherapy begins, an initial bond must be created between therapist and patient, i.e. a therapeutic relationship. Some basic level of trust certainly characterizes all varieties of therapeutic relationships, although when attention is directed to more internal experiences, deeper bonds of trust and attachment are required and developed. The initial encounter between patient and therapist is essentially a meeting of two strangers, with the patient deciding whether the therapist is trustworthy, has the necessary experience to devote the time and effort to understanding both the problem and the context in which the patient and the problem find themselves. People make very quick judgments about whether they can trust their therapist. They make quick decisions based on the therapist's attire, the layout and decor of the office, and other features of the therapeutic environment. Patients also come to therapy with expectations about the nature of psychotherapy, based on past experiences, recommendations from close associates or influential individuals, and cultural beliefs. The initial interaction between patient and therapist is critical, as more patients appear to terminate therapy prematurely after the first session than at any other point during therapy (10–12).
The real relationship, as defined psychodynamically, is the personal relationship between therapist and patient, characterized by the degree to which each is genuine and perceives or experiences the other in ways that are appropriate to the other. The therapeutic relationship, or alliance, encompasses three central ideas: a collaborative relationship, an affective bond between the therapist and patient, and the ability of the therapist and patient to agree on treatment goals. Although the psychotherapeutic relationship is influenced by general social processes, the interaction is confidential, with some legal restrictions (e.g., reporting child abuse), and the disclosure of difficult material (e.g., spousal infidelity, etc.) does not disrupt the social bond. Indeed, in psychotherapy, the patient can talk about difficult material without the threat that the therapist will end the relationship. The importance of human connection has been discussed for decades, with concepts such as attachment, belongingness, or social support. Psychotherapy provides the patient with a human connection with an empathic and caring person, which promotes health, especially for patients who have poor or chaotic social relationships (10).
Research also shows that expectations have a strong influence on experience (13). Critical to the course of expectations is that patients believe that the provided explanation and the subsequent therapeutic actions will correct their problems. Consequently, the patient and therapist should agree on the goals of the treatment as well as the tasks, which are two critical components of the therapeutic alliance. The creation of expectations in psychotherapy depends on a convincing theoretical explanation, provided to the patient and accepted by him, as well as on therapeutic activities that are consistent with the explanation and that are believed to lead to control of his problems. A strong therapeutic alliance indicates that the patient accepts the treatment and cooperates with the therapist, creating confidence in the patient that the treatment will be successful. Empathy, a complex process through which an individual can be influenced and share the emotional state of another, is considered essential for cooperation, goal sharing and regulation of social interaction, while at the same time reinforcing the influence of expectations (14).
Box 1 lists the common factors that have been recorded in literature from time to time and concern the pre-treatment period, the characteristics of the client and the therapist, the therapeutic relationship, the structure and development of psychotherapy (2).
Box 1. Common factors in psychotherapy
• Customer attributes: Positive expectation, hope or faith / Distressed or incongruent client / Patient actively seeks help / Mental illness perceptions.
• Pretherapy preparation: Expectation of therapeutic success / Perceptions of treatment or outcome / Expectation for length of treatment.
• Therapeutic relationship: Development of alliance or relationship / Engagement / Transference.
• Treatment structure: Use of techniques or rituals / Focus on inner world exploration of emotional issues / A healing setting / There is interaction / Communication (verbal and nonverbal) / Explanation of therapy and participants' roles.
• Psychotherapist characteristics: General positive descriptors / Cultivates hope and enhances expectancies / Warmth or positive regard / Empathic understanding / Socially sanctioned healer / Acceptance.
• Psychotherapy processes: Opportunity for catharsis or ventilation / Acquisition and practice of new behaviors / Provision of rationale / Foster insight or awareness / Emotional and interpersonal learning / Reality testing / Success and mastery experiences / Persuasion / Placebo effect / Identification with the therapist / Contingency management / Tension reduction / Desensitization / Education and information provision.
In a more general and theoretical basis, Bordin (15) suggested that a strong ‘alliance’ between a client and therapist is crucial for a successful outcome. ‘Therapeutic alliance’ is based on three key elements: agreement on goals, agreement on tasks, and the bond between them. Bordin argued that the alliance is a component of all therapies, although the specifics of the required alliance vary depending on the therapeutic approach. Furthermore, he added: “Strength, rather than the kind of working alliances, will prove to be the major factor in change achieved through psychotherapy”.
It is also worth mentioning the degree of follow-up, continuation, and completion of psychotherapy, as a parameter that indicates both client motivation and satisfaction. Premature termination of treatment hinders the effective delivery of mental health services across various settings, consumer populations, and treatment modalities. Dropout after the first session estimating at 50% across various settings. Attrition research is complicated by differing therapist and client perceptions of treatment or outcome. Therapists expect treatment to last significantly longer than do clients. Clients prematurely ending treatment may recognize a lack of improvement and believe that additional sessions will not be helpful, a fact often missed by therapists. External factors, such as difficulties in finding mental health services, greater distance travel, placement on a waiting list, and having a longer wait from intake to first treatment session have repeatedly been linked with treatment dropout. Higher rates of attrition were found for patients with more severe diagnoses and more complex diagnostic pictures (i.e., psychosis or Axis II comorbidity). The type of treatment a patient receives also influences rates of dropout. For example, treatments involving both medications and therapy have consistently shown lower rates of attrition than either medication or therapy alone. Demographics, environment, psychological need, perceptions of illness and mental health treatments influence engagement and retention in treatment. Perceptions of mental health are likely to also influence the utilization of services. Mental illness is often perceived in a negative way by many ethnic minority groups (16).
One of the earliest strategies for reducing dropout was based on the idea that preparing clients for what would happen in therapy would improve attendance and reduce early dropout. Pretherapy preparation, consisting of education about nature and process of therapy, offers clients an expectation of therapeutic success, dispels therapy misconceptions, and has been shown to improve client attendance. Thus, use of a brief pretherapy training video, motivational interview, or both could dispel many misconceptions and increase the likelihood of retention. Another factor influencing dropouts is the often-differing expectations about treatment duration. Expectation for length of treatment seems a critical factor to address in conducting effective treatment. Indeed, significant reductions in attrition may be seen if the duration of treatment is clearly articulated and adapted to be more in line with consumers’ actual use of services (17).
Detecting common factors in AI-based psychotherapy
The application of AI to online mental health care is still in its infancy. However, the impact of AI is proving to be impressive. Such tools are increasingly being integrated into practice, offering virtual psychotherapy services, assisting with diagnosis, facilitating consultations, providing psychoeducation, and providing treatment options (18–20). Natural Language Processing (NLP) helps analyze patient language in conversations, chats, emails, and social media posts. It can detect patterns related to mental health issues, such as depression or anxiety, and is a vital component of chatbots (21, 22). Following the machine learning approach, chatbots extract content from user input using Natural Language Processing (NLP) and could learn from conversations. They consider the entire context of the dialogue, not just the current line, and do not require a pre-defined response for every possible user input. Often, Artificial Neural Networks (ANN) are used to implement chatbots. Retrieval-based models use a neural network to assign scores and select the most likely response from a set of responses. In contrast, generative models synthesize the response, typically using deep learning techniques (18).
The future of mental health care seems to involve a hybrid approach, combining the strengths of artificial intelligence and human therapists (23). AI-powered therapy chatbots offer virtual psychotherapy services and have shown promising results in reducing symptoms of depression and anxiety and helping to address mental health issues in various populations, including the elderly. An AI assessment tool was shown to be 89% accurate in identifying and classifying patients’ mental health disorders from just 28 questions, without human input (24). Graham et al. (25) stated characteristically: “As AI techniques continue to be refined and improved, it will be possible to help mental health practitioners re-define mental illnesses more objectively than currently done in the DSM-5, identify these illnesses at an earlier or prodromal stage when interventions may be more effective, and personalize treatments based on an individual’s unique characteristics”.
In a review of research on mental health chatbots, Li et al. (21) noted that chatbots have the potential to effectively alleviate psychological distress and even result in the creation of therapeutic relationships with AI. Recent studies have shown promising results for AI applications, including mental health monitoring, psychoeducation, suicide risk assessment and prediction, identification of predictors of mental illness, delivery of psychotherapy, therapist training, personalization of online mental health care, mental health triage and decision-making, and promotion of therapeutic engagement (26). A recent study found that ChatGPT (4.0) performance in facial emotion recognition is in line with human performance (27), while Hwang et al. (28) found that ChatGPT(4.0) can generate psychodynamic forms from a case history, while adding additional psychoanalytic material can improve the results. Types of psychotherapy applied by AI include Cognitive Behavioral Therapy (CBT), Acceptance & Commitment Therapy (ACT), Dialectical Behavior Therapy (DBT), Mindfulness-Based Cognitive Behavioral Therapy, and Supportive Psychotherapy. However, due to the novelty of this technology, there are several unanswered questions and issues, such as limitations in language interpretation, biases in interacting with patients from different backgrounds, as well as unanswered issues of ethics, patient safety, and health policy (9, 29).
Online mental health care has several advantages over its in-person counterpart, primarily due to the added privacy and ability to access healthcare from anywhere with an internet connection. Furthermore, studies show that despite some concerns about the strength of the therapeutic relationship, online mental health care has similar effectiveness to in-person options for managing mental illness. For example, a study by Alavi et al. (30) showed that an online cognitive behavioral therapy (eCBT) program for depression had similar effectiveness and dropout rates to its in-person counterpart, with a medium to large effect size on managing depressive symptoms. Regarding user satisfaction with online psychotherapy tools provided through Artificial Intelligence, several studies reported that their study tool was warmly received and considered useful and encouraging by approximately 60% to 90% of users. These studies highlighted the number of interactions with the tool, the feeling of empathy and understanding, and the appropriateness of the dialogue, as important positive factors determining treatment outcomes and satisfaction (31). Friesem (32) describes as following the characteristics of digital empathy: “digital empathy explores the ability to analyze and evaluate another’s internal state (empathy accuracy), have a sense of identity and agency (self-empathy), recognize, understand and predict other’s thoughts and emotions (cognitive empathy), feel what others feel (affective empathy), role play (imaginative empathy), and be compassionate to others (empathic concern) via digital media”.
Artificial intelligence enables more personalized and adaptive responses using multiple modes of interaction, such as text and voice. Monitoring treatment progress, assessing risk, personalizing the treatment experience, and training new therapists are challenges for online mental health delivery. However, especially in the case of fully self-guided online psychotherapy, the lack of monitoring, risk assessment, and personalization can put the patient at increased risk of dropping out of treatment or worsening psychiatric symptoms (33). Ewbank et al (34), 35) applied a deep learning approach to large patient datasets obtained from a variety of eCBT programs for mental health symptoms. They found that time spent on cognitive and behavioral techniques was associated with higher odds of improvement and treatment engagement. Although the authors acknowledge that some non-treatment-related content—such as informative greetings—can be important to the session, too much of it can be disruptive and ultimately detrimental. They also found that patient statements that indicated a desire or commitment to change were associated with increased odds of symptom improvement and therapeutic engagement.
More detailed exploration of AI psychotherapies sheds light on the complex internal structure of the psychotherapeutic process. Sperandeo et al. (36) evaluated the possibility of describing the complexity of therapeutic relationships using the methods of machine learning and complex networks. They concluded that the use of graphs is a valid tool for the analysis of both the psychotherapeutic sessions and the evolution of the care relationship over time. Also, numerous suggestions on the dynamics within the patient–therapist system emerged from the construction of a complex network useful for describing the trend of psychotherapy. Chen et al. (37) proposed a hierarchical framework to automatically evaluate the quality of an Enhanced CBT interaction (eCBT). The experimental results suggest that incorporating the local quality estimator leads to better segment representations and to consistent improvements for assessing the overall session quality. Chien et al. (38) categorized participants in an eCBT program for depression into five treatment engagement categories, considering treatment platform usage (i.e., time spent on the platform, access to sessions and treatment tools, and treatment session completion) and treatment disengagement rate. They found that lower platform usage was associated with lower symptom improvement rates, and higher platform usage and lower disengagement rates were associated with higher symptom reduction for depression and anxiety symptoms. Gonzalez Salas Duhne et al. (39), applied a supervised machine learning (ML) approach to analyze data from an in-person and an online eCBT program for depression and identified five common variables that could predict a higher likelihood of early dropout from eCBT: younger age, ethnic minority membership, lower socioeconomic status, medication use, and higher baseline severity of depressive symptoms.
Box 2 lists some examples of common factors investigated in Artificial Intelligence-assisted Psychotherapy. For example, Schiepek et al. (42) suggested that common factors of psychotherapeutic change and psychological hypotheses on motivation, emotion regulation and information processing of the client’s functioning can be integrated into a comprehensive non-linear model of human change processes. Their model contributes to the development of an integrative conceptualization of psychotherapy, which is consistent with the state of scientific knowledge of common factors, as well as other psychological topics, such as motivation, emotion regulation and cognitive processing. Also, motivational interviewing has promise in increasing clients' commitment to and involvement with therapy (44). In a study by Hadar-Shoval et al. (45), the emotional awareness of progressive artificial intelligence adapted to the personality characteristics of individuals with borderline personality disorder and schizoid personality disorder was studied for therapeutic purposes. ChatGPT showed that it can demonstrate cognitive abilities, in terms of emotional richness and intensity, adapted to specific personality disorders. Several studies in the field of artificial intelligence-assisted psychotherapy, aimed at predicting treatment response, have identified demographic characteristics associated with more prosperous and less marginalized populations as predictors of better treatment response, highlighting this potential bias in the data. Education level is a frequently cited predictor of treatment response in patients participating in eCBT (40, 41).
Box 2. Examples of common factors investigated in Aι Psychotherapy
• Customer attributes: Psychological distress (21) / Personality characteristics (45) / Age, ethnic minority, socioeconomic status (39) / Education level (40, 59).
• Pretherapy preparation: Informative greetings - desire or commitment to change (34, 35)
• Therapeutic relationship: Evolution of the care relationship over time (36
• Treatment structure: Motivation - emotion regulation (42)
• Psychotherapist’ characteristics: Empathy and understanding (31) / Cost-effective - works continuously - does not wear out or get sick - interacts in different languages (43).
• Psychotherapy processes: Quality of interaction (37) / Time spent on the platform - completion of therapy sessions - dropout rates (38) / Early treatment dropout (30, 39)
The systematic review of Cruz-Gonzalez et al. (46) presented the application of AI in mental health in the domains of diagnosis, monitoring, and intervention. The authors found that the AI methods most frequently used were support vector machine and random forest for diagnosis, machine learning for monitoring, and AI chatbot for intervention. The AI chatbot Fido focuses on dialogue to recognize and modify cognitive biases using Socratic questioning. It identifies suicidal ideation, guiding users to emergency hotlines. It utilizes the ABC technique from CBT, provides psychoeducation on mental health, and offers gratitude practice exercises (47). The CBT chatbot Emohaa, rooted in CBT principles, uses interactive exercises, like automatic thoughts training and guided expressive writing, to address irrational thoughts and enhance mental well-being in healthy adults (n = 301) (19). Danieli et al. (48) found that mixed treatment with in-person and AI chatbot TEO components was most effective in reducing stress and anxiety in active workers (n = 60) over 55 with stress symptoms and mild-to moderate anxiety. In university students (n = 181) with anxiety and depression symptoms were evaluated the viability, acceptability, and potential impact of using Tess, an AI-based chatbot that delivers brief text conversations as comprehensive support for mental health. An electronic psychoeducation book on depression was used in a control group. The users express Tess was found effective in addressing anxiety but not depressive symptoms (49).
Perspectives
The purpose of this article was to highlight the common psychotherapeutic factors as tools for further improving AI-based psychotherapies. As discussed above, the core of these factors is the so-called ‘therapeutic alliance’, as has been described by numerous psychotherapists. It even seems that AI-based psychotherapies have the potential to implement the Bordin’s ‘strength’ of alliances, since AI-based psychotherapies have the advantage of at least immediacy and availability.
Although AI does not replace therapists, many AI applications and tools show that they can provide a reasonable degree of therapeutic support (18). Artificial intelligence is increasingly being used in healthcare, supporting both mental health professionals in diagnosing and finding the best treatments, and mentally ill patients by providing information and psychotherapeutic interventions. Advanced technologies such as big language models, which became popular with the launch of ChatGPT in 2022, are being explored for their potential in mental health care to generate sophisticated responses and interactions, supporting the mental health of those in need (50). Artificial intelligence provides support to clients in an overstretched mental health system, bridging the gap where traditional services struggle to meet the growing demand for treatment (52). It is true that a chatbot providing therapy can make the mental health system more accessible and successful for people who hesitate to talk to a doctor because they feel uncomfortable revealing their feelings. In fact, in some cases, chatbots may be better suited to meet patients’ wishes than human doctors because they are not biased against patients, while patients are not biased against chatbots due to gender, age, or race. It’s possible that patients worried about social stigma would feel more comfortable asking an AI for help rather than a GP or a human psychotherapist (23, 43).
Chatbots also do not wear out or get sick. They are cost-effective and can operate continuously throughout the day, which is especially useful for people who may have health problems outside of their doctors’ working hours. Chatbots could become surrogate for nonmedical caregivers. They can also interact in different languages to help respond to specific patient needs (43). Although online mental health care aims to promote greater accessibility to services, especially for those who live far from in-person services, these systemic barriers can limit the intended benefit of these interventions. Future studies should acknowledge this factor and support accessibility to services via internet-enabled devices and support technological literacy for marginalized communities (31).
A major obstruction is the lack of valid real-world databases required to feed data-intensive AI algorithms (33). Actually, although diagnostic and therapeutic issues are relatively settled in formal psychiatry, there is considerable confusion among the public. In the real world, there is notable misunderstanding of terminology and concepts of diagnoses and treatments (52). As a result, both accurate diagnosis and effective treatment of mental disorders remain unfulfilled goals (53).
Another obstacle is the low number of studies that have investigated the issues under discussion so far. Data related to eCBT, ACT, predicting treatment dropout, identifying useful treatment aspects, identifying predictors of symptom remission, matching patients to appropriate treatments, predicting treatment adherence, and predicting symptom remission, are extracted from very few studies. Thus, without a defined standard or guidelines for the study and application of AI tools in online mental health care, most research groups choose to study and develop their own AI tools, comparison interventions, outcome measures, and intervention designs. Also, in several of the available studies, there is no appropriate comparator or control group, which may directly affect the observed effect of AI interventions and tools reported in the literature. In summary, the factors that may affect the generalizability of conclusions in AI psychotherapy research are: gender distribution (most studies include women), ethnicity distribution (most studies are conducted in the US, Sweden, and the UK), race (the population mainly identifies as white), and the language of the articles (exclusion of studies not written in English) (31).
Several Large Language Models LMMs have “passed” the US medical licensing examination. However, passing a written medical examination with medical knowledge does not imply the provision of safe and effective clinical services. There is a discordance between what today’s models can do and what may be expected of them in real-world clinical workflows (54). The transition from a large language model used to answer medical questions to a tool that can be used by healthcare providers, administrators, and consumers will require significant additional research to ensure the safety, reliability, effectiveness, and privacy of the technology (54, 55). To support the credibility of studies using ML algorithms in the health sector in the future, WHO (56) suggests six key factors: Justification of the need to use ML, adequacy of data, description of the algorithm used, results including model accuracy and calibration, availability of programming code, and discussion of internal and external validation of the model.
Box 3 provides some examples of such proposed factors and related research directions. Strengthening therapeutic alliance and an empathic therapeutic relationship, are important examples of common psychotherapeutic factors that will enhance the effectiveness of AI-psychotherapies. In fact, research into their specific components is expected to reveal hidden aspects of the therapist-patient relationship or even enrich our neuroscientific knowledge. For example, reviewing the literature to determine how the brain perceives human and artificial intelligence as a “presence”, in examples of social perception and decision-making, Harris (51) wondered how much and in what way the brain’s response to artificial intelligences would change as people gain more experience with them and become more integrated into human life.
Box 3. Suggestions for further integrating common factors into AI Psychotherapy - Suggestions for future research
• Strengthening the empathic therapeutic relationship (10, 32)
Positive expectations / Warmth or positive regard / Empathic understanding / Acceptance / Foster insight or awareness / Education and information provision
• Strengthening therapeutic alliance – Strengthening the strength of working alliance (15)
• Strengthening research in existing fields
Pretherapy preparation (34, 35) / Expectation of therapeutic success / Perceptions of treatment or outcome / Acquisition and practice of new behaviors / Rates of satisfaction and improvement / Rates of dropout (30)
• Promoting new fields of research
On the balancing the demographic characteristics of the samples (age, gender, educational level, economic status) (39)
On the detection and utilization of those common factors with the greatest therapeutic power (28)
On the identifying and analyzing individuals who preferred AI psychotherapy, due to stigma against mental illness or for economic reasons (23, 43)
On the monitoring treatment compliance and therapy dropouts – Identifying the causes (30, 38)
On the detection of possible new therapeutic factors emerging with AI psychotherapy (36)
Investigating how the brain perceives human and artificial intelligence as a ‘presence’ (51)
Ending up, the development of Artificial Intelligence in psychotherapy requires the use of state-of-the-art technologies to avoid over-reliance on algorithmic counseling and minimize errors. Blindly following algorithmic counsel can lead to unintended consequences, such as oversimplifying human complexity. While machine learning can provide valuable insights and support in psychotherapy settings, it is imperative to maintain a balanced approach that maintains human contact and recognizes the limitations of algorithmic decision-making (57). Continuous training and development for professionals in the field is recommended to ensure a balanced integration of technology and human expertise. Further applications of computational methods need to be identified to improve the results in AI psychotherapy. Further collaborations are needed to develop specific algorithms for different psychotherapies and patient groups (58). At the same time, the greatest possible integration of ‘common therapeutic factors’ into AI psychotherapies, by creating a 'therapeutic alliance environment', will further enhance their immediacy, reliability and effectiveness (59, 60).
Conclusion
With nearly 100 years of psychotherapy practice, the main characteristics of both the patient and the therapist, the dynamics of the therapeutic relationship and the intermediate functions that serve the progress of the patient have already been studied extensively. The pre-treatment period, with the development of appropriate expectations and information, plays an important role, while the individual application of specific psychotherapeutic techniques can serve the needs of some special populations of patients. So far, the various AI psychotherapy programs use a variety of psychotherapeutic techniques, while incorporating various common psychotherapeutic factors. Immediacy and accessibility will always be the strong point of AI psychotherapy, in a world with complex relationships and enormous therapeutic needs. Easy analysis of AI psychotherapy data will now be able to provide more accurate explanations for client behavior during treatment, including dropout. In the future, balancing the demographic characteristics of the samples and improvements in the development of an empathetic therapeutic relationship environment, based on common psychotherapeutic factors, are expected to facilitate tailoring interventions and further increase the reliability and effectiveness of AI psychotherapy.
Data availability statement
The original contributions presented in the study are included in the article/supplementary material. Further inquiries can be directed to the corresponding author.
Author contributions
OG: Writing – original draft, Writing – review & editing.
Funding
The author(s) declare that no financial support was received for the research, and/or publication of this article.
Conflict of interest
The author declares that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.
Generative AI statement
The author(s) declare that no Generative AI was used in the creation of this manuscript.
Any alternative text (alt text) provided alongside figures in this article has been generated by Frontiers with the support of artificial intelligence and reasonable efforts have been made to ensure accuracy, including review by the authors wherever possible. If you identify any issues, please contact us.
Publisher’s note
All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article, or claim that may be made by its manufacturer, is not guaranteed or endorsed by the publisher.
References
1. Rosenzweig S. Some implicit common factors in diverse methods of psychotherapy. Am J Orthopsychiatry. (1936) 6:412–5. doi: 10.1111/j.1939-0025.1936.tb05248.x
2. Grencavage LM and Norcross JC. Where are the commonalities among the therapeutic common factors? Prof Psychology: Res Pract. (1990) 21:372–8. doi: 10.1037/0735-7028.21.5.372
3. Arch JJ, Eifert GH, Davies C, Vilardaga JCP, Rose RD, Craske MG, et al. Randomized clinical trial of cognitive behavioral therapy (CBT) versus acceptance and commitment therapy (ACT) for mixed anxiety disorders. J Consulting Clin Psychol. (2012) 80:750–65. doi: 10.1037/a0028310
4. Bögels SM, Wijts P, Oort FJ, and Sallaerts SJM. Psychodynamic psychotherapy versus cognitive behavior therapy for social anxiety disorder: An efficacy and partial effectiveness trial. Depression Anxiety. (2014) 31:363–73. doi: 10.1002/da.22246
5. Barth J, Munder T, Gerger H, Nüesch E, Trelle S, Znoj H, et al. Comparative efficacy of seven psychotherapeutic interventions for patients with depression: A network meta-analysis. PloS Med. (2013) 10:e1001454. doi: 10.1371/journal.pmed.1001454
6. Lambert MJ, Shapiro DA, and Bergin AE. The effectiveness of psychotherapy. In: Garfield SL and Bergin AE, editors. Handbook of psychotherapy and behavior change. Wiley, New\brk (1986). p. 157–212.
7. Gabrielli S, Rizzi S, Carbone S, and Donisi V. A chatbot-based coaching intervention for adolescents to promote life skills: pilot study. JMIR Hum Factors. (2020) 7:e16762. doi: 10.2196/16762
8. Haenlein M and Kaplan A. Abrief history of artificial intelligence: on the past, present, and future of artificial intelligence. California Manage Rev. (2019) 61:5–14. doi: 10.1177/0008125619864925
9. Kolding S, Lundin RM, Hansen L, and Østergaard SD. Use of generative artificial intelligence (AI) in psychiatry and mental health care: a systematic review. Acta Neuropsychiatr. (2024) 37:e37. doi: 10.1017/neu.2024.50
10. Wampold BE. How important are the common factors in psychotherapy? update. World Psychiatry. (2015) 14:270–7. doi: 10.1002/wps.20238
11. Willis J and Todorov A. First impressions: making up your mind after a 100-ms exposure to a face. Psychol Sci. (2006) 17:592–8. doi: 10.1111/j.1467-9280.2006.01750.x
12. Connell J, Grant S, and Mullin T. Client initiated termination of therapy at NHS primary care counselling services. Couns Psychother Res. (2006) 6:60–7. doi: 10.1080/14733140600581507
13. Horvath AO. The alliance in context: Accomplishments, challenges, and future directions. Psychother (Chic). (2006) 43:258–63. doi: 10.1037/0033-3204.43.3.258
14. Elliott R, Bohart AC, Watson JC, and Greenberg LS. Empathy. Psychotherapy. (2011) 48:43–9. doi: 10.1037/a0022187
15. Bordin ES. The generalizability of the psychoanalytic concept of the working alliance. Psychotherapy: Theory Res Pract. (1979) 16:252–60. doi: 10.1037/h0085885
16. Barrett MS, Chua W-J, Crits-Christoph P, Gibbons MB, and Thompson D. Early withdrawal from mental health treatment: Implications for psychotherapy practice. Psychotherapy: Theory Research Practice Training. (2008) 45:247–67. doi: 10.1037/0033-3204.45.2.247
17. Messer SB. What makes brief psychodynamic therapy time efficient. Clin Psychology: Sci Pract. (2001) 8:5–22. doi: 10.1093/clipsy.8.1.5
18. Adamopoulou E and Moussiades L. Chatbots: History, technology, and applications. Mach Learn Appl. (2020) 2:100006. doi: 10.1016/j.mlwa.2020.100006
19. Sabour S, Zhang W, Xiao X, Zhang Y, Zheng Y, Wen J, et al. A chatbot for mental health support: exploring the impact of Emohaa on reducing mental distress in China. Front Digit Health. (2023) 4:1133987. doi: 10.3389/fdgth.2023.1133987
20. Alanezi F. Assessing the effectiveness of chatGPT in delivering mental health support: A qualitative study. J Multidiscip Health. (2024) 31:461–71. doi: 10.2147/JMDH.S447368
21. Li H, Zhang R, Lee Y-C, Kraut RE, and Mohr DC. Systematic review and meta-analysis of AI-based conversational agents for promoting mental health and wellbeing. NPJ Digital Med. (2023) 6:236. doi: 10.1038/s41746-023-00979-5
22. Holohan M and Fiske A. Like I’m talking to a real person”: Exploring the meaning of transference for the use and design of AI-based applications in psychotherapy. Front Psychol. (2021) 12:720476. doi: 10.3389/fpsyg.2021.720476
23. Minerva F and Giubilini A. Is AI the future of mental healthcare? Topoi. (2023) 42:809–17. doi: 10.1007/s11245-023-09932-3
24. Tutun S, Johnson ME, Ahmed A, Albizri A, Irgil S, Yesilkaya I, et al. An AI-based decision support system for predicting mental health disorders. Inf Syst Front. (2023) 25:1261–76. doi: 10.1007/s10796-022-10282-5
25. Graham S, Depp C, Lee EE, Nebeker C, Tu X, Kim HC, et al. Artificial intelligence for mental health and mental illnesses: an overview. Curr Psychiatry Rep. (2019) 21:116. doi: 10.1007/s11920-019-1094-0
26. Boucher EM, Harake NR, Ward HE, Stoeckl SE, Vargas J, Minkel J, et al. Artificially intelligent chatbots in digital mental health interventions: a review. Expert Rev Med Devices. (2021) 18:37–49. doi: 10.1080/17434440.2021.2013200
27. Elyoseph Z, Refoua E, Asraf K, Lvovsky M, Shimoni Y, and Hadar-Shoval D. Capacity of generative AI to interpret human emotions from visual and textual data: pilot evaluation study. JMIR Ment Health. (2024) 6:e54369. doi: 10.2196/54369
28. Hwang G, Lee DY, Seol S, Jung J, Choi Y, Her ES, et al. Assessing the potential of ChatGPT for psychodynamic formulations in psychiatry: An exploratory study. Psychiatry Res. (2024) 331:115655. doi: 10.1016/j.psychres.2023.115655
29. Beg MJ, Verma M, Vishvak Chanthar KMM, and Verma MK. Artificial intelligence for psychotherapy: A review of the current state and future directions. Indian J Psychol Med. (2025) 47:314–25. doi: 10.1177/02537176241260819
30. Alavi N, Moghimi E, Stephenson C, Gutierrez G, Jagayat J, Kumar A, et al. Comparison of online and in-person cognitive behavioral therapy in individuals diagnosed with major depressive disorder: a non-randomized controlled trial. Front Psychiatry. (2023) 14:1113956. doi: 10.3389/fpsyt.2023.1113956
31. Gutierrez G, Stephenson C, Eadie J, Asadpour K, and Alavi N. Examining the role of AI technology in online mental healthcare: opportunities, challenges, and implications, a mixed-methods review. Front Psychiatry. (2024) 15:1356773. doi: 10.3389/fpsyt.2024.1356773
32. Friesem Y. Developing digital empathy: a holistic approach to media literacy research methods. In: Yildiz MN and Keengwe J, editors. Handbook of research on media literacy in the digital age, vol. 1 . IGI Global, Hershey, PA (2016). p. 145–60. doi: 10.4018/978-1-4666-9667-9.ch007
33. Koutsouleris N, Hauser TU, Skvortsova V, and De Choudhury M. From promise to practice: Towards the realisation of AI-informed mental health care. Lancet (British Edition). (2022) 4:e829–40. doi: 10.1016/S2589-7500(22)00153-4
34. Ewbank MP, Cummins R, Tablan V, Bateup S, Catarino A, Martin AJ, et al. Quantifying the association between psychotherapy content and clinical outcomes using deep learning. JAMA Psychiatry. (2020) 77:35–43. doi: 10.1001/jamapsychiatry.2019.2664
35. Ewbank MP, Cummins R, Tablan V, Catarino A, Buchholz S, and Blackwell AD. Understanding the relationship between patient language and outcomes in internet enabled cognitive behavioural therapy: A deep learning approach to automatic coding of session transcripts. Psychother Res. (2021) 31:326–38. doi: 10.1080/10503307.2020.1788740
36. Sperandeo R, Mosca LL, Galchenko A, Moretto E, Di Sarno AD, Longobardi, et al. The nodes of treatment: A pilot study of the patient-therapist relationship through the theory of complex systems. In: Progresses in artificial intelligence and neural systems. Springer, Singapore (2020). p. 585–93.
37. Chen Z, Flemotomos N, Singla K, Creed TA, Atkins DC, and Narayanan S. An automated quality evaluation framework of psychotherapy conversations with local quality estimates. arXiv. (2022) 2106:07922. doi: 10.1016/j.csl.2022.101380
38. Chien I, Enrique A, Palacios J, Regan T, Keegan D, Carter D, et al. A machine learning approach to understanding patterns of engagement with internet-delivered mental health interventions. JAMA Netw Open. (2020) 3:e2010791. doi: 10.1001/jamanetworkopen.2020.10791
39. Gonzalez Salas Duhne P, Delgadillo J, and Lutz W. Predicting early dropout in online versus face-to-face guided self-help: A machine learning approach. Behav Res Ther. (2022) 159:104200. doi: 10.1016/j.brat.2022.104200
40. Wallert J, Boberg J, Kaldo V, Mataix-Cols D, Flygare O, Crowley JJ, et al. Predicting remission after internet-delivered psychotherapy in patients with depression using machine learning and multi-modal data. Transl Psychiatry. (2022) 12:1–10. doi: 10.1038/s41398-022-02133-3
41. Rodrigo H, Beukes EW, Andersson G, and Manchaiah V. Exploratory data mining techniques (decision tree models) for examining the impact of internet-based cognitive behavioral therapy for tinnitus: Machine learning approach. J Med Internet Res. (2021) 23:1–13. doi: 10.2196/preprints.28999
42. Schiepek GK, Viol K, Aichhorn W, Hütt M-T, Sungler K, Pincus D, et al. Psychotherapy is chaotic (Not only) in a computational world. Front Psychol. (2017) 8:379. doi: 10.3389/fpsyg.2017.00379
43. Palanica A, Flaschner P, Thommandram A, Li M, and Fossat Y. Physicians’ Perceptions of chatbots in health care: cross-sectional web-based survey. J Med Internet Res. (2019) 21:e12887. doi: 10.2196/12887
44. Walitzer KS, Dermen KH, and Connors GJ. Strategies for preparing clients for treatment: A review. Behav Modif. (1999) 23:129–51. doi: 10.1177/0145445599231006
45. Hadar-Shoval D, Elyoseph Z, and Lvovsky M. The plasticity of ChatGPT’s mentalizing abilities: personalization for personality structures. Front Psychiatry. (2023) 14:1234397. doi: 10.3389/fpsyt.2023.1234397
46. Cruz-Gonzalez P, He AW-J, Lam EP, Ng IMC, Li MW, Hou R, et al. Artificial intelligence in mental health care: a systematic review of diagnosis, monitoring, and intervention applications. psychol Med. (2025) 55:1–52. doi: 10.1017/S0033291724003295
47. Karkosz S, Szymański R, Sanna K, and Michałowski J. Effectiveness of a web-based and mobile therapy chatbot on anxiety and depressive symptoms in subclinical young adults: randomized controlled trial. JMIR Formative Res. (2024) 8:e47960. doi: 10.2196/47960
48. Danieli M, Ciulli T, Mousavi SM, Silvestri G, Barbato S, Natale D, et al. Assessing the impact of conversational artificial intelligence in the treatment of stress and anxiety in aging adults: randomized controlled trial. JMIR Ment Health. (2022) 9:e38067. doi: 10.2196/38067
49. Klos MC, Escoredo M, Joerin A, Lemos VN, Rauws M, and Bunge EL. Artificial intelligence–based chatbot for anxiety and depression in university students: pilot randomized controlled trial. JMIR Formative Res. (2021) 5:e20678. doi: 10.2196/20678
50. Raile P. The usefulness of ChatGPT for psychotherapists and patients. Humanities Soc Sci Commun. (2024) 11. doi: 10.1057/s41599-023-02567-0
51. Harris LT. The neuroscience of human and artificial intelligence presence. Annu Rev Psychol. (2024) 75:433–66. doi: 10.1146/annurev-psych-013123-123421
52. Giotakos O. Psychiatry in the real world. Front Psychiatry. (2025) 16:1616276. doi: 10.3389/fpsyt.2025.1616276
53. Stein DJ, Shoptaw SJ, Vigo D, Lund C, Cuijpers P, Bantjes J, et al. Psychiatric diagnosis and treatment in the 21st century: paradigm shifts versus incremental integration. World Psychiatry. (2022) 21:393–414. doi: 10.1002/wps.20998
54. Singhal K, Azizi S, Tu T, Mahdavi SS, Wei J, Chung HW, et al. Large language models encode clinical knowledge. Nature. (2023) 620:172–80. doi: 10.1038/s41586-023-06291-2
55. Su S, Wang Y, Jiang W, Zhao W, Gao R, Wu Y, et al. Efficacy of artificial intelligence-assisted psychotherapy in patients with anxiety disorders: A prospective, national multicenter randomized controlled trial protocol. Front Psychiatry. (2022) 12:799917. doi: 10.3389/fpsyt.2021.799917
56. WHO. Ethics and governance of artificial intelligence for health. Guidance on large multi-modal models (2024). Geneva: World Health Organization. Available online at: https://www.who.int/publications/i/item/9789240084759 (Accessed October 11, 2025).
57. Richards D. Artificial intelligence and psychotherapy: A counterpoint. Couns Psychother Res. (2025) 25:e12758. doi: 10.1002/capr.12758
58. Cioffi V, Mosca LL, Moretto E, Ragozzino O, Stanzione R, and Bottone M. Computational methods in psychotherapy: A scoping review. Int J Environ Res Public Health. (2022) 19:12358. doi: 10.3390/ijerph191912358
59. Hammelrath L, Hilbert K, Heinrich M, Zagorscak P, Knaevelsrud C, Crowley JJ, et al. Select or adjust? How information from early treatment stages boosts the prediction of non-response in internet-based depression treatment. Psychol Med. (2024) 54:1641–50. doi: 10.1017/S0033291723003537
Keywords: psychotherapy, AI psychotherapy, common factors, therapeutic alliance, eCBT, artificial intelligence–based psychotherapy
Citation: Giotakos O (2025) Artificial intelligence-based psychotherapy: focusing on common psychotherapeutic factors. Front. Psychiatry 16:1710715. doi: 10.3389/fpsyt.2025.1710715
Received: 22 September 2025; Accepted: 28 October 2025;
Published: 11 November 2025.
Edited by:
Heleen Riper, VU Amsterdam, NetherlandsReviewed by:
Kim Mathiasen, Aarhus University, DenmarkCopyright © 2025 Giotakos. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.
*Correspondence: Orestis Giotakos, aW5mb0BvYnJlbGEuZ3I=