Skip to main content

The use of artificial intelligence in psychotherapy: development of intelligent therapeutic systems

Abstract

Background

The increasing demand for psychotherapy and limited access to specialists underscore the potential of artificial intelligence (AI) in mental health care. This study evaluates the effectiveness of the AI-powered Friend chatbot in providing psychological support during crisis situations, compared to traditional psychotherapy.

Methods

A randomized controlled trial was conducted with 104 women diagnosed with anxiety disorders in active war zones. Participants were randomly assigned to two groups: the experimental group used the Friend chatbot for daily support, while the control group received 60-minute psychotherapy sessions three times a week. Anxiety levels were assessed using the Hamilton Anxiety Rating Scale and Beck Anxiety Inventory. T-tests were used to analyze the results.

Results

Both groups showed significant reductions in anxiety levels. The control group receiving traditional therapy had a 45% reduction on the Hamilton scale and a 50% reduction on the Beck scale, compared to 30% and 35% reductions in the chatbot group. While the chatbot provided accessible, immediate support, traditional therapy proved more effective due to the emotional depth and adaptability provided by human therapists. The chatbot was particularly beneficial in crisis settings where access to therapists was limited, proving its value in scalability and availability. However, its emotional engagement was notably lower compared to in-person therapy.

Conclusions

The Friend chatbot offers a scalable, cost-effective solution for psychological support, particularly in crisis situations where traditional therapy may not be accessible. Although traditional therapy remains more effective in reducing anxiety, a hybrid model combining AI support with human interaction could optimize mental health care, especially in underserved areas or during emergencies. Further research is needed to improve AI’s emotional responsiveness and adaptability.

Peer Review reports

Introduction

In today’s world, the growing demand for psychotherapy services and limited access to qualified specialists require the search for new treatments for mental disorders. The use of artificial intelligence (AI) in psychotherapy offers the potential to address these challenges, particularly through the development of intelligent therapy systems that can adapt to individual patient needs and provide continuous support. AI is increasingly integrated into various aspects of human lives, including the fields of psychology and psychotherapy. The use of AI in this area opens up new opportunities for expanding access to psychotherapy services and improving their effectiveness [1]. One of the promising areas is the creation of smart therapeutic systems that can provide more individualised treatment and solve the problem of lack of qualified specialists.

Modern studies confirm that the use of AI is effective for the diagnosis and treatment of mental disorders [2, 3]. Some studies highlight that chatbots can effectively help manage symptoms of depression and anxiety [4]. Zheng et al. [5] demonstrated that AI-based systems, specifically an attention-based multi-modal MRI fusion model, can effectively aid in diagnosing and managing major depressive disorder. Their study highlights how continuous monitoring through AI can improve the accuracy of diagnosis and provide real-time interventions, ultimately enhancing the treatment of depression and anxiety. The use of chatbots in psychotherapy has seen significant growth over the past five years. Li et al. [6] conducted a systematic review and meta-analysis, demonstrating that AI-based conversational agents can effectively promote mental health and well-being. Their study highlights that chatbots provide timely support and personalised recommendations, which can significantly reduce anxiety levels and improve psychoemotional well-being, especially among students.

Another important area of research is integrating AI to manage symptoms of depression and anxiety. Wang et al. [7] developed a high- and low-frequency feature fusion framework for the automatic diagnosis of major depressive disorder, further emphasizing AI’s potential to offer more accessible and personalized support to patients. These studies underscore the growing role of AI in improving the accuracy and effectiveness of mental health interventions, particularly in the management of depression and anxiety. Poalelungi et al. [8] note that AI can perform functions, in particular, monitoring the condition of patients and providing treatment recommendations, which opens up new opportunities for psychotherapy practice. However, there are a significant number of gaps in research, especially regarding the integration of these systems into the clinical environment and the evaluation of their long-term effectiveness. Also, the study by Ooi and Wilkinson [9] highlight the need for a more detailed study of the ethical aspects of AI use in psychotherapy, considering psychological and social factors that may influence the therapy process. These challenges require the creation of a clear regulatory framework and assessment methods that are poorly developed in existing research.

Haber et al. [10] introduced the concept of the “Artificial Third” in psychotherapy, exploring how AI’s presence affects the dynamics between therapists and patients. This study provides a broader perspective on the role of AI in therapy, suggesting that while AI can assist in diagnosis and support, it may also impact the traditional therapist-patient relationship. Ronneberg et al. [11] conducted the SPEAC-2 study, which investigates the potential of a PST-trained, voice-enabled AI counselor for adults experiencing emotional distress. This study provides valuable insights into how AI-driven interventions can offer real-time, personalised emotional support, addressing the growing demand for mental health services where traditional access may be limited. However, while the SPEAC-2 system shows promise, more research is needed to explore its long-term effectiveness in comparison to human therapists.

Alimour et al. [12] explored the quality traits of AI operations in mental health care, focusing on how AI can predict mental health professionals’ perceptions and contribute to enhanced decision-making in psychotherapy. Their study reveals AI’s potential in improving the operational efficiency of therapeutic practices, yet it also points to the need for deeper research into how AI can be fully integrated into clinical environments. Plakun [13] examined the role of AI in psychotherapy, noting that while AI can assist with patient monitoring and provide personalized treatment recommendations, it cannot replace the human element in therapy. This study emphasizes the need for a balanced approach to integrating AI into psychotherapy, where AI serves as a complement rather than a substitute for human therapists. This highlights the necessity for further exploration of how AI can maintain the empathy and personal connection critical to therapeutic success.

The current research addresses a critical gap in the application of AI technologies within mental health care, where such tools remain underutilised compared to their use in physical health. Unlike previous studies, this research not only explores the potential of AI in enhancing diagnostic tools and identifying objective biomarkers for mental illnesses but also evaluates the effectiveness of AI-driven chatbots in real-world crisis scenarios, such as during the war in Ukraine. The significance of this study lies in its comprehensive analysis of AI applications, demonstrating how these technologies can provide timely psychological support when access to traditional therapy is limited. By focussing on user interactions with the Friend chatbot, the research reveals its effectiveness in reducing anxiety among users in high-stress environments, highlighting AI’s role as a crucial adjunct to conventional mental health care.

This study aimed to evaluate an intelligent therapeutic system that integrates AI to support psychotherapy processes, especially in cases where access to psychotherapists is limited. Through an empirical study, it was evaluated how such a system can affect the improvement of clinical outcomes and optimisation of the therapeutic process. The study also helped to identify potential problems and the impact of AI technology on psychotherapy practice. Also, the study aimed to evaluate the effectiveness of the Friend chatbot compared to conventional psychotherapy sessions in reducing anxiety levels in women in crisis situations. Several hypotheses for study on the usefulness of AI technology, notably chatbots, in mental health treatment have been developed:

H1

Using the Friend chatbot would considerably lower participants’ anxiety levels compared to pre-intervention assessments, demonstrating its usefulness as a psychological support aid.

H2

The Friend chatbot will offer timely and effective emotional support, resulting in a significant improvement in participants’ self-reported emotional states and ways of coping.

H3

Integrating AI technology into mental health care would improve the accessibility and availability of psychological support services, especially in crisis situations where traditional treatment is restricted.

H4

Participants that use the Friend chatbot report high levels of user happiness and perceived utility, demonstrating that AI-powered solutions can help with mental health treatment.

H5

There will be a positive relationship between the number of interactions with the Friend chatbot and the amount of anxiety reduction experienced by participants, suggesting that greater involvement with AI technologies may improve their efficacy.

Materials and methods

The participants for this randomised controlled trial (RCT) were chosen in a multi-step procedure to guarantee that the sample was representative and the results were accurate. Recruitment was conducted out through outreach initiatives aimed at women living in regions of active military combat in Ukraine, including social media marketing. The study included women aged 18 to 60 who had been diagnosed with anxiety disorder by a qualified mental health practitioner during the preceding six months. Participants were also required to live in areas directly affected by military activities and to have access to a mobile device with Internet connection. Exclusion criteria included the existence of major mental illnesses (e.g., schizophrenia, bipolar disorder), current substance abuse, pregnancy or breastfeeding, or any condition that would preclude the participant from interacting with the chatbot.

Clinical psychologists performed virtual interviews to confirm the anxiety diagnosis and determine eligibility, with 104 women chosen from among 150 screened. Participants were then assigned to either the experimental group (52 women utilising the AI-based chatbot Friend for daily psychological support) or the control group (52 women having three 60-minute conventional therapy sessions with certified psychologists each week). During the eight-week study, the chatbot gave tailored help via natural language processing and machine learning, adapting its replies to the user’s emotional state.

This study is classified as a clinical trial because participants were randomly assigned to either an experimental group using the Friend chatbot or a control group receiving traditional psychotherapy, which is a component of clinical trials that lowers bias and ensures comparability. This AI system uses a natural language processing framework and machine learning methods to communicate with people in real time. The system is intended to analyse the user’s emotional state via text inputs and then deliver tailored replies based on therapy procedures such as CBT and motivational interviewing. The chatbot employs deep learning models to adjust replies as conversations progress, imitating an empathic relationship to alleviate anxiety symptoms. Data acquired from these discussions is evaluated by the system’s backend, which does emotional sentiment analysis to identify discomfort and offer appropriate coping measures. By randomly assigning participants to the experimental or control groups, the study assures comparability and allows for a thorough evaluation of the chatbot’s effectiveness in contrast to traditional psychotherapy, resulting in credible evidence to influence mental health treatment. Furthermore, the RCT’s structured format allows for robust statistical analysis, which strengthens the findings’ validity in the context of psychiatric therapy during crisis situations.

An RCT was required to establish causality between the intervention and anxiety reduction results, allowing for a complete assessment of the chatbot’s effectiveness. The structured RCT design allows for thorough statistical analysis, which improves the validity of the findings in crisis settings. Standardised outcome measures, specifically the Hamilton Anxiety Rating Scale and the Beck Anxiety Inventory, were used to assess anxiety levels before and after the intervention, which is consistent with clinical trial objectives of evaluating treatment efficacy and safety, distinguishing it from quasi-experimental designs. The data were analysed using descriptive statistics and t-test methods to compare results before and after applying the system. The researchers used a t-test rather than ANOVA to compare the means of two groups: experimental (using the Friend chatbot) and control (receiving conventional psychotherapy). A t-test is appropriate for comparing two groups, while an ANOVA is typically used for three or more groups. Because the focus was only on these two groups, a t-test was preferred. The use of Statistical Package for the Social Sciences version 27 software ensured the accuracy and objectivity of processing results. The system used specialised computer equipment with sufficient computing potential for processing large amounts of data, and secure server equipment for storing and processing patients’ personal data. SPSS was used in accordance with the General Data Protection Regulation (GDPR), ensuring that participants’ personal data was securely recorded and processed, and that strict privacy and data protection standards were observed throughout the study. GDPR compliance preserves participant confidentiality and guarantees the ethical management of sensitive data.

In addition to quantitative measurements, the research looked at other AI apps, such as Tess, Wysa, and Woebot, to get qualitative insights. This study examined their methods, features, and operating environments, as well as how they leverage natural language processing and other AI technologies to increase user engagement. The study’s holistic approach seeks to thoroughly examine the potential benefits of AI technologies in improving the diagnosis and treatment of mental illnesses, as well as to suggest topics for future research and usage in psychotherapy.

Results

The use of AI technology in medicine is growing more popular, particularly for physical health, although its use in mental health remains limited [14]. Mental health relies on the capacity to build mutual understanding, form connections with patients, and observe their emotions and behaviour, making it challenging to automate activities using AI [15]. However, AI can dramatically increase our understanding of mental disease and diagnostic methods. It has the potential to aid in the identification of biomarkers as well as the development of more accurate diagnostic tools and risk models for mental disease. For example, electroencephalography (EEG) is used to investigate depression, and deep learning techniques such as neural networks enable to accurately discriminate individuals with depression from healthy individuals [16]. A three-dimensional neural network has also been proven in studies to properly predict depression based on video clips.

In addition to diagnostic and prognostic aims, AI-enabled therapeutic devices are quickly evolving. Technological innovation has had a tremendous impact on the area of psychology, particularly with the development of the first chatbot by psychologist Weizenbaum [17]. The software, named ELIZA after the character Eliza Doolittle in George Bernard Shaw’s play Pygmalion, replicated a conversation with a therapist, despite its’ limited intellectual ability. ELIZA relied on the premise of reformulating the user’s comments into questions, allowing for the appearance of comprehension and empathy. Using the “Doctor” scenario, ELIZA successfully mimicked the technique of prominent psychologist Carl Rogers, creating the sense of active listening and interaction. This early application of the chatbot highlighted the possibility of employing AI for psychotherapy, setting the framework for future innovations in the field [18]. Modern technologies have advanced significantly, improving the capacity to handle complicated issues and produce more intelligent interactive systems capable of providing more realistic and in-depth interactions with users in the field of psychology [19]. Tess and other chatbots (Sara, Woebot, and Wysa) offer assistance through text messages and interactive displays [20, 21]. Woebot, for example, uses virtual psychoanalysis to assist users in identifying their emotions and thinking patterns, as well as teaching them resilience and anxiety reduction strategies [22]. Woebot has been found in studies to successfully lessen depressive symptoms. Tess is used to diagnose emotional illnesses and to alleviate sadness and anxiety. Intelligent psychotherapy systems are built on AI and machine learning, which encompass a wide spectrum of technologies [23].

Tess, a psychotherapeutic chatbot, uses natural language processing, machine learning, and deep learning to assess patients’ language, emotions, and behaviour, modifying replies to give personalised therapy [20]. Tess uses a variety of treatment modalities, including CBT and motivational interviewing, and has demonstrated success in lowering anxiety and sadness, particularly among students. Tess is not a replacement for expert care but rather an auxiliary tool that improves access to mental health assistance and provides instructional resources for self-help skill development [24].

Ellie, a virtual therapist created by the Institute of Creative Technologies, uses voice recognition and computer vision to evaluate mental health during virtual consultations [25]. This artificial intelligence system recognises small changes in facial expressions, voice intonation, gestures, and posture, which are critical for accurately assessing mental states. Ellie evaluates semantic and emotional subtleties in speech, but computer vision algorithms look at microexpressions and gestures, which might disclose more than spoken communication [26]. The method is used to evaluate individuals’ mental health, diagnose signs of post-traumatic stress disorder (PTSD), depression, and anxiety, and develop communication skills. Ellie is also a helpful research tool, helping to gain a greater understanding of the relationship between nonverbal conduct and mental health. This AI application in psychotherapy has demonstrated the ability to improve mental illness diagnosis and treatment, providing critical assistance to individuals who have limited access to traditional healthcare facilities.

SimSensei is an advanced system that uses deep machine learning and behavioural analysis to detect signs of sadness and PTSD [27, 28]. It performs realistic user interactions by combining several data sources, such as text messages and audio-video recordings. SimSensei uses natural language processing and video analysis to identify emotional states and behavioural indications of psychiatric problems. Its main feature is its capacity to engage users via an empathic chatbot or virtual agent, providing individualised instruction and psychological support depending on their emotional requirements [29].

Replika is an AI-powered chatbot that functions as a “virtual friend” for tailored communication and emotional support [30]. It uses neural networks and natural language processing to assess user input and provide contextually suitable replies. Replika uses interactive dialogues to help users improve their social skills, give psychological support, and entertain them. It is a tool for personal and emotional growth, especially for people experiencing social isolation or seeking assistance with personal concerns.

Elomia is a mobile app that uses science and technology to improve mental wellness. It uses cognitive behavioural therapy, self-help approaches, and artificial intelligence to provide personalised care. Users are subjected to psychometric assessments and activity tracking, which result in individualised CBT plans that include psychological exercises. AI optimises therapies, and the app provides visualising and meditation aids. Elomia has a community platform for user engagement. It was developed by psychology, artificial intelligence, and marketing professionals and now serves thousands of people worldwide, with a free version available to Ukrainians. The software works with Open AI, using powerful models such as GPT-3 to deliver effective and accessible therapeutic help.

Faino Bot Psy is a stress-management chatbot that communicates anonymously and in five languages. It measures stress levels and provides recovery exercises, but underlines that it does not replace real therapist conversations or treat underlying psychological disorders, focused instead on rapid help. Similarly, Wysa, an international chatbot therapist, offers a variety of mental health activities and voice conversation without requesting any personal information from users, allowing them to stay anonymous. These ideas demonstrate chatbots’ ability to provide psychological assistance while maintaining continuity of care. Despite their limits in human emotional connection and deep diagnosis, these technologies efficiently address acute needs and complement traditional therapeutic procedures.

Ukraine has introduced a Telegram chatbot called Friend First Aid, which is aimed to give psychological support in difficult situations using contemporary procedures and scientific research. It is especially effective for individuals who have recently experienced stress and do not have access to an expert. The chatbot engages users by asking questions, giving relaxation techniques, and making suggestions for solving obstacles. If necessary, it can send consumers to the Tell Me platform for specialised consultations. According to the Centre for Strategic Communications and Information Security, the chatbot has been clinically proven to provide immediate psychological help when direct communication with an expert is not available.

The Friend chatbot uses AI to serve as a virtual companion, asking questions, providing assistance, suggesting relaxing activities, and advising users on stress management. It uses natural language processing to assess user content, produce answers, and customise interactions, successfully recognising and responding to emotional states and learning from each encounter to tailor its assistance. This adaptability is especially useful in crisis settings, where access to standard therapy is restricted. The Friend chatbot is designed for Ukrainian language support and crisis intervention in military settings, while Tess and Woebot focus on regular psychotherapy encounters, and Wysa provides comprehensive mental health services to a global audience. Table 1 presents a comparison of chatbots.

Table 1 Comparison of chatbots

In light of the need for psychological support in Ukraine, the Friend chatbot was identified as the best tool for conducting scientific research. This app was chosen because of its accessibility, ease of use, and support for the Ukrainian language. The purpose of the study was to evaluate the effectiveness of the Friend chatbot in comparison with conventional psychotherapy sessions in times of crisis, in particular during the war in Ukraine. Participants in the experiment were 104 women who were in active combat zones and had experience of being in dangerous conditions. All participants were diagnosed with an anxiety disorder prior to the study, which provided high initial anxiety in the sample. For the study, participants were randomly divided into two groups. The experimental group used the Friend chatbot for daily communication and psychological support. Participants could contact the chatbot at any time of the day, receiving immediate responses and recommendations. Instead, the control group received 60-minute sessions three times a week with qualified volunteer psychologists, which were conducted in person or online, depending on safety conditions. Participants’ anxiety levels were assessed at the start of the study and after it was completed using two tools: the Hamilton Anxiety Rating Scale and the Beck Anxiety Inventory (Fig. 1). These tools were chosen because of their widespread use in clinical practice and their ability to accurately measure changes in anxiety levels.

Fig. 1
figure 1

Reduction of the average score on anxiety scales. Note: data from the control group is indicated in green, and data from the experimental group is indicated in blue. Source: compiled by the author

The experiment revealed a substantial reduction in anxiety symptoms in both groups, demonstrating the efficacy of both methods in relieving anxiety. However, the control group showed a better recovery, most likely owing to the personal interaction with therapists, which may be more important for some patients in crisis situations. The experimental group, which used the Friend chatbot, demonstrated a 30% drop on the Hamilton Anxiety Scale and a 35% reduction on the Beck Depression Inventory, whereas the control group obtained 45% and 50% reductions on these measures, respectively. These findings, displayed in Fig. 1, show that, while the Friend chatbot is an excellent tool for lowering anxiety, traditional psychotherapy sessions are more helpful in delivering anxiety reduction.

A t-test using the Hamilton Anxiety Scale indicated a t-value of 2.85 and a p-value of 0.007, showing a statistically significant difference between the two groups, with the therapist group experiencing higher anxiety reduction. Similarly, the t-test for the Beck Anxiety Inventory yielded a t-value of 3.12 and a p-value of 0.003, demonstrating the substantial difference in treatments. These findings imply that, while the chatbot successfully decreases anxiety, personal engagement and emotional support offered by a therapist provide more profound therapeutic effects, especially in high-stress or crisis circumstances. The larger reduction in anxiety symptoms in the therapist group is likely due to the benefits of direct human connection and a therapeutic relationship, which may be difficult for AI-based systems to fully imitate. Nonetheless, the potential of AI in mental health treatment is clear, particularly in places where traditional therapy may be limited.

The substantial p-values from both t-tests demonstrate that conventional treatment was more helpful at lowering anxiety than the chatbot intervention. However, it is worth noting that the chatbot still resulted in a significant reduction in anxiety symptoms. This shows that AI-powered treatments, such as the Friend chatbot, might be useful tools, particularly in instances when access to traditional therapy is limited or rapid assistance is required. The chatbot’s accessibility and ability to offer fast replies make it a valuable tool in mental health care.

The substantial p-values from both t-tests indicate that traditional therapy was more helpful in reducing anxiety than the chatbot intervention. However, the chatbot still resulted in a significant reduction in anxiety symptoms, demonstrating that AI-based technologies such as the Friend chatbot might be useful, especially when traditional therapy is restricted or rapid assistance is required. The gap in outcomes highlights the specific advantages of human therapy interactions, in which the therapeutic relationship, characterised by empathy and personalised care, provides benefits that AI technologies may not fully reproduce. Therapists offer a sense of safety and validation while addressing the emotional intricacies of anxiety in ways that existing AI systems cannot. Nonetheless, the chatbot’s significant reductions highlight its potential as a valuable supplement in mental health therapy, particularly in situations when traditional therapies may be unavailable.

The t-test results highlight the importance of a hybrid model of care that integrates AI technology with human engagement. While AI solutions like the Friend chatbot provide immediate, scalable assistance, especially during emergencies like war activities in Ukraine, they cannot totally replace the human, sympathetic treatment offered by therapists (Table 2). The chatbot can be especially beneficial for patients who may skip therapy owing to logistical difficulties or stigma, bridging gaps in treatment. However, the higher anxiety reductions observed in the therapist group highlight the particular benefits of human connection, such as emotional validation and customised care, which are challenging for AI to imitate. These findings show the possibility of incorporating AI technology, such as the Friend chatbot, into larger mental health efforts in areas where traditional therapy may be less feasible. A hybrid method might combine the benefits of both modalities, serving immediate demands with technology while establishing deeper emotional relationships with therapists. The chatbot may help with first contact, coping mechanisms, and keeping patients involved in between treatment sessions, all of which contribute to broader therapeutic goals.

Table 2 Comparison of traditional therapy and AI therapy

AI has significant potential in improving the diagnosis and treatment of mental disorders. Technologies such as deep learning and neural networks are effectively used to analyse EEG data and video materials, helping to accurately identify biomarkers of depression and other mental states. Virtual psychotherapists and chatbots that use natural language processing and other AI algorithms demonstrate the ability to effectively interact with patients, providing support and facilitating psychotherapy processes [31]. These technologies not only provide care in a clinical setting, but also expand access to psychological care, reducing barriers associated with remoteness and stigmatisation of mental illness. AI plays a key role in transforming the field of mental health, offering the latest approaches to improve the quality of life of patients [32].

However, it should be noted that, while the results indicate that AI-powered treatments, such as the Friend chatbot, provide scalable, accessible, and cost-effective solutions, some major challenges arise when using them to psychotherapy. One key problem is a lack of emotional depth and empathy, which are necessary components of human-based treatment but difficult to reproduce with AI. Human therapists may adapt their approach depending on real-time emotional indicators, providing personalised treatment that AI chatbots, which rely on programmed replies, fail to match. Furthermore, privacy and data protection are major problems with AI systems, which require sensitive patient data to work properly. The long-term usefulness of AI-powered mental health tools is unknown, particularly their potential to offer profound, long-lasting therapeutic change when compared to human therapists. Furthermore, concerns about algorithm bias and the incapacity to create therapeutic partnerships limit their broad usage in mental health treatment. As a result, while AI can fill some gaps in mental health service delivery, especially in crisis situations, these essential challenges demand a hybrid strategy that blends AI with human-cantered care to provide more comprehensive support.

The results have important practical implications for the future use of chatbots in psychological support, especially in conditions of limited access to psychological services, offering a low-cost, affordable alternative that can be used in a wide range of scenarios. This data can help us understand the potential benefits of using chatbots as a means of psychological support in situations of high stress. The results highlight the importance of further studying the role of AI in psychotherapy as a means of supporting traditional therapeutic methods and improving the overall effectiveness of treating mental disorders.

While both AI and traditional psychotherapy are successful in reducing anxiety, the long-term benefit of interaction remains critical for comprehensive treatment, especially in high-stress settings. Ongoing research will be required to develop these tools and maximise their application for better mental health results. Future research should focus on qualitative aspects of mental health treatment, such as empathy and validation, to optimise mental health assistance. Longitudinal studies and investigation of demographic and contextual characteristics are necessary to determine the long-term effectiveness of AI treatments and understand the most effective treatment outcomes.

Discussion

The new study contributes to the expanding body of data supporting the efficacy of AI-powered tools in mental health therapy, particularly in crisis settings. In this study, the Friend chatbot significantly reduced anxiety symptoms among participants, especially those in active combat zones. However, the reduction was more dramatic in people getting conventional treatment, underscoring the idea that human therapists offer a level of emotional support and flexibility that AI cannot match. This research emphasises the necessity for hybrid mental health care models that combine AI tools with human interaction to optimise treatment effects.

Tahan and Saleem [22] carried out a systematic review and discovered that AI-based solutions are effective at delivering timely and personalised mental health care. Similarly, Hoffman et al. [33] discovered that AI-driven treatments such as Woebot boosted user engagement while providing scalable mental health solutions. However, as demonstrated in the current study, while AI tools are helpful, they may not have the same impact as traditional therapy, as emphasised by Sedlakova and Trachsel [34]. Their findings revealed that human characteristics such as empathy and trust, which are essential to therapeutic effectiveness, are difficult for AI to imitate.

In terms of diagnostic accuracy, Verma et al. [35] investigated AI’s involvement in speech signal analysis for depression diagnosis and concluded that AI improves precision in mental health diagnosis. However, similar to the current study, they discovered that AI’s potential to completely replace human contact is limited, particularly in emotional and relational circumstances. The current findings indicate that, while the Friend chatbot is effective for quick, scalable assistance, it lacks the personalised and nuanced replies of a therapist, which might explain the larger anxiety reductions reported in the conventional therapy group.

The potential for incorporating AI into larger mental health methods is apparent, particularly in crisis situations where access to human therapists is limited [36]. Green et al. [37] revealed that AI systems might dramatically increase access to mental health treatment in resource-constrained settings, similar to what the Friend chatbot did in this study for participants in conflict zones. Li et al. [6] also emphasised the effectiveness of conversational AI bots in delivering real-time support, complementing rather than replacing traditional treatment. This is consistent with the outcomes of the current study, which highlight the Friend chatbot’s potential as a beneficial supplement to conventional therapy, particularly in areas where access to professional mental health services is restricted.

However, limitations in the current investigation should be acknowledged. The brief period of the intervention may not accurately reflect the chatbot’s long-term efficacy, as previously reported. Schaeuffele et al. [38] discovered that, while AI-driven therapies are useful in the short term, they may not provide the same long-term advantages as transdiagnostic CBT, which has been shown to sustain emotional stability over time. Furthermore, the sample size in this study, while adequate for preliminary findings, may restrict the data’s generalisability across diverse communities and cultural situations.

Building on the present study’s findings, which revealed the usefulness of AI-powered chatbots such as Friend in giving psychological support, it is critical to contextualise these findings within the larger body of previous research. The American Psychological Association [39] emphasises the need for personalised care in treating depression across age groups, a premise that AI-driven solutions only partially meet. The present study’s findings support this, since traditional treatment beat the chatbot in anxiety reduction, emphasising the critical importance of human empathy and adaptation in psychotherapy, particularly for people with more complex emotional needs.

In contrast, the chatbot’s accessibility and scalability in crisis situations, such as the war in Ukraine, provide a realistic alternative when traditional treatment is unavailable. Studies such as Lindqvist Bagge et al. [40] and Dingle et al. [41], which investigate the mental health effects of COVID-19, highlight the increasing need for mental health services during times of crisis. As demonstrated in the current study, AI-based solutions can play a critical role in alleviating the immediate mental health burden. Similarly, Lee et al. [42] and Adepoju and Valdez [43] discuss how access to mental health treatment became important during the COVID-19 epidemic, with AI-powered platforms providing a beneficial addition to overcrowded healthcare institutions. However, while AI can fill some gaps in mental health care, Richards [44] expresses worries about its limits in psychotherapy, specifically its incapacity to build true therapeutic connections. The current study’s findings reinforce these worries, as human therapists gave far greater emotional support and anxiety reduction than the chatbot, demonstrating AI’s limits in duplicating the subtle, sympathetic interactions required for effective treatment.

The new study emphasises the potential of AI-powered systems in psychotherapy, especially in crisis settings where access to human therapists is restricted. This conclusion is consistent with Clark et al. [45], who demonstrated that online cognitive therapy may greatly expand therapists’ reach, boosting treatment outcomes in disorders such as social anxiety disorder. While AI systems such as chatbots can improve accessibility, they lack the nuanced, empathetic interaction that human therapists provide– a key limitation observed in both the current study and discussed in Sufyan et al. [46], who compared AI models and psychologists in delivering social intelligence.

In terms of practical applicability, the current study’s findings are consistent with those of Upadhyay et al. [47], who applied machine learning algorithms for early detection of persistent depressive disorder, suggesting that AI may successfully help early diagnosis and monitoring. However, as Wang [48] noted, the current study discovered that, while AI systems such as the Friend chatbot provide accessibility and immediate psychological support, they are insufficient replacements for human therapists, especially when dealing with complex mental health issues that necessitate a higher level of engagement.

Furthermore, the hybrid paradigm proposed by this study, in which AI supports human treatment, is supported by studies such as Aafjes-van Doorn [49], who investigated the viability of AI in psychotherapy from both patient and clinician viewpoints. While AI can improve therapeutic interventions and assist clinicians, Suso-Ribera et al. [50] emphasise the importance of human intervention in managing more complex aspects of mental health, such as chronic pain management, which necessitates personalised care beyond the capabilities of current AI systems.

The findings also support Bhatt’s [51] assertion that digital mental health technologies will play an increasingly important role in the future of psychotherapy. However, both Bhatt and the current study recognise AI’s limits in duplicating the emotional depth and therapeutic interactions that human therapists give, underscoring the notion that AI should be seen as a complementary tool rather than a solo solution. According to these findings, the future of psychotherapy is likely to be a hybrid strategy in which AI boosts the reach and efficiency of human therapists while not completely replacing them.

Incorporating AI into psychotherapy provides substantial prospects for enhancing access to mental health treatment, particularly in crisis settings where traditional therapy may be limited. The new study’s findings are consistent with earlier research, emphasising the significance of AI in extending therapy’s reach, assisting in early diagnosis, and offering instant assistance. However, the limitations identified, such as the lack of emotional depth and nuanced interaction inherent in human treatment, emphasise the need for a hybrid approach. While AI systems can supplement and improve the efficiency of therapists, they cannot replace the empathic and adaptable treatment offered by human specialists, particularly in complicated emotional situations. Future research should examine the long-term efficacy of AI-based mental health therapies, particularly in various crisis circumstances and across demographics. Furthermore, investigations could investigate the integration of AI systems with hybrid models of care that include both human therapists and AI help and assess the influence on overall treatment outcomes. Further development of AI systems that increase emotional depth and reactivity may improve the quality of psychological assistance provided by chatbots.

Conclusions

This study investigated the effectiveness of an AI-based chatbot Friend in giving psychological assistance to women in active combat zones in Ukraine, comparing it to standard psychotherapy techniques. The study intended to determine if AI-driven assistance may be an effective substitute for traditional treatment in high-stress settings with limited resources. In a randomised controlled experiment (RCT), 104 people were split into two groups: those who used the chatbot and those who received traditional therapy from qualified psychologists.

While both treatments were helpful in lowering anxiety symptoms, the study revealed that conventional therapy resulted in a more significant drop in anxiety levels, with a 45% reduction on the Hamilton Anxiety Scale compared to 30% for the chatbot. These findings emphasise the need for human interaction in mental health therapies, especially in crisis situations where personal connection and empathy are essential for emotional recovery. Despite the better outcomes linked with traditional treatment, the Friend chatbot has shown great potential as a supplemental aid in mental health care. Its accessibility, cost-effectiveness, and instant availability make it particularly useful in settings where access to skilled therapists is limited, such as war zones or countries with low healthcare resources. The chatbot’s capacity to provide individualised psychological guidance through natural language processing and machine learning highlights AI’s rising role in broadening the reach of mental health services, making treatment more scalable and adaptable.

The significance of this study stems from its evidence of AI’s ability to bridge gaps in mental health care provision, particularly in crisis or resource-constrained contexts. While the chatbot performed less well than traditional treatment, its advantages in terms of accessibility and scalability cannot be overlooked. This study adds to the growing body of evidence indicating that AI-powered chatbots can be valuable adjuncts to traditional treatment, providing instant psychological support and aiding in anxiety management during important periods. This study shows how AI-based chatbots like Friend can offer accessible, scalable psychological help in crisis circumstances, especially when traditional treatment options are restricted. Its findings indicate that such technologies may be successfully integrated into mental health care systems to supplement human therapists, providing instant assistance in high-stress situations. Future research should focus on improving AI chatbots’ emotional reactivity and investigating the long-term impact of these treatments on mental health outcomes. A hybrid strategy that combines AI technology with traditional therapy approaches may be the most effective answer for addressing mental health issues in both crisis and everyday settings.

Data availability

The data that support the findings of this study are available on request from the corresponding author.

References

  1. Mosquera FEC, Guevara-Montoya MC, Serna-Ramirez V, Liscano Y. Neuroinflammation and Schizophrenia: new therapeutic strategies through Psychobiotics, Nanotechnology, and Artificial Intelligence (AI). J Pers Med. 2024;14(4):391. https://doiorg.publicaciones.saludcastillayleon.es/10.3390/jpm14040391

    Article  PubMed Central  PubMed  Google Scholar 

  2. Prescott J, Barnes S. Artificial intelligence positive psychology and therapy. Couns Psychother Res. 2024;24:843–5. https://doiorg.publicaciones.saludcastillayleon.es/10.1002/capr.12784

    Article  Google Scholar 

  3. Pandey A, Misra M. Artificial Intelligence in Mental Health Care. Artificial Intelligence and Machine Learning in Healthcare. Singapore: Springer. 2023;117–28. https://doiorg.publicaciones.saludcastillayleon.es/10.1007/978-981-99-6472-7_8

    Chapter  Google Scholar 

  4. Kang C, Novak D, Urbanova K, Cheng Y, Hu Y. Domain-specific improvement on psychotherapy chatbot using assistant. In: IEEE International Conference on Acoustics, Speech, and Signal Processing Workshops. Seoul: IEEE. 2024;351-5. https://doiorg.publicaciones.saludcastillayleon.es/10.1109/ICASSPW62465.2024.10626529

  5. Zheng G, Zheng W, Zhang Y, Wang J, Chen M, Wang Y, Cai T, Yao Z, Hu B. An attention-based multi-modal MRI fusion model for major depressive disorder diagnosis. J Neural Eng. 2023;20(6):066005. https://doiorg.publicaciones.saludcastillayleon.es/10.1088/1741-2552/ad038c

    Article  Google Scholar 

  6. Li H, Zhang R, Lee YC, Kraut RE, Mohr DC. Systematic review and meta-analysis of AI-based conversational agents for promoting mental health and well-being. Npj Digit Med. 2023;6:236. https://doiorg.publicaciones.saludcastillayleon.es/10.1038/s41746-023-00979-5

    Article  PubMed Central  PubMed  Google Scholar 

  7. Wang J, Li T, Sun Q, Guo Y, Yu J, Yao Z, Hou N, Hu B. Automatic diagnosis of major depressive disorder using a high- and low-frequency feature fusion framework. Brain Sci. 2023;13(11):1590. https://doiorg.publicaciones.saludcastillayleon.es/10.3390/brainsci13111590

    Article  PubMed Central  PubMed  Google Scholar 

  8. Poalelungi DG, Musat CL, Fulga A, Neagu M, Neagu AI, Piraianu AI, Fulga I. Advancing patient care: how artificial intelligence is transforming healthcare. J Pers Med. 2023;13(8):1214. https://doiorg.publicaciones.saludcastillayleon.es/10.3390/jpm13081214

    Article  PubMed Central  PubMed  Google Scholar 

  9. Ooi PB, Wilkinson G. Enhancing ethical codes with artificial intelligence governance– a growing necessity for the adoption of generative AI in counselling. Br J Guid Couns. 2024;1–15. https://doiorg.publicaciones.saludcastillayleon.es/10.1080/03069885.2024.2373180

  10. Haber Y, Levkovich I, Hadar-Shoval D, Elyoseph Z. The artificial third: a broad view of the effects of introducing generative artificial intelligence on psychotherapy. JMIR Ment Health. 2024;11:e54781. https://doiorg.publicaciones.saludcastillayleon.es/10.2196/54781

    Article  PubMed  Google Scholar 

  11. Ronneberg CR, Lv N, Ajilore OA, Kannampallil T, Smyth J, Kumar V, Barve A, Garcia C, Dosala S, Wittels N, Xiao L, Aborisade G, Zhang A, Tang Z, Johnson J, Ma J. Study of a PST-trained voice-enabled artificial intelligence counselor for adults with emotional distress (SPEAC-2): design and methods. Contemp Clin Trials. 2024;142:107574. https://doiorg.publicaciones.saludcastillayleon.es/10.1016/j.cct.2024.107574

    Article  PubMed  Google Scholar 

  12. Alimour SA, Alnono E, Aljasmi S, El Farran H, Alqawasmi AA, Alrabeei MM, Shwedeh F, Aburayya A. The quality traits of artificial intelligence operations in predicting mental healthcare professionals’ perceptions: a case study in the psychotherapy division. J Auton Intell. 2024;7(4):1–17. https://doiorg.publicaciones.saludcastillayleon.es/10.32629/jai.v7i4.1438

    Article  Google Scholar 

  13. Plakun EM. Psychotherapy and artificial intelligence. J Psychiatr Pract. 2023;29(6):476–9. https://doiorg.publicaciones.saludcastillayleon.es/10.1097/PRA.0000000000000748

    Article  PubMed  Google Scholar 

  14. Beg MJ, Verma M, Vishvak Chanthar KMM, Verma MK. Artificial intelligence for psychotherapy: a review of the current state and future directions. Indian J Psychol Med. 2024. https://doiorg.publicaciones.saludcastillayleon.es/10.1177/02537176241260819

    Article  PubMed Central  PubMed  Google Scholar 

  15. Huțul TD, Popescu A, Karner-Huțuleac A, Holman AC, Huțul A. Who’s willing to lay on the virtual couch? Attitudes, anthropomorphism and need for human interaction as factors of intentions to use chatbots for psychotherapy. Couns Psychother Res. 2024;00:1–10. https://doiorg.publicaciones.saludcastillayleon.es/10.1002/capr.12794

    Article  Google Scholar 

  16. Klooster D. Evaluating robustness of brain stimulation biomarkers for depression: a systematic review of magnetic resonance imaging and electroencephalography studies. Biol Psychiatry. 2024;95(6):553–63. https://doiorg.publicaciones.saludcastillayleon.es/10.1016/j.biopsych.2023.09.009

    Article  PubMed  Google Scholar 

  17. Weizenbaum J. ELIZA– A computer program for the study of natural language communication between man and machine. Commun ACM. 1966;9(1):36–45. https://doiorg.publicaciones.saludcastillayleon.es/10.1145/365153.365168

    Article  Google Scholar 

  18. Alhuwaydi AM. Exploring the role of artificial intelligence in mental healthcare: current trends and future directions– A narrative review for a comprehensive insight. Risk Manag Healthc Policy. 2024;17:1339–48. https://doiorg.publicaciones.saludcastillayleon.es/10.2147/RMHP.S461562

    Article  PubMed Central  PubMed  Google Scholar 

  19. Bocheliuk V, Panov M, Nechyporenko V, Pozdniakova-Kyrbiatieva E. Formation of mental set of subjects of higher education institution for management by the correction game method. Astra Salvens. 2019;7(13):275–88.

    Google Scholar 

  20. Borna S, Maniaci MJ, Haider CR, Gomez-Cabello CA, Pressman SM, Haider SA, Demaerschalk BM, Cowart JB, Forte AJ. Artificial intelligence support for informal patient caregivers: a systematic review. Bioengineering. 2024;11(5):483. https://doiorg.publicaciones.saludcastillayleon.es/10.3390/bioengineering11050483

    Article  PubMed Central  PubMed  Google Scholar 

  21. Durden E, Pirner MC, Rapoport SJ, Williams A, Robinson A, Forman-Hoffman VL. Changes in stress, burnout, and resilience associated with an 8-week intervention with relational agent Woebot. Internet Interv. 2023;33:100637. https://doiorg.publicaciones.saludcastillayleon.es/10.1016/j.invent.2023.100637

    Article  PubMed Central  PubMed  Google Scholar 

  22. Tahan M, Saleem T. An overview of the application of artificial intelligence in psychotherapy: a systematic review. Neuropsychiatr Neuropsychol. 2024;19(1):28–39. https://doiorg.publicaciones.saludcastillayleon.es/10.5114/nan.2024.141887

    Article  Google Scholar 

  23. Kryshtanovych M, Kryshtanovych S, Stepanenko L, Brodiuk Y, Fast A. Methodological approach to determining the main factors for the development of creative thinking in students of creative professions. Creat Stud. 2021;14(2):391–404. https://doiorg.publicaciones.saludcastillayleon.es/10.3846/cs.2021.14806

    Article  Google Scholar 

  24. Bocheliuk V, Shevtsov A, Pozdniakova O, Panov M, Zhadlenko I. Effectiveness of Psycho-Correctional methods and technologies in Work with children who have autism. Syst Rev J Intel Disabil– Diagn Treat. 2023;11(1):10–20. https://doiorg.publicaciones.saludcastillayleon.es/10.6000/2292-2598.2023.11.01.2

    Article  Google Scholar 

  25. Sawalha J, Yousefnezhad M, Shah Z, Brown MRG, Greenshaw AJ, Greiner R. Detecting presence of PTSD using sentiment analysis from text data. Front Psychiatry. 2022;12:811392. https://doiorg.publicaciones.saludcastillayleon.es/10.3389/fpsyt.2021.811392

    Article  PubMed Central  PubMed  Google Scholar 

  26. Efremov A. The fear primacy hypothesis in the structure of Emotional States: a systematic literature review. Psychol Rep. 2025. https://doiorg.publicaciones.saludcastillayleon.es/10.1177/00332941241313106

    Article  PubMed  Google Scholar 

  27. Romaniuk O. Expression and interpretation of attraction and interpersonal intimacy: a comparative study of female nonverbal behaviour. Anal Univer Craiova - Ser Stiinte Filolog Ling. 2021;43(1–2):220–37.

    Google Scholar 

  28. Romaniuk O, Yavorska L. Complimenting behaviour in young adults’ first impression scripts. Anal Univer Craiova - Ser Stiinte Filolog Ling. 2022;44(1–2):168–87. https://doiorg.publicaciones.saludcastillayleon.es/10.52846/aucssflingv.v44i1-2.58

    Article  Google Scholar 

  29. DeVault D, Artstein R, Benn G, Dey T, Fast E, Gainer A, Georgila K, Gratch J, Hartholt A, Lhommet M, Lucas G, Marsella S, Morbini F, Nazarian A, Scherer S, Stratou G, Suri A, Traum D, Wood R, Xu Y, Rizzo A, Morency LP. SimSensei Kiosk: A virtual human interviewer for healthcare decision support. In: Proceedings of the 2014 International Conference on Autonomous Agents and Multi-Agent Systems. Richland: International Foundation for Autonomous Agents and Multiagent Systems. 2014;1061-8. https://doiorg.publicaciones.saludcastillayleon.es/10.5555/2615731.2617415

  30. Shaituro O, Holodnyk Y, Pevko S, Khan O. Legal awareness as a factor in preventing illegal (deviant) behavior. Dialog Hum Soc Sci. 2025;3(1):9–16. https://doiorg.publicaciones.saludcastillayleon.es/10.71261/dhss/3.1.9.16

    Article  Google Scholar 

  31. Shamne A, Dotsevych T, Akimova A. Psychosemantic peculiarities of promotional videos perception. Psycholing. 2019;25(1):384–408. https://doiorg.publicaciones.saludcastillayleon.es/10.31470/2309-1797-2019-25-1-384-408

    Article  Google Scholar 

  32. Sokól-Szawlowska M. Paternal perinatal depression: cases. Psychol Polska. 2020;54(6):1123–35. https://doiorg.publicaciones.saludcastillayleon.es/10.12740/PP/110610

    Article  Google Scholar 

  33. Hoffman V, Flom M, Mariano TY, Chiauzzi E, Williams A, Kirvin-Quamme A, Pajarito S, Durden E, Perski O. User engagement clusters of an 8-week digital mental health intervention guided by a relational agent (woebot): exploratory study. J Med Internet Res. 2023;25:e47198. https://doiorg.publicaciones.saludcastillayleon.es/10.2196/47198

    Article  PubMed Central  PubMed  Google Scholar 

  34. Sedlakova J, Trachsel M. Conversational artificial intelligence in psychotherapy: a new therapeutic tool or agent? Am J Bioeth. 2022;23(5):4–13. https://doiorg.publicaciones.saludcastillayleon.es/10.1080/15265161.2022.2048739

    Article  PubMed  Google Scholar 

  35. Verma A, Jain P, Kumar T. An effective depression diagnostic system using speech signal analysis through deep learning methods. Int J Artif Intell Tools. 2023;32(2):2340004. https://doiorg.publicaciones.saludcastillayleon.es/10.1142/S0218213023400043

    Article  Google Scholar 

  36. Wahid N, Sajid IA. Factors contributing to On-campus drug use among students: a study of the boys’ hostels in the University of Peshawar. Rev Law Soc Sci. 2025;3(1):23–38. https://doiorg.publicaciones.saludcastillayleon.es/10.71261/rlss/3.1.23.38

    Article  Google Scholar 

  37. Green EP, Lai Y, Pearson N, Rajasekharan S, Rauws M, Joerin A, Kwobah E, Musyimi C, Jones RM, Bhat C, Mulinge A, Puffer ES. Expanding access to perinatal depression treatment in Kenya through automated psychological support: development and usability study. JMIR Form Res. 2020;4(10):e17895. https://doiorg.publicaciones.saludcastillayleon.es/10.2196/17895

    Article  PubMed Central  PubMed  Google Scholar 

  38. Schaeuffele C, Meine LE, Schulz A, Weber MC, Moser A, Paersch C, Recher D, Boettcher J, Renneberg B, Flückiger C, Kleim B. A systematic review and meta-analysis of transdiagnostic cognitive behavioural therapies for emotional disorders. Nat Hum Behav. 2024;8:493–509. https://doiorg.publicaciones.saludcastillayleon.es/10.1038/s41562-023-01787-3

    Article  PubMed Central  PubMed  Google Scholar 

  39. American Psychological Association. Summary of the clinical practice guideline for the treatment of depression across three age cohorts. Am Psychol. 2022;77(6):770–80. https://doiorg.publicaciones.saludcastillayleon.es/10.1037/amp0000904

    Article  Google Scholar 

  40. Lindqvist Bagge AS, Lekander M, Olofsson Bagge R, Carlander A. Mental health, stress, and well-being measured before (2019) and during (2020) COVID-19: A Swedish socioeconomic population-based study. Psychol Health. 2023;1–18. https://doiorg.publicaciones.saludcastillayleon.es/10.1080/08870446.2023.2257747

  41. Dingle GA, Han R, Alhadad SS, Beckman E, Bentley SV, Gomersall SR, Hides L, Maccallum F, McKimmie BM, Rossa K, Smith SS, Walter ZC, Williams E, Wright O. Data from four consecutive cohorts of students in Australia (2019–2022) show the impact of the COVID-19 pandemic on domestic and international university students’ mental health. Aust N Z J Psychiatry. 2024;58(6):528–36. https://doiorg.publicaciones.saludcastillayleon.es/10.1177/00048674241233111

    Article  PubMed Central  PubMed  Google Scholar 

  42. Lee M, Jeong S, Kim CS, Yang YJ. Analysis of health behavior, mental health, and nutritional status among Korean adolescents before and after COVID-19 outbreak: based on the 2019–2020 Korea National Health and Nutrition Examination Survey. J Nutr Health. 2023;56(6):667–82. https://doiorg.publicaciones.saludcastillayleon.es/10.4163/jnh.2023.56.6.667

    Article  Google Scholar 

  43. Adepoju OE, Valdez MR. Trends in mental health utilization before and during the COVID-19 pandemic: federally qualified health centers as a case study. Popul Health Manag. 2023;26(3):143–8. https://doiorg.publicaciones.saludcastillayleon.es/10.1089/pop.2023.0006

    Article  PubMed  Google Scholar 

  44. Richards D. Artificial intelligence and psychotherapy: a counterpoint. Couns Psychother Res. 2024;1–6. https://doiorg.publicaciones.saludcastillayleon.es/10.1002/capr.12758

  45. Clark DM, Wild J, Warnock-Parkes E, Stott R, Grey N, Thew G, Ehlers A. More than doubling the clinical benefit of each hour of therapist time: a randomised controlled trial of internet cognitive therapy for social anxiety disorder. Psychol Med. 2023;53(11):5022–32. https://doiorg.publicaciones.saludcastillayleon.es/10.1017/S0033291722002008

    Article  PubMed  Google Scholar 

  46. Sufyan NS, Fadhel FH, Alkhathami SS, Mukhadi JYA. Artificial intelligence and social intelligence: preliminary comparison study between AI models and psychologists. Front Psychol. 2024;15:1353022. https://doiorg.publicaciones.saludcastillayleon.es/10.3389/fpsyg.2024.1353022

    Article  PubMed Central  PubMed  Google Scholar 

  47. Upadhyay DK, Mohapatra S, Singh NK. An early assessment of persistent depression disorder using machine learning algorithm. Multimed Tools Appl. 2023;83:49149–71. https://doiorg.publicaciones.saludcastillayleon.es/10.1007/s11042-023-17369-4

    Article  Google Scholar 

  48. Wang C. Application of MPP database and artificial intelligence system in online evaluation of college students’ mental health. Prev Med. 2023;173:107580. https://doiorg.publicaciones.saludcastillayleon.es/10.1016/j.ypmed.2023.107580

    Article  PubMed  Google Scholar 

  49. Aafjes-van Doorn K. Feasibility of artificial intelligence-based measurement in psychotherapy practice: patients’ and clinicians’ perspectives. Couns Psychother Res. 2024;1–11. https://doiorg.publicaciones.saludcastillayleon.es/10.1002/capr.12800

  50. Suso-Ribera C, Castilla D, Martínez-Borba V, Jaén I, Botella C, Baños RM, García-Palacios A. Technological interventions for pain management. Compr Clin Psychol. 2022;10:219–38. https://doiorg.publicaciones.saludcastillayleon.es/10.1016/b978-0-12-818697-8.00009-1

    Article  Google Scholar 

  51. Bhatt S. Digital mental health: role of artificial intelligence in psychotherapy. Ann Neurosci. 2024. https://doiorg.publicaciones.saludcastillayleon.es/10.1177/09727531231221612

    Article  PubMed Central  PubMed  Google Scholar 

Download references

Acknowledgements

Not applicable.

Funding

No funding was received for conducting this study.

Author information

Authors and Affiliations

Authors

Contributions

L.S.: designed the study, data collection and analysis, results interpretation, manuscript writing, and editing.

Corresponding author

Correspondence to Liana Spytska.

Ethics declarations

Ethics approval

All procedures were conducted in accordance with the principles outlined in the Helsinki Declaration. A study was approved by the Ethics Commission of Kyiv International University, No. 11293.

Informed consent

Informed consent was obtained from all individuals included in this study.

Consent for publication

Not applicable.

Competing interests

The author declares no competing interests.

Additional information

Publisher’s note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License, which permits any non-commercial use, sharing, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if you modified the licensed material. You do not have permission under this licence to share adapted material derived from this article or parts of it. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by-nc-nd/4.0/.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Spytska, L. The use of artificial intelligence in psychotherapy: development of intelligent therapeutic systems. BMC Psychol 13, 175 (2025). https://doiorg.publicaciones.saludcastillayleon.es/10.1186/s40359-025-02491-9

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doiorg.publicaciones.saludcastillayleon.es/10.1186/s40359-025-02491-9

Keywords