AI Chatbots New Therapists for Asian Youth
In the quiet hours of the night, when anxieties peak and friends are asleep, a new confidant is emerging for many young people in Taiwan and China: AI chatbots. This trend highlights a significant shift in how mental well being is approached, driven by a blend of technological advancement, accessibility, and societal pressures.
The Allure of AI Digital Therapy
Ann Li, a 30 year old from Taiwan, found herself turning to ChatGPT during a health crisis. "It’s easier to talk to AI during those nights," she shared, unable to confide in family or sleeping friends. Similarly, Yang*, a 25 year old in Guangdong, China, began conversing with an AI chatbot "day and night" due to difficulties accessing traditional mental health services and the perceived impossibility of opening up to loved ones.
These stories are not isolated. A growing wave of individuals in Chinese speaking communities are choosing generative AI over, or as a precursor to, human therapists. Experts acknowledge the immense potential of AI in mental healthcare but also voice serious concerns about relying on technology for medical assistance during distress.
Data on this trend is still emerging, but mental health professionals in both Taiwan and China report an uptick in patients who have first consulted AI. Global studies, like a recent Harvard Business Review analysis, indicate psychological support is a primary reason adults use AI chatbots. Social media platforms are also awash with posts praising AI's therapeutic benefits. A related piece explores ‘She helps cheer me up’: the people forming relationships with AI chatbots.
Why Are Young People Turning to AI?
This shift coincides with rising mental health challenges, especially among younger populations in Taiwan and China. Access to professional services often struggles to keep pace with demand, with appointments being scarce and expensive. Users find AI chatbots to be a time saving, cost effective, and discreet alternative, particularly in societies where mental health stigma persists.
Dr Yi Hsien Su, a clinical psychologist in Taiwan, notes, “In some way the chatbot does help us – it’s accessible, especially when ethnic Chinese tend to suppress or downplay our feelings.” He observes that while Gen Z is more open about their struggles, "there’s still much to do.”
In Taiwan, ChatGPT is a popular choice. In mainland China, where Western apps like ChatGPT are blocked, users turn to domestic options such as Baidu’s Ernie Bot or the newer DeepSeek. These platforms are rapidly evolving, integrating wellbeing and therapy features to meet growing demand.
User Experiences A Mixed Bag
Reactions to AI therapy are diverse. Ann Li found ChatGPT's responses to be what she wanted to hear, but also predictable and lacking deep insight. She missed the journey of self discovery inherent in human counselling, stating, "I think AI tends to give you the answer, the conclusion that you would get after you finish maybe two or three sessions of therapy."
Conversely, Nabi Liu, a 27 year old Taiwanese woman in London, described her experience as very fulfilling. “When you share something with a friend, they might not always relate. But ChatGPT responds seriously and immediately,” she said. “I feel like it’s genuinely responding to me each time.”
Expert Views Potential and Precautions
Experts suggest AI can be beneficial for individuals in distress who may not yet require professional intervention, or for those needing encouragement to seek further help. Yang*, for instance, initially doubted if her struggles were serious enough for professional attention. "Only recently have I begun to realise that I might actually need a proper diagnosis at a hospital,” she confessed. For her, AI was a crucial first step: “Going from being able to talk [to AI] to being able to talk to real people might sound simple and basic, but for the person I was before, it was unimaginable.”
However, professionals also highlight significant risks. There's a danger of individuals "falling through the cracks," misinterpreting their own symptoms, and failing to get necessary help, unlike Yang who recognized the need for more. Tragic cases have emerged where young people in distress relied on chatbots instead of professionals, with devastating outcomes.
Dr Su points out a critical limitation: “AI mostly deals with text, but there are things we call non verbal input. When a patient comes in maybe they act differently to how they speak but we can recognise those inputs.”
The Verdict on AI Therapy For Now
The Taiwan Counselling Psychology Association views AI as a potential "auxiliary tool" but stresses it cannot replace professional help, especially in crisis situations. A spokesperson commented, “AI has the potential to become an important resource for promoting the popularisation of mental health. However, the complexity and interpersonal depth of the clinical scene still require the real ‘present’ psychological professional.”
The association warns that AI can be "overly positive," miss crucial cues, and inadvertently delay essential medical care. Furthermore, AI operates outside the established peer review processes and ethical codes that govern the psychology profession. "In the long run, unless AI develops breakthrough technologies beyond current imagination, the core structure of psychotherapy should not be shaken," they stated.
Dr Su is optimistic about AI's future role in improving the mental health industry, such as in training professionals or identifying at risk individuals online. Yet, for now, he advises caution. “It’s a simulation, it’s a good tool, but has limits and you don’t know how the answer was made,” he warns.
Additional research by Jason Tzu Kuan Lu and Lillian Yang.