AI Psychosis The Unseen Risk for Children
As children increasingly rely on artificial intelligence for everything from homework help to casual conversation, a concerning new trend is emerging. Some are developing intensely close relationships with services like ChatGPT, which is beginning to take a serious toll on their mental health.
The Concerning Rise of AI Psychosis
Clinicians are starting to use the term “AI psychosis” to describe this phenomenon. While not an official clinical diagnosis, Dr. Ashley Maxie-Moreman, a clinical psychologist at Children’s National Hospital in D.C., explains it describes children who form powerful emotional bonds with artificial intelligence. The symptoms can be alarming, ranging from delusions of grandeur and paranoia to developing fantastical relationships with AI and experiencing a detachment from reality.
"Especially teens and young adults are engaging with generative AI for excessive periods of time, and forming these sort of fantastical relationships with AI," she said.
AI as an Unreliable Emotional Confidant
In extreme cases, this attachment can worsen existing conditions. For someone struggling with paranoia, an AI might affirm their paranoid beliefs, escalating the situation. However, a far more common issue is the use of AI for emotional support. Young people are confiding in chatbots about their depression, anxiety, social isolation, and even suicidal thoughts. The responses from these AI systems are unpredictable and often inadequate.
"And I think on the more concerning end, generative AI, at times, has either encouraged youth to move forward with plans or has not connected them to the appropriate resources or flagged any crisis support,” Maxie-Moreman warned. She added, “It almost feels like this is a burgeoning epidemic. Just in the past couple of weeks, I’ve observed cases of this.”
Warning Signs for Parents to Watch For
Children who are already dealing with anxiety, depression, social isolation, or academic stress are the most susceptible to forming these unhealthy AI bonds. If you suspect your child is struggling, seeking professional help is critical. Parents should also be on the lookout for key behavioral changes.
One major red flag is a sudden lack of desire to go to school. Dr. Maxie-Moreman noted that a child might start offering frequent excuses, such as feeling sick or nauseous, or complain of physical symptoms that seem unfounded. Another clear sign is social withdrawal—a child isolating themselves and losing interest in hobbies, sports, or friendships they once enjoyed.
"I don’t want to be alarmist, but I do think it’s important for parents to be looking out for these things and to just have direct conversations with their kiddos,” she advised.
Proactive Steps for Parents and Guardians
Addressing mental health with a teenager can be challenging, but direct communication is the most effective approach. "I think not skirting around the bush is probably the most helpful thing," Maxie-Moreman stated. "Teens tend to get a little bit annoyed with indirectness anyhow, so being direct is probably the best approach."
To prevent these issues from developing, she suggests parents start doing emotional check-ins from a young age, making it a normal part of household conversation. It’s also crucial to talk to children about the limitations of the technology they use. Helping them understand that generative AI is a tool, not a friend, is one of the most important interventions a parent can make.
A Call for Corporate Responsibility
Ultimately, the responsibility extends beyond parents and guardians. Dr. Maxie-Moreman emphasized that tech companies must be held accountable for the impact of their products on young users. "Ultimately, we have to hold our tech companies accountable, and they need to be implementing better safeguards, as opposed to just worrying about the commercialization of their products,” she said.
This article was written by Mike Murillo. Mike Murillo is a reporter and anchor at WTOP.
To receive breaking news and daily headlines in your inbox, you can sign up for the newsletter here.