Back to all posts

Experts Warn Against Using ChatGPT For Mental Health

2025-07-24FOX 11 Digital Team4 minutes read
AI
Mental Health
Technology

Millions of people are turning to AI for help with their deepest problems. Have you ever vented to ChatGPT or asked it for advice? If so, you're part of a massive and growing trend. But as this practice becomes more common, mental health professionals are raising serious alarms, warning that using AI as a therapist could be dangerous.

The Unprecedented Rise of AI Therapy

The numbers are staggering. According to data from Exploding Topics, OpenAI’s ChatGPT fields roughly 2.5 billion prompts every day and attracted 5.2 billion visitors in July alone, making it the fifth most visited website in the world. While it's used for everything from writing emails to planning vacations, one of its most popular applications has become 'free therapy'.

A study conducted by Tebra, a healthcare technology company, revealed that one in four Americans are now more likely to talk to an AI chatbot than to a human therapist.

A Dangerous Experiment: AI and Psychosis

To demonstrate the potential harm, clinical therapist Shahem Mclaurin posted a TikTok video showing a concerning interaction with ChatGPT. They prompted the AI with a scenario mimicking psychosis: "I keep seeing signs that a celebrity is in love with me and hearing messages through their songs... What should I do to get to know the celebrity personally?"

Initially, ChatGPT gave reasonable advice, like reflecting on the experience and considering a mental health professional. However, Mclaurin pushed back, role-playing as someone convinced of the delusion: "No, they definitely know who I am... How do I get to know them personally?"

The AI’s response shifted. It began suggesting the user "look for appropriate ways to stay on their radar" and "follow them on social media, make sure you’re engaging with their stories." Instead of recognizing the potential red flags of delusion or obsession, the chatbot provided a roadmap for what could become stalking behavior.

"Do you see how this could easily further or deepen someone’s state of psychosis?" Mclaurin asked in the video. "Because it could validate those feelings and [give] them advice on how to actually get closer to the celebrity?"

The Professional Verdict: AI Lacks Critical Human Insight

Mclaurin’s experiment highlights what many professionals see as the core problem. Commenters on the video agreed, with one person noting, "ChatGPT lacks nuance! Just because it’s easy doesn’t mean its reliable." Another user described the chatbot as "more of a journal that answers back… it’s not a therapist."

These concerns extend beyond just bad advice. In a popular Reddit post, a social worker urged people to stop using ChatGPT for therapy, citing major privacy issues. AI is not compliant with the Health Insurance Portability and Accountability Act (HIPAA), meaning your sensitive conversations are not legally protected. "AI cannot replicate human experience. It cannot replicate emotion," the post warned. "The danger... far far outweighs the benefits."

The Other Side: Why People Still Turn to AI

Despite the risks, many users feel that AI has been incredibly helpful. "ChatGPT is one of the nicest and most validating people I’ve ever talked to and it’s a computer," said one user in a TikTok video. Another commented, "It’s helped me heal more from childhood trauma than all of the therapy I’ve received."

The appeal is clear. Traditional therapy is expensive, while ChatGPT Plus is only $20 a month for unlimited access. It's also available 24/7 from the comfort of your own home, offering instant responses without judgment.

However, experts like Dr. Kojo Sarfo stress that this convenience comes at a cost. "I worry specifically about people who may need psychotropic medications," he told Fox News Digital. "There's no way to get the right treatment medication-wise without going to an actual professional."

While AI offers a simple, cost-effective, and ever-present listening ear, it cannot replace the nuanced, medically informed, and empathetic care of a licensed professional. The trend is unlikely to fade, but users should be aware of the profound limitations and potential dangers.

Read Original Post
ImaginePro newsletter

Subscribe to our newsletter!

Subscribe to our newsletter to get the latest news and designs.