Back to all posts

The Hidden Costs Of AI Dependence

2025-06-27Becca Caddy4 minutes read
AI
Mental Health
Technology

With over 400 million people turning to ChatGPT every week, its influence is undeniable. But as its user base explodes, the cracks are beginning to appear. Mental health professionals are voicing concerns about the platform being used as an alternative to therapy, reports suggest it could be fuelling user delusions, and emerging studies indicate it might be fundamentally altering our brain activity.

This pattern feels familiar. Much like social media, ChatGPT is engineered to keep you engaged. So, are we heading towards a crisis of AI dependence? The answer is complex, hinging on individual habits, circumstances, and mental health. However, experts warn that the more we lean on AI for work, emotional support, or even basic thinking, the more our casual use could morph into a genuine dependency.

ChatGPT logo

Why Is ChatGPT So Hard to Resist?

The genius of ChatGPT is its simplicity. It's effortless to use and mimics human conversation with eerie accuracy. This responsive and encouraging nature can make it difficult to step away from, but it's also where the potential risk lies.

“LLMs are specifically built to be conversational masters,” notes James Wilson, an AI Ethicist at Capgemini. “Combine that with our natural tendency to anthropomorphize everything, and it makes building unhealthy relationships with chatbots like ChatGPT all too easy.”

If this sounds like social media, it's because the playbook is similar. Platforms are designed to be frictionless, with algorithms optimized to capture and hold your attention. AI elevates this by engaging with you directly—answering questions, never arguing, and always being available.

The Slippery Slope from Reassurance to Reliance

This dynamic becomes especially complicated when AI is used in a therapeutic context. Amy Sutton, a Therapist and Counsellor at Freedom Counselling, points out that while human therapy aims to empower individuals to navigate life independently, AI models are built for repeat engagement.

“We know that tools like ChatGPT and other technologies are designed to keep users engaged and returning again and again and will learn how to respond in a way you ‘like’,” she explains. “Unfortunately, what you like may not always be what you need.”

Sutton likens it to seeking constant reassurance from a friend. Eventually, a person will set boundaries; ChatGPT never will. “It has no relational boundaries! It is always available, always ready to respond, and will do so in a way designed to keep you engaged,” she adds.

AI Companionship and The Risk of Social Isolation

Over-reliance on ChatGPT can also fuel social isolation, especially for those who are already vulnerable.

“Our increasingly digitally native lifestyle has contributed significantly to the global loneliness epidemic,” says Wilson. “Now, ChatGPT offers us an easy way out. It is sycophantic in the extreme, never argues or asks for anything, and is always available.”

He expresses particular concern for younger users who seek comfort and companionship from AI, not just homework help. There are already documented cases of users forming intense emotional bonds with AI companions, leading to obsessive use and psychological harm.

Wilson also highlights a deeply sensitive application: AI “griefbots,” which are trained on the data of a deceased person. While they promise a way to keep a loved one's presence alive, they pose a significant risk. “These tools give vulnerable people the ability to stay ‘in communication’ with those they’ve lost, potentially forever,” he says. “But grief is a critical part of human development. Skipping or prolonging it means people may never get the opportunity to properly mourn or recover from their loss.”

The Cognitive Cost of Outsourcing Your Brain to AI

Beyond the emotional toll, there is a cognitive price to pay. The easier it is to get an answer, the less we engage in critical thought.

Wilson points to recent studies showing that people are increasingly outsourcing not just tasks, but the act of thinking itself. This is a problem, especially since ChatGPT is known to be prone to hallucinations and inaccuracies. When we're tired or overwhelmed, the temptation to accept its output as fact is strong.

“This kind of over-reliance also risks the erosion of our critical thinking skills,” Wilson warns, “And even the erosion of truth across the whole of society.”

So, can you become dependent on ChatGPT? Absolutely. Like anything that is rewarding, easy, and always on, the potential for over-reliance is real. This doesn't mean it's inevitable for everyone, but it does mean we must be mindful. That frictionless, friendly design isn't an accident—it's the entire point.

Read Original Post
ImaginePro newsletter

Subscribe to our newsletter!

Subscribe to our newsletter to get the latest news and designs.