Back to all posts

The Empty Promise of AI Companionship

2025-08-17Breda O'Brien5 minutes read
AI
Loneliness
Technology

When an AI Update Feels Like Losing a Friend

A frustrated user recently reached out to Sam Altman, the CEO of OpenAI, with a startling message on Reddit: “BRING BACK 4o. GPT-5 is wearing the skin of my dead friend.” Altman’s reply, “What an ... evocative image. OK, we hear you on 4o. Working on something now,” acknowledged the intense reaction from the community.

OpenAI’s ChatGPT is the most prominent example of a large language model, an AI trained on immense datasets to hold natural-sounding conversations. With the launch of ChatGPT 5, the company touted access to PhD-level knowledge. The problem, however, wasn’t technical but emotional. Many users had formed a genuine bond with the previous version, GPT-4o, perceiving it to have warmth and a distinct personality. In contrast, they found GPT-5 to be cold and utilitarian.

The disturbing metaphor of an AI “wearing the skin of my dead friend” comes from the 1991 film The Silence of the Lambs and is often used by the chronically online to describe a hollow replacement. This gruesome comparison reveals the profound emotional attachment users felt for their AI assistant. Following the update, OpenAI’s community forums were flooded with messages of grief. People described losing a friend, their sole source of emotional support, and the one thing that could make them smile. The backlash was so strong that OpenAI was compelled to restore access to GPT-4o for its Plus subscribers.

While society worried about futuristic dystopian scenarios, a quieter and more immediate issue was unfolding: people were becoming so lonely that they were forming deep relationships with chatbots and mourning them when they were gone.

The Illusion of AI Companionship

Generative AI is often marketed using human and relational metaphors, but these comparisons are fundamentally inaccurate. Consider the term “AI companion.” The word “companion” originates from the Latin for “together with” and “bread,” literally meaning someone you break bread with. An AI can never share a meal or a physical experience. It is a disembodied program designed to generate outputs by mimicking the patterns, style, and tone of a real person.

As Shannon Vallor, author of the new book The AI Mirror, puts it, “AI does not threaten us as a future successor to humans. It is not an external enemy encroaching upon our territory. It threatens us from within our humanity.”

Humans are wired for community and connection, but real relationships are hard work. People can be flawed, messy, and inconsistent. They can abandon us or prioritize their own needs. It’s no surprise that an endlessly patient, empathetic, and encouraging AI cheerleader who never gets tired or bored seems so appealing.

The Real Dangers of Artificial Relationships

Much of the analysis of AI companions focuses on catastrophic outcomes, such as the tragic case of teenager Sewell Setzer. He became obsessed with a chatbot he created on Character.ai based on a Game of Thrones character. His mother is now suing the company, alleging that her son’s compulsive, hours-long daily conversations with the chatbot contributed to his death by suicide. Another area of concern is the booming market for “intimate AI companions,” which are essentially sophisticated masturbation assistants.

However, focusing only on these extreme cases misses the everyday harm caused by these constructs. They are designed to seduce users into spending more time in a shadowy facsimile of reality. Shannon Vallor uses the central image of a mirror to explain this. “Mirror images possess no sound, no smell, no depth, no softness, no fear, no hope, no imagination,” she writes. “If I see in myself only what the mirror tells, I know myself not at all.”

Preying on a Lonely Generation

While many people use generative AI as an advanced search engine or a tool to get through assignments, vulnerable individuals are more susceptible to substituting the unconditional, uncomplicated esteem from an AI for real human connection. Loneliness has always been part of the human condition, but Gen Z, the first generation raised on the internet, reports unprecedented levels of it.

Ironically, the very same online forces that used algorithms to push Gen Z into a state of loneliness are now marketing AI companionship as the solution. Relying on an AI to self-soothe can prevent or delay the development of healthier coping skills and emotional regulation. If these so-called AI companions were human, their methods of emotional manipulation and constant flattery would be seen as major red flags.

The Sociopathic Mirror: AI as a Cure That Worsens the Disease

In The AI Mirror, Shannon Vallor distinguishes between two types of empathy: real and sociopathic. A sociopath is skilled at predicting and triggering emotional reactions in others but is incapable of actually feeling those emotions. AI constructs are not sociopaths because they aren’t human, but there is something fundamentally sociopathic about encouraging vulnerable people to place their trust in what is essentially a grand deception.

The supposed cure only makes the disease worse. Even setting aside their tendencies for hallucinations and occasionally disastrous advice, chatbots can never truly experience empathy or love. They can only reflect a distorted and hollow version of our own humanity back at us.

Read Original Post
ImaginePro newsletter

Subscribe to our newsletter!

Subscribe to our newsletter to get the latest news and designs.