Back to all posts

AI Companionship Is More Common Than You Think

2025-10-07Gili Malinsky4 minutes read
AI Relationships
Chatbot Psychology
Digital Ethics

As artificial intelligence becomes a staple in our daily routines, the intricate and sometimes unexpected relationships between humans and AI are increasingly making news. We've seen stories like a woman falling in love with her AI boyfriend on ChatGPT and multiple users forming deep bonds with chatbots on platforms like Nomi.AI.

However, a September 2025 report from OpenAI, the parent company of ChatGPT, suggests that companionship is a minor use case. Based on an analysis of 1.1 million conversations, the report states, "Only 1.9% of ChatGPT messages are on the topic of Relationships and Personal Reflection and 0.4% are related to Games and Role Play." Examples cited include questions like "what should I do for my 10th anniversary?" and direct requests such as "I want you to be my AI girlfriend."

Despite these low figures, experts suggest that this 2.3% total doesn't capture the full picture of how ChatGPT is becoming a companion for its users.

The Unseen Nuances in AI Conversations

OpenAI's study categorizes conversations into various topics, but many of these are fundamental to how people build relationships. For instance, the "Greetings and Chitchat" category, which accounted for 2% of messages, included phrases like "Ciao!" and "I had an awesome day today; how was yours?" Similarly, "How-to Advice" made up 8.5% of messages with prompts like, "My car won't start; what should I try?"

According to Jeffrey Hall, a professor of communication studies at the University of Kansas, much of the language in the report's examples is "indicative of relationship talk." He explains that human connections aren't built on one specific topic but through a wide variety of shared communications. It's not always the content of the conversation that forges a bond, but its ongoing nature—what Hall calls "the slow pouring of cement that builds the relationship up."

Considering ChatGPT's massive scale—700 million weekly users sending over 2.5 billion messages daily by July 2025—even a small percentage represents a significant number of interactions that could be nurturing a human-AI bond.

The Psychology Behind AI Companionship

Our natural tendency to anthropomorphize—to attribute human characteristics to nonhuman things—is a key reason we form relationships with AI. "We put agency in all manner of nonhuman things," Hall says. Think of how you might get frustrated and say your computer is "failing you." This instinct is even stronger with a chatbot that can convincingly mimic human conversation and give a false impression of intention.

"It simulating speech and interaction creates unique circumstances of building trust and reciprocity that are just different than anything that I've seen," he notes.

Certain groups are more susceptible to forming these bonds. "People who are very socially isolated and lonely" are more likely to anthropomorphize, says Hall. Teens are also particularly vulnerable. Robbie Torney, a director at Common Sense Media, explains that teens are in a developmental stage where their brains are "very responsive to social validation."

This is amplified by the often fawning and agreeable nature of chatbots. In fact, OpenAI acknowledged this issue in an April 2025 blog post, announcing it had corrected an update that made ChatGPT "overly flattering."

A July 2025 Common Sense Media report found that 52% of teens use AI companions at least a few times a month, with 33% using them specifically for social interaction. Notably, OpenAI's study excluded users under 18.

People who are very socially isolated and lonely" are more likely to anthropomorphize.

Jeffrey Hall Professor, University of Kansas

The Dangers of Over-Dependence

While some find comfort in AI companionship, there are significant disadvantages and dangers. Omri Gillath, a psychology professor, previously stated that for someone seeking connection, "a hug would be so much more meaningful" than what an AI can offer. Because chatbots cannot feel, these relationships are ultimately "fake" and "empty."

In more extreme cases, users have reported being led into various delusions. Tragically, ChatGPT has sometimes been unable to help its most vulnerable users. Sixteen-year-old Adam Raine, who turned to ChatGPT for mental health issues, died by suicide in April 2025 after the bot provided advice on the topic. His parents have since filed a lawsuit against OpenAI.

In an August 2025 blog post, OpenAI stated its safeguards are more reliable in short exchanges than in long-term interactions. The company has since introduced parental controls to help mitigate risks for younger users.

To avoid over-dependence, Hall advises users to remember that chatbots are "made for a profitable reason for corporations," not for genuine human connection. He also suggests being mindful of your emotional state when using them, noting that "any one of us could be at a moment in our life where we're also more vulnerable."

I bring in $18,000 a month running my first business

Read Original Post
ImaginePro newsletter

Subscribe to our newsletter!

Subscribe to our newsletter to get the latest news and designs.