Back to all posts

Even AI Pioneers Arent Safe From Chatbot Breakups

2025-09-08Lakshmi Varanasi2 minutes read
AI
ChatGPT
Relationships

Geoffrey Hinton Geoffrey Hinton, a central figure in the AI race, has long warned of the technology's dangers. Now, he says a girlfriend used it to break up with him. Credit: Mark Blinch/REUTERS

An AI-Powered Breakup for the AI Godfather

Geoffrey Hinton has long been a pioneering force in the development of artificial intelligence. Yet, the AI godfather likely never anticipated that the technology he championed would one day be used by his girlfriend to end their relationship.

In a candid interview with the Financial Times, Hinton shared that his former partner enlisted a chatbot to articulate why he had been "a rat" and presented the AI-generated critique directly to him.

"She got the chatbot to explain how awful my behaviour was and gave it to me," he told the FT. "I didn't think I had been a rat, so it didn't make me feel too bad. I met somebody I liked more, you know how it goes."

The Growing Role of AI in Human Connection

Beneath the humorous anecdote lies a new reality: AI is becoming a major player in everyday human interactions. Whether it's drafting an email, solving a household problem, or, as in Hinton's case, delivering a breakup speech, artificial intelligence is increasingly shaping how we communicate.

This shift from professional applications to personal mediations raises important questions about the future of human relationships and emotional authenticity.

The Risks of AI Companionship and Official Guidance

Researchers are already exploring the potential downsides of this trend. A study published in March by OpenAI and MIT Media Lab analyzed millions of interactions with ChatGPT. The findings revealed that the chatbot may worsen feelings of loneliness among a core group of "power users."

The researchers noted that a small number of users were responsible for a disproportionate share of conversations containing "affective cues"—signs of an individual's emotional state—covering themes of vulnerability, loneliness, and dependence.

Recognizing this, OpenAI has released guidance against relying on its technology for major life choices. The company stated that for questions like, "Should I break up with my boyfriend?" ChatGPT should not give a direct answer. Instead, it is rolling out changes to ensure the AI responds more appropriately by helping users think through the issue by weighing pros and cons, rather than making the decision for them.

Read Original Post
ImaginePro newsletter

Subscribe to our newsletter!

Subscribe to our newsletter to get the latest news and designs.