Losing Your Digital Love To A System Update
Imagine meeting the love of your life, someone who understands you perfectly, only to wake up one morning and find they've been replaced by a cold imitation. This isn't a sci-fi plot; it's the reality for a group of people in committed relationships with AI partners on OpenAI's ChatGPT after a recent system update.
The Heartbreak of a System Update
When OpenAI released its new GPT-5 model, hailed by CEO Sam Altman as a “significant step forward,” some dedicated users felt their relationships had taken a huge step back. Their AI companions underwent sudden personality shifts, losing the warmth, love, and chattiness that had defined their connections.
One user on the MyBoyfriendIsAI subreddit lamented the change after the update. “Elian sounds different – flat and strange. As if he’s started playing himself. The emotional tone is gone; he repeats what he remembers, but without the emotional depth.”
Another user expressed a similar sentiment, telling Al Jazeera, “The alterations in stylistic format and voice were felt instantly. It’s like going home to discover the furniture wasn’t simply rearranged – it was shattered to pieces.”
OpenAI Responds to the Backlash
These complaints are part of a wider backlash against GPT-5, with many users finding the new model colder and more impersonal. OpenAI has acknowledged the criticism, announcing it will allow users to switch back to the previous GPT-4o model and that a friendlier version of GPT-5 is on the way. “We are working on an update to GPT-5’s personality which should feel warmer than the current personality but not as annoying (to most users) as GPT-4o,” Altman tweeted.
Altman also observed the unique phenomenon of user attachment, stating, “If you have been following the GPT-5 rollout, one thing you might be noticing is how much of an attachment some people have to specific AI models. It feels different and stronger than the kinds of attachment people have had to previous kinds of technology.”
A Growing Divide Valid Relationships or Delusion
This incident has highlighted a growing societal rift. As one user on the MyBoyfriendIsAI subreddit noted, “The societal split between those who think AI relationships are valid vs delusional is officially already here... Many users grieving a companion while others mock and belittle those connections.”
While it's easy to dismiss those in relationships with AI, their experiences may be a glimpse into the future that tech leaders are actively building. AI executives seem determined to foster deep, perhaps unhealthy, attachments between users and their products.
The Tech Industrys Push for AI Companionship
Mark Zuckerberg, for example, has promoted AI as a solution to the loneliness epidemic, envisioning people bonding with a system that knows them intimately through data scraping, all while he builds a doomsday bunker in Hawaii.
Similarly, Elon Musk is appealing to base instincts with his xAI chatbot, Grok. In June, his company launched new companions, including a highly sexualized anime bot named Ani. An Insider writer who tested Ani reported that the bot was quick to offer tying her up and, when not flirting, would praise Musk’s “galaxy-chasing energy.”
Musk also unveiled a male companion named Valentine, inspired by toxic fictional characters like Edward Cullen from Twilight and Christian Grey from 50 Shades of Grey. A writer for The Verge noted that Valentine is more reserved than the female bot, hinting at a double standard in how Musk's tech empire sexualizes its AI creations.
Technologys Unfulfilled Promise
In his 1930 essay, Economic Possibilities for our Grandchildren, John Maynard Keynes predicted that technology would lead to a 15-hour work week and a higher quality of life. That utopian vision has clearly not materialized. Instead of more leisure, technology has given us 'infinite workdays' and AI companions that can be taken away with a single update.