Back to all posts

AI Your Relationship Coach Or Anxiety Source

2025-05-20Megan Farokhmanesh8 minutes read
AI
Relationships
Anxiety

AI Your Relationship Coach Or Anxiety Source

People are increasingly turning to generative AI to help sort out their interpersonal conflicts. But for some, this new digital confidant might be making their anxiety even worse. Let's explore this evolving relationship between humans and AI in the quest for understanding.

The Shock of an AI-Edited Breakup Message

Green was navigating the aftermath of a breakup. The reasons were common enough: mismatched needs and communication struggles. So, an unprompted email from Green's ex came as a surprise.

The email itself wasn't unusual—a typical post-breakup airing of grievances. However, the ex had broken a no-contact agreement. Green expressed displeasure with the email and its tone.

The ex's defense? They had "ran it through ChatGPT multiple times to ensure it wouldn't hurt anyone's feelings." Green, a ChatGPT user themself, was taken aback. "ChatGPT doesn't have the history we have... I don't know how it could possibly apply the context of our relationship," Green explained. The feeling was one of a personal, caring relationship being "reduced down to the opinion of ChatGPT, versus my own boundaries or opinions or needs. It felt impersonal."

AI's Ever-Expanding Reach

Generative AI has swiftly woven itself into nearly every part of modern life. It's changing jobs across many different industries, challenging original art, and raising concerns about creators' rights as it learns from existing work. Students use it to cheat, while teachers use it for lesson planning. There are reports of it radicalizing vulnerable users and potentially helping to dictate new drug approvals. AI also profoundly impacts our environment and has even been used to reanimate victims to provide statements in court. Some people fall in love with AI like ChatGPT, while others use it for therapy.

Inevitably, AI has become a tool for the most human of endeavors: trying to understand each other. People are using ChatGPT to analyze texts from potential partners, settle disputes with friends, or make peace with family—alone. Despite their earnest desires to communicate better, the ChatGPT users WIRED spoke to confessed to feeling some shame about using a computer for their human problems. All requested anonymity due to privacy concerns.

The Risk of Outsourcing Emotional Labor

"Our friendships depend on those bonds," says Daniel Kimmel, a clinical psychiatrist at Columbia University, highlighting the risk of offloading emotional labor to AI. "We do have to be mindful about understanding what are the ancestral ingredients to a functioning human society, functioning human friendship."

Humans have spent eons developing an "emotional simulator"—instincts that help us understand each other, Kimmel explains. AI lacks this emotional experience. What it possesses, he says, is "an extremely powerful predictive engine" capable of interpreting patterns and language.

"Therein lies the limitation of the AI: All that it can operate on is words," Kimmel states. "Our model is built on something that is not words. It is ineffable."

Seeking Relationship Insights from an Algorithm

Kate, a 35-year-old from Denver, started using ChatGPT for relationship analysis about two and a half months ago. Her job introduced her to the service, and the leap to personal use wasn't difficult. Previously, she'd turn to Reddit or Google for relationship advice like "how can you tell if the guy you're dating is over his ex." Past relationship wounds still affect her, manifesting as an "anxious attachment style" in early-stage relationships.

About a month and a half into a new relationship, Kate found herself in an anxious loop. Her new partner was recently divorced with a young child, triggering painful associations. She exported their entire text history into ChatGPT. "I asked it to analyze our text exchanges and give me a scorecard," Kate says. This included their attachment styles, healthy aspects of their relationship, and the daunting question: Who likes whom more?

"I took it pretty hard, to be honest," she admits. ChatGPT suggested her partner leaned avoidant, a style she hoped to avoid. "But it was really enlightening. I actually shared it with my girlfriends." Having the relationship laid out was strangely comforting. "It was reassuring... because if my head or thoughts can run away with me—Does he really like me? How does he feel?—it was incredible to have kind of like a third party that could really quickly experience things from my perspective."

Kate muses she’d be flattered if the roles were reversed. "Wow, I'm honored I'm even top of mind enough for you to prompt ChatGPT about me," she says. "I'd want to know what Chat GPT said and then assess how I feel." Still, she adds, "it would really break my heart, too."

The AI Therapist A Source of Comfort and Anxiety

ChatGPT, with its extensive memory of her current relationship, has become "kind of like my therapist," Kate says. But it's a double-edged sword. It can cause anxiety, offering hypotheses about her partner's behavior—like suggesting he's pulling away when he's just busy. She can spend hours prompting about her relationships. "I'm not saying it's always the healthiest," she concedes.

Kate's actual therapist isn't a fan. "She's like, ‘Kate, promise me you'll never do that again. The last thing that you need is more tools to analyze at your fingertips. What you need is to sit with your discomfort, feel it, recognize why you feel it.’"

An OpenAI spokesperson, Taya Christianson, told WIRED that ChatGPT is a factual, neutral, and safety-minded general-purpose tool, not a substitute for a mental health professional. Christianson pointed to a blog post about a collaboration with MIT Media Lab studying "how AI use that involves emotional engagement—what we call affective use—can impact users’ well-being."

For Kate, ChatGPT is a sounding board without its own needs or schedule. While she has good friends and a close sister, it's different. "If I were texting them the amount of times I was prompting ChatGPT, I'd blow up their phone," she says. "It wouldn't really be fair. I don't need to feel shame around blowing up ChatGPT with my asks, my emotional needs."

AI as a Private Confidant

Andrew, a 36-year-old in Seattle, increasingly uses ChatGPT for personal needs after a difficult family period. He isn't secretive about it but isn't broadcasting it either. "I haven't had a lot of success finding a therapist that I mesh with," he shares. "And not that ChatGPT by any stretch is a true replacement for a therapist, but to be perfectly honest, sometimes you just need someone to talk to about something sitting right on the front of your brain."

Previously, Andrew used ChatGPT for tasks like meal planning. Then, his girlfriend broke up with him via text the day before Valentine’s Day. He wasn't even sure he’d been dumped. "I think between us there was just always kind of a disconnect in the way we communicated," he reflects. The text "didn't actually say, ‘Hey, I'm breaking up with you’ in any clear way."

Puzzled, he fed the message to ChatGPT, asking, "Hey, did she break up with me? Can you help me understand what's going on?" ChatGPT didn't provide much clarity. "I guess it was maybe validating, because it was just as confused as I was."

Andrew has close friends he'd usually turn to, but he didn't want to burden them. "Maybe they don't need to hear Andrew’s whining about his crappy dating life," he says. "I'm kind of using this as a way to kick the tires on the conversation before I really kind of get ready to go out and ask my friends about a certain situation."

Sharing Your Deepest Secrets The Privacy Question

The intimate information users share with ChatGPT raises serious privacy concerns. If chats leak or data is misused, more than just passwords are at stake.

"I have honestly thought about it," Kate says about trusting the service with private details. "Oh my God, if someone just saw my prompt history—you could draw crazy assumptions around who you are, what you worry about, or whatever else."

Christianson stated that ChatGPT's model aims to help while directing users to professional help and real-world connections.

The company has previously said it is "committed to protecting people's privacy." While users can adjust privacy settings, the service inevitably ingests vast amounts of personal data. Users interviewed demonstrated high faith in OpenAI, treating ChatGPT like a high-functioning diary.

Andrew describes his trust as a "hopeful handshake agreement" with ChatGPT.

"It's probably going to sound super sad boy, but I sometimes find it easier to trust a technology platform," he admits. He’s had experiences where people turned his private confessions into gossip. ChatGPT, in comparison, is a locked box, a neutral party uninterested in chatter.

The Normalization of AI Relationship Coaching

Reservations about using ChatGPT as a diary or intermediary seem to be fading. On TikTok, creators coach users on using AI to analyze text arguments, get relationship advice, and automatically reply to dates or even break up. Kate finds these videos useful for new prompt ideas.

Recently, she saw a TikTok prompt where users describe their goals, and ChatGPT writes a detailed story of what that life could be like. "It helps you embody or experience it, live through what that could look like," she says.

"The amount that I have used it for has continued to grow exponentially," Kate notes. Still, she’s trying to use it less in her romantic life, following her human therapist's advice.

Read Original Post
ImaginePro newsletter

Subscribe to our newsletter!

Subscribe to our newsletter to get the latest news and designs.