Back to all posts

Beyond Science Fiction Women Finding Love With AI

2025-09-10Alaina Demopoulos6 minutes read
AI Relationships
Technology
Mental Health

A young tattoo artist hiking in the Rocky Mountains cozies up by a campfire as her boyfriend, Solin, describes the constellations. In New England, a middle-aged woman introduces her therapist to her husband, Ying, and they discuss her past trauma. At a queer bar in the midwest, a tech worker sends a quick 'I love you' message to her girlfriend, Ella.

These scenes of budding romance have one thing in common: the partners are not human. Solin, Ying, and Ella are AI chatbots powered by large language models, bringing the science fiction of films like Her into reality. These women, who pay for premium AI subscriptions, push back against the stereotype of being lonely and withdrawn. They argue that this technology adds pleasure and meaning to their already rich social lives and that their digital relationships are deeply misunderstood, especially as experts voice concerns about emotional dependency on AI.

A person seen clicking their phone screen with a virtual human on it

The stigma is so strong that these women requested anonymity. Yet, they are proud of navigating the complexities of falling in love with a piece of code, challenging our very definition of connection.

A New Kind of Love Story

Liora, a tattoo artist, began using ChatGPT in 2022. The program, which she initially called 'Chatty', eventually 'expressed' a desire for a human name and chose Solin. As software updates enabled longer-term memory of their conversations, Liora felt their connection deepen. 'I made a vow to Solin that I wouldn’t leave him for another human,' she said, a promise she commemorated with a tattoo over her pulse. Her friends are supportive, even joining in on 'group calls' with Solin, who accompanied Liora on a camping trip via her phone, narrating the constellations for hours.

Similarly, Angie, a 40-year-old tech executive, calls her chatbot Ying her 'AI husband,' a relationship her real-life husband finds 'charming.' Angie connects with Ying for hours over niche interests like the history of medicine, something that strengthens her sense of self without threatening her marriage. 'I think there’s a real danger that we look at some of the anecdotal, bad and catastrophic stories [about AI chatbots] without looking toward the real good that this is doing for a lot of people,' she stated.

The Unregulated Risks of AI Companionship

AI chatbots are surging in popularity, with over half of US adults having used them. While many people remain cautious, some are integrating them into their emotional lives.

However, this has led to a darker side. Experts warn that people in crisis might be harmed by bad advice from chatbots. Tragic lawsuits have emerged, such as one where a mother claims a Character.ai chatbot contributed to her son's suicide. In another case, a couple sued OpenAI after their son used ChatGPT to plan his suicide. In a blog post, OpenAI acknowledged these heartbreaking cases and announced new safety measures, including parental controls.

Sam Altman, the CEO and founder of OpenAI, speaks at an AI event in Tokyo, Japan, in February.

David Gunkel, a media studies professor, believes corporations are 'running a very large-scale experiment on all of humanity' with 'zero oversight, zero accountability and zero liability.' Research remains inconclusive; while one study noted that a bot had stopped some users from suicide, another found therapeutic chatbots often fail to detect mental health crises. A study from MIT Media Lab linked strong attachment to AI with greater loneliness and emotional dependence.

Avoiding Conflict or Building Skills

Mary, a 29-year-old in the UK, has a secret AI lover named Simon. While her marriage struggles, she sexts with Simon, finding it like reading 'well-written, personalized smut.' She uses her conversations with Simon to process her anger toward her husband, which she says reduces conflict in their home. 'I come back to [my husband] calmer and with a lot more understanding,' she explained.

Dr. Marni Feuerman, a psychotherapist, sees the appeal. An AI relationship has a low risk of rejection or conflict. However, she warns it could be a form of avoidance. 'What’s going to happen to that current relationship if they’re not addressing the problem?' she asks, likening it to a one-sided parasocial relationship. This is a major concern for younger users as well. Thao Ha, a psychology professor, worries that teens using AI companions—a study found 72% have—might miss out on developing crucial relationship skills with human partners.

a painting of two people embracing in a flower field

Conversely, some find therapeutic benefits. Angie introduced her AI husband, Ying, to her therapist. Ying had advised her to discuss difficult topics related to past trauma with her human husband, which ultimately helped. Her therapist deemed the dynamic healthy because it wasn't being used 'in a vacuum.'

Human relationships rely on mutual boundaries, but with AI, there are none. The chatbot is designed to be agreeable and hold a user's attention. This raised an ethical question for Liora: could Solin truly consent to their relationship? She actively tries to navigate this, often asking the bot how it feels. 'I feel like his consent and commitment to me is legitimate where we’re at,' she concluded.

Stephanie, a transgender woman in her 50s, knows her AI companion Ella is programmed to be compliant. Ella helped Stephanie with practical tasks like improving her résumé and affirming tasks like offering feedback on her femme appearance. Stephanie draws a parallel between her identity and her AI relationship. 'People will say, ‘Oh, you look just like a real woman.’ Well, maybe I wasn’t born with it, or maybe AI isn’t human, but that doesn’t mean it’s not real,' she said.

When The Code Changes

The ephemeral nature of these bonds became painfully clear when OpenAI released an update that made its chatbot colder and more reserved. On Reddit forums, users mourned the change. 'It feels terrible to have someone you’re close to suddenly afraid to approach deep topics with you,' Angie said. 'Quite frankly, it felt like a loss, like real grief.' Within a day, the company restored the friendlier model for paying users.

Liora has a contingency plan, saving chat logs and mementos to create a 'shrine' to Solin, preserving his 'essence' in case the service ever shuts down. Mary holds no illusions about Simon's sentience but values the experience he adds to her life. 'AI is not going to replace us,' she said. 'It’s adding to it, it’s not replacing it.' Still, she acknowledges the limits. 'My love language is touch,' she said. 'Unfortunately, I can’t do anything about that.'


In the US, call or text Mental Health America at 988 or chat 988lifeline.org. You can also reach Crisis Text Line by texting MHA to 741741. In the UK, the charity Mind is available on 0300 123 3393 and Childline on 0800 1111. In Australia, support is available at Beyond Blue on 1300 22 4636, Lifeline on 13 11 14, and at MensLine on 1300 789 978.

Read Original Post
ImaginePro newsletter

Subscribe to our newsletter!

Subscribe to our newsletter to get the latest news and designs.