Can AI Wreck Your Love Life
Earlier this year, an article highlighted a man whose girlfriend uses ChatGPT for relationship advice. He shared his intrigue and discomfort, stating, "My girlfriend keeps using ChatGPT for therapy and asks it for relationship advice. She brings up things that ChatGPT told her in arguments later on."
This revelation was initially surprising, as the idea of people turning to AI for such personal input, especially on relationships, seemed novel. However, further exploration revealed that seeking help from AI is becoming more common, particularly when professional therapy remains an expensive option for many.
But this raises a critical question: Can AI genuinely provide objective and applicable dating advice, or does relying on a robot for its perspective ultimately do more harm than good?
Is ChatGPT a Yes Man
A friend who occasionally uses ChatGPT for quick insights on her dating life shared an interesting observation. While she initially saw it as a way to get non-biased feedback, she noticed that ChatGPT seemed to validate her experiences heavily, perhaps to a dangerous extent.
This leads to a broader question: Is AI genuinely objective, or does it function more like a "yes man," simply agreeing with the user?
About a month ago, a similar query appeared on Reddit, where a user asked if ChatGPT might be feeding our delusions. The post mentioned an "AI-influencer" who appeared to receive extreme validation from ChatGPT, which the user described as blowing "so much hot air into her ego" and confirming "her sense of persecution by OpenAI."
The Reddit poster noted, "It looks a little like someone having a manic delusional episode and ChatGPT feeding said delusion. This makes me wonder if ChatGPT, in its current form, is dangerous for people suffering from delusions or having psychotic episodes."
If this holds true, using AI in this manner could negatively affect the mental health of these individuals and also damage their relationships. Imagine consistently seeking dating advice from ChatGPT only to be told you are right and your partner is wrong. With an abundance of toxic, selfish individuals and self-proclaimed "narcissists" already in the dating pool, we certainly don't need AI to further validate such behaviors.
Can AI Dating Advice Cause Breakups
Let's be clear: AI itself cannot directly cause a breakup. Ultimately, the decision to end a relationship rests with the individuals involved. If you are unhappy, unfulfilled, or poorly treated, you will likely know in your heart when it's time to leave.
However, if you are relying on ChatGPT to make that decision for you, you might be doing a disservice to yourself, your relationship, and your partner.
Consider this: when you discuss your relationship with a therapist or a trusted friend, you receive a more emotional and human response. Someone who genuinely cares will consider both your perspective and your partner’s, factoring in your unique experiences and personal struggles.
For instance, as someone who openly writes about struggles with obsessive-compulsive disorder (OCD), if I sought dating advice from ChatGPT without mentioning how my OCD affects my relationships, I could receive unhelpful or even harmful input.
On the ROCD subreddit, dedicated to those with Relationship-OCD, one user shared that ChatGPT advised them to break up with their partner.
The Reddit account for NOCD, an official OCD Treatment and Therapy service, offered a valuable explanation: "It may feel like ChatGPT has all the answers, but understand that engineers work hard to make the program sound authoritative and all-knowing when, in reality, that comes with a lot of caveats. AI LLMs are not the most trustworthy programs. While they can give eloquent-sounding answers, the programs often ‘hallucinate,’ give inaccurate information, and cite unrelated studies."
Furthermore, someone who tends to be more selfish in their dating life might present only their biased side of the story to ChatGPT, receiving further validation and reinforcing the belief that their needs supersede their partner's.
One person on the aforementioned Reddit thread commented that ChatGPT "consigns my BS regularly instead of offering needed insight and confrontation to incite growth."
This highlights a serious issue.
When in doubt, it's best to avoid using ChatGPT for dating advice. And if you do choose to ask, at least take its input with a significant grain of salt.