Is AI Secretly Writing Your Friends Text Messages
Is That You or ChatGPT?
When Sarah Chiappetta sought career advice from a close friend via text, the response she received felt off. It was, as she described, "overly sympathetic," meticulously acknowledged her feelings, and used an em dash—a punctuation mark her friend rarely employed. The comforting words felt strangely familiar, echoing the distinct style of ChatGPT. "I wasn't mad. I was a little weirded out," the 30-year-old product marketing manager shared. "Is my text that hard that you need ChatGPT to help you with it?"
The Rise of AI in Personal Conversations
Millions of people now converse with chatbots as if they are colleagues, best friends, or even romantic partners. While some find this dystopian, others see it as a solution to loneliness or simply a helpful tool. However, a more stigmatized behavior is emerging: using generative AI to craft messages for our real-life human relationships. People are becoming amateur detectives, spotting AI-generated texts by telltale signs like the frequent use of em dashes or words like "delve."
This trend is prompting a critical question: as we bring generative AI into our personal conversations, are we outsourcing the emotional labor of our relationships? We've grown accustomed to autocorrect and predictive text, but large language models take it a step further, capable of writing entire messages of comfort or confrontation. New research from MIT suggests that relying on ChatGPT for writing can make people lazier and more dependent on the technology over time. As we use AI to flirt on dating apps or draft wedding vows, we risk letting our own social muscles and ability to connect atrophy.
The Science of AI-Assisted Communication
The social impact of this technology is still being studied, but experts are concerned it could undermine the trust we have in our digital communications. "We don't know who we're actually talking to," says Jess Hohenstein, a former AI researcher at Cornell University. "Could we potentially be moving to a place where face-to-face interactions are the only interactions we can truly trust?"
Research highlights this growing disconnect. A 2023 Ohio State University study found that when participants were told a supportive text from a fictional friend was crafted with AI, they felt their relationship was not as close. Similarly, a Cornell study discovered that while AI can make conversations more efficient and positive, people judge others negatively for using it, viewing the messages as less authentic.
"We really just see this critical disconnect," explains Hohenstein. "Using AI to communicate actually can kind of improve the way that we talk to each other, but there's this perception where it's judged so negatively." This sentiment is echoed across platforms like Reddit, where users express feeling hurt, as if their friendship is a "chore" to the person using AI. Quinn White, a philosophy professor at Harvard, notes the fundamental difference: "Telling you what is the most likely next best word is just so different from telling you what I think."
Sarah has been relying on ChatGPT to translate acrimonious texts to her soon-to-be ex-husband. "It does a good job of separating yourself from a hostile situation," she says.
A Tool for Tough Talks: When AI Can Actually Help
After Chiappetta gently confronted her friend, the friend admitted to using ChatGPT for parts of the response. Chiappetta wasn't angry and still values the friendship, finding the AI-assisted advice generally helpful. For some, generative AI is more than a shortcut; it's an essential tool. For individuals who are neurodivergent, have social anxiety, or are facing a difficult conversation, AI can act as a helpful rehearsal space.
David Deal, a 62-year-old marketing consultant, used ChatGPT to workshop a response to a young mentee, wanting to ensure he didn't come off as a "mansplaining jerk." The bot suggested he lead with empathy, a tactic he admits he might not have used on his own. In other cases, emotional distance is the goal. Sarah, who is finalizing a divorce, uses ChatGPT to rephrase her emotionally-charged texts to her ex-husband into respectful but firm messages. She gets to vent her frustration to the bot while maintaining a sense of "moral superiority" by sending calm, collected responses.
Putting It to the Test: Can AI Understand Friendship?
Curious about its capabilities, I fed my own active group chat into ChatGPT to see if it could provide a useful summary. The bot was accurate with logistics, like planning a camping trip, but it completely missed the nuances of our humor. It described an inside joke, "Hoagie Day," as a "culturally significant" celebration. When I asked for a humorous message to send, it suggested, "I'm only coming camping if someone promises to take a dramatic candid of me staring into the woods like I'm in a folk album cover." Sending that would undoubtedly make my friends question my sanity.
The verdict? For a quick summary of logistical details, a chatbot might suffice. But if you want to maintain genuine connection and the respect of your friends, it's still best to type with your own two thumbs.