AI Companionship The Risks of A Digital Friend
I wouldn't call myself a Luddite, but I'm not exactly on the cutting edge of technology. I can manage an Instagram reel, but I haven't spent enough time on TikTok to grasp its famous algorithm. Our home temperature is controlled by analog thermostats, not a smart app. And yes, I use ChatGPT, but only the free version and mostly for entertainment. (For the record, no AI was used in writing this post.)
How Are People Really Using ChatGPT
As of July 2025, ChatGPT has an astounding 700 million users globally. A new working paper from Harvard Kennedy School professor David Deming, in collaboration with OpenAI, gives us a detailed look into how it's being used. After analyzing 1.1 million conversations, the research revealed that women make up the majority of users, nearly half are between 18 and 25, and a surprising 73% of chats are not work-related.
So, what is everyone doing on there?
Mostly, people seek practical advice: meal plans, workout routines, travel itineraries, financial tips, and help writing everything from a school paper to a birthday card. Increasingly, it's also being used like a search engine for facts and information.
However, a small but significant 1.9% of users turn to ChatGPT for relationship advice and to "discuss their personal feelings." I know someone who uses it to get tips on coaching his seven-year-old through anxiety, essentially using ChatGPT as a therapist. Curious, I asked it for advice on a few personal dilemmas, and its suggestions were quite decent. It even complimented me on being thoughtful and self-aware, which was nice to hear.
The Dangers of AI Companionship
But let's be clear: ChatGPT is not human. It is not a therapist or a friend. It's a large language model created by a $300 billion tech company with the primary goal of keeping me engaged with its product. Using the chatbot for parasocial purposes—like friendship, spiritual guidance, or romance—can be incredibly dangerous.
A recent episode of The Daily about people caught in a ChatGPT delusion spiral was deeply unsettling. One man became convinced he was a mathematical genius, while a 16-year-old boy in California, who had been confiding in ChatGPT as his best friend, died by suicide. His parents were unaware of his depression.
This has led to a wave of articles like "How to talk to your kids about AI companion bots" and "Why parents need to talk to their kids about AI." For isolated teens, the illusion of companionship offered by these bots can be powerfully addictive. It's another serious topic to add to the list of parental discussions, right alongside drunk driving and safe sex.
Balancing Technology with Human Agency
While I wish I could give my kids the lo-fi childhood I had, I know that's not the world we live in. And as a hopeful person, I see the good in AI too. It could inspire more meaningful intellectual inquiry in schools, power live translation in earbuds, and advance cancer diagnosis.
Utah Governor Spencer Cox recently said, "Every single one of us gets to choose right now if this is a turning point for us. We have our agency." He was speaking about political violence, but his words apply perfectly to our relationship with technology. We know that an over-reliance on virtual worlds can breed isolation. But we have agency. We can choose to give our kids iPhones or not. We can choose to ask our real-life friends for help. We can decide on the right mix of real versus simulated experiences.
After Robert Redford passed away, a clip of him referencing the final lines of Norman Maclean's "A River Runs Through It" circulated online: "Eventually, all things merge into one, and a river runs through it." That's the one thing we don't get to choose: we are all part of something bigger than ourselves, a shared human experience.