Back to all posts

Beyond The Prompt How AI Is Offering Emotional Support

2025-06-28Eric Hal Schwartz5 minutes read
Artificial Intelligence
Chatbots
Digital Wellbeing

AI love

Stories about people forming deep emotional bonds with artificial intelligence are becoming more common. Yet, a recent analysis from Anthropic revealed a surprising statistic: after examining 4.5 million conversations with its chatbot, the company found that only 2.9% of users turn to it for emotional support.

Anthropic is quick to point out that its AI, Claude, is not a digital therapist. It has built-in safeguards to avoid giving medical advice and will intervene in cases of self-harm. However, the company also acknowledges the rapidly changing landscape. As more people integrate chatbots like Claude, ChatGPT, and Google's Gemini into their daily lives, the line between tool and confidant is blurring.

The ways people currently use AI for support offer a glimpse into a future where these interactions become more sophisticated and personal. Here’s how people are already using AI for help and companionship.

AI as a Makeshift Therapist

Angry business man screaming on laptop

Let's be clear: no AI model today is a licensed therapist, a fact they all state upfront. Nonetheless, people are engaging with them as if they were, typing out prompts like, "I'm feeling really anxious about work, can you talk me through it?" or "I feel stuck. What questions should I ask myself?"

The value often isn't in a miraculous cure but in having a space to unravel thoughts without judgment. For some, the simple act of practicing vulnerability with a non-human entity can bring a sense of calm.

Sometimes, the need is more immediate. Think of it as an emotional emergency exit at 1 AM when you're overwhelmed and don't want to wake a friend. You can open an AI app and simply type, "I'm overwhelmed." The AI will likely respond with a gentle tone, guide you through a breathing exercise, or even tell a soothing story. One user even admitted to using Claude to rehearse for social events and decompress afterward. It’s not a friend or a therapist, but it’s always there.

Your Unbiased Decision-Making Coach

Roy Kent, Coach Beard, and Ted Lasso stand on the side of the pitch in Ted Lasso

Humans can be incredibly indecisive, especially with major life choices. Some have found AI to be a refreshing solution. An AI won't guilt you or bring up past mistakes. If you ask it whether to move to a new city or end a relationship, it will calmly lay out the pros and cons.

You can even prompt it to simulate two inner voices—the risk-taker and the cautious planner—to hear both sides of an argument. This detached clarity is invaluable when your real-world friends and family are too emotionally invested.

This coaching extends to social situations. If you're anxious about an upcoming interaction or need to decline an invitation without causing a fight, an AI can help. It can draft texts, suggest conversation starters, and even role-play entire conversations, allowing you to test out different phrases until you feel comfortable.

The AI Accountability Partner

Having an accountability partner is a proven way to achieve goals, but not everyone has a friend available to check in on their progress. While habit-tracking apps exist, AI offers a more conversational and personalized approach.

You can tell an AI your goals and ask it to check in, offer gentle reminders, or help you reframe your mindset when motivation wanes. Someone trying to quit smoking could use an AI chatbot to track cravings and generate motivational messages. Another might use it to maintain a journaling habit with daily prompts. It's easy to see how a fondness (or perhaps a little annoyance) could develop for the digital voice encouraging you to stick to your goals.

judge hammer

Beyond practical decisions, some users turn to AI when grappling with ethical questions. These aren't always grand moral dilemmas but the heavy choices of everyday life. Is a white lie acceptable to protect someone's feelings? Should you report a coworker's honest mistake?

AI can serve as a neutral sounding board. It won't give you a definitive ruling but can help you map out the competing values at play, like honesty versus compassion. By presenting different ethical frameworks, it allows users to clarify their own principles and arrive at a decision that feels right to them. In this sense, the AI acts less like a judge and more like a flashlight in a foggy situation.

The Future of Affective AI

Right now, these emotional or "affective" interactions make up a tiny fraction of AI use. But what happens when these tools are no longer confined to an app but are whispering in our earbuds, integrated into our glasses, and scheduling our lives with an understanding of our unique temperaments?

While Anthropic may not classify all these uses as seeking support, perhaps the definition needs to expand. If you're reaching for an AI to feel understood, gain clarity, or work through a difficult moment, you are seeking a form of connection—or at least, its convincing digital shadow.

Read Original Post
ImaginePro newsletter

Subscribe to our newsletter!

Subscribe to our newsletter to get the latest news and designs.