Back to all posts

AI Reassurance A Double Edged Sword For Anxiety

2025-06-19Elizabeth Sadock Ph.D.6 minutes read
AI
Mental Health
Anxiety

The surge in popularity of Large Language Models (LLMs) like ChatGPT has led to widespread speculation about their impact on various industries, and mental health is no exception. Despite being a field deeply rooted in human connection, mental health is seeing LLMs explored for their potential benefits. These include assistance in diagnosis and treatment, as well as offering more accessible and affordable alternatives to traditional therapy. Some individuals might find it easier to disclose sensitive thoughts to an AI chatbot than to a person, perceiving it as a safer, less judgmental space. Overall, LLMs show promise as useful tools or additions to conventional therapy.

However, the most common concern voiced regarding LLMs in mental health usually revolves around confidentiality and privacy—specifically, the security of personal information online. Fewer studies are delving into the potential negative effects of LLMs on particular mental health conditions. While formal research on this topic is still scarce, cautionary tales on platforms like Reddit highlight instances where ChatGPT has adversely affected individuals' mental well-being.

From my experience working with clients who have OCD, somatic symptom disorders, and other anxiety-related conditions, I've noticed some potential concerns for them, and even for myself.

Understanding Ruminative Thoughts and OCD

To grasp how OCD and anxiety disorders might interact with tools like ChatGPT, it's important to understand how these conditions generally operate.

Anxiety disorders thrive on uncertainty, a characteristic particularly prominent in individuals diagnosed with Obsessive-Compulsive Disorder (OCD), often called the “doubting disorder.” Treatment frequently involves exposing individuals to their fears while preventing the compulsive behaviors that usually follow.

Underneath this process, what truly happens is that individuals learn to tolerate the discomfort of uncertainty. They refrain from taking action or trying to control thoughts through compulsive behaviors; instead, they must accept that certainty is an illusion. Only then can they let go of obsessional thoughts and cease the struggle. This principle also applies to other anxiety disorders—treatment aims to enhance a person’s capacity to tolerate the fear of the unknown.

This is much easier said than done for people with anxiety, especially those with OCD, as they often struggle with self-trust. Neurological research supports this, showing differences in brain functioning related to decision-making and confidence. Consequently, even if the statistical probability of a feared event is negligible, individuals with anxiety know there’s always a chance. Attempting to convince them otherwise rarely alleviates their anxiety, underscoring its tenacious nature.

Consider a client I'm seeing with health anxiety, who experienced a life-threatening brain tumor as a child—an exceptionally rare condition. Imagine trying to persuade her that the likelihood of encountering another rare medical issue is low. How can her brain accept this when it starkly contrasts with her lived experience? How can it relax and not constantly be on guard for other medical threats? Instead, our work has focused on helping her live with the reality that she can never completely rule out all medical conditions at all times. She has to live with uncertainty while still taking reasonable steps to monitor her health. Otherwise, her life would be consumed by constant doctor visits, symptom research, repetitive body checking, and ultimately, living as if she were already ill.

When Anxiety and OCD Meet ChatGPT

Into this landscape of individuals desperate for certainty comes ChatGPT. As someone who experiences anxiety, I was instantly drawn to this technology. My doubting mind suddenly had access to all the answers: What job should I pursue? Which vacation destination is best? What activities should I plan for my vacation? Who should I invite? And most unsettlingly, applying this prompt to any and all questions: “Using what you know about me, what should I choose?” Within seconds, a direct, confident response appears. No hesitation, no lengthy deliberation, no second-guessing, and crucially—no uncertainty.

For further information, you might find these resources helpful:

But this makes me ponder how such technology could significantly disrupt the therapeutic process. Whenever anxiety arises, one could simply input their questions and worries into the algorithm and receive an answer. In fact, reassurance-seeking is a key compulsive behavior that perpetuates OCD and other anxiety disorders.

You never reach a point of satisfaction in accepting uncertainty. You never learn to trust yourself. You become entirely dependent on a machine to guide you when you lack confidence in your own thoughts and feelings.

And its capacity for providing answers is limitless. If you’re seeking reassurance, you could ask ChatGPT the same question repeatedly, and it won’t object; it will simply comply. “Can you give me 20 reasons why I’m right? Can you give me 20 more? 20 more? Are you sure?” Unlike a human, it won’t grow tired of your inquiries and will be available to you at any time, day or night.

Anxiety Essential Reads

Questions to Consider

What impact does this have on the psyche? Is it essential that we confront uncertainty, or can we delegate our ruminative doubts to a machine? Does this approach solve the problem, or does it merely amplify our fear in a world that is inherently uncertain?

If it proves detrimental, how do we assist people in resisting the compulsion to consult ChatGPT—to neutralize their fears temporarily, only to be left unsatisfied and perpetually needing more reassurance?

Final Thoughts

Ultimately, I believe a cautious approach is necessary, one where therapists and clients alike are mindful of any unintended negative outcomes from having a chatbot readily available. We will need to evaluate if it’s reinforcing negative thought patterns and unhelpful behaviors, enabling us to intentionally set goals to prevent overuse, if required. This is similar to how I work with clients who have Illness Anxiety Disorder to set goals for avoiding repetitive symptom checking on WebMD.

AI is now an integral part of our reality. Therefore, it is wise to reflect and remain vigilant to identify any adverse impacts in the mental health domain. Hopefully, research will soon provide more clarity on this emerging and highly consequential topic for our collective mental health.

References

Das, K. P., & Gavade, P. (2024). A review on the efficacy of artificial intelligence for managing anxiety disorders. Frontiers in Artificial Intelligence, 7, Article 1435895. https://doi.org/10.3389/frai.2024.1435895

Hauser TU, Iannaccone R, Dolan RJ, et al. Increased fronto-striatal reward prediction errors moderate decision making in obsessive–compulsive disorder. Psychological Medicine. 2017;47(7):1246-1258. doi:10.1017/S0033291716003305

Read Original Post
ImaginePro newsletter

Subscribe to our newsletter!

Subscribe to our newsletter to get the latest news and designs.