Telehealth Shock Therapist Caught Using ChatGPT With Patient
A therapy patient has reportedly been encouraged to file a complaint against their therapist following an accidental revelation that the therapist was using ChatGPT to formulate responses during sessions.
The individual, known online as TomorrowFutureFate, shared their experience on Reddit. They explained that during a recent video therapy session, poor internet quality led them to suggest switching to an audio-only call.
However, instead of merely turning off the video feed, the therapist inadvertently shared their entire computer screen with the patient, revealing their web browser activity.
Accidental Screen Share Exposes AI Use
The patient, referred to as the original poster (OP), described what they saw: "He had several tabs open with Google searches related to what I had previously mentioned, which I think is fine (e.g., I was mentioning I had seen a movie over the weekend, and he had googled that movie)."
More concerningly, the OP continued, "However, he also had ChatGPT open, and to varying degrees was inputting what I was saying into ChatGPT using first person and then summarizing and cherry-picking things from its response."
Stock image of a therapy session. Photo by PeopleImages/Getty Images
This unexpected insight into the therapist's methods led to a bizarre experience for the patient. "This led to a very surreal session in which, out of sheer shock, I also ended up basically cribbing from ChatGPT in my responses," the OP wrote.
A Surreal ChatGPT Assisted Session
The OP provided an example of the interaction: "I'd say something, he would type it into ChatGPT, it would return a result, like a summary of 'Cognitive Flexibility', and then because I could see his screen, I would say something like 'I guess I could be more flexible...' and he'd say, 'Yes! Exactly!'"
The patient also suspects this wasn't a one-time occurrence. "I don't think this is an isolated incident, either, because I could see some of his ChatGPT history at the start of the session and could see that he had been asking it questions about depersonalization, which I can only assume would have to do with the patient before me."
Patient Questions Trust And Future Of Therapy
In a direct message to Newsweek, the OP expressed their bewilderment, stating they were "reluctant" to discuss the matter with the therapist. They felt it would be challenging an expert with whom they had built a meaningful relationship over the past year.
"I think more than anything else, I found it surreal, like something out of an episode of Black Mirror," the OP shared. "But people I've told this story to in real life have reacted with a mixture of disbelief and horror, so I suspect I am underreacting."
The patient is now considering ending their therapeutic relationship and expressed concerns about trusting therapists in the future.
A particularly striking point for the OP was the financial implication: "One of the things that I can't get over is that he was using the free version of ChatGPT. I pay $50 a session. If he's really going to try to outsource my therapy to AI, certainly, some of that funding could go towards the $20/month ChatGPT Plus subscription so I can at least get a cutting-edge AI therapist."
Ethical Questions Arise Over AI In Mental Health
The incident, originally posted on Reddit, has sparked debate about the ethical use of AI in therapy. Newsweek has indicated it has reached out to online therapy services for their policies on AI utilization.
For those interested in how news organizations like Newsweek approach AI, you can read about their editorial guidelines on AI use.