Your ChatGPT Therapy Sessions Arent Private Sam Altman Warns
As more people turn to artificial intelligence for personal advice, a stark warning has been issued about the privacy of these digital conversations. Many users, especially younger individuals, are now using ChatGPT as a stand-in therapist or life coach, but these chats don't come with the same confidentiality guarantees as a human professional.
OpenAI CEO Sam Altman says young people, in particular, are using ChatGPT as a therapist. Sebastian Gollnow/picture alliance via Getty Images
The Confidentiality Gap AI vs Traditional Therapy
OpenAI CEO Sam Altman recently highlighted a critical vulnerability for those confiding in ChatGPT. In a recent podcast episode, he explained that conversations with the AI lack the legal protections that shield discussions with doctors, lawyers, or therapists.
"If you go talk to ChatGPT about your most sensitive stuff and then there's like a lawsuit or whatever, we could be required to produce that, and I think that's very screwed up," Altman stated. He contrasted this with traditional professional relationships where doctor-patient or attorney-client privilege is legally recognized. For AI interactions, this framework is non-existent.
How Your ChatGPT Data Is Handled
Unlike end-to-end encrypted services, conversations with ChatGPT are not entirely private. OpenAI staff can access user chats to fine-tune the AI models and monitor for misuse of the platform. According to the company's data policy, while users can delete their chat histories, these conversations are not immediately erased. They are typically wiped permanently after 30 days, but there's a significant exception: OpenAI may retain data for longer if required for "legal or security reasons."
This policy has already been put to the test. In a major copyright lawsuit filed by The New York Times and other news organizations, a court order was issued compelling OpenAI to retain all user logs, including those that were deleted. OpenAI is currently appealing this order.
A Call for Urgent Regulation
Altman stressed the urgency of addressing this legal gray area. He believes there should be a "same concept of privacy for your conversations with AI that we do with a therapist." The rapid adoption of AI for personal and sensitive matters has created a new set of challenges that legal systems have not yet caught up with.
"No one had to think about that even a year ago, and now I think it's this huge issue of like, 'How are we gonna treat the laws around this?'" Altman remarked, pointing to the need for new regulations to protect user privacy in the age of AI.