Your secrets aren’t safe: ChatGPT conversations can be used against you

Millions of users speak to ChatGPT like a trusted friend, therapist, or adviser—but a stark revelation has shattered that illusion of privacy: your conversations are not legally protected, and they could be used against you in court.
In a recent podcast interview, OpenAI’s CEO warned users to think twice before sharing anything sensitive with ChatGPT. “We don’t have legal privilege, which is very screwed up,” said Sam Altman. “People talk to ChatGPT like it’s a therapist. It’s not. We do not have the ability to protect that information in the way a doctor or lawyer would.”
This means if law enforcement or a court issues a subpoena, OpenAI can be legally required to hand over your chats—even deleted ones. That’s no longer hypothetical: in its ongoing legal battle with The New York Times, OpenAI has already been ordered to retain all user chat logs, something the company is now fighting to overturn.
Experts say the implications are serious. While AI tools are increasingly being used as emotional outlets or mental health aids, no current law grants them the privacy protections afforded to professionals in those roles.
“Right now, if you tell ChatGPT something you wouldn’t even tell your closest friend, that data is just floating—vulnerable to legal discovery,” said tech lawyer Nicole Grant.
The danger, privacy advocates argue, is that people often treat AI like it’s human—sharing secrets, confessions, fears, and personal dilemmas—without realizing those conversations can legally be exposed.
Although OpenAI allows users to opt out of data being used for training, that doesn’t apply to legal subpoenas or regulatory demands. And now, under mounting pressure from courts, OpenAI may be forced to store user chats longer than its current 30-day policy—a major rollback of privacy standards.
Altman says he supports legislation that would create “AI privilege”, similar to confidentiality between clients and licensed professionals. But until such laws exist, he urges users to be cautious.
“We should be able to protect that information legally, but we can’t right now. And people don’t know that,” Altman admitted.