This week, OpenAI CEO Sam Altman put a spotlight on a critical issue:
According to Altman, “People talk about the most personal sh** in their lives to ChatGPT,” he said. “And right now, if you talk to a therapist or a lawyer or a doctor about those problems, there’s legal privilege for it… And we haven’t figured that out yet for when you talk to ChatGPT.”
Legal privilege, confidentiality, and data control are not built into cloud-based AI tools. They never were. And despite Altman’s surprising candor, this isn’t new information.
Back in early 2023, legal journalists and ethics watchdogs were already sounding the alarm. Sharing client info with a public AI like ChatGPT is legally equivalent to disclosing it to a third party—destroying privilege. That’s a malpractice minefield.
In March 2024, Microsoft’s Azure-based deployment of ChatGPT exposed another loophole: human access to user conversations. And if that wasn’t enough, a federal judge has now ordered OpenAI to retain all user logs, even when users opted to delete them.
The bottom line?
If your tools require the cloud, your data isn’t entirely yours.
At TheFormTool, we built our software—TheFormTool PRO and Doxserá—to operate entirely offline. No cloud, no outside access, and no built-in vulnerability to discovery, data mining, or silent subpoenas.
It’s time to rethink how we treat client confidentiality. If GenAI is going to be part of your practice, it must come with built-in protections. Ours does.