Chat GPT users on macOS are shocked to discover chats are stored unencrypted.
The partnership between Apple and OpenAI is off to a shaky start, as ChatGPT users on MacOS recently discovered that their conversations were being saved in text files.
Apple has positioned itself as a company that prioritizes privacy in a market where many of its competitors make the lion's share of profits by selling or exchanging user data. However, as data and electronics engineer Pedro José Pereira Vietto points out in a post on Meta's Threads forum, someone has dropped the ball when it comes to third-party integration of OpenAI's ChatGPT on MacOS.
Privacy risk
ChatGPT was released to subscribers on macOS in May. General access to unsubscribed accounts was introduced on June 25. As of Friday, July 5, however, the app stored all chat logs in unencrypted clear text files on users' hard drives.
This means that anyone with access to a computer, whether through physical or remote attacks such as malware or phishing, can access every conversation a user has with ChatGPT on that computer.
Sandbox
Apple's macOS has a privacy protection measure called a “sandbox” that controls access to software and data at the kernel level. Apps installed through Apple's App Service are “sandboxed” by default so that data is never encrypted.
Pereira Vietto has released this latest version of the ChatGPT app on Macross exclusively via the OpenAI website:
“OpenAI chose to opt out of the sandbox and store the conversations in plain text, disabling all built-in defenses.”
It's currently unclear whether any users have actually been affected by the apparent oversight, but the general social media and pundits' comments suggest consternation.
For example, in the comments section of an article published in The Verge, user General Lex described finding unencrypted text files stored on his computer's memory.
“I used Activity Monitor to dump the ChatGPT executable from memory, and horror of horrors, I discovered that the chat log was plaintext and unencrypted!”
A simple mistake?
The real question is why is this happening? We know how it happened, and the problem is apparently solved, but the why is unknown.
Presumably, this was done so that OpenAI could easily access chat logs for further ChatGPT development. According to the app's terms of use, users must expressly opt out of sharing their data with OpenAI.
But why didn't Apple intervene before the application went live on behalf of users, and why didn't it know that OpenAI was generating sensitive unencrypted data on users' machines?
Cointelegraph contacted OpenAI and Apple for more information, but did not receive an immediate response from either.
Related: Apple supercharging Siri and iOS with ‘Apple Intelligence' and OpenAI