According to a new report, the ChatGPT app for Mac had a troubling security issue that could easily compromise user conversations and data
Developer Pedro José Pereira Vieito revealed in Threads that ChatGPT stored data as plain text, circumventing security measures This allows other apps to easily find and read the data For example, Pedro José Pereira Vieito was able to read past conversations between users and ChatGPT in an app he developed
As The Verge noted, this means that the ChatGPT app did not follow the sandbox protocol used by the majority of Apple's apps Sandboxing refers to a protocol that allows apps to restrict data, making it impossible for other apps to access or make changes to the data without explicit permission
When contacted by OpenAI, the company that developed ChatGPT, The Verge received the following response from spokesperson Taya Christianson:
"We are aware of this issue and have shipped a new version of the application that encrypts these conversations We have done so
"We are committed to providing a useful user experience while maintaining our high security standards as our technology evolves"
An update to the ChatGPT app has since been released for macOS, which appears to resolve this issue by locking ChatGPT data with proper encryption and preventing Pereria Vieito's method from working
It should be noted that the ChatGPT app is different from the Apple-OpenAI partnership, which is part of Apple Intelligence; for ChatGPT to work, some degree of inter-application transfer is required, and Apple has made its particular integration is likely to be quite strict about security details
This issue may cause some users to worry about the security of their data and AI chatbot conversations However, Apple and OpenAI's quick resolution of this issue should allay concerns To continue to protect yourself and your data, we recommend updating your apps and macOS as often as possible and investing in good antivirus software
Comments