this post was submitted on 06 Jul 2024
19 points (85.2% liked)

Linux and Tech News

1017 readers
5 users here now

This is where all the News about Linux and Linux adjacent things goes. We'll use some of the articles here for the show! You can watch or listen at:

You can also get involved at our forum here on Lemmy:

Or just get the most recent episode of the show here:

founded 1 year ago
MODERATORS
all 3 comments
sorted by: hot top controversial new old
[–] autotldr@lemmings.world 2 points 4 months ago

This is the best summary I could come up with:


OpenAI announced its Mac desktop app for ChatGPT with a lot of fanfare a few weeks ago, but it turns out it had a rather serious security issue: user chats were stored in plain text, where any bad actor could find them if they gained access to your machine.

As Threads user Pedro José Pereira Vieito noted earlier this week, "the OpenAI ChatGPT app on macOS is not sandboxed and stores all the conversations in plain-text in a non-protected location," meaning "any other running app / process / malware can read all your ChatGPT conversations without any permission prompt."

OpenAI chose to opt-out of the sandbox and store the conversations in plain text in a non-protected location, disabling all of these built-in defenses.

OpenAI has now updated the app, and the local chats are now encrypted, though they are still not sandboxed.

It's not a great look for OpenAI, which recently entered into a partnership with Apple to offer chat bot services built into Siri queries in Apple operating systems.

Apple detailed some of the security around those queries at WWDC last month, though, and they're more stringent than what OpenAI did (or to be more precise, didn't do) with its Mac app, which is a separate initiative from the partnership.


The original article contains 291 words, the summary contains 211 words. Saved 27%. I'm a bot and I'm open source!

[–] csolisr@hub.azkware.net 1 points 4 months ago

@leo Well, which one doesn't at this rate... Still waiting on OpenRecall to actually do something about encrypting its internal memory in some way.