this post was submitted on 27 Feb 2025
996 points (96.6% liked)

Technology

63375 readers
4465 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
 

Per one tech forum this week: “Google has quietly installed an app on all Android devices called ‘Android System SafetyCore’. It claims to be a ‘security’ application, but whilst running in the background, it collects call logs, contacts, location, your microphone, and much more making this application ‘spyware’ and a HUGE privacy concern. It is strongly advised to uninstall this program if you can. To do this, navigate to 'Settings’ > 'Apps’, then delete the application.”

top 50 comments
sorted by: hot top controversial new old
[–] UltraGiGaGigantic@lemmy.ml 3 points 2 hours ago

I didn't see it anywhere on my phone but ill look into it more after work. Thanks for the heads up.

[–] AWittyUsername@lemmy.world 21 points 12 hours ago

Google says that SafetyCore “provides on-device infrastructure for securely and privately performing classification to help users detect unwanted content

Cheers Google but I'm a capable adult, and able to do this myself.

[–] Lanske@lemmy.world 20 points 17 hours ago

Thnx for this, just uninstalled it, google are arseholes

[–] teohhanhui@lemmy.world 41 points 20 hours ago (7 children)
[–] kattfisk@lemmy.dbzer0.com 31 points 18 hours ago (4 children)

To quote the most salient post

The app doesn't provide client-side scanning used to report things to Google or anyone else. It provides on-device machine learning models usable by applications to classify content as being spam, scams, malware, etc. This allows apps to check content locally without sharing it with a service and mark it with warnings for users.

Which is a sorely needed feature to tackle problems like SMS scams

[–] throwback3090@lemmy.nz 7 points 16 hours ago (1 children)

Why do you need machine learning for detecting scams?

Is someone in 2025 trying to help you out of the goodness of their heart? No. Move on.

[–] Aermis@lemmy.world 5 points 15 hours ago (1 children)

If you want to talk money then it is in businesses best interest that money from their users is being used on their products, not being scammed through the use of their products.

Secondly machine learning or algorithms can detect patterns in ways a human can't. In some circles I've read that the programmers themselves can't decipher in the code how the end result is spat out, just that the inputs will guide it. Besides the fact that scammers can circumvent any carefully laid down antispam, antiscam, anti-virus through traditional software, a learning algorithm will be magnitudes harder to bypass. Or easier. Depends on the algorithm

load more comments (1 replies)
[–] desktop_user@lemmy.blahaj.zone 6 points 16 hours ago (1 children)

if the cellular carriers were forced to verify that caller-ID (or SMS equivalent) was accurate SMS scams would disappear (or at least be weaker). Google shouldn't have to do the job of the carriers, and if they wanted to implement this anyway they should let the user choose what service they want to perform the task similar to how they let the user choose which "Android system WebView" should be used.

[–] Aermis@lemmy.world 4 points 15 hours ago

Carriers don't care. They are selling you data. They don't care how it's used. Google is selling you a phone. Apple held down the market for a long time for being the phone that has some of the best security. As an android user that makes me want to switch phones. Not carriers.

load more comments (2 replies)
[–] Spaniard@lemmy.world 7 points 17 hours ago

If the app did what op is claiming then the EU would have a field day fining google.

[–] dan@upvote.au 4 points 20 hours ago (2 children)

So is this really just a local AI model? Or is it something bigger? My S25 Ultra has the app but it hasn't used any battery or data.

load more comments (2 replies)
load more comments (4 replies)
[–] pfr@lemmy.sdf.org 10 points 20 hours ago

laughs in GrapheneOS

[–] mctoasterson@reddthat.com 42 points 1 day ago

People don't seem to understand the risks presented by normalizing client-side scanning on closed source devices. Think about how image recognition works. It scans image content locally and matches to keywords or tags, describing the person, objects, emotions, and other characteristics. Even the rudimentary open-source model on an immich deployment on a Raspberry Pi can process thousands of images and make all the contents searchable with alarming speed and accuracy.

So once similar image analysis is done on a phone locally, and pre-encryption, it is trivial for Apple or Google to use that for whatever purposes their use terms allow. Forget the iCloud encryption backdoor. The big tech players can already scan content on your device pre-encryption.

And just because someone does a traffic analysis of the process itself (safety core or mediaanalysisd or whatever) and shows it doesn't directly phone home, doesn't mean it is safe. The entire OS is closed source, and it needs only to backchannel small amounts of data in order to fuck you over.

Remember the original justification for clientside scanning from Apple was "detecting CSAM". Well they backed away from that line of thinking but they kept all the client side scanning in iOS and Mac OS. It would be trivial for them to flag many other types of content and furnish that data to governments or third parties.

[–] Denalduh@lemmy.world 35 points 1 day ago (2 children)

I didn't have it in my app drawer but once I went to this link, it showed as installed. I un-installed it ASAP.

https://play.google.com/store/apps/details?id=com.google.android.safetycore&hl=en-US

[–] danciestlobster@lemm.ee 14 points 1 day ago

I also reported it as hostile and inappropriate. I am sure Google will do fuck all with that report but I enjoy being petty sometimes

load more comments (1 replies)
[–] digdilem@lemmy.ml 9 points 21 hours ago (1 children)

More information: It's been rolling out to Android 9+ users since November 2024 as a high priority update. Some users are reporting it installs when on battery and off wifi, unlike most apps.

App description on Play store: SafetyCore is a Google system service for Android 9+ devices. It provides the underlying technology for features like the upcoming Sensitive Content Warnings feature in Google Messages that helps users protect themselves when receiving potentially unwanted content. While SafetyCore started rolling out last year, the Sensitive Content Warnings feature in Google Messages is a separate, optional feature and will begin its gradual rollout in 2025. The processing for the Sensitive Content Warnings feature is done on-device and all of the images or specific results and warnings are private to the user.

Description by google Sensitive Content Warnings is an optional feature that blurs images that may contain nudity before viewing, and then prompts with a “speed bump” that contains help-finding resources and options, including to view the content. When the feature is enabled, and an image that may contain nudity is about to be sent or forwarded, it also provides a speed bump to remind users of the risks of sending nude imagery and preventing accidental shares. - https://9to5google.com/android-safetycore-app-what-is-it/

So looks like something that sends pictures from your messages (at least initially) to Google for an AI to check whether they're "sensitive". The app is 44mb, so too small to contain a useful ai and I don't think this could happen on-phone, so it must require sending your on-phone data to Google?

[–] loics2@lemm.ee 3 points 13 hours ago

I guess the app then downloads the required models

[–] sommerset@thelemmy.club 12 points 1 day ago (2 children)

Thanks. Just uninstalled. What a cunts

[–] dev_null@lemmy.ml 6 points 19 hours ago (2 children)

Do we have any proof of it doing anything bad?

Taking Google's description of what it is it seems like a good thing. Of course we should absolutely assume Google is lying and it actually does something nefarious, but we should get some proof before picking up the pitchforks.

[–] btaf45@lemmy.world 2 points 12 hours ago (1 children)

Whether the people at Google who did this knows they are evil or thinks they are not evil doesn't really even matter. Having a phone app that automatically scans all your photos should scare the shit out of you. At the very least it wastes your battery and slows down your phone.

[–] dev_null@lemmy.ml 2 points 11 hours ago

If it provided a feature to automatically block incoming dick pics, which Google claims it's for, was fully local, and only scanned incoming messages, not my own gallery, which is what Google claims, I would likely find it useful. There is nothing wrong with the idea in general.

At the very least it wastes your battery

Again, if it's an optional feature that you can choose to turn on or off, there is nothing wrong with that.

[–] sommerset@thelemmy.club 9 points 18 hours ago* (last edited 18 hours ago) (3 children)

Google is always 100% lying.
There are too many instances to list and I'm not spending 5 hours collecting examples for you.
They removed don't be evil long time ago

[–] dev_null@lemmy.ml 12 points 16 hours ago* (last edited 16 hours ago)

They removed don’t be evil long time ago

See, this is why I like proof. If you go to Google's Code of Conduct today, or any other archived version, you can see yourself that it was never removed. Yet everyone believed the clickbait articles claiming so. What happened is they moved it from the header to the footer, clickbait media reported that as "removed" and everyone ran with it, even though anyone can easily see it's not true, and it takes 30 seconds to verify, not even 5 hours.

Years later you are still repeating something that was made up just because you heard it a lot.

Of course Google is absolutely evil and the phrase was always meaningless whether it's there or not, but we can't just make up facts just because it fits our world view. And we have to be aware of confirmation bias. Yeah Google removing "don't be evil" sounds about right for them, right? It makes perfect sense. But it just plain didn't happen.

[–] tame@dormi.zone 2 points 17 hours ago

Why check any sources first when you can just blindly rage and assume the worst?

https://grapheneos.social/@GrapheneOS/113969399311251057

load more comments (1 replies)
[–] static@lemm.ee 11 points 22 hours ago

I uninstalled it, and a couple of days later, it reappeared on my phone.

[–] perestroika@lemm.ee 6 points 21 hours ago (3 children)

The countdown to Android's slow and painful death is already ticking for a while.

It has become over-engineered and no longer appealing from a developer's viewpoint.

I still write code for Android because my customers need it - will be needing for a while - but I've stopped writng code for Apple's i-things and I research alternatives for Android. Rolling my own environment with FOSS components on top of Raspbian looks feasible already. On robots and automation, I already use it.

load more comments (3 replies)
[–] latenightnoir@lemmy.world 2 points 17 hours ago

Great, it'll have to plow through ~30GB of 1080p recordings of darkness and my upstairs neighbors living it up in the AMs. And nothing else.

load more comments
view more: next ›