this post was submitted on 04 Apr 2025
194 points (96.2% liked)

Technology

68305 readers
5844 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
 

Nucleo's investigation identified accounts with thousands of followers with illegal behavior that Meta's security systems were unable to identify; after contact, the company acknowledged the problem and removed the accounts

you are viewing a single comment's thread
view the rest of the comments
[–] mic_check_one_two@lemmy.dbzer0.com 1 points 7 hours ago* (last edited 7 hours ago)

What's to stop actual child abusers from just photoshopping a 6th finger onto their images and then claiming that it's AI generated?

Aside from the other arguments people have presented, this wrecks one of the largest reasons that people produce CSAM. Pedophiles are insular data hoarders by necessity, because actually creating and procuring it is such a big risk. Every time they go online to find new content, they’re at risk of stumbling into a honeypot. And producing it requires IRL work, and a LOT of risk of being caught/turned in by the victim. They tend to form tight-knit rings, and one of the only reliable ways to get into a ring as an outsider is to provide your own CSAM to the others. CSAM is traded in these rings like baseball cards, where you need fresh content in order to receive fresh content.

The data hoarding side of things is where all of the “cops bust pedophile with 100TB of CSAM” headlines come from; In reality, it was probably like 1TB of videos, (which is a lot, but not unheard of) but was backed up multiple times in multiple places, because losing it would be catastrophic for the CSAM producer; They can’t simply go grab a new blue ray of it. And the cops counted the full size of each backup disk, not just the space that was used.

Intentionally marking your content as AI-generated would ruin the trading value, because nobody will see it as valuable/worth trading for if it’s fake. At best, you won’t get anything for it. At worst, you’d be labeled a cop trying to pass off AI content to gather evidence.