this post was submitted on 21 May 2024
503 points (95.3% liked)

Technology

59300 readers
4481 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] crazyminner@lemmy.ml 24 points 5 months ago* (last edited 5 months ago) (7 children)

I had an idea when these first AI image generators started gaining traction. Flood the CSAM market with AI generated images( good enough that you can't tell them apart.) In theory this would put the actual creators of CSAM out of business, thus saving a lot of children from the trauma.

Most people down vote the idea on their gut reaction tho.

Looks like they might do it on their own.

[–] Itwasthegoat@lemmy.world 8 points 5 months ago (1 children)

My concern is why would it put them out of business? If we just look at legal porn there is already beyond huge amounts already created, and the market is still there for new content to be created constantly. AI porn hasn't noticeably decreased the amount produced.

Really flooding the market with CSAM makes it easier to consume and may end up INCREASING the amount of people trying to get CSAM. That could end up encouraging more to be produced.

[–] crazyminner@lemmy.ml 4 points 5 months ago

The market is slightly different tho. Most CSAM is images, with Porn theres a lot of video and images.

load more comments (5 replies)