this post was submitted on 20 Oct 2023
1344 points (100.0% liked)

196

16244 readers
1937 users here now

Be sure to follow the rule before you head out.

Rule: You must post before you leave.

^other^ ^rules^

founded 1 year ago
MODERATORS
 
you are viewing a single comment's thread
view the rest of the comments
[โ€“] someguy7734206@sh.itjust.works 26 points 11 months ago (14 children)

One thing I've started to think about for some reason is the problem of using AI to detect child porn. In order to create such a model, you need actual child porn to train it on, which raises a lot of ethical questions.

[โ€“] LeylaaLovee@lemmy.blahaj.zone 2 points 11 months ago* (last edited 11 months ago)

This is a stupid comment trying to hide as philosophical. If your website is based in the US (like 80 percent of the internet is), you are REQUIRED to keep any CSAM uploaded to your website and report it. Otherwise, you're deleting evidence. So all these websites ALREADY HAVE giant databases of child porn. We learned this when Lemmy was getting overran with CP and DB0 made a tool to find it. This is essentially just using shit any legally operating website would already have around the office, and having a computer handle it instead of a human who could be traumatized or turned on by the material. Are websites better for just keeping a database of CP and doing nothing but reporting it to cops who do nothing? This isn't even getting into how moderators that look for CP STILL HAVE TO BE TRAINED TO DO IT!

Yeah, a real fuckin moral quandary there, I bet this is the question that killed Kant.

load more comments (13 replies)