this post was submitted on 26 Aug 2024
364 points (97.2% liked)

News

23259 readers
2805 users here now

Welcome to the News community!

Rules:

1. Be civil


Attack the argument, not the person. No racism/sexism/bigotry. Good faith argumentation only. This includes accusing another user of being a bot or paid actor. Trolling is uncivil and is grounds for removal and/or a community ban. Do not respond to rule-breaking content; report it and move on.


2. All posts should contain a source (url) that is as reliable and unbiased as possible and must only contain one link.


Obvious right or left wing sources will be removed at the mods discretion. We have an actively updated blocklist, which you can see here: https://lemmy.world/post/2246130 if you feel like any website is missing, contact the mods. Supporting links can be added in comments or posted seperately but not to the post body.


3. No bots, spam or self-promotion.


Only approved bots, which follow the guidelines for bots set by the instance, are allowed.


4. Post titles should be the same as the article used as source.


Posts which titles don’t match the source won’t be removed, but the autoMod will notify you, and if your title misrepresents the original article, the post will be deleted. If the site changed their headline, the bot might still contact you, just ignore it, we won’t delete your post.


5. Only recent news is allowed.


Posts must be news from the most recent 30 days.


6. All posts must be news articles.


No opinion pieces, Listicles, editorials or celebrity gossip is allowed. All posts will be judged on a case-by-case basis.


7. No duplicate posts.


If a source you used was already posted by someone else, the autoMod will leave a message. Please remove your post if the autoMod is correct. If the post that matches your post is very old, we refer you to rule 5.


8. Misinformation is prohibited.


Misinformation / propaganda is strictly prohibited. Any comment or post containing or linking to misinformation will be removed. If you feel that your post has been removed in error, credible sources must be provided.


9. No link shorteners.


The auto mod will contact you if a link shortener is detected, please delete your post if they are right.


10. Don't copy entire article in your post body


For copyright reasons, you are not allowed to copy an entire article into your post body. This is an instance wide rule, that is strictly enforced in this community.

founded 1 year ago
MODERATORS
 

A Florida man is facing 20 counts of obscenity for allegedly creating and distributing AI-generated child pornography, highlighting the danger and ubiquity of generative AI being used for nefarious reasons.

Phillip Michael McCorkle was arrested last week while he was working at a movie theater in Vero Beach, Florida, according to TV station CBS 12 News. A crew from the TV station captured the arrest, which made for dramatic video footage due to law enforcement leading away the uniform-wearing McCorkle from the theater in handcuffs.

top 50 comments
sorted by: hot top controversial new old
[–] Nollij@sopuli.xyz 85 points 2 months ago (10 children)

This creates a significant legal issue - AI generated images have no age, nor is there consent.

The difference in appearance between age 16 and 18 is minimal, but the legal difference is immense. This is based entirely on a concept that cannot apply.

How do you define what's depicting a fictional child? Especially without including real adults? I've met people who believe that preferring a shaved pubic area is pedophilia. This is even though the vast majority of adult women do so. On the flip side, teenagers from the 70s and 80s would be mistaken for 40+ today.

Even the extremes aren't clear. Adult star "Little Lupe", who was 18+ in every single appearance, lacked most secondary sex characteristics. Experts testified in court that she could not possibly be an adult. Except she was, and there's full documentation to prove it. Would AI trained exclusively on her work be producing CSAM?

load more comments (10 replies)
[–] DmMacniel@feddit.org 73 points 2 months ago (121 children)

I don't see how children were abused in this case? It's just AI imagery.

It's the same as saying that people get killed when you play first person shooter games.

Or that you commit crimes when you play GTA.

[–] timestatic@feddit.org 33 points 2 months ago (8 children)

Then also every artist creating loli porn would have to be jailed for child pornography.

load more comments (8 replies)
load more comments (120 replies)
[–] jaggedrobotpubes@lemmy.world 61 points 2 months ago (19 children)

Do we know that AI child porn is bad? I could believe it would get them in the mood for the real thing and make them do it more, and I could believe it would make them go "ok, itch scratched", and tank the demand for the real stuff.

Depending on which way it goes, it could be massively helpful for protecting kids. I just don't have a sense for what the effect would be, and I've never seen any experts weigh in.

[–] damnedfurry@lemmy.world 33 points 2 months ago

Do we know that AI child porn is bad? I could believe it would get them in the mood for the real thing and make them do it more, and I could believe it would make them go “ok, itch scratched”, and tank the demand for the real stuff.

From bits/articles I've seen here and there over the years about other things that are kind of in the same category (porn comics with child characters in them, child-shaped sex dolls), the latter seems to be more the case.

I'm reminded of when people were arguing that when Internet porn became widespread, the incidence of rape would go through the roof. And then literally the opposite happened. So...that pushes me toward hypothesizing that the latter is more likely to be the case, as well.

[–] PhilMcGraw@lemmy.world 22 points 2 months ago (3 children)

In Australia cartoon child porn is enforced in the same way as actual child porn. Not that it answers your question but it's interesting.

I'd imagine for your question "it depends", some people who would have acted on their urges may get their jollies from AI child porn, others who have never considered being pedophiles might find the AI child porn (assuming legal) and realise it's something they were into.

I guess it may lower the production of real child porn which feels like a good thing. I'd hazard a guess that there are way more child porn viewers than child abusers.

load more comments (3 replies)
[–] Thespiralsong@lemmy.world 16 points 2 months ago

I seem to remember Sweden did a study on this, but I don't really want to google around to find it for you. Good luck!

load more comments (16 replies)
[–] BonesOfTheMoon@lemmy.world 56 points 2 months ago (16 children)

Could this be considered a harm reduction strategy?

Not that I think CSAM is good in any way, but if it saves a child would it be worthwhile? Like if these pedos were to use AI images instead of actual CSAM would that be any better?

I've read that CSAM sites on the dark web number into the hundreds of thousands. I just wonder if it would be a less harmful thing since it's such a problem.

[–] RandomlyNice@lemmy.world 38 points 2 months ago (3 children)

Many years ago (about 25) I read an article in a newspaper (idk the name, but it may have been the The Computer Paper, which is archived on line someplace}. This article noted that a study had been commissioned to show that cp access increases child abuse. The study seemed to show the opposite.

Here's the problem with even AI generated cp: It might lower abuse in the beginning, but with increased access it would 'normalise' the perception of such conduct. This would likely increase abuse over time, even involving persons who may not have been so inclined otherwise.

This is all a very complex. A solution isn't simple. Shunning things in anyway won't help though, and that seems to be the current most popular way to deal with the issue.

[–] Facebones@reddthat.com 25 points 2 months ago (8 children)

Actual pedophiles (a lot of CSA is abuse of power, not pedophilia - though to be clear fuck abusers either way) have a high rate of suicidal ideation because they think its as fucked up as everyone else. Of course we can't just say "sure AI material is legal now" but I could imagine a regulated system accessed via doctors akin to how controlled substances work.

People take this firm "kill em all" stance but these people just feel the way they do same as I do towards women or a gay man feels toward men. It just is what it is - we all generally agree gay isnt a choice and this is no different. As long as they dont act on it, I think we should be sympathetic and be open to helping them live a less tortured life.

I'm not 100% saying this is how we do it, but we should be open to exploring the issue instead of full stop demonization.

load more comments (8 replies)
[–] Cryophilia@lemmy.world 18 points 2 months ago (1 children)

"Normalized" violent media doesn't seem to have increased the prevalence of real world violence.

load more comments (1 replies)
load more comments (1 replies)
load more comments (15 replies)
[–] Stern@lemmy.world 39 points 2 months ago (1 children)

Lolicon fans in absolute shambles.

load more comments (1 replies)
[–] RangerJosie@lemmy.world 38 points 2 months ago (1 children)

Hey, remember that terrible thing everyone said would happen?

It's happening.

load more comments (1 replies)
[–] hexdream@lemmy.world 25 points 2 months ago (10 children)

If this thread (and others like it) have taught me aulnything is that facts be damned, people are opinionated either way. Nuance means nothing and it's basically impossible to have a proper discussion when it comes to wedge issues or anything that can be used to divide people. Even if every study 100% said Ai generated csam always led to a reduction in actual child harm and reduced recidivism and never needed any actual real children to be used as training material, the comments would still pretty much look the same. If the studies showed the exact opposite, the comments would also be the same. Welcome to the internet. I hope you brought aspirin.

load more comments (10 replies)
[–] recapitated@lemmy.world 24 points 2 months ago* (last edited 2 months ago) (1 children)

To be clear, I am happy to see a pedo contained and isolated from society.

At the same time, this direction of law is something that I don't feel I have the sophistication to truly weigh in on, even though it invokes so many thoughts for me.

I hope we as a society get this one right.

[–] Cryophilia@lemmy.world 14 points 2 months ago

We never do.

[–] Mubelotix@jlai.lu 17 points 2 months ago (7 children)

It's not really children on these pics. We can't condemn people for things that are not illegal yet

[–] Microw@lemm.ee 15 points 2 months ago (1 children)

It's Florida. They will simply book him and then present him a deal for "only x years prison", which he'll take and therefore prevent this from going to court and actually be ruled upon.

load more comments (1 replies)
load more comments (6 replies)
load more comments
view more: next ›