this post was submitted on 30 Sep 2023
550 points (97.9% liked)

World News

39104 readers
2216 users here now

A community for discussing events around the World

Rules:

Similarly, if you see posts along these lines, do not engage. Report them, block them, and live a happier life than they do. We see too many slapfights that boil down to "Mom! He's bugging me!" and "I'm not touching you!" Going forward, slapfights will result in removed comments and temp bans to cool off.

We ask that the users report any comment or post that violate the rules, to use critical thinking when reading, posting or commenting. Users that post off-topic spam, advocate violence, have multiple comments or posts removed, weaponize reports or violate the code of conduct will be banned.

All posts and comments will be reviewed on a case-by-case basis. This means that some content that violates the rules may be allowed, while other content that does not violate the rules may be removed. The moderators retain the right to remove any content and ban users.


Lemmy World Partners

News !news@lemmy.world

Politics !politics@lemmy.world

World Politics !globalpolitics@lemmy.world


Recommendations

For Firefox users, there is media bias / propaganda / fact check plugin.

https://addons.mozilla.org/en-US/firefox/addon/media-bias-fact-check/

founded 1 year ago
MODERATORS
(page 3) 50 comments
sorted by: hot top controversial new old
[–] SayJess@lemmy.blahaj.zone 4 points 1 year ago* (last edited 1 year ago) (32 children)

My god there are way too many comments in here trying to normalize pedophilia. Disgusting. Pathetic.

These are people that need serious psychiatric care, not acceptance or to be included in the LGBTQ+ community. There is absolutely nothing to compare between them and any group within the LGBTQ+ community. Nothing.

Combatting CP is a hard enough task for the poor bastards that have to do it. There does not need to be AI produced images in the mix.

Lemmy, do better.

load more comments (32 replies)
[–] 0ddysseus@lemmy.world 4 points 1 year ago (9 children)

(Apologies if I use the wrong terminology here, I'm not an AI expert, just have a fact to share)

The really fucked part is that at least google has scraped a whole lot of CSAM as well as things like ISIS execution bids etc and they have all this stuff stored and use it to do things like train the algorithms for AIs. They refuse to delete this material as they claim that they just find the stuff and aren't responsible for what it is.

Getting an AI image generator to produce CSAM means it knows what to show. So why is the individual in jail and not the tech bros?

[–] mcgravier@kbin.social 2 points 1 year ago

Getting an AI image generator to produce CSAM means it knows what to show

Not necessarily. Part of AI is blending different concepts. AI trained on images of regular children and nude adults in principle should be able to produce underage nudity. This is a side effect of the intelligence in the AI

load more comments (8 replies)
load more comments
view more: ‹ prev next ›