this post was submitted on 27 Apr 2025
128 points (94.4% liked)

Technology

69391 readers
3126 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
top 30 comments
sorted by: hot top controversial new old
[–] IllNess@infosec.pub 20 points 8 hours ago

In a statement to CNN, Telegram said the company “has a zero-tolerance policy for illegal pornography” and uses “a combination of human moderation, AI and machine learning tools and reports from users and trusted organizations to combat illegal pornography and other abuses of the platform.”

They have machine learning algorithms for identifying nudity in pictures for decades now. Tech companies also have the best facial recognition software ever.

The company could combine both technologies to instantly stop uploads of content with the faces of previous victims.

[–] demunted@lemmy.ml 20 points 8 hours ago (1 children)

The incel chuds out there redistributing and shaming the target is something I wish would be dealt with in the mainstream. It's disgusting that we're regressing as a society allowing for bullying to continue.

[–] BigRigButters@lemm.ee 1 points 2 hours ago

“Incel chuds” lol

[–] taladar@sh.itjust.works 45 points 10 hours ago (4 children)

Lets be perfectly honest here, especially for pictures this has been possible for at least 20 years.

What is ruining people's lives is the obsession a conservative society has with demonizing nudity and sex.

[–] MoonlightFox@lemmy.world 19 points 9 hours ago (2 children)

I agree in principle, society is demonizing nudity and sex. This has got to change. Society needs to change in order to fix this and many other issues related to sex and nudity.

As long as it affects a persons reputation and their standing, this is a problem. Any person can harm someone with this technology, and as a society we can not accept that.

Most people could not make a decent fake sex tape with any person in the world with low effort before. Now they can.

Should creating deepfakes for personal consumption be legal/illegal? Distribution is the real problem. The rest is fantasizing with tools more or less. Some people will understandably not like it if they find out other people fantasize about them, but that is close to thought crime. What is acceptable? Is a stickdrawing with names too much? What if I am really good at realistic drawings? What if I draw many images in a book and make a physical animation of ouf it? Is the limit anything outside my head? What if I draw a politician fellating another one and distribute it as art/satire?

The short term solution is to ban deepfakes, the long term is probably something else, but I am not sure what. There is not inherently any actual abuse in deepfakes, there is no actual sex either. So it's a reputational/honor and disgust thing. These things still matter a lot in societies, so we can't ignore it either.

[–] BigRigButters@lemm.ee 1 points 2 hours ago

Also people fixate and idolize them making it worse

[–] DeathsEmbrace@lemm.ee 2 points 9 hours ago

Religion would like to have a word

[–] wizardbeard@lemmy.dbzer0.com 31 points 10 hours ago (2 children)

Lets not pretend that lowering the barrier to making fake nudes hasn't changed the situation either.

[–] altphoto@lemmy.today 5 points 8 hours ago (1 children)

If there was no shame in being nude, then there wouldn't be a problem..... Imagine finding out that a guy you know has hundreds of portraits of you. Portraits you were never in! Absolutely not terrifying.

[–] catloaf@lemm.ee 4 points 3 hours ago (1 children)

I mean that's still creepy and probably grounds for a restraining order

[–] altphoto@lemmy.today 2 points 2 hours ago

Its sarcastic also. LOL.

[–] taladar@sh.itjust.works 7 points 9 hours ago (1 children)

In a way it has improved it. It makes it more plausible to claim that any nudes that do show up of you are fake.

[–] turnip@sh.itjust.works 1 points 3 hours ago (1 children)

It will also multiply the amount of porn a hundred fold, which will disconnect any kind of basis in reality. When youre viewing nothing but fake nudes and you are aware of it I'd assume it makes it kind of weird.

[–] taladar@sh.itjust.works 1 points 2 hours ago

It is just another step in that direction starting at real amateur porn (i.e. real sex by real people being filmed) over professional porn (actors being filmed), fake amateur porn (professionals pretending to be amateurs), animated or rendered or machinima porn,...

There hasn't really been that much genuine left in porn in quite a while.

[–] werty@sh.itjust.works 7 points 7 hours ago (1 children)

Prude shaming is exactly the same as slut shaming. Implicit in your comment is the notion that women ought to be perfectly fine with deepfake porn of themselves being created and viewed by anyone, but society has made them prudish through the demonization of nudity sex.

Women are people and have rights and feelings that should not be ignored simply to serve the male sex drive, and a great many women do not want to be deepfake porn stars. If you disagree then say so. Don't hide behind the notion that society has robbed women of their desire to serve your fantasy through the demonization of sex. Sexual liberation includes the right to not have or be involved in sexual activity if one chooses. Prude shaming, like yours, is designed to remove that right.

No wonder 4b took off in Korea.

[–] taladar@sh.itjust.works 0 points 5 hours ago (1 children)

Society subjects people to a lot of things that people aren't "perfectly fine with" but only some of them have wider implications for reputation, career chances, ostracizing,... and that is the main problem with this kind of technology.

Your assumption that people need deepfake technology to fantasize about people sexually who have no interest in being on the receiving end of that is at best naive and at worst arguing in bad faith.

[–] werty@sh.itjust.works 1 points 5 hours ago

and that is the main problem with this kind of technology.

In your opinuon.

Your assumption that people need deepfake technology to fantasize about people sexually who have no interest in being on the receiving end of that is at best naive and at worst arguing in bad faith.

I never assumed or suggested any such thing. You are making shit up.

[–] arararagi@ani.social 3 points 9 hours ago

It's just like piracy, when It becomes too easy it's when companies and government begins the crackdown.

Theoretically possible since Photoshop, but you had to be pretty fucking good at it, now even teens are making deepfakes of their teachers and classmates.

[–] SunshineJogger@feddit.org 65 points 13 hours ago (3 children)

This problem already existed since people got good enough with MS paint. Then got next level bigger with photoshop and is now simply on the next level once more with ai generation.

It's bad, and it's also not something new and not caused by AI but by humans misusing a tool. Again.

[–] tetris11@lemmy.ml 37 points 12 hours ago (3 children)

Aren't we bringing about an era where you can't trust what you see or hear, unless it comes from a source you trust?

Essentially aren't we just reverting back to 1800s where news came from newspapers of reputation, and hearsay came from elsewhere

[–] remotelove@lemmy.ca 35 points 12 hours ago (1 children)

It's worse. We are reverting back to the age of lügenpresse and hearsay comes in short-form video formats.

Many people simply do not care (or are even aware) if a source is trusted if the message aligns with their own bias or the message is presented as a new "fact". Trust is irrelevant, unfortunately.

[–] jeena@piefed.jeena.net 4 points 11 hours ago

How is that different to before the 1800?

https://www.youtube.com/watch?v=zrzMhU_4m-g back then they also burned witches because of hearsay.

[–] halcyoncmdr@lemmy.world 19 points 12 hours ago

Basically, except the newspapers of today no longer care about reputation. They only care about clicks, the bottom line, and speed. Accuracy is no longer a primary focus.

[–] DeathsEmbrace@lemm.ee -1 points 8 hours ago (1 children)

No because in the 1800s you could argue their was a thing called journalism. Now a days the debate between clicks and news means there isn't going to be trust worthy news because it's brought to you by Amazon AWS.

[–] tetris11@lemmy.ml 2 points 6 hours ago

Definitely agree for the most part. I would say that independent news like AP, and the Guardian (arguably), do have reputations they try to uphold, but I hear you

[–] Flemmy@lemm.ee 5 points 13 hours ago

The fetish genre you know but don't want to know and yet someone is out there providing them. That's why normalcy is ok.

Like rule34+robot sex toys+VR is a huge mistake that should be noped if you think your child is heading that direction.

[–] prole@lemmy.blahaj.zone 0 points 8 hours ago (1 children)

It's so disingenuous to pretend this is the same as editing a photo in MS Paint.

[–] SunshineJogger@feddit.org 8 points 7 hours ago

You missed the point. It's about this simply being the next step of an old and longstanding topic.

I'm quite simply fed up with people treating AI as the incorrect starting point and scapegoat of so many things they dislike that already existed.

[–] moe90@feddit.nl 17 points 13 hours ago
[–] FartMaster69@lemmy.dbzer0.com 7 points 13 hours ago

Well, making deepfakes is certainly ruining Johnny Somali’s life