this post was submitted on 27 Nov 2023
6 points (71.4% liked)

Artificial Intelligence

1337 readers
18 users here now

Welcome to the AI Community!

Let's explore AI passionately, foster innovation, and learn together. Follow these guidelines for a vibrant and respectful community:

You can access the AI Wiki at the following link: AI Wiki

Let's create a thriving AI community together!

founded 1 year ago
 

With the proliferation of AI powered deepfakes at galactic speeds, there will be nobody (especially women) that will have shred of privacy left in few years. It does not matter if it is fake photo or not, nobody should be able to see "you" naked unless you allow it. But with rise of tools that are able to run on consumer level hardware, that seems like really losing battle. Since how can we police what a person can run or cannot run on his personal computer? That is another can of worms better not opened, since the idea of some agency being able to monitor what you do on your PC is another dystopia. Soon, you can never be too sure if your neighbor or coworker did not deepfaked you and now every time he looks at you he sees you as sexual object. That is highly uncomfortable though for sure.

Since we cannot possibly stop it, what is the best option moving forward? Normalizing it? Marginalizing it, since it is fake after all? Ignoring it? No option seems very good either.

This goes way beyond current framework of "revenge porn", since when it comes to revenge porn, the case is simple - unlawful distribution without consent. But what about unlawful generation for personal use without consent? I cannot think of legal grounds that could make this criminal offense, since soon we would have to ban even drawing lewd doodles with pencil at home.

top 7 comments
sorted by: hot top controversial new old
[–] Hermano@feddit.de 5 points 11 months ago (1 children)

I think these are two different scenarios. If the generated images are kept private it's not too different from previous times. People could draw pictures, glue a head to a magazine page, photoshop things or just imagine stuff before. Sharing deep fakes is new though, I'd say this should be treated much like revenge porn as the damage is similar.

[–] Adequately_Insane@lemmy.world -2 points 11 months ago* (last edited 11 months ago)

The main problem with gluying stuff to magazine for example was that the result was not that great either and in the end it was not that enticing to do for would be faker. Plus it was obvious it is not you so most people would just laugh it off. But the deepfakes are whole different thing, without proper labeling it can be passed as real you and even if that person does not distribute it, it can be unnerving for some people just to think that someone has their nude photos that look like real thing without their consent and you never know who has those photos. That is why marginalization or ignoring it will be lot more harder than with the old school fakery.

I for one will laugh it off. Wanna see fake naked picture of me and rub one off? Be my guest, I could care less. I might be even flattered. But then there is whole other group, that takes it somewhat seriously. That is why when most social networks started, private profiles were not a thing. But soon, bunch of people started thinking "ewww, I shared bunch of pictures of me online, and now what if someone rubs one off to my fully clothed pictures, how do I counter that, how can we stop those creeps?" Bam, and we got private profiles. But doing something like that now to stop AI gen seems kind of impossible, since the cat is already out of the bag and there is a good chance that person which has online presence of any kind will have at least one mugshot of them somewhere available. And even if not, there are always yearbooks. Or something.

[–] BrikoX@lemmy.zip 4 points 11 months ago (1 children)

All LLMs do is lower the bar for entry. Fake nudes were a thing long before LLMs, now instead of needing photo or video editing skills you can ask LLM to do it for you.

And the majority of people don't care about privacy until it doesn't affect them negative personally, they put their whole life online for people to see and scrape that data. There is no stopping LLMs now, the time for that was 10+ years ago, but everyone ignored those people that sounded the alarm as privacy nuts or conspiracy theorists...

[–] andrewta@lemmy.world 1 points 11 months ago (2 children)
[–] BrikoX@lemmy.zip 2 points 11 months ago

https://en.wikipedia.org/wiki/Large_language_model

The way "AI" is used lately misrepresents what that truly means. It became a useless buzz word marketing teams are using.

[–] Adequately_Insane@lemmy.world 1 points 11 months ago

Lewd Large Language Model

[–] canis_majoris@lemmy.ca 1 points 11 months ago

Deepfakes for porn are not the problem.

Deepfakes of media and the propaganda therein, is the real problem.

Does it suck that a person can have their clothed photos turned into porn? Sure, but it's way smaller scale than the mass creation of propaganda that is being done with LLMs. In comparison deepfakes for nudes are practically a non-issue.