this post was submitted on 24 Apr 2024
281 points (95.2% liked)

Technology

60086 readers
2635 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 2 years ago
MODERATORS
top 50 comments
sorted by: hot top controversial new old
[–] otp@sh.itjust.works 83 points 8 months ago* (last edited 8 months ago) (12 children)

The laws regarding a lot of this stuff seem to ignore that people under 18 can and will be sexual.

If we allow people to use this tech for adults (which we really shouldn't), then we have to accept that people will use the same tech on minors. It isn't even necessarily pedophilia on all cases (such as when the person making them is also a minor)*, but it's still something that very obviously shouldn't be happening.

* we don't need to get into semantics. I'm just saying it's not abnormal (the way pedophilia is) for a 15-year old to be attracted to another 15-year old in a sexual way.

Without checks in place, this technology will INEVITABLY be used to undress children. If the images are stored anywhere, then these companies will be storing/possessing child pornography.

The only way I can see to counteract this would be to invade the privacy of users (and victims) to the point where nobody using them """legitimately""" would want to use it...or to just ban them outright.

[–] micka190@lemmy.world 20 points 8 months ago (3 children)

such as when the person making them is also a minor

I get the point you're tying to make. But minors taking nudes of themselves is illegal in a lot of places, because it's still possession.

[–] BrianTheeBiscuiteer@lemmy.world 56 points 8 months ago (1 children)

And that's still a bit messed up. It's a felony for a teen to have nude pictures of themselves and they'll be registered sex offenders for life and probably ineligible for most professions. Seems like quite a gross over reaction. There needs to be a lot of reform in this area but no politician wants to look like a "friend" to pedophiles.

[–] gravitas_deficiency@sh.itjust.works 9 points 8 months ago (2 children)

It does seem a bit heavy handed when the context is just two high schoolers tryna smash.

[–] micka190@lemmy.world 2 points 8 months ago (1 children)

The issue is that the picture then exists, and it's hard to prove it was actually destroyed.

For example, when I was in high school, a bunch of girls would send nudes to guys. But that was 10 years ago. Those pictures still exist. Those dudes aren't minors anymore. Their Messenger chats probably still exist somewhere. Nothing's really preventing them from looking at those pictures again.

I get why it's illegal. And, honestly, I find it kind of weird that there's people trying to justify why it shouldn't be illegal. You're still allowed to have sex at that age. Just don't take pictures/videos of it.

[–] BrianTheeBiscuiteer@lemmy.world 2 points 7 months ago

That makes complete sense except that stuff just does not register with teens. If a couple months in juvenile hall and 100 hours community service isn't enough deterrent for a teenager then 5 years in jail and a lifelong label of "sex offender" won't deter them. I recall seeing a picture of a classmate topless (under 18) and over 20 years later it finally dawned on me that it was child pornography.

If we prosecuted every offender to the full extent of the law then like half of every high school class would be in jail. Not to say that something should be legal as long as enough people are breaking the law but if millions of kids are violating some of the strictest laws in the country we're probably not getting the full picture.

load more comments (1 replies)
[–] Zorque@kbin.social 22 points 8 months ago

Which is more of a "zero-tolerance" policy, like unto giving the same punishment to a student defending themselves as the one given to the person who initiated the attack.

[–] otp@sh.itjust.works 10 points 8 months ago (2 children)

I get the point you're tying to make. But minors taking nudes of themselves is illegal in a lot of places, because it's still possession.

I agree, and on the one hand, I understand why it could be good to consider it illegal (to prevent child porn from existing), but it does also seem silly to treat it as a case of pedophilia.

load more comments (2 replies)
load more comments (11 replies)
[–] ristoril_zip@lemmy.zip 48 points 8 months ago (10 children)

This genie is probably impossible to get back in the bottle.

People are going to just direct the imitative so called AI program to make the face just different enough to have plausible deniability that it's a fake of this person or that person. Or use existing tech to age them to 18+ (or 30+ or whatever). Or darken or lighten their skin or change their eye or hair color. Or add tattoos or piercings or scars...

I'm not saying we should be happy about it, but it is here and I don't think it's going anywhere. Like, if you tell your so called AI to give you a completely fictional nude image or animation of someone that looks similar to Taylor Swift but isn't Taylor Swift, what's the privacy (or other) violation, exactly?

Does Taylor Swift own every likeness that looks somewhat like hers?

[–] PM_Your_Nudes_Please@lemmy.world 19 points 8 months ago

It’s also not a new thing. It’s just suddenly much easier for the layman to do. Previously, you needed some really good photoshop skills to pull it off. But you could make fake nudes if you really wanted to, and were willing to put in the time and effort.

[–] FlyingSquid@lemmy.world 3 points 8 months ago (1 children)

This does give prosecutors a new angle though. So it's not for nothing.

load more comments (1 replies)
load more comments (8 replies)
[–] Daft_ish@lemmy.world 40 points 8 months ago* (last edited 8 months ago) (5 children)

This is probably not the best context but I find it crazy how fast the government will get involved if it involves lude content but children are getting mudered in school shootings and gun control is just a bridge too far.

[–] pro_grammer@programming.dev 11 points 8 months ago* (last edited 8 months ago)

I think they act faster on those matters because, aside from being a very serious problem, they also have a conservative agenda.

Is very easy to say: "LOOK, WE ARE DOING THIS TO PROTECT YOUR CHILDREN FROM PEDOPHILES!!!"

But they can't just go and say "let's enforce gun safety on schools", because having a conservative voter reading "gun safety" will already go bad for them.

They know they are sacrificing the well being of children by not acting on the school shootings, but for them is just the price of a few lives to stay in power.

[–] Psythik@lemmy.world 5 points 8 months ago* (last edited 8 months ago) (2 children)

Are quaaludes even still available in 2024?

Or did you mean to say "lewd"?

load more comments (2 replies)
[–] Grandwolf319@sh.itjust.works 2 points 8 months ago

Those shootings don’t happen in private schools.

Nudes happen in private schools.

[–] EatATaco@lemm.ee 2 points 8 months ago

There's no competing interests when it comes to protecting child from child sexual exploitation. When it comes to protecting them from guns, there is the competing interest of the second amendment.

[–] themeatbridge@lemmy.world 31 points 8 months ago (2 children)

No reason not to ban them entirely.

The problem is enforcing the ban. Would it be a crime to have access to the software, or would they need to catch the criminals with the images and video files? It would be trivial to host a site in a country without legal protections and make the software available from anywhere.

[–] 520@kbin.social 21 points 8 months ago (3 children)

Would it be a crime to have access to the software, or would they need to catch the criminals with the images and video files?

Problem with the former is that would outlaw any self hosted image generator. Any image generator is capable of use for deep fake porn

[–] themeatbridge@lemmy.world 6 points 8 months ago

Right, this is my point. The toothpaste is out of the tube. So would simply having the software capable of making deepfake porn be a crime?

load more comments (2 replies)
[–] mynamesnotrick@lemmy.zip 11 points 8 months ago (2 children)

I feel like a sensible realistic course of action with this is it needs to be in the act of sharing/distributing. It would be way to broad otherwise as the tools that generate this stuff have unlimited purposes. Obvious child situations should be dealt with in the act of production of but the enforcement mechanism needs to be on the sharing/distribution part. Unfortunately analogy is blame the person not the tool on this one.

[–] catloaf@lemm.ee 6 points 8 months ago

Right. And honestly, this should already be covered under existing harassment laws.

load more comments (1 replies)
[–] NutWrench@lemmy.world 23 points 8 months ago (1 children)

"We're gonna ban Internet stuff" is something said by people who have no idea how the Internet works.

[–] pro_grammer@programming.dev 9 points 8 months ago* (last edited 8 months ago)

They probably do this to satisfy voters who also don't know how the internet works.

[–] TheFriar@lemm.ee 17 points 8 months ago (2 children)

That title is…misleading. Why start it that way?

[–] sugar_in_your_tea@sh.itjust.works 11 points 8 months ago (1 children)
[–] Son_of_dad@lemmy.world 7 points 8 months ago

Hey we said no deepfakes!

[–] FenrirIII@lemmy.world 5 points 8 months ago

It does sound off. But, then again, these are politicians, so it could go either way.

[–] WhyDoYouPersist@lemmy.world 11 points 8 months ago (2 children)

For some reason I thought it was mainly to protect Taylor Swift, with teen girls being the afterthought.

[–] Imgonnatrythis@sh.itjust.works 3 points 8 months ago (1 children)

Won't somebody please think of Taylor?!

[–] GreyEyedGhost@lemmy.ca 3 points 8 months ago

But not that way...

[–] Buelldozer@lemmy.today 8 points 8 months ago
[–] boatsnhos931@lemmy.world 3 points 8 months ago

My digital bobs and vagene is very special I'll have you know

[–] hexdream@lemmy.world 2 points 8 months ago

And no chance it's because they want to, uh, thoroughly investigate the evidence....

load more comments
view more: next ›