this post was submitted on 06 Apr 2024
78 points (100.0% liked)

Privacy

4214 readers
11 users here now

A community for Lemmy users interested in privacy

Rules:

  1. Be civil
  2. No spam posting
  3. Keep posts on-topic
  4. No trolling

founded 1 year ago
MODERATORS
top 7 comments
sorted by: hot top controversial new old
[–] noodlejetski@lemm.ee 16 points 7 months ago
[–] elshandra@lemmy.world 10 points 7 months ago

If my ai bot is as exaggerated, fake and dense as so many youtubers seem to be these days. I think it will find itself without communication components in a very short time.

[–] FarraigePlaisteach@kbin.social 5 points 7 months ago (1 children)

If they’re scraping the web, and they’re generating AI content on the web, how do they avoiding training their AI on it’s own nonsense somewhere?

[–] harrys_balzac@lemmy.dbzer0.com 2 points 7 months ago

They can't since most AI generated content isn't tagged in any meaningful way to keep it from being scraped.

[–] AceFuzzLord@lemm.ee 5 points 7 months ago

Can't wait for this to backfire because those same companies might end up training on data created by using their AI. Can't wait for the most popular videos you see when you're not logged in to become AI generated videos that are so bad that they are literal nonsense that makes no sense, causing it to start punishing any videos that make sense.

I have a feeling we're gonna go into a content death spiral on the platform, worse than anything we've ever seen before. Only because the largest channels will end up abusing it with reckless abandon.

[–] Immersive_Matthew@sh.itjust.works 1 points 7 months ago (1 children)

We already know they used all the public information on the Internet. How is this news? If AI is going to be any use, it needs to learn from somewhere.

[–] circuitfarmer@lemmy.sdf.org 3 points 7 months ago

People have been used to a lot of private services for a while now. YouTube is so ubiquitous it's almost like a utility, in that everyone always has access to it and it's just everywhere, with no real competitor.

But all of these social media services are private, so as much as they feel like public information utilities, once you're on one, your data isn't your own. I think that's the disconnect when people hear that "their data" has been used for AI training. It ceased to be their data as soon as it went on the platform, at least tacitly in the US.

There has traditionally been a public expectation of control that simply isn't there for any of these services. The industry knows this and capitalizes on it regularly. It's a key tenet of technofeudalism.