this post was submitted on 11 May 2025
710 points (92.7% liked)

Fuck AI

2780 readers
1037 users here now

"We did it, Patrick! We made a technological breakthrough!"

A place for all those who loathe AI to discuss things, post articles, and ridicule the AI hype. Proud supporter of working people. And proud booer of SXSW 2024.

founded 1 year ago
MODERATORS
 

cross-posted from: https://pawb.social/post/24295950

Source (Bluesky)

you are viewing a single comment's thread
view the rest of the comments
[–] RandomVideos@programming.dev 1 points 4 days ago (1 children)

Had the big ones not stolen the training data, were they not being used to leverage corporate goals over humans, they could be a very useful thing

AI still has the problems of spam(propaganda being the most dangerous variant of it), disinformation and impersonating real artists. These could be fixed if every AI image/video had a watermark, but i dont think that could be enforced well enough to completely eliminate these issues

[–] southsamurai@sh.itjust.works 1 points 4 days ago (1 children)

Those specific flaws are down to the same issue though. The training data was flawed enough, in large part due to being stolen wholesale, that it skews the matter towards counterfeits being easier. I would agree that in the absence of legislation, no for profit business based on ai will ever tag their output. It could be an easier task for non profit, and/or open source models though. Definitely something that needs addressing.

I'm not sure what you mean by spam being a direct problem of ai. Are you saying that it's easier to generate propaganda, and thus allow it to be spammed?

As near as I can tell, the propaganda farms were doing quite well spreading misinformation and disinformation before ai. Spamming it too, when that was useful to their goals.

[–] RandomVideos@programming.dev 1 points 4 days ago* (last edited 4 days ago)

As far as i know, the twitter AI tags its images

Propaganda is more of a problem with text generation than image generation, but both can be used to change peoples opinions much more easily than before