this post was submitted on 03 May 2025
940 points (97.5% liked)

Technology

69871 readers
4292 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] T156@lemmy.world 10 points 5 days ago (1 children)

Conversely, while the research is good in theory, the data isn't that reliable.

The subreddit has rules requiring users engage with everything as though it was written by real people in good faith. Users aren't likely to point out a bot when the rules explicitly prevent them from doing that.

There wasn't much of a good control either. The researchers were comparing themselves to the bots, so it could easily be that they themselves were less convincing, since they were acting outside of their area of expertise.

And that's even before the whole ethical mess that is experimenting on people without their consent. Post-hoc consent is not informed consent, and that is the crux of human experimentation.

[–] thanksforallthefish 2 points 4 days ago (1 children)

Users aren't likely to point out a bot when the rules explicitly prevent them from doing that.

In fact one user commented that he had his comment calling out one of the bots as a bot deleted by mods for breaking that rule

[–] FriendBesto@lemmy.ml 1 points 2 days ago

Point there is clear, that even the mods helped the bots manipulate people to a cause/point. This proves the studiy's point even more. In practice and in the real world.

Imagine the experiment was allowed to run secretly, it would have changed user's minds since the study claims that the bots were 3 to 6 times better at manipulating people than a human in different metrics.

Given that Reddit is a bunch of hive minds, it is obvious that it would have made huge dents. As mods have a tendency to delete or ban anyone who rejects the group think. So mods are also a part of the problem.