this post was submitted on 05 Jun 2025
950 points (98.8% liked)

Not The Onion

16546 readers
1494 users here now

Welcome

We're not The Onion! Not affiliated with them in any way! Not operated by them in any way! All the news here is real!

The Rules

Posts must be:

  1. Links to news stories from...
  2. ...credible sources, with...
  3. ...their original headlines, that...
  4. ...would make people who see the headline think, “That has got to be a story from The Onion, America’s Finest News Source.”

Please also avoid duplicates.

Comments and post content must abide by the server rules for Lemmy.world and generally abstain from trollish, bigoted, or otherwise disruptive behavior that makes this community less fun for everyone.

And that’s basically it!

founded 2 years ago
MODERATORS
(page 2) 50 comments
sorted by: hot top controversial new old
[–] deathbird@mander.xyz 7 points 22 hours ago (1 children)

Sue that therapist for malpractice! Wait....oh.

[–] jagged_circle@feddit.nl 3 points 20 hours ago (2 children)

Pretty sure you can sue the ai company

load more comments (2 replies)
[–] Zacryon@feddit.org 100 points 1 day ago (14 children)

I feel like humanity is stupid. Over and over again we develop new technologies, make breakthroughs, and instead of calmly evaluating them, making sure they're safe, we just jump blindly on the bandwagon and adopt it for everything, everywhere. Just like with asbestos, plastics and now LLMs.

Fucking idiots.

[–] iAvicenna@lemmy.world 49 points 1 day ago (1 children)

"adopt it for everything, everywhere."

The sole reason for this being people realizing they can make some quick bucks out of these hype balloons.

[–] dil@lemmy.zip 11 points 1 day ago (1 children)

they usually know its bad but want to make money before the method is patched, like cigs causing cancer and health issues but that kid money was so good

[–] WorldsDumbestMan@lemmy.today 1 points 16 hours ago (1 children)

Claude has simply been of amazing help that humans have not. Because humans are kind of dicks.

If it gets something wrong, I simply correct it and ask better.

load more comments (1 replies)
load more comments (13 replies)
[–] untakenusername@sh.itjust.works 31 points 1 day ago (22 children)

LLMs have a use case

But they really shouldnt be used for therapy

load more comments (22 replies)
[–] TheDeadlySquid@lemm.ee 5 points 23 hours ago

And thus the flaw in AI is revealed.

[–] ExtremeDullard@lemmy.sdf.org 208 points 1 day ago (7 children)

Remember: AI chatbots are designed to maximize engagement, not speak the truth. Telling a methhead to do more meth is called customer capture.

load more comments (7 replies)
load more comments
view more: ‹ prev next ›