this post was submitted on 21 Oct 2024
478 points (98.2% liked)

Facepalm

2597 readers
488 users here now

founded 1 year ago
MODERATORS
 
you are viewing a single comment's thread
view the rest of the comments
[–] Muffi@programming.dev 44 points 20 hours ago (3 children)

I was having lunch at a restaurant a couple of months back, and overheard two women (~55 y/o) sitting behind me. One of them talked about how she used ChatGPT to decide if her partner was being unreasonable. I think this is only gonna get more normal.

[–] GreenKnight23@lemmy.world 1 points 41 minutes ago

I would rather it from a LLM over some dumb shit magazine quiz, and I fucking hate LLMs.

[–] Wolf314159@startrek.website 36 points 16 hours ago (1 children)

A decade ago she would have been seeking that validation from her friends. ChatGPT is just a validation machine, like an emotional vibrator.

[–] Trainguyrom@reddthat.com 8 points 11 hours ago

The difference between asking a trusted friend for advice vs asking ChatGPT or even just Reddit is a trusted friend will have more historical context. They probably have met or at least interacted with the person in question, and they can bring i the context of how this person previously made you feel. They can help you figure out if you're just at a low point or if it's truly a bad situation to get out of.

Asking ChatGPT or Reddit is really like asking a Magic 8 Ball. How you frame the question and simply asking the question helps you interrogate your feelings and form new opinions about the situation, but the answers are pretty useless since there's no historical context to base the answers off of, plus the answers are only as good as the question asked.

[–] orcrist@lemm.ee 1 points 11 hours ago

I don't think people who think very much would bother to ask ChatGPT, unless they didn't have any friends, because it's quite obvious that relationship advice is delicate and you certainly want the advice giver to know something about your situation. You know, like your friends do, like computers don't.

We don't even have to look at the low quality advice, because there's no way it would be informed advice.