this post was submitted on 12 Jun 2024
174 points (91.0% liked)

Technology

34832 readers
15 users here now

This is the official technology community of Lemmy.ml for all news related to creation and use of technology, and to facilitate civil, meaningful discussion around it.


Ask in DM before posting product reviews or ads. All such posts otherwise are subject to removal.


Rules:

1: All Lemmy rules apply

2: Do not post low effort posts

3: NEVER post naziped*gore stuff

4: Always post article URLs or their archived version URLs as sources, NOT screenshots. Help the blind users.

5: personal rants of Big Tech CEOs like Elon Musk are unwelcome (does not include posts about their companies affecting wide range of people)

6: no advertisement posts unless verified as legitimate and non-exploitative/non-consumerist

7: crypto related posts, unless essential, are disallowed

founded 5 years ago
MODERATORS
 

It's time to call a spade a spade. ChatGPT isn't just hallucinating. It's a bullshit machine.

From TFA (thanks @mxtiffanyleigh for sharing):

"Bullshit is 'any utterance produced where a speaker has indifference towards the truth of the utterance'. That explanation, in turn, is divided into two "species": hard bullshit, which occurs when there is an agenda to mislead, or soft bullshit, which is uttered without agenda.

"ChatGPT is at minimum a soft bullshitter or a bullshit machine, because if it is not an agent then it can neither hold any attitudes towards truth nor towards deceiving hearers about its (or, perhaps more properly, its users') agenda."

https://futurism.com/the-byte/researchers-ai-chatgpt-hallucinations-terminology

@technology #technology #chatGPT #LLM #LargeLanguageModels

you are viewing a single comment's thread
view the rest of the comments
[–] davel@lemmy.ml 21 points 5 months ago* (last edited 5 months ago) (4 children)

I think “hallucinating” and “bullshitting” are pretty much synonyms in the context of LLMs. And I think they’re both equally imperfect analogies for the exact same reasons. When we talk about hallucinators & bullshitters, we’re almost always talking about beings with consciousness/understanding/agency/intent (people usually, pets occasionally), but spicy autocompleters don’t really have those things.

But if calling them “bullshit machines” is more effective communication, that’s great—let’s go with that.

To say that they bullshit reminds me of On Bullshit, which distinguishes between lying and bullshitting: “The main difference between the two is intent and deception.” But again I think it’s a bit of a stretch to say LLMs have intent.

I might say that LLMs hallunicate/bullshit, and the rules & guard rails that developers build into & around them are attempts to mitigate the madness.

[–] heavyboots@lemmy.ml 4 points 5 months ago

I totally agree that both seem to imply intent, but IMHO hallucinating is something that seems to imply not only more agency than an LLM has, but also less culpability. Like, "Aw, it's sick and hallucinating, otherwise it would tell us the truth."

Whereas calling it a bullshit machine still implies more intentionality than an LLM is capable of, but at least skews the perception of that intention more in the direction of "It's making stuff up" which seems closer to the mechanisms behind an LLM to me.

I also love that the researchers actually took the time to not only provide the technical definition of bullshit, but also sub-categorized it too, lol.

[–] medicsofanarchy@lemmy.world 3 points 5 months ago (1 children)

I think for the sake of mixed company and delicate sensibilities we should refer to this as a "BM" rather than a "bullshit machine". Therefore it could be a LLM BM, or simply a BM.

[–] davel@lemmy.ml 0 points 5 months ago

Large Bowel Movement, got it.

[–] bastardsheep@aus.social 2 points 5 months ago

@davel Very well said. I'll continue to call it bullshit because I think that's still a closer and more accurate term than "hallucinate". But it's far from the perfect descriptor of what AI does, for the reasons you point out.

[–] theory@xoxo.zone 1 points 5 months ago

@davel @ajsadauskas I enjoy the bullshitting analogy, but regression to mediocrity seems most accurate to me. I think it makes sense to call them mediocrity machines. (h/t @ElleGray)