this post was submitted on 01 Jun 2024
1599 points (98.6% liked)

Technology

59565 readers
3434 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 
you are viewing a single comment's thread
view the rest of the comments
[–] SnotFlickerman@lemmy.blahaj.zone 41 points 5 months ago* (last edited 5 months ago) (2 children)

To be fair, they call it a hallucination because hallucinations don't have intent behind them.

LLMs don't have any intent. Period.

A purposeful lie requires an intent to lie.

Without any intent, it's not a lie.

I agree that "fabrication" is probably a better word for it, especially because it implies the industrial computing processes required to build these fabrications. It allows the word fabrication to function as a double entendre: It has been fabricated by industrial processes, and it is a fabrication as in a false idea made from nothing.

[–] dohpaz42@lemmy.world 4 points 5 months ago

I did look up an article about it that basically said the same thing, and while I get “lie” implies malicious intent, I agree with you that fabricate is better than hallucinating.

[–] zout@fedia.io 4 points 5 months ago (1 children)

LLM's may not have any intent, but companies do. In this case, Google decides to present the AI answer on top of the regular search answers, knowing that AI can make stuff up. MAybe the AI isn't lying, but Google definitely is. Even with the "everything is experimental, learn more" line, because they'd just give the information if they'd really want you to learn more, instead of making you have to click again for it.

[–] SnotFlickerman@lemmy.blahaj.zone 2 points 5 months ago (2 children)

In other words, I agree with your assessment here. The petty abject attempts by all these companies to produce the world's first real "Jarvis" are all couched in "they didn't stop to think if they should."

[–] zout@fedia.io 4 points 5 months ago (1 children)

My actual opnion is that they don't want to think if they should, because they know the answer. The pressure to go public with a shitty model outweighs the responsibility to the people relying on the search results.

[–] SnotFlickerman@lemmy.blahaj.zone 6 points 5 months ago* (last edited 5 months ago) (1 children)

It is difficult to get a man to understand something when his salary depends on his not understanding it.

-Upton Sinclair

Sadly, same as it ever was. You are correct, they already know the answer, so they don't want to consider the question.

[–] dohpaz42@lemmy.world 2 points 5 months ago

There’s also the argument that “if we don’t do it, somebody else would,” and I kind of understand that, while I also disagree with it.

[–] marcos@lemmy.world 1 points 5 months ago

Oh, they absolutely should. A "Jarvis" would be great.

But that thing they are pushing has absolutely no relation to a "Jarvis".