this post was submitted on 23 Mar 2025
767 points (97.8% liked)
Technology
68066 readers
4237 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related news or articles.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Humans hallucinate. These things extrapolate tokens statistically. In average his tone of requests would be likely to lead to some murder story.
Nothing is wrong with the tech (except it doesn't seem very useful when you firmly know what it can't do), but everything is wrong with that tech being called artificial intelligence.
It's almost like calling polygraph "lies detector".
I replied to the following statement:
I countered this dismissal by quoting the article, which explains that it was more than just a coincidental name mix up.
You response is not really relevant to my response, unless you are assuming I'm arguing for one side or the other. I'm just informing someone who dismissed the article's headline using an explanation that demonstrated that they didn't bother to read the article.
If the owners of the technology call it artificial intelligence and hype or sell it as a potential replacement for intelligent human decision making then it should be absolutely be judged on those grounds.
I know I've hate the fact that we've settled on the word Hallucinate It anthropomorphizes something that absolutely isn't intelligent.
It's not capable of thinking a particular piece of information is true when it isn't, because it isn't capable of thinking about information in general.
Well, one of is features is remembering details you told it previously. I am surprised it went into child murder territory. In my experience it will usually avoid such topics unless prompted carefully. My initial suspicion is he prompted it into saying these things, but to be fair these things have gone off the rails before. Either way, most people just have a very little or no understanding on what these things are and how they work.
It's a function approximate...er!
I have understanding and this is horrible, extreme malpractice and not an inherent feature