this post was submitted on 07 May 2025
685 points (100.0% liked)

TechTakes

1834 readers
721 users here now

Big brain tech dude got yet another clueless take over at HackerNews etc? Here's the place to vent. Orange site, VC foolishness, all welcome.

This is not debate club. Unless it’s amusing debate.

For actually-good tech, you want our NotAwfulTech community

founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] Flipper@feddit.org -5 points 1 day ago (2 children)

Apparently it's useful for extraction of information out of a text to a format you specify. A Friend is using it to extract transactions out of 500 year old texts. However to get rid of hallucinations the temperature reds to be 0. So the only way is to self host.

[–] OhNoMoreLemmy@lemmy.ml 11 points 1 day ago

Setting the temperature to 0 doesn't get rid of hallucinations.

It might slightly increase accuracy, but it's still going to go wrong.

[–] daellat@lemmy.world 8 points 1 day ago

Well, LLMs are capable (but hallucinant) and cost an absolute fuckton of energy. There have been purpose trained efficient ML models that we've used for years. Document Understanding and Computer Vision are great, just don't use a LLM for them.