this post was submitted on 03 Sep 2023
6 points (100.0% liked)

Gaming

2416 readers
256 users here now

The Lemmy.zip Gaming Community

For news, discussions and memes!


Community Rules

This community follows the Lemmy.zip Instance rules, with the inclusion of the following rule:

You can see Lemmy.zip's rules by going to our Code of Conduct.

What to Expect in Our Code of Conduct:


If you enjoy reading legal stuff, you can check it all out at legal.lemmy.zip.


founded 1 year ago
MODERATORS
top 4 comments
sorted by: hot top controversial new old
[–] huginn@feddit.it 0 points 1 year ago (2 children)

As a programmer: most people vastly overestimate the efficacy of large language models.

CEOs seem to overestimate them even more than everyone else.

A lot of AI researchers think LLMs are a dead end (See: Timnit Gebru) because by their structure they cannot understand truth.

The "hallucinations" are intrinsic to the structure and the best minds are saying there's no way around that.

We might be able to cludge together filters over it but at some point that's just hard coding the world anyways, which is what LLMs are supposed to avoid.

[–] secrethat@kbin.social 1 points 1 year ago

As a data scientist, people seem to just attribute anything that is a computer and they don't understand to AI or worse ChatGPT. Shudder

[–] sj_zero@lotide.fbxl.net 1 points 1 year ago (1 children)

I've been using chatgpt a lot, and it's really clear to me it has many uses, but it's almost more like asking your buddy who knows a lot but is full of shit too -- sometimes he tells you exactly what you need, sometimes he sends you on a wild goose chase with all kinds of false leads.

In the end you need your own competence because the human needs to be able to make a final decision about whether to listen or not.

[–] huginn@feddit.it 1 points 1 year ago

My EM suggested an integration using an SDK that doesn't exist.

He was very insistent that we just hadn't read the docs.

Then it came out that it was chat gpt suggesting it.