this post was submitted on 17 Mar 2025
575 points (96.9% liked)

Technology

66975 readers
5589 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
 

Half of LLM users (49%) think the models they use are smarter than they are, including 26% who think their LLMs are “a lot smarter.” Another 18% think LLMs are as smart as they are. Here are some of the other attributes they see:

  • Confident: 57% say the main LLM they use seems to act in a confident way.
  • Reasoning: 39% say the main LLM they use shows the capacity to think and reason at least some of the time.
  • Sense of humor: 32% say their main LLM seems to have a sense of humor.
  • Morals: 25% say their main model acts like it makes moral judgments about right and wrong at least sometimes. Sarcasm: 17% say their prime LLM seems to respond sarcastically.
  • Sad: 11% say the main model they use seems to express sadness, while 24% say that model also expresses hope.
(page 4) 50 comments
sorted by: hot top controversial new old
[–] interested_party@lemmy.org 2 points 2 days ago

It's probably true too.

[–] Schadrach@lemmy.sdf.org 1 points 1 day ago

An LLM is roughly as smart as the corpus it is summarizing is accurate for the topic, because at their best they are good at creating natural language summarizers. Most of the main ones basically do an internet search and summarize the top couple of results, which means they are as good as the search engine backing them. Which is good enough for a lot of topics, but...not so much for the rest.

[–] Montreal_Metro@lemmy.ca 7 points 2 days ago

There’s a lot of ignorant people out there so yeah, technically LLM is smarter than most people.

[–] Dindonmasker@sh.itjust.works 10 points 3 days ago (1 children)

I don't think a single human who knows as much as chatgpt does exists. Does that mean chatgpt is smarter then everyone? No. Obviously not based on what we've seen so far. But the amount of information available to these LLMs is incredible and can be very useful. Like a library contains a lot of useful information but isn't intelligent itself.

[–] kameecoding@lemmy.world 4 points 2 days ago

That's pretty weak reasoning, by your own words, it isn't intellignt, it doesnt know anything.

By that logic wikipedia is also smarter than any human because it has lot of knowledge.

[–] AbnormalHumanBeing@lemmy.abnormalbeings.space 11 points 3 days ago (2 children)

I wouldn't be surprised if that is true outside the US as well. People that actually (have to) work with the stuff usually quickly learn, that its only good at a few things, but if you just hear about it in the (pop-, non-techie-)media (including YT and such), you might be deceived into thinking Skynet is just a few years away.

[–] singletona@lemmy.world 5 points 2 days ago

It's a one trick pony.

That trick also happens to be a really neat trick that can make people think it's a swiss army knife instead of a shovel.

load more comments (1 replies)
[–] avidamoeba@lemmy.ca 8 points 2 days ago* (last edited 2 days ago)

Just a thought, perhaps instead of considering the mental and educational state of the people without power to significantly affect this state, we should focus on the people who have power.

For example, why don't LLM providers explicitly and loudly state, or require acknowledgement, that their products are just imitating human thought and make significant mistakes regularly, and therefore should be used with plenty of caution?

It's a rhetorical question, we know why, and I think we should focus on that, not on its effects. It's also much cheaper and easier to do than refill years of quality education in individuals heads.

[–] transMexicanCRTcowfart@lemmy.world 8 points 3 days ago* (last edited 3 days ago)

Aside from the unfortunate name of the university, I think that part of why LLMs may be perceived as smart or 'smarter' is because they are very articulate and, unless prompted otherwise, use proper spelling and grammar, and tend to structure their sentences logically.

Which 'smart' humans may not do, out of haste or contextual adaptation.

[–] EncryptKeeper@lemmy.world 6 points 2 days ago* (last edited 2 days ago)

The funny thing about this scenario is by simply thinking that’s true, it actually becomes true.

[–] Fubarberry@sopuli.xyz 7 points 3 days ago* (last edited 3 days ago)

I wasn't sure from the title if it was "Nearly half of U.S. adults believe LLMs are smarter than [the US adults] are." or "Nearly half of U.S. adults believe LLMs are smarter than [the LLMs actually] are." It's the former, although you could probably argue the latter is true too.

Either way, I'm not surprised that people rate LLMs intelligence highly. They obviously have limited scope in what they can do, and hallucinating false info is a serious issue, but you can ask them a lot of questions that your typical person couldn't answer and get a decent answer. I feel like they're generally good at meeting what people's expectations are of a "smart person", even if they have major shortcomings in other areas.

[–] MITM0@lemmy.world 2 points 2 days ago

Why are you even surprised at this point, when it comes to Americans ?

[–] ulterno@programming.dev 0 points 1 day ago
[–] TheObviousSolution@lemm.ee -2 points 1 day ago* (last edited 1 day ago) (2 children)

They are. Unless you can translate what I'm saying to any language I tell you to on the fly, I'm going to assume that anyone that tells me they are smarter than LLMs are lower on the spectrum than usual. Wikipedia and a lot of libraries are also more knowledgeable than me, who knew. If I am grateful for one thing, it is that I am not one of those people whose ego has to be jizzing everywhere, including their perception of things.

load more comments (2 replies)
[–] cmhe@lemmy.world 1 points 2 days ago

I suppose some of that comes down to the personal understanding of what "smart" is.

I guess you could call some person, that doesn't understand a topic, but still manages to sound reasonable when talking about it, and might even convince people that they actually have a deep understanding of that topic, "smart", in a kind of "smart imposter".

[–] Ledericas@lemm.ee 0 points 1 day ago (1 children)

only boomers and tech-unsavy people think that.

load more comments (1 replies)
[–] 1984@lemmy.today 4 points 2 days ago (1 children)

An llm simply has remembered facts. If that is smart, then sure, no human can compete.

Now ask an llm to build a house. Oh shit, no legs and cant walk. A human can walk without thinking about it even.

In the future though, there will be robots who can build houses using AI models to learn from. But not in a long time.

[–] Omgpwnies@lemmy.world 3 points 2 days ago (1 children)

3d-printed concrete houses are already a thing, there's no need for human-like machines to build stuff. They can be purpose-built to perform whatever portion of the house-building task they need to do. There's absolutely no barrier today from having a hive of machines built for specific purposes build houses, besides the fact that no-one as of yet has stitched the necessary components together.

It's not at all out of the question that an AI can be trained up on a dataset of engineering diagrams, house layouts, materials, and construction methods, with subordinate AIs trained on the specific aspects of housing systems like insulation, roofing, plumbing, framing, electrical, etc. which are then used to drive the actual machines building the house. The principal human requirement at that point would be the need for engineers to check the math and sign-off on a design for safety purposes.

load more comments (1 replies)
[–] echodot@feddit.uk 5 points 3 days ago

Maybe if the adults actually didn't use the LLMs so much this wouldn't be the case.

load more comments
view more: ‹ prev next ›