this post was submitted on 28 Jun 2025
881 points (94.6% liked)

Technology

72062 readers
3577 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
 

We are constantly fed a version of AI that looks, sounds and acts suspiciously like us. It speaks in polished sentences, mimics emotions, expresses curiosity, claims to feel compassion, even dabbles in what it calls creativity.

But what we call AI today is nothing more than a statistical machine: a digital parrot regurgitating patterns mined from oceans of human data (the situation hasn’t changed much since it was discussed here five years ago). When it writes an answer to a question, it literally just guesses which letter and word will come next in a sequence – based on the data it’s been trained on.

This means AI has no understanding. No consciousness. No knowledge in any real, human sense. Just pure probability-driven, engineered brilliance — nothing more, and nothing less.

So why is a real “thinking” AI likely impossible? Because it’s bodiless. It has no senses, no flesh, no nerves, no pain, no pleasure. It doesn’t hunger, desire or fear. And because there is no cognition — not a shred — there’s a fundamental gap between the data it consumes (data born out of human feelings and experience) and what it can do with them.

Philosopher David Chalmers calls the mysterious mechanism underlying the relationship between our physical body and consciousness the “hard problem of consciousness”. Eminent scientists have recently hypothesised that consciousness actually emerges from the integration of internal, mental states with sensory representations (such as changes in heart rate, sweating and much more).

Given the paramount importance of the human senses and emotion for consciousness to “happen”, there is a profound and probably irreconcilable disconnect between general AI, the machine, and consciousness, a human phenomenon.

https://archive.ph/Fapar

(page 4) 50 comments
sorted by: hot top controversial new old
[–] Geodad@lemmy.world 32 points 1 day ago (10 children)

I've never been fooled by their claims of it being intelligent.

Its basically an overly complicated series of if/then statements that try to guess the next series of inputs.

[–] Flagstaff@programming.dev 11 points 1 day ago (3 children)

ChatGPT 2 was literally an Excel spreadsheet.

I guesstimate that it's effectively a supermassive autocomplete algo that uses some TOTP-like factor to help it produce "unique" output every time.

And they're running into issues due to increasingly ingesting AI-generated data.

Get your popcorn out! 🍿

[–] aesthelete@lemmy.world 25 points 1 day ago* (last edited 1 day ago) (2 children)

I really hate the current AI bubble but that article you linked about "chatgpt 2 was literally an Excel spreadsheet" isn't what the article is saying at all.

load more comments (2 replies)
load more comments (2 replies)
load more comments (9 replies)
[–] some_guy@lemmy.sdf.org 20 points 1 day ago (2 children)

People who don't like "AI" should check out the newsletter and / or podcast of Ed Zitron. He goes hard on the topic.

load more comments (2 replies)
[–] RalphWolf@lemmy.world 24 points 1 day ago (7 children)

Steve Gibson on his podcast, Security Now!, recently suggested that we should call it "Simulated Intelligence". I tend to agree.

[–] pyre@lemmy.world 8 points 1 day ago (1 children)

reminds me of Mass Effect's VI, "virtual intelligence": a system that's specifically designed to be not truly intelligent, as AI systems are banned throughout the galaxy for its potential to go rogue.

[–] Repelle@lemmy.world 6 points 1 day ago

Same, I tend to think of llms as a very primitive version of that or the enterprise’s computer, which is pretty magical in ability, but no one claims is actually intelligent

[–] goondaba@lemmy.world 7 points 1 day ago (1 children)

I’ve taken to calling it Automated Inference

load more comments (1 replies)
[–] modifier@lemmy.ca 6 points 1 day ago

Pseudo-intelligence

load more comments (4 replies)
[–] confuser@lemmy.zip 8 points 1 day ago* (last edited 1 day ago) (1 children)

The thing is, ai is compression of intelligence but not intelligence itself. That's the part that confuses people. Ai is the ability to put anything describable into a compressed zip.

[–] elrik@lemmy.world 5 points 1 day ago (2 children)

I think you meant compression. This is exactly how I prefer to describe it, except I also mention lossy compression for those that would understand what that means.

[–] confuser@lemmy.zip 4 points 1 day ago

Lol woops I guess autocorrect got me with the compassion

load more comments (1 replies)
[–] Nomad@infosec.pub 7 points 1 day ago (1 children)

I think most people tend to overlook the most obvious advantages and are overly focused on what is supposed to be and marketed as.

No need to think how to feed a thing into google to get a decent starting point for reading. No finding the correct terminology before finding the thing you are looking for. Just ask like you would ask a knowledgeable individual and you get an overview of what you wanted to ask in the first place.

Discuss a little to get the options and then start reading and researching the everliving shit out of them to confirm all the details.

[–] grabyourmotherskeys@lemmy.world 9 points 1 day ago (1 children)

Agreed.

When I was a kid we went to the library. If a card catalog didn't yield the book you needed, you asked the librarian. They often helped. No one sat around after the library wondering if the librarian was "truly intelligent".

These are tools. Tools slowly get better. Is a tool make life easier or your work better, you'll eventually use it.

Yes, there are woodworkers that eschew power tools but they are not typical. They have a niche market, and that's great, but it's a choice for the maker and user of their work.

load more comments (1 replies)
[–] Angelusz@lemmy.world 6 points 1 day ago (2 children)

Super duper shortsighted article.

I mean, sure, some points are valid. But there's not just programmers involved, other professions such as psychologists and Philosophers and artists, doctors etc. too.

And I agree AGI probably won't emerge from binary systems. However... There's quantum computing on the rise. Latest theories of the mind and consciousness discuss how consciousness and our minds in general also appear to work with quantum states.

Finally, if biofeedback would be the deciding factor.. That can be simulated, modeled after a sample of humans.

The article is just doomsday hoo ha, unbalanced.

Show both sides of the coin...

load more comments (2 replies)
load more comments
view more: ‹ prev next ›