this post was submitted on 18 Jul 2024
802 points (99.5% liked)

Technology

59651 readers
2630 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

Companies are going all-in on artificial intelligence right now, investing millions or even billions into the area while slapping the AI initialism on their products, even when doing so seems strange and pointless.

Heavy investment and increasingly powerful hardware tend to mean more expensive products. To discover if people would be willing to pay extra for hardware with AI capabilities, the question was asked on the TechPowerUp forums.

The results show that over 22,000 people, a massive 84% of the overall vote, said no, they would not pay more. More than 2,200 participants said they didn't know, while just under 2,000 voters said yes.

you are viewing a single comment's thread
view the rest of the comments
[–] EliteDragonX@lemmy.world 24 points 4 months ago (2 children)

This is yet another dent in the “exponential growth AGI by 2028” argument i see popping up a lot. Despite what the likes of Kurzweil, Musk, etc would have you believe, AI is severely overhyped and will take decades to fully materialise.

You have to understand that most of what you read about is mainly if not all hype. AI, self driving cars, LLM’s, job automation, robots, etc are buzzwords that the media loves to talk about to generate clicks. But the reality is that all of this stuff is extremely hyped up, with not much substance behind it.

It’s no wonder that the vast majority of people hate AI. You only have to look at self driving cars being unable to handle fog and rain after decades of research, or dumb LLM’s (still dumb after all this time) to see why. The only real things that have progressed quickly since the 80s are cell phones, computers, etc. Electric cars, self driving cars, stem cells, AI, etc etc have all not progressed nearly as rapidly. And even the electronics stuff is slowing down soon due to the end of Moore’s Law.

[–] cestvrai@lemm.ee 17 points 4 months ago* (last edited 4 months ago) (3 children)

There is more to AI than self driving cars and LLMs.

For example, I work at a company that trained a deep learning model to count potatoes in a field. The computer can count so much faster than we can, it’s incredible. There are many useful, but not so glamorous, applications for this sort of technology.

I think it’s more that we will slowly piece together bits of useful AI while the hyped areas that can’t deliver will die out.

[–] jj4211@lemmy.world 15 points 4 months ago (1 children)

Machine vision is absolutely the most slam dunk "AI" does work and has practical applications. However it was doing so a few years before the current craze. Basically the current craze was driven by ChatGPT, with people overestimating how far that will go in the short term because it almost acts like a human conversation, and that seemed so powerful .

[–] AA5B@lemmy.world 0 points 4 months ago (2 children)

That’s why I love ai: I know it’s been a huge part of phone camera improvements in the last few years.

I seem to get more use out of voice assistants because I know how to speak their language, but if language processing noticeably improves, that will be huge

Motion detection and person detection have been a revolution in cheap home cameras by very reliably flagging video of interest, but there’s always room for improvement. More importantly I want to be able to do that processing real time, on a device that doesn’t consume much power

[–] technocrit@lemmy.dbzer0.com 5 points 4 months ago

None of what you're describing is anything close to "intelligence". And it's all existed before this nonsense hype cycle.

[–] khaleer@sopuli.xyz 3 points 4 months ago (1 children)

AI camera generation isn't camera improvement.

[–] AA5B@lemmy.world 1 points 4 months ago* (last edited 4 months ago)

When my phone takes a clearer picture in darker situations and catch a recognizable action shot of my kid across a soccer field, it’s a better camera. It doesn’t matter whether the improvements were hardware or software, or even how true to life in some cases, it’s a better camera

Apple has done a great job of not only making cameras physically better, but integrating LiDAR for faster focus, image composition across multiple lenses, improved low light pictures, and post-processing to make dramatically better pictures in a wide range of conditions

[–] EliteDragonX@lemmy.world 4 points 4 months ago

That’s nice and all, but that’s nowhere close to a real intelligence. That’s just an algorithm that has “learned” what a potato is.

[–] technocrit@lemmy.dbzer0.com 3 points 4 months ago (1 children)

So... A machine is "intelligent" because it can count potatoes? This sort of nonsense is a huge part of the problem.

[–] captainlezbian@lemmy.world 1 points 4 months ago (1 children)

Idk robots are absolutely here and used. They’re just more Honda than Jetsons. I work in manufacturing and even in a shithole plant there are dozens of robots at minimum unless everything is skilled labor.

[–] nadram@lemmy.world 4 points 4 months ago (1 children)

I might be wrong but those do not make use of AI do they? It's just programming for some repetitive tasks.

[–] captainlezbian@lemmy.world 4 points 4 months ago

They use machine learning these days in the nice kind, but I misinterpreted you. I interpreted you as saying that robots were an example of hype like AI is, not that using AI in robots is hype. The ML in robots is stuff like computer vision to sort defects, detect expected variations, and other similar tasks. It’s definitely far more advanced than back in the day, but it’s still not what people think.