this post was submitted on 13 Nov 2024
661 points (95.0% liked)

Technology

59415 readers
2803 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
top 50 comments
sorted by: hot top controversial new old
[–] OsrsNeedsF2P@lemmy.ml 12 points 2 days ago

I work with people who work in this field. Everyone knows this, but there's also an increased effort in improvements all across the stack, not just the final LLM. I personally suspect the current generation of LLMs is at its peak, but with each breakthrough the technology will climb again.

Put differently, I still suspect LLMs will be at least twice as good in 10 years.

[–] jpablo68@infosec.pub 10 points 2 days ago (1 children)

I just want a portable self hosted LLM for specific tasks like programming or language learning.

[–] plixel@programming.dev 7 points 2 days ago

You can install Ollama in a docker container and use that to install models to run locally. Some are really small and still pretty effective, like Llama 3.2 is only 3B and some are as little as 1B. It can be accessed through the terminal or you can use something like OpenWeb UI to have a more "ChatGPT" like interface.

[–] dejected_warp_core@lemmy.world 13 points 2 days ago (1 children)

Welcome to the top of the sigmoid curve.

If you were wondering what 1999 felt like WRT to the internet, well, here we are. The Matrix was still fresh in everyone's mind and a lot of online tech innovation kinda plateaued, followed by some "market adjustments."

[–] Hackworth@lemmy.world 7 points 2 days ago* (last edited 2 days ago)

I think it's more likely a compound sigmoid (don't Google that). LLMs are composed of distinct technologies working together. As we've reached the inflection point of the scaling for one, we've pivoted implementations to get back on track. Notably, context windows are no longer an issue. But the most recent pivot came just this week, allowing for a huge jump in performance. There are more promising stepping stones coming into view. Is the exponential curve just a series of sigmoids stacked too close together? In any case, the article's correct - just adding more compute to the same exact implementation hasn't enabled scaling exponentially.

[–] masquenox@lemmy.world 40 points 3 days ago (1 children)
[–] UnderpantsWeevil@lemmy.world 16 points 3 days ago (1 children)

I've been hearing about the imminent crash for the last two years. New money keeps getting injected into the system. The bubble can't deflate while both the public and private sector have an unlimited lung capacity to keep puffing into it. FFS, bitcoin is on a tear right now, just because Trump won the election.

This bullshit isn't going away. Its only going to get forced down our throats harder and harder, until we swallow or choke on it.

[–] thatKamGuy@sh.itjust.works 4 points 2 days ago (1 children)

With the right level of Government support, bubbles can seemingly go on for literal decades. Case in point, Australian housing since the late 90s has been on an uninterrupted tear (yes, even in ‘08 and ‘20).

load more comments (1 replies)
[–] aesthelete@lemmy.world 7 points 2 days ago

I hope it all burns.

[–] Blackmist@feddit.uk 44 points 3 days ago (14 children)

Thank fuck. Can we have cheaper graphics cards again please?

I'm sure a RTX 4090 is very impressive, but it's not £1800 impressive.

[–] lorty@lemmy.ml 9 points 2 days ago (11 children)

Just wait for the 5090 prices...

load more comments (11 replies)
[–] bountygiver@lemmy.ml 3 points 2 days ago

nope, if normal gamers are already willing to pay that price, no reason for nvidia to reduce them.

There's more 4090 on steam than any AMD dedicated GPU, there's no competition

[–] explodicle@sh.itjust.works 7 points 3 days ago

Sorry, crypto is back in season.

load more comments (11 replies)
[–] theacharnian@lemmy.ca 53 points 3 days ago (2 children)

It's so funny how all this is only a problem within a capitalist frame of reference.

load more comments (2 replies)
[–] kromem@lemmy.world 5 points 2 days ago

Oh nice, another Gary Marcus "AI hitting a wall post."

Like his "Deep Learning Is Hitting a Wall" post on March 10th, 2022.

Indeed, not much has changed in the world of deep learning between spring 2022 and now.

No new model releases.

No leaps beyond what was expected.

\s

Gary Marcus is like a reverse Cassandra.

Consistently wrong, and yet regularly listened to, amplified, and believed.

[–] Someplaceunknown@fedia.io 226 points 4 days ago (1 children)

"LLMs such as they are, will become a commodity; price wars will keep revenue low. Given the cost of chips, profits will be elusive," Marcus predicts. "When everyone realizes this, the financial bubble may burst quickly."

Please let this happen

[–] orl0pl@lemmy.world 35 points 4 days ago

Market crash and third world war. What a time to be alive!

[–] Semi_Hemi_Demigod@lemmy.world 198 points 4 days ago (4 children)

I wish just once we could have some kind of tech innovation without a bunch of douchebag techbros thinking it's going to solve all the world's problems with no side effects while they get super rich off it.

[–] ohwhatfollyisman@lemmy.world 63 points 4 days ago (10 children)

... bunch of douchebag techbros thinking it's going to solve all the world's problems with no side effects...

one doesn't imagine any of them even remotely thinks a technological panacaea is feasible.

... while they get super rich off it.

because they're only focusing on this.

load more comments (10 replies)
load more comments (3 replies)
[–] recapitated@lemmy.world 22 points 3 days ago

I think I've heard about enough of experts predicting the future lately.

[–] TankovayaDiviziya@lemmy.world 8 points 2 days ago (1 children)

Short on the AI stocks before it crash!

[–] sugar_in_your_tea@sh.itjust.works 15 points 2 days ago* (last edited 2 days ago)

The market can remain irrational longer than you can remain solvent.

A. Gary Shilling

[–] randon31415@lemmy.world 30 points 3 days ago (6 children)

The hype should go the other way. Instead of bigger and bigger models that do more and more - have smaller models that are just as effective. Get them onto personal computers; get them onto phones; get them onto Arduino minis that cost $20 - and then have those models be as good as the big LLMs and Image gen programs.

[–] Yaky@slrpnk.net 23 points 3 days ago (2 children)

Other than with language models, this has already happened: Take a look at apps such as Merlin Bird ID (identifies birds fairly well by sound and somewhat okay visually), WhoBird (identifies birds by sound, ) Seek (visually identifies plants, fungi, insects, and animals). All of them work offline. IMO these are much better uses of ML than spammer-friendly text generation.

load more comments (2 replies)
[–] rumba@lemmy.zip 10 points 3 days ago

This has already started to happen. The new llama3.2 model is only 3.7GB and it WAAAAY faster than anything else. It can thow a wall of text at you in just a couple of seconds. You're still not running it on $20 hardware, but you no longer need a 3090 to have something useful.

load more comments (4 replies)
[–] LovableSidekick@lemmy.world 20 points 3 days ago* (last edited 3 days ago)

Marcus is right, incremental improvements in AIs like ChatGPT will not lead to AGI and were never on that course to begin with. What LLMs do is fundamentally not "intelligence", they just imitate human response based on existing human-generated content. This can produce usable results, but not because the LLM has any understanding of the question. Since the current AI surge is based almost entirely on LLMs, the delusion that the industry will soon achieve AGI is doomed to fall apart - but not until a lot of smart speculators have gotten in and out and made a pile of money.

[–] halcyoncmdr@lemmy.world 102 points 4 days ago (12 children)

No shit. This was obvious from day one. This was never AGI, and was never going to be AGI.

Institutional investors saw an opportunity to make a shit ton of money and pumped it up as if it was world changing. They'll dump it like they always do, it will crash, and they'll make billions in the process with absolutely no negative repercussions.

load more comments (12 replies)
[–] CerealKiller01@lemmy.world 31 points 3 days ago (17 children)

Huh?

The smartphone improvements hit a rubber wall a few years ago (disregarding folding screens, that compose a small market share, improvement rate slowed down drastically), and the industry is doing fine. It's not growing like it use to, but that just means people are keeping their smartphones for longer periods of time, not that people stopped using them.

Even if AI were to completely freeze right now, people will continue using it.

Why are people reacting like AI is going to get dropped?

[–] finitebanjo@lemmy.world 18 points 3 days ago* (last edited 3 days ago)

People are dumping billions of dollars into it, mostly power, but it cannot turn profit.

So the companies who, for example, revived a nuclear power facility in order to feed their machine with ever diminishing returns of quality output are going to shut everything down at massive losses and countless hours of human work and lifespan thrown down the drain.

This will have an economic impact quite large as many newly created jobs go up in smoke and businesses who structured around the assumption of continued availability of high end AI need to reorganize or go out of business.

Search up the Dot Com Bubble.

[–] theherk@lemmy.world 18 points 3 days ago

Because in some eyes, infinite rapid growth is the only measure of success.

[–] drake@lemmy.sdf.org 4 points 2 days ago (2 children)

It’s absurdly unprofitable. OpenAI has billions of dollars in debt. It absolutely burns through energy and requires a lot of expensive hardware. People aren’t willing to pay enough to make it break even, let alone profit

load more comments (2 replies)
load more comments (14 replies)
[–] Greg@lemmy.ca 65 points 4 days ago (5 children)

largely based on the notion that LLMs will, with continued scaling, become artificial general intelligence

Who said that LLMs were going to become AGI? LLMs as part of an AGI system makes sense but not LLMs alone becoming AGI. Only articles and blog posts from people who didn't understand the technology were making those claims. Which helped feed the hype.

I 100% agree that we're going to see an AI market correction. It's going to take a lot of hard human work to achieve the real value of LLMs. The hype is distracting from the real valuable and interesting work.

[–] mutant_zz@lemmy.world 28 points 4 days ago (2 children)

OpenAI published a paper about GPT titled "Sparks of AGI".

I don't think they really believe it but it's good to bring in VC money

load more comments (2 replies)
load more comments (4 replies)
[–] DirigibleProtein@aussie.zone 52 points 4 days ago
[–] Defaced@lemmy.world 16 points 3 days ago

This is why you're seeing news articles from Sam Altman saying that AGI will blow past us without any societal impact. He's trying to lessen the blow of the bubble bursting for AI/ML.

[–] Mushroomm@sh.itjust.works 13 points 3 days ago

It's been 5 minutes since the new thing did a new thing. Is it the end?

load more comments
view more: next ›