this post was submitted on 28 Sep 2024
429 points (98.2% liked)

Technology

59300 readers
4750 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
top 50 comments
sorted by: hot top controversial new old
[–] Blackout@fedia.io 220 points 1 month ago (1 children)

What does OpenAI need a CEO for anyway? Just let chatGPT run the company if they are so gung-ho about it.

[–] CosmicTurtle0@lemmy.dbzer0.com 72 points 1 month ago (1 children)

I've been saying this for almost a year. Not open AI specifically but any company with a board of directors.

They aren't considering the shareholder value of their most expensive liability: the CEO.

He (because let's face it. It's going to be a he in most cases) is paid millions of dollars with a golden parachute. Literally money that could be given back to shareholders through dividends.

The fact that Boards of Directors aren't doing this could be evidence that they aren't looking out for shareholders' interests

[–] eskimofry@lemm.ee 49 points 1 month ago

Here's why:

Boards of directors are CEOs of other companies that are buddies of the CEO of the company they are directors of. This is like a shitty musical chair of board of directors.

[–] Ilovethebomb@lemm.ee 176 points 1 month ago (3 children)

I can't wait for the AI bubble to burst.

[–] Alphane_Moon@lemmy.world 71 points 1 month ago (3 children)

We are all waiting. If they don't come up with proven revenue opportunities in the next ~18 months, it's going to be difficult to justify the astronomical capex spend.

[–] NeoNachtwaechter@lemmy.world 25 points 1 month ago

This podcasting bro is NOT chasing revenue (yet).

He wants power.

He wants to collect 11-12 figure sums of venture capital and then built things that let him rule the world.

And afterwards, maybe revenue.

[–] john_lemmy@slrpnk.net 16 points 1 month ago

Another year of this shit? I don't think we can take it, honestly

[–] rollerbang@lemmy.world 3 points 1 month ago

Mah, won't happen like this. It was similar with Facebook 10 years ago and look at where it is now.

[–] wewbull@feddit.uk 3 points 1 month ago (3 children)

Which tech titan does it take with it?

My money is on Microsoft as owners of OpenAI, but most have sunk more into it than they should.

[–] Ilovethebomb@lemm.ee 23 points 1 month ago

I doubt anyone that big will fall, Microsoft have so many fingers in so many pies, they can afford to take a hit like this. Plus, with the Office suite of products, they're probably in the best place to make something back, even if they don't make all their money back.

[–] Voroxpete@sh.itjust.works 11 points 1 month ago

Microsoft are bullet proof. Their share price will take a big hit, and an exec or two will take a golden parachute, but they'll bounce back very quickly. The bigger problem is that along the way they'll balance the capex with multiple rounds of cutbacks and layoffs in other departments, and that's before they're finally forced to layoff everyone actually connected to this AI nonsense (who isn't a senior manager or c-suite; they'll all be fine).

[–] frezik@midwest.social 6 points 1 month ago (2 children)

Nvidea. Their share price would be a fraction of what it is without AI. Just like the last two cryptocurrency bubbles, they went all in and then acted surprised when they popped.

At the same time, they've lost a lot of goodwill with gamers, formerly their core audience. With the AAA industry pulling back, games might not be pushing the limits of GPU tech anymore. Microsoft still has their old core products, but Nvidia may return to it to find a wasteland.

[–] conciselyverbose@sh.itjust.works 5 points 1 month ago (1 children)

Nvidia isn't going to be holding any bag. They're selling through what they make, and LLMs are just one of many uses for the massively parallel math they're at the forefront of. At most they have to bring pricing down, but they don't own the fab, so if demand did drop (which isn't really all that likely), their costs will go down too. They have contracts in terms of volume and price, but they're not near long term enough to do them more than a blip, and all their investment in developing architecture/tooling has value well outside of LLM nonsense.

[–] frezik@midwest.social 3 points 1 month ago (2 children)

Their stock price will tank. They have a $3B market cap because they're selling shovels in a gold rush. Once the gold rush is over, that valuation will go back to where they were three years ago. Probably lower, because the stock market tends to overcorrect on these things.

Companies base their capital on their stock price, and a drop like that can kill companies. Doesn't mean for sure that Nvidia will die, but they could.

[–] conciselyverbose@sh.itjust.works 3 points 1 month ago* (last edited 1 month ago)

Their fundamentals are too strong. They have market dominance with extremely steady technological progress against really bad competition. LLMs aren't going to disappear when the shitty overpromising bubble pops. Generative AI isn't going anywhere. Any of the thousands of other uses for their raw power are still there. They'll just be at the ground floor of whatever the next math heavy hype cycle is, just like they were with crypto and LLMs, because cuda is the best way to get shit done, whether what you're doing is useful or trash.

load more comments (1 replies)
load more comments (1 replies)
load more comments (1 replies)
[–] cybersandwich@lemmy.world 109 points 1 month ago (2 children)

I've heard someone call it billionaire brain rot. I think at some point you end up with so much money and not enough people telling you no, that it literally changes your brain.

Seems likely.

[–] noobface@lemmy.world 49 points 1 month ago (1 children)

Imagine never hearing the word "No." as a complete sentence ever again in your life.

[–] normanwall@lemmy.world 19 points 1 month ago (1 children)

Or when you do just assuming you can override them eventually

We had a guy like that at work, he said basically "how far above you do I have to go to get what I want"

[–] pearsaltchocolatebar@discuss.online 4 points 1 month ago (1 children)

That's not a bad power to have if you use it for good

[–] normanwall@lemmy.world 12 points 1 month ago

He wanted a second laptop dock for home when we had limited supply and not everyone had one yet

[–] Dasus@lemmy.world 6 points 1 month ago

I think it's also likely that it's very hard to amass billions unless you already have some sort of brain rot.

[–] MrMakabar@slrpnk.net 88 points 1 month ago (2 children)

$7trillion is three times the GDP if Brazil. It is bigger then the US federal budget. Seriously it is insane.

[–] vonxylofon@lemmy.world 57 points 1 month ago

This guy is an absolute lunatic.

"Gimme all of the world's money several times over for this fancy T9 that I'm playing with."

If someone wrote a cartoon villain using his quotes, it would be dismissed as unbelievable and rubbish.

load more comments (1 replies)
[–] TimeNaan@lemmy.world 75 points 1 month ago (1 children)

He is an empty husk of a man who has completed his transformation into a pure PR machine

[–] Alphane_Moon@lemmy.world 42 points 1 month ago* (last edited 1 month ago) (1 children)

His involvement in the infamous WorldCoin provides useful insight into his character.

An oligarch and a degenerate (outside the US many oligarchs have a more or less sober understanding of who they are, although degeneracy among oligarchs is a global issue).

load more comments (1 replies)
[–] 2pt_perversion@lemmy.world 62 points 1 month ago (1 children)

Open AI has a projected revenue of 3 Billion this year.
It is currently projected to burn 8 Billion on training costs this year.
Now it needs 5 Gigawatt data centers worth over 100 Billion.
And new fabs worth 7 Trillion to supply all the chips.

I get that it’s trying to dominate a new market but that’s ludicrous. And even with everything so far they haven’t really pulled far ahead of competing models like Claude and Gemini who are also training like crazy.

[–] BananaTrifleViolin@lemmy.world 46 points 1 month ago* (last edited 1 month ago) (4 children)

There is no market, or not much of one. This whole thing is a huge speculative bubble, a bit like crypto. The core idea of crypto long term make some sense but the speculative value does not. The core idea of LLMs (we are no where near true AI) makes some sense but it is half baked technology. It hadn't even reached maturity and enshittification has set in.

OpenAI doesn't have a realistic business plan. It has a griftet who is riding a wave of nonsense in the tech markets.

No one is making profit because no one has found a truly profitable use with what's available now. Even places which have potential utility (like healthcare) are dominated by focused companies working in limited scenarios.

[–] makyo@lemmy.world 29 points 1 month ago (3 children)

IMO it's even worse than that. At least from what I gather from the AI/Singularity communities I follow. For them, AGI is the end goal - a creative thinking AI capable of deduction far greater than humanity. The company that owns that suddenly has the capability to solve all manner of problems that are slowing down technological advancement. Obviously owning that would be worth trillions.

However it's really hard to see through the smoke that the Altmans etc. are putting up - how much of it is actual genuine prediction and how much is fairy tales they're telling to get more investment?

And I'd have a hard time believing it isn't mostly the latter because while LLMs have made some pretty impressive advancements, they still can't have specialized discussions about pretty much anything without hallucinating answers. I have a test I use for each new generation of LLMs where I interview them about a book I'm relatively familiar with and even with the newest ChatGPT model, it still makes up a ton of shit, even often contradicting its own answers in that thread, all the while absolutely confident that it's familiar with the source material.

Honestly, I'll believe they're capable of advancing AI when we get an AI that can say 'I actually am not sure about that, let me do a search...' or something like that.

[–] itslilith@lemmy.blahaj.zone 12 points 1 month ago (3 children)

I follow a YouTube channel, AI explained, that has some pretty grounded analysis of the latest models and capabilities. He compared LLMs to the creative writing center of the brain, as in they're really nice to interact with, output things that sound correct, but ultimately are missing the capabilities of reasoning and factuality that are needed for AGI

load more comments (3 replies)
load more comments (2 replies)
[–] lurch@sh.itjust.works 3 points 1 month ago* (last edited 1 month ago)

yeah, i really hate this. i have shares of multiple tech companies, like nvidia, intel, AMD, TSMC, etc. and because of the AI bubble idk how much they are really worth. the market is all warped and one day a company is doing well, the next day it seems to be in peril. i would like to know how much they would be worth after the bubble bursts, but there is no way to know.

load more comments (2 replies)
[–] ColdWater@lemmy.ca 45 points 1 month ago (1 children)

This guy is losing touch of reality

[–] Tramort@programming.dev 12 points 1 month ago (2 children)

I know musk is bipolar. Is Altman too?

Bipolar can cause this kind of request. It's called a delusion of grandiosity.

[–] downhomechunk@midwest.social 6 points 1 month ago (1 children)

I think it's a delusion of grandeur.

load more comments (1 replies)
[–] NotMyOldRedditName@lemmy.world 6 points 1 month ago* (last edited 1 month ago) (1 children)

Oh wow I didn't actually know he was bipolar (checked to confirm it is the case). I knew he was on the autism spectrum.

I can't imagine those 2 play nice together.

[–] Crashumbc@lemmy.world 3 points 1 month ago

Well the insane amount of random drugs he's taking don't help either.

[–] Bookmeat@lemmy.world 44 points 1 month ago

"But the breakthrough will come just as soon as the chips no one can make are delivered."

Probably.

[–] apfelwoiSchoppen@lemmy.world 40 points 1 month ago

The climate? What climate? Who cares about the climate?

[–] geneva_convenience@lemmy.ml 26 points 1 month ago

You can buy a lot of Twitters for that money

[–] rimu@piefed.social 21 points 1 month ago (2 children)

Middle Eastern money

Something tells me the Saudis don't want AI for the betterment of all humanity.

Could be the human rights abuses, dunno.

[–] Speculater@lemmy.world 16 points 1 month ago (1 children)

Imagine an AI bound by Sharia law, the current ones limited by American puritanical bullshit are already bad enough. "How did prostitution during the gold rush affect the economy of mining towns?"

ERROR THIS QUESTION VIOLATES GUIDELINES

[–] quant@leminal.space 3 points 1 month ago

Which is why China is also pushing hard for AI as well.

[–] wewbull@feddit.uk 3 points 1 month ago

It needs lots of energy.

[–] Halcyon@discuss.tchncs.de 8 points 1 month ago

The Bloomberg podcast series 'Foundering – The OpenAI Story' is quite insightful in regard to Sam Altman's psyche.

There are five episodes, first is here:

https://www.bloomberg.com/news/audio/2024-06-05/foundering-openai-e1-most-silicon-valley-man-alive-podcast

[–] Halcyon@discuss.tchncs.de 6 points 1 month ago

He will get that. The ultra rich ignore all healthy limits.

load more comments
view more: next ›