this post was submitted on 05 Jun 2024
406 points (96.6% liked)

Technology

59438 readers
2955 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] dogslayeggs@lemmy.world 76 points 5 months ago (6 children)

I didn't know there were that many PC gamers out there. /s

Seriously, though, the pivot from making video cards to investing in AI and crypto is kinda genius. The crypto thing mostly fell into their laps, but they leaned in. The AI thing, though, I'm not sure how they decided to focus on that or who first pitched the idea to the board; but that was business genius.

[–] dkc@lemmy.world 38 points 5 months ago (1 children)

To your point, when you look at both crypto and AI I see a common theme. They both need a lot of computation, call it super computing. Nvidia makes products that provide a lot of compute. Until Nvidia’s competitors catch up I think they’ll do fine as more applications that require a lot of computation are found.

Basically, I think of Nvidia as a super computer company. When I think of them this way their position makes more sense.

[–] Aceticon@lemmy.world 3 points 5 months ago* (last edited 5 months ago)

Also those thing are highly parallelizable and mainly deal with vector and matrix data, so the same "lots of really simple but fast processing units optimized for vectors and matrix operations working in parallel" that works fine for modern 3D Graphics (for example, each point on a frame image to display on the screen can be calculated in parallel with all the other points - in what's called a fragment shader - and most 3D data is made of 3D vectors whilst the transforms are 3x3 Matrices) turns out to also work fine for things like neural networks were the neurons in each layer are quite simple and can all be processed in parallel (if the architecture of that wasn't layered, GPUs would be far less effective for it).

To a large extent Nvidia got lucky that the stuff that became fashionable now works by doing lots of simple and highly paralellizeable computations, since otherwise it would've been the makers of CPUs that gained from the rise of said computing power demanding tech.

[–] kromem@lemmy.world 24 points 5 months ago* (last edited 5 months ago) (1 children)

They were doing that for years before it became popular. The same tech for video graphics just so happened to be useful for AI and big data, and they doubled down on supporting enterprise and research efforts in that when it was a tiny field before their competitors did, and continued to specialize as it grew.

Supporting niche uses of your product can sometimes pay off if that niche hits the lottery.

[–] webghost0101@sopuli.xyz 7 points 5 months ago

Hardware made for heavy computing being good at stuff like this isn’t all that schokking though. The biggest gamble is if new technology will take off at all. Nvidia, just like google has the capital to diversify, bet on all the horses at once to drop the losers later.

[–] chrash0@lemmy.world 18 points 5 months ago

same as with crypto. the software community started using GPUs for deep learning, and they were just meeting that demand

[–] RecallMadness@lemmy.nz 11 points 5 months ago* (last edited 5 months ago)

They were first to market with a decent GPGPU toolkit (CUDA) which built them a pretty sizeable userbase.

Then when competitors caught up, they made it as hard as possible to transition away from their ecosystem.

Like Apple, but worse.

I guess they learned from their Gaming heyday that not controlling the abstraction layer (eg OpenGL, DirectX, etc) means they can’t do lock in.

[–] slacktoid@lemmy.ml 9 points 5 months ago

To their credit they've been pushing GPGPUs for a while. They did position themselves well for accelerators. Doesn't mean they don't suck.

[–] swayevenly@lemm.ee 8 points 5 months ago

DLSS was a necessity to make gains at speeds their hardware could not keep up with.