this post was submitted on 10 Dec 2024
245 points (100.0% liked)

TechTakes

1481 readers
393 users here now

Big brain tech dude got yet another clueless take over at HackerNews etc? Here's the place to vent. Orange site, VC foolishness, all welcome.

This is not debate club. Unless it’s amusing debate.

For actually-good tech, you want our NotAwfulTech community

founded 1 year ago
MODERATORS
 

I can't wait for the spectacular implosion

you are viewing a single comment's thread
view the rest of the comments
[–] db0@lemmy.dbzer0.com 39 points 1 week ago (1 children)

I expect a creative destruction, like what happened with the dotcom bubble. A ton of GenAI companies will go bust and the market will be flooded with cheap GPUs and other AI hw which will be snapped on the cheap, and enthusiasts and researches will use them to make actually useful stuff.

[–] dgerard@awful.systems 12 points 1 week ago* (last edited 1 week ago) (1 children)

these are compute GPUs that don't even have graphics ports

[–] db0@lemmy.dbzer0.com 16 points 1 week ago (2 children)

Yes, my point is that the compute from those chips can still be used. Maybe on actually useful machine learning tools that will be developed latter, or some other technology which might make use of parallel computing like this.

[–] JackRiddle@sh.itjust.works 5 points 6 days ago

I know of at least one company that uses cuda for ray-tracing for I believe ground research, so there is definitely already some usefull things happening.

I mean there are a lot of applications for linear algebra, although I admit I don't fully know in what way "AI" uses linear algebra and what other uses overlap with it.

I'm waiting on the a100 fire sale next year