this post was submitted on 30 Aug 2023
975 points (95.9% liked)
Memes
45896 readers
1165 users here now
Rules:
- Be civil and nice.
- Try not to excessively repost, as a rule of thumb, wait at least 2 months to do it if you have to.
founded 5 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
My problem when buying my last GPU is that AMD's answer to CUDA, ROCm, was just miles behind and not really supported on their consumer GPUs. From what I se now that has changed for the better, but it's still hard to trust when CUDA is so dominant and mature. I don't want to reward NVIDIA, but I want to use my GPU for some deep learning projects too and don't really have a choice at the moment.
I've become more and more convinced that considerations like yours, which I do not understand since I don't rely on GPUs professionally, have been the main driver of Nvidia's market share. It makes sense.
The online gamer talk is that people just buy Nvidia for no good reason, it's just internet guys refusing to do any real research because they only want a reason to stroke their own egos. This gamer-based GPU market is a loud minority whose video games don't seem to rely too heavily on any card features for decent performance, or especially compatibility, with what they're doing. Thus, the constant idea that people "buy Nvidia for no good reason except marketing".
But if AMD cards can't really handle things like machine learning, then obviously that is a HUGE deficiency. The public probably isn't certain of its needs when it spends $400 on a graphics card, it just notices that serious users choose Nvidia for some reason. The public buys Nvidia, just in case. Maybe they want to do something they haven't thought of yet. I guess they're right. The card also plays games pretty well, if that's all they ever do.
If you KNOW for certain that you just want to play games, then yeah, the AMD card offers a lot of bang for your buck. People aren't that certain when they assemble a system, though, or when they buy a pre-built. I would venture that the average shopper at least entertains the idea that they might do some light video editing, the use case feels inevitable for the modern PC owner. So already they're worrying about maybe some sort of compatibility issue with software they haven't bought, yet. I've heard a lot of stories like yours, and so have they. I've never heard the reverse. I've never heard somebody say they'd like to try Nvidia but they need AMD. Never. So everyone tends to buy Nvidia.
The people dropping the ball are the reviewers, who should be putting a LOT more emphasis on use cases like yours. People are putting a lot of money into labs for exhaustive testing of cooling fans for fuck’s sake, but just running the same old gaming benchmarks like that's the only thing anyone will ever do with the most expensive component in the modern PC.
I've also heard of some software that just does not work without CUDA. Those differences between cards should be tested and the results made public. The hardware journalism scene needs to stop focusing so hard on damned video games and start focusing on all the software where Nvidia vs AMD really does make a difference, maybe it would force AMD to step up its game. At the very least, the gamebros would stop acting like people buy Nvidia cards for no reason except some sort of weird flex.
No, dummy, AMD can't run a lot of important shit that you don't care about. There's more to this than the FPS count on Shadow of the Tomb Raider.
Well the counterpoint is that NVIDIA's Linux drivers are famously garbage, which also pisses off professionals. From what I see from AMD now with ROCm, it seems like they've gone the right way. Maybe they can convince me next time I'm on the lookout for a GPU.
But overall you're right yeah. My feeling is that AMD is competitive with NVIDIA regarding price/performance, but NVIDIA has broader feature support. This is both in games and in professional use cases. I do feel like AMD is steadily improving in the past years though. In the gaming world FSR seems almost as ubiquitous (or maybe even more ) as DLSS, and ROCm support seems to have grown rapidly as well. Hopefully they keep going, so I'll have a choice for my next GPU.
It's a shame there's not really an equivalent comparison to the CUDA cores on AMD cards, being able to offload rendering to the GPU and getting instant feedback is so important when sculpting (without having to fall back to using eevee)