this post was submitted on 17 Nov 2023
134 points (92.4% liked)

PC Gaming

8251 readers
481 users here now

For PC gaming news and discussion. PCGamingWiki

Rules:

  1. Be Respectful.
  2. No Spam or Porn.
  3. No Advertising.
  4. No Memes.
  5. No Tech Support.
  6. No questions about buying/building computers.
  7. No game suggestions, friend requests, surveys, or begging.
  8. No Let's Plays, streams, highlight reels/montages, random videos or shorts.
  9. No off-topic posts/comments.
  10. Use the original source, no clickbait titles, no duplicates. (Submissions should be from the original source if possible, unless from paywalled or non-english sources. If the title is clickbait or lacks context you may lightly edit the title.)

founded 1 year ago
MODERATORS
top 50 comments
sorted by: hot top controversial new old
[–] Outtatime@sh.itjust.works 61 points 10 months ago (5 children)

I'm so sick of Nvidia's bullshit. My next system will be AMD just out of spite. That's goes for processors as well

[–] CaptainEffort@sh.itjust.works 18 points 10 months ago

That’s exactly why I’ve been using AMD for the past 2 years. Fuck Nvidia

[–] kureta@lemmy.ml 16 points 10 months ago

only thing keeping me is CUDA and there's no replacement for it. I know AMD has I-forgot-what-it's-called but it is not a realistic option for many machine learning tasks.

[–] dojan@lemmy.world 14 points 10 months ago (1 children)

I went with an AM5 and an Intel Arc GPU. Quite satisfied, the GPU is doing great and didn’t cost an arm and a leg.

[–] Nanomerce@lemmy.world 5 points 10 months ago (1 children)

How is the stability in modern games? I know the drivers are way better now but more samples is always great.

[–] dojan@lemmy.world 6 points 10 months ago

Like, new releases? I don’t really play many new games.

Had Baldur’s Gate III crash once, and that’s the newest title I’ve played.

Other than that I play Final Fantasy XIV, Guild Wars 2, The Sims and Elden Ring, never had any issues.

[–] Vinny_93@lemmy.world 6 points 10 months ago

Considering the price of a 4070 vs the 7800XT, the 4070 makes a lot more sense where I live.

But yes, the way AMD makes their software open to use (FSR, FreeSync) and they put DisplayPort 2.1 on their cards, they create a lot of goodwill for me.

[–] Cagi@lemmy.ca 3 points 10 months ago* (last edited 10 months ago) (1 children)

The only thing giving me pause about ATI cards is their ray tracing is allegedly visibly worse. They say next gen will be much better, but we shall see. I love my current non ray tracing card, an rx590, but she's getting a bit long in the tooth for some games.

[–] limitedduck@awful.systems 17 points 10 months ago (1 children)

ATI

"Now that's a name I've not heard in a long time"

[–] Cagi@lemmy.ca 18 points 10 months ago (1 children)

Not since, oh before most of Lemmy was born. I'm old enough to remember when Nvidia were the anti-monopoly good guys fighting the evil Voodoo stranglehold on the industry. You either die a hero or you live long enough to see yourself become the villain.

[–] PenguinTD@lemmy.ca 4 points 10 months ago

yeah, that's pretty much why I stopped buying Nvidia after GTX 1080. Cuda was bad in terms of their practice, but not that impactful since OpenCL etc can still tune and work properly with similar performance, just software developer/researcher love free support/R&D/money to progress their goal. They are willing to be the minions which I can't ask them to not take the free money. But RTX and then tensor core is where I draw the line, since their patent and implementation will have actual harm in computer graphic and AI research space but I guess it was a bit too late. We are already seeing the results and Nvidia is making banks with that advantage. They are essentially just applying the Intel playbook but doing it slightly different, they don't buy the OEM vendors, they "invest" software developers/researcher to use their closed tech. Now everyone is paying the premium if you buy RTX/AI chips from Nvidia and the capital boom from AI will make the gap hard to close for AMD. After all, R&D requires lots of money.

[–] sederx@programming.dev 19 points 10 months ago

i saw a 4080 on amazon for 1200, shits crazy

[–] GarytheSnail@programming.dev 18 points 10 months ago (1 children)

All three cards are rumored to come with the same memory configuration as their base models...

Sigh.

[–] Fungah@lemmy.world 8 points 10 months ago (1 children)

Give us more fucking vram you dicks.

[–] CanadianCarl@sh.itjust.works 1 points 10 months ago

I have 12gb vram, do I need more?

[–] Schmuppes@lemmy.world 17 points 10 months ago (1 children)

Major refresh means what nowadays? 7 instead of 4 percent gains compared to the previous generation?

[–] NOT_RICK@lemmy.world 8 points 10 months ago (1 children)

The article speculates a 5% gain for the 4080 super but a 22% gain for the 4070 super which makes sense because the base 4070 was really disappointing compared to the 3070.

[–] vxx@lemmy.world 2 points 10 months ago (1 children)

Will the price be the same or up to 22% more expensive?

[–] NOT_RICK@lemmy.world 3 points 10 months ago

You’ll pay 30% more for the honor of owning a 4 series

[–] the_q@lemmy.world 17 points 10 months ago (1 children)
[–] zoe@jlai.lu 3 points 10 months ago* (last edited 10 months ago)

just 10-15 years at least, for smartphones\electronics overall too. Process nodes are now harder to reduce, more than ever. holding up to my 12nm ccp phone like there is no tomorrow ..

[–] gnuplusmatt@reddthat.com 15 points 10 months ago* (last edited 10 months ago) (2 children)

As a Linux gamer, this really wasn't on the cards anyway

[–] BCsven@lemmy.ca 4 points 10 months ago (1 children)

AMD is a better decision, but my nVidia works great with Linux, but I'm on OpenSUSE and nVidia hosts their own OpenSUSE drivers so it works out of the get go once you add the nVidia repo

[–] gnuplusmatt@reddthat.com 3 points 10 months ago (1 children)

I had an nvidia 660 GT back in 2013, it was a pain in the arse being on a leading edge distro, used to break xorg for a couple of months every time there was an xorg release (which admittedly are really rare these days since its in sunset mode). Buying an amd was the best hardware decision, no hassles and I've been on Wayland since Fedora 35.

[–] CeeBee@lemmy.world 3 points 10 months ago (1 children)

A lot has changed in a decade.

load more comments (1 replies)
[–] lowmane@lemmy.world 1 points 10 months ago (1 children)

Laughs in dual 3090s on Linux coming from 5x 1070tis

[–] gnuplusmatt@reddthat.com 1 points 10 months ago (1 children)

Laughs at dual 3090s on Linux

That sounds like a hassle

[–] lowmane@lemmy.world 1 points 10 months ago* (last edited 10 months ago) (1 children)

It's not at all. You have a dated notion of the experience of the past few years+ with an nvidia gpu

[–] gnuplusmatt@reddthat.com 1 points 10 months ago

dated notion of the experience

Do I still have to load a module that taints my kernel and could break due to ABI incompatibility? Does wayland work in an equivalent manner to the in kernel drivers that properly support GBM?

[–] RizzRustbolt@lemmy.world 12 points 10 months ago

freezes

stands there with my credit card in my hand while the cashier stares at me awkwardly

[–] joneskind@beehaw.org 8 points 10 months ago (1 children)

It really is a risky bet to make.

I doubt full price RTX 4080 SUPER upgrade will worth it over a discounted regular RTX 4080.

SUPER upgrades never crossed the +10%

I’d rather wait for the Ti version

load more comments (1 replies)
[–] Kit@lemmy.blahaj.zone 3 points 10 months ago

Meh I'm still gonna buy a 4070 Ti on Black Friday. Wish I could wait but my other half wants a PC for Christmas.

[–] dellish@lemmy.world 2 points 10 months ago (3 children)

Perhaps this is a good place to ask now the topic has been raised. I have an ASUS TUF A15 laptop with an nVidia GTX 1650Ti graphics card and I am SO sick of 500MB driver "updates" that are basically beta tests that break one thing or another. What are the chances of upgrading to a Raedon/AMD graphics card? Or am I stuck with this shit?

[–] vivadanang@lemm.ee 3 points 10 months ago (1 children)

have an ASUS TUF A15 laptop with an nVidia GTX 1650Ti graphics card and I am SO sick of 500MB driver “updates” that are basically beta tests that break one thing or another. What are the chances of upgrading to a Raedon/AMD graphics card? Or am I stuck with this shit?

in a laptop? practically none. there are some very rare 'laptops' out there - really chonk tops - that have full size desktop gpu's inside them. the vast majority, on the other hand, will have 'mobile' versions of these gpus that are basically permanently connected to the laptop's motherboard (if not being on the mobo itself).

one example of a laptop with a full-size gpu (legacy, these aren't sold anymore): https://www.titancomputers.com/Titan-M151-GPU-Computing-Laptop-workstation-p/m151.htm note the THICK chassis - that's what you need to hold a desktop gpu.

[–] dellish@lemmy.world 1 points 10 months ago

Well that sucks, but unfortunately I'm not too surprised.

[–] chemsed@lemmy.ca 1 points 10 months ago

In my experience, AMD is not more reliable on updates. I had to clean install trice to be able to have my RX 6600 function properly and months later, I have a freezing issue that may be caused by my GPU.

[–] gazab@lemmy.world 1 points 10 months ago

You could use an separate external gpu if you have thunderbolt ports. It's not cheap and you sacrifice some performance but worth it for the flexibility in my opinion. Check out https://egpu.io/

[–] state_electrician@discuss.tchncs.de 2 points 10 months ago (2 children)

Only slightly related question: is there such a thing as an external nVidia GPU for AI models? I know I can rent cloud GPUs but I am wondering if long-term something like an external GPU might be worth it.

[–] baconisaveg@lemmy.ca 6 points 10 months ago

A 3090 (used) is the best bang for your buck for any LLM / StableDiffusion work right now. I've seen external GPU enclosures, though they probably cost as much as slapping a used 3090 into a barebones rig and running it headless in a closet.

[–] AnotherDirtyAnglo@lemmy.ca 3 points 10 months ago

Generally speaking, buying outright is always cheaper than renting, because you can always continue to run the device potentially for years, or sell it to reclaim some capital.

load more comments
view more: next ›