This is getting out of hand. The other day I saw the requirements for "Hell is Us" game and it's ridiculous. My RX6600 can't play anything anymore. I've downloaded several PS3 ROMs and now I'm playing old games. So much better than this insanity. This is probably what I'm going to be doing now, play old games.
Edit: I wanted to edit the post for more context and to vent/rant a little.
I don't want to say I made a mistake, but I buy everything used, and I have scored a good deal on two 27" 4k monitors from Facebook marketplace. Got both monitors for $120.
They're $800 on Amazon used. Great monitors and I love 4k. I also bought an RX6600 AMD GPU for $100 from Facebook. It was almost new. The owner upgraded and wanted to get rid of it. My whole build was very cheap compared to what I see some folks get (genuinely happy for those who can afford it. Life is too short. Enjoy it while you can).
I can't afford these high end GPUs, but now very few games work on low settings and I'd get something like 20 FPS max. My friend gave me access to his steam library and I wanted to play Indiana Jones the other day, and it was an "omfg, wtf is this horrible shit" moment.
I'm so sick of this shit!
I don't regret buying any of these, but man it sucks that the games I want to play barely even work.
So, now I'm emulating and it's actually pretty awesome. I've missed out on so many games in my youth so now I'm just going to catch up on what I've missed out on. Everything works in 4k now and I'm getting my full 60FPS and I'm having so much fun.
Idk man, I bought Sol Cesto yesterday, and I'm pretty sure my toaster could run it
Edit:
Bruh. I have a 3080 Ti and barely feel comfortable running my games in 2k. I'm pretty sure the 6600 was made with only 1080p and lower in mind.
I dunno my 3080Ti runs 4k 90+ FPS no problem. Maybe not all games but most decently modern games.
๐. I know, dude. That's my whole point. Why do WE have to bear the burden of optimizing the game? Why don't the developers optimize their games? It's bonkers the game "hell is us" listed the 4090 as a minimum requirement to run 4k ar 30 fps. I was like wut?
Because running 4k is extreme. Asking it to run well at 4k is asking them to quadruple the pixels for the same processing cost. You're saying you want the vast majority of people who dont have a 4k setup to have their games downgraded so they'll run well on your boutique monitor.
It's a balancing act, and they can either make the game look like something from 2010 on all systems just to make sure it runs in 4k on older cards, or they can design it to look good in 1080p on older or cheaper cards, which is fine for most most people.
If you want to game in 4k, you need to buy a video card and monitor to support it. Meanwhile, I'll keep running new games on my older card at 1080 and be perfectly happy with it.
(And then you have portable boxes that somehow ADvertise to run games in 4k 60fps for 499$ ๐คฃ๐คฃ)
That's why I went back to the roots. I'm now playing older games at 4k 60 fps no problem. I'll stick with emulators. I'd rather not spend the $700. I'll still complain about new games not running for me, though. That's the only thing I can do beside playing older games instead ๐
Or just run newer games at 1080p. Unless you're unhealthily close to the monitor you probably won't even see the difference.
If you're rubbing it on a TV across the room, you probably literally can't see the difference.
I do run them at 1080p, trust me. Here is the thing, though, running 1080p on a native 4k screen makes for a horrible looking picture. It just looks off and very bad. Try it if you can. It's best when the screen itself is physically 1080p. I think you fat-fingered the "b" in "running". Came out funny ๐
1080p scales to 4k perfectly unless you have a weird aspect ratio, since it can just treat each square of 4 screen pixels as 1.
What looks bad is trying to run anything between 1080p and 4k, since it's not a perfect 4:1 relationship.
You'll want to use Lossless Scaling. It'll quadruple the pixels without any filtering and make the output not look weird on a 4k display.
Elaborate, please. What res would that be?
It's not bonkers though. Fill rate (the time it takes to render all the pixels your monitor is displaying) is a massive issue with ever increasingly photo realistic games because you can't rely on any simple tricks to optimize the rendering pipeline, because there is so much details on the screen that every single pixel can potentially completely change at any given moment, and also be very different from its neighbors (hence the popularity of temporal upscalers like DLSS, because extrapolating from the previous frame(s) is really the last trick that still kind of works. Emphasis on "kind of")
If you don't want to sell a kidney to buy a good GPU for high resolutions, do yourself a favor and try to get a 1440p monitor, you'll have a much easier time running high end games. Or run your games at a lower res but it usually looks bad.
I personally experienced this firsthand when I upgraded to 1440p from 1080p a while ago, suddenly none of my games could run at max settings in my native resolution, even though it was perfectly fine before. Also saw the same problem in bigger proportions when I replaced my 1440p monitor with a 4k one at work and we hadn't received the new GPUs yet.
I'll just play older games. Everything runs at 4k 60 fps no issue on RPCS3 and I've been having a freaking blast. Started with uncharted 1, and man, I've missed out on this game. I'm going to stick with older games.