this post was submitted on 18 Aug 2023
272 points (100.0% liked)

Gaming

30548 readers
146 users here now

From video gaming to card games and stuff in between, if it's gaming you can probably discuss it here!

Please Note: Gaming memes are permitted to be posted on Meme Mondays, but will otherwise be removed in an effort to allow other discussions to take place.

See also Gaming's sister community Tabletop Gaming.


This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.

founded 2 years ago
MODERATORS
 

I am probably unqualified to speak about this, as I am using an RX 550 low profile and a 768P monitor and almost never play newer titles, but I want to kickstart a discussion, so hear me out.

The push for more realistic graphics was ongoing for longer than most of us can remember, and it made sense for most of its lifespan, as anyone who looked at an older game can confirm - I am a person who has fun making fun of weird looking 3D people.

But I feel games' graphics have reached the point of diminishing returns, AAA studios of today spend millions of dollars just to match the graphics' level of their previous titles - often sacrificing other, more important things on the way, and that people are unnecessarily spending lots of money on electricity consuming heat generating GPUs.

I understand getting an expensive GPU for high resolution, high refresh rate gaming but for 1080P? you shouldn't need anything more powerful than a 1080 TI for years. I think game studios should just slow down their graphical improvements, as they are unnecessary - in my opinion - and just prevent people with lower end systems from enjoying games, and who knows, maybe we will start seeing 50 watt gaming GPUs being viable and capable of running games at medium/high settings, going for cheap - even iGPUs render good graphics now.

TLDR: why pay for more and hurt the environment with higher power consumption when what we have is enough - and possibly overkill.

Note: it would be insane of me to claim that there is not a big difference between both pictures - Tomb Raider 2013 Vs Shadow of the Tomb raider 2018 - but can you really call either of them bad, especially the right picture (5 years old)?

Note 2: this is not much more that a discussion starter that is unlikely to evolve into something larger.

you are viewing a single comment's thread
view the rest of the comments
[–] Doods@infosec.pub 15 points 1 year ago (1 children)

I didn't mean we should stop improving, what I meant is we should focus more on the efficiency and less on the raw power.

[–] mcforest@kbin.social 11 points 1 year ago* (last edited 1 year ago) (2 children)

I think game and engine developers should do both. If it's possible to improve efficiency and performance it should be done. But at the same time hardware is improving as well and that performance gain should be used.

I'm kinda worried a little bit about the recent development in hardware though. At the moment GPU power mostly increases with energy consumption and only a little with improved architecture. That was different some years ago. But in my eyes thats a problem the hardware manufactorera have, not the game developers.

[–] icesentry@lemmy.ca 8 points 1 year ago* (last edited 1 year ago)

Performance is always about doing as much as possible with as little as possible. Making a game runs faster automatically makes it more efficient because the only way it can run faster is by doing less work. It's just that whenever you can run faster it means the game has more room for other things.

[–] squaresinger@feddit.de 2 points 1 year ago

It's resource consumption and graphics output are directly linked. If you gain more efficiency, that gives you more headroom to either reduce resource consumption or increase the graphics output. But you can't maximize both. You have to decide what do to with the resources you have. Use them, or not use them. You can't do both at the same time.