I wonder how many people can even tell the difference between 144Hz and 480Hz apart from the extra heat and screaming GPU fans.
Technology
Which posts fit here?
Anything that is at least tangentially connected to the technology, social media platforms, informational technologies and tech policy.
Rules
1. English only
Title and associated content has to be in English.
2. Use original link
Post URL should be the original link to the article (even if paywalled) and archived copies left in the body. It allows avoiding duplicate posts when cross-posting.
3. Respectful communication
All communication has to be respectful of differing opinions, viewpoints, and experiences.
4. Inclusivity
Everyone is welcome here regardless of age, body size, visible or invisible disability, ethnicity, sex characteristics, gender identity and expression, education, socio-economic status, nationality, personal appearance, race, caste, color, religion, or sexual identity and orientation.
5. Ad hominem attacks
Any kind of personal attacks are expressly forbidden. If you can't argue your position without attacking a person's character, you already lost the argument.
6. Off-topic tangents
Stay on topic. Keep it relevant.
7. Instance rules may apply
If something is not covered by community rules, but are against lemmy.zip instance rules, they will be enforced.
Companion communities
!globalnews@lemmy.zip
!interestingshare@lemmy.zip
Icon attribution | Banner attribution
It's actually very simple, just activate the FPS counter and it will display the frame rate for you. (/s)
Literally zero, but they'll constantly tell you how amazing the difference is.
I definitely can’t tell the difference for anything above 120hz or so, but I recall reading an article about counter strike several years ago, which showed that pro players can see an increase in performance with higher refresh rates up to about 300 or 350 hz (as long as you have the fps to match it, lol).
At that point, was it really the rendering speed, or just the finer game engine granularity that made a difference?
What is "game engine granularity" supposed to mean?
Unless they fucked up the test, the only difference is how fast pictures arrive on the screen. If the test showed that pro players were able to tell a difference, it's reasonable to assume that this is actually the case, unless you can show a flaw in their test setup.
A frame is not just the picture arriving to the screen, it's everything from input processing to game logic to rendering to picture arriving at the screen. What the other commenter was saying is that things like input lag and game logic smoothness should affect player performance as well. In fact, you can isolate for those variables with an unlocked frame rate, where you can get a frame rate in the 250s on a 144Hz monitor, and pros still see an improvement in that case because those hidden subcycles are smoothing out the non-visual calculations.
Sure, but why do you expect the input lag to be different for the different monitors that were tested? If that's the case, we should be able to point at those differences in the test setup, instead of saying "yeah, they were probably just too dumb".
The game logic also shouldn't be different, as CS logic hasn't been reliant on frame rates for a long time (if ever).
I can’t run most games much above 100Hz with a 3090.
I've got the 45" ultrawide 240hz OLED that LG make. That's plenty fast enough.
What's the point if the human eye can't see beyond 30fps?
It’s okay, I understood your comment without you needing to put that stupid “/s” in
I thought the joke was obvious but the number of downvotes says otherwise.
Dont display downvotes, save that cortisol
It was obvious to me!
That's not entirely correct. Have you tried using a 60hz and 165hz monitor? The difference is not slight.
Number go up
presenting that human eye with the most up to date graphical developments provide a (however slight but still measurable) performance edge over slower refresh rates.