this post was submitted on 07 Mar 2024
106 points (93.4% liked)

PC Gaming

8581 readers
478 users here now

For PC gaming news and discussion. PCGamingWiki

Rules:

  1. Be Respectful.
  2. No Spam or Porn.
  3. No Advertising.
  4. No Memes.
  5. No Tech Support.
  6. No questions about buying/building computers.
  7. No game suggestions, friend requests, surveys, or begging.
  8. No Let's Plays, streams, highlight reels/montages, random videos or shorts.
  9. No off-topic posts/comments, within reason.
  10. Use the original source, no clickbait titles, no duplicates. (Submissions should be from the original source if possible, unless from paywalled or non-english sources. If the title is clickbait or lacks context you may lightly edit the title.)

founded 1 year ago
MODERATORS
top 50 comments
sorted by: hot top controversial new old
[–] dangblingus@lemmy.dbzer0.com 60 points 8 months ago (2 children)

Okay, so all they're saying is that they won't certify any monitor under 144 for FreeSync technology. Okay? That basically changes nothing. If you are using a 60hz monitor, basic vsync is all you need.

[–] Perfide@reddthat.com 20 points 8 months ago (2 children)

Basic vsync worsens response time, often by a lot. I'd take screen tearing over vsync.

[–] Sibbo@sopuli.xyz 9 points 8 months ago* (last edited 8 months ago)

True. For FPS I often do that too. For top-down view games it's often nicer with vsync. Even triple buffered.

[–] snugglesthefalse@sh.itjust.works 5 points 8 months ago

I'd take response time over screen tearing.

[–] Sibbo@sopuli.xyz 13 points 8 months ago (1 children)

Probably their freesync is mostly ineffective on 60Hz but better on higher refresh rates, and they just made a good spin out of that in the marketing department.

[–] Still@programming.dev 12 points 8 months ago (1 children)

I've have a monitor that locks to 33-92hz when free sync is enabled (144hz otherwise) it's way more useful at lower fps values that higher ones

[–] Sibbo@sopuli.xyz 4 points 8 months ago

Interesting. Yeah my comment was just a shitpost. But what you are saying makes it seem completely nonsensical what AMD is doing.

But then maybe they just want to push faster video cards, and freesync is something that people care about. Then they can still sell better cards to those people that don't know that freesync is less useful at higher refresh rates.

[–] Kyrgizion@lemmy.world 29 points 8 months ago (3 children)

60hz4k until I die. Or can afford 144hz4k. Which'll probably be around the same date, give or take.

[–] CanadianCorhen@lemmy.ca 38 points 8 months ago (1 children)

I'm very much a 1440p 144hz guy.

I'd like 4k, but would take this compromise for now.

I want my next screen to be 4k, 144hz oled

[–] domi@lemmy.secnd.me 5 points 8 months ago

1440p high refresh rate gamers unite. I have an Alienware AW3423DWF and boy are those new OLED panels beautiful. Expensive but beautiful. I still remember playing Left 4 Dead right after I got it and even without HDR I was baffled by the credits at the end of the match. Just white text floating in nothingness.

They also recently released the AW3225QF which is 4k@240.

[–] Kolanaki@yiffit.net 23 points 8 months ago

Buck up, chum. Maybe the world will go to shit and you can experience 144hz @ 4K after the looting starts.

[–] noobnarski@feddit.de 5 points 8 months ago (1 children)

I just bought a 4k 144hz screen, and let me say, it is worth the price.

[–] cyberpunk007@lemmy.ca 1 points 8 months ago (1 children)

I have a 2.5K ultrawide 144Hz. Even when this PC was new it struggled on that era of games :(. We need better graphics cards that don't cost the price of a mortgage.

[–] noobnarski@feddit.de 1 points 8 months ago

Yeah, I invested in a 4070, I wish I could get a 4090 for that price.

[–] mox@lemmy.sdf.org 16 points 8 months ago* (last edited 8 months ago) (1 children)

FreeSync is for variable refresh rates, which 60Hz monitors generally don't support anyway. So this headline is nothing but clickbait.

Also, I don't know of any sub-120Hz VRR monitors that are still being made, but if they exist, they're not aimed at anyone who cares about FreeSync branding.

So this whole article is a pointless waste of time.

[–] notfromhere@lemmy.ml 3 points 8 months ago

I recently got a 75hz 1080p monitor with FreeSync branding from Costco, so yea they are still made.

[–] Fiivemacs@lemmy.ca 15 points 8 months ago (1 children)

Provided the 60hz monitors still work who cares..if they do some arbitrary bullshit to prevent stuff from not working just because profit, then get fucked. I personally don't care about their certification or claims.

[–] SchmidtGenetics@lemmy.world 15 points 8 months ago (5 children)

Freesync is open source, so wouldn’t be profit motivated.

load more comments (5 replies)
[–] ArbitraryValue@sh.itjust.works 4 points 8 months ago (10 children)

Maybe I'm weird because anything over 20 FPS looks smooth to me (and I know it doesn't to other people) but what's the point of going over 60 FPS? Can anyone actually see the difference or is this just a matter of "bigger numbers must be better"?

[–] CaptainEffort@sh.itjust.works 13 points 8 months ago (1 children)

Theres a huge difference once you use it for long enough. I have a 144hz monitor and love getting to play games that high, they’re so smooth! If you play long enough the difference becomes night/day.

[–] nik282000@lemmy.ca 0 points 8 months ago (2 children)

Theres a huge difference once you use it for long enough.

sus. If you can't notice a big difference right away then what difference IS there?

[–] CaptainEffort@sh.itjust.works 2 points 8 months ago* (last edited 8 months ago)

It basically just takes your eyes a second to adjust, the same is true for 30 fps to 60. If you only play at 30 then you might not notice the leap when going to 60, but once you start playing 60 regularly your eyes will adjust, and 30 will start to look choppy. Once that happens the difference will become easy to point out, and you’ll be able to appreciate the frame increase.

Eventually it’ll become night and day, taking zero effort to notice the difference between frame rates, and the difference being a massive deal. I still only play at 1080p, for example, so I can play at 120-144 fps consistently, as that smoothness is infinitely more important to me than a sharper image.

[–] Buddahriffic@lemmy.world 1 points 8 months ago

It's hard to say exactly what it is, but if my monitor ends up getting set to 60hz because some game has weird defaults, I'll notice that something is off until I change the setting and get it back to 144hz. Maybe it's the monitor itself being tuned for 144hz so the pixels fade a bit before they get refreshed? Or maybe my eyes/brain can tell the difference after getting used to the higher refresh rate.

I think it is different for games that are fps locked to 60fps while the monitor is set to 144hz, which suggests that it might be the fading thing (or something similar).

Though I did notice a big difference in overwatch when I upgraded my GPU from one that would get fps in the range of 60-90 to one that could consistently get over 120.

It's really hard to quantify your own senses, so all I can say for sure is that I definitely notice it when my monitor is set to 60hz instead of 144hz.

[–] glimse@lemmy.world 12 points 8 months ago (1 children)

Lucky you. Seriously. I wish I didn't care because it means displays are more expensive for me.

I definitely thought it was all hype but once I saw games 120+ fps, even 60fps looks choppy to me. I also very much notice the difference between 30fps and 60fps video but 120fps (at full speed) didn't do much for me

For what it's worth, I was a professional video editor for years so I'm a bit more inclined to notice than the average person

[–] Formes@lemmy.ca 1 points 8 months ago

I'm kind of in that boat - digital art, and so on more. I never understood buying a computer monitor of over about 22" that was 1080p resolution. I want decent colour reproduction - I get it, it won't be perfect unless you spend a fortune but it should be at least decent.

120hz w/ good HDR support is fantastic for content that supports it, and 240hz is just buttery smooth. Variable refresh is pretty much a must for modern gaming.

[–] TheOneCurly@lemm.ee 11 points 8 months ago

There are diminishing returns but I can absolutely tell the difference between my 165Hz display and my wife's 240Hz.

[–] RaoulDook@lemmy.world 9 points 8 months ago

It's very easy to tell the difference when you see them in person. I have a 60Hz monitor and a 144Hz monitor on the same PC and you can drag a window across the desktop from one to the other and the lack of animation frames in the movement going from 144 to 60 makes the movement look choppy on the slower one. In games, the animation becomes smooth to the point of being lifelike and visually vibrant when your framerate is able to go up to 90 to 100 or more FPS

[–] Fermion@mander.xyz 8 points 8 months ago* (last edited 8 months ago)

It really depends on what and how you play. If reaction time is important then you'll feel more than see the difference in refresh rates. If none of your games require sub second reaction time accuracy, then it's much more of a nice to have luxury than a game changer.

Also, frametime pacing matters a lot. If your system very consistently puts out 30 fps, you'll have more accurate keypresses than if you normally get 50 and it gets hung on a few frames and it dips to 30fps. Your nervous system adapts pretty well to consistent delay, but it's much more difficult to compensate for delay that varies a lot.

I don't really play first person shooters so resolution matters more to me than framerate.

[–] TheSambassador@lemmy.world 7 points 8 months ago

For 3d games where the whole screen is moving and changing as the camera moves, I've noticed a big difference between 60 and 144. It just makes the game feel absurdly smooth.

For smaller games with more static views it doesn't really make much difference.

It mostly depends on the speed of the game.

[–] ramjambamalam@lemmy.ca 6 points 8 months ago

Here's a site which nicely demonstrates the effect: https://www.testufo.com/

[–] Incandemon@lemmy.ca 2 points 8 months ago

I don't play competitive games so I don't need the extra shooting accuracy. What I have found is that the higher refresh rate has made panning maps in RTS or looking around quickly in FPS much smoother. Its an overall nicer experience, but not really any better gaming than at 60hz.

[–] cevn@lemmy.world 2 points 8 months ago

Occasionally my fps gets set to 60. As soon as I start playing rocket league I can tell it is off. I went to a friends house and asked why everything is so choppy, checked his monitor settings and it was set to 60 instead of 144. There are people that can see the difference

[–] ILikeBoobies@lemmy.ca 0 points 8 months ago

I can tell you I notice no difference between 240 and 60

Stability is all that really matters

[–] someguy3@lemmy.ca 3 points 8 months ago* (last edited 8 months ago) (3 children)

What does "certification" mean? It won't work?

[–] gravitas_deficiency@sh.itjust.works 5 points 8 months ago* (last edited 8 months ago) (1 children)

It’s basically ~~ATI~~ AMD (lol) saying “this product meets or exceeds the required hardware standards to be granted this label”.

[–] uninvitedguest@lemmy.ca 14 points 8 months ago (1 children)

ATI is a name I haven't seen in awhile 🙂

[–] BradleyUffner@lemmy.world 4 points 8 months ago* (last edited 8 months ago)

It means that they are allowed to put another sticker on the monitor.

[–] CountVon@sh.itjust.works 2 points 8 months ago

To be "FreeSync certified", a monitor has to have certain minimum specs and must pass some tests regarding its ability to handle Variable Refresh Rate (VRR). In exchange for meeting the minimum spec and passing the tests, the monitor manufacturer gets to put the FreeSync logo on the box and include FreeSync support in its marketing. If a consumer buys an AMD graphics card and a FreeSync certified monitor then FreeSync (AMD's implementation of VRR) should work out of the box. The monitor might also be certified by Nvidia as GSync compatible, in which case another customer with an Nvidia graphics card should have the same experience with Gsync.

[–] stanka@lemmy.ml 1 points 8 months ago (1 children)

What does this mean for standard TVs that people us for gaming. LG/Sony/Samsung OLEDs tend to be able to do 4k@120, having native 120hz panels. Maybe this only covers "monitors" getting freesymc certified.

[–] Dudewitbow@lemmy.zip 2 points 8 months ago (1 children)

those handle VRR with the HDMI 2.1 hardware spec which is a little bit different than the traditional method of VRR.

its the main reason how current gen consoles have VRR (through hdmi 2.1 spec)

[–] stanka@lemmy.ml 1 points 8 months ago (1 children)

Rtings says that the LG (B2 at least) TV's support VRR via several standards: HDMI 2.1 , FreeSync, and GSYNC. I have a console hooked up, but no GPU good enough in a PC.

[–] Dudewitbow@lemmy.zip 1 points 8 months ago* (last edited 8 months ago) (1 children)

its freesync/gsync over hdmi 2.1 standard. Nivida does not have a Gsync over HDMI in the standard hdmi connection. There is no non 2.1 hdmi monitor/tv that will accept VRR over HDMI on Nvidia. Only AMD had Freesync over HDMI (on very low end budget monitors)

Gsync Compatible is basically gsync over the display port standard. Gsync Ultimate is over the FPGA which uses display port as a medium.

[–] stanka@lemmy.ml 1 points 8 months ago (1 children)

I would love to learn more about this. Know of any technical papers or references?

[–] Dudewitbow@lemmy.zip 2 points 8 months ago

idk about technical documents perse, but heres a news article when AMD introduced VRR over hdmi ways back, noting how vrr on hdmi wasnt a thing yet, so AMD partnered with monitor makers to use a different scaler that would make it compatible with freesync.

VRR over display port would be in the Displayport 1.2a specification sheet. VRR over HDMI (officially) is under the HDMI 2.1b sheet.

load more comments
view more: next ›