add arrow-down arrow-left arrow-right arrow-up authorcheckmark clipboard combo comment delete discord dots drag-handle dropdown-arrow errorfacebook history inbox instagram issuelink lock markup-bbcode markup-html markup-pcpp markup-cyclingbuilder markup-plain-text markup-reddit menu pin radio-button save search settings share star-empty star-full star-half switch successtag twitch twitter user warningwattage weight youtube

Refresh rate question

DanDan825

1 month ago

Just to clear it up, CAN you see any difference if your game goes above 60 fps on a 60 hz display?

Comments

  • 1 month ago
  • 3 points

Short answer is no.

Long answer is, it's helpful to your visual experience to be able to render more than 60 fps on average even if your monitor is 60hz. Shooting for a higher performance level is still beneficial to you in a tangential way. You won't see any difference on average, but if you are able to render a higher FPS, that means you have more "buffer" so to speak for when things get really hairy in-game.

For example, if you're averaging exactly 60 fps in game then that's fine, but when you hit an unusually dense/active scene, let's say a bunch of explosions occur and a building collapses or something (and this isn't part of the average gameplay), you are pretty much guaranteed to dip substantially below 60 fps and you're going to feel this. On the other hand, if you're averaging 90 fps or something like that on average, you are not going to see a difference during average gameplay. But when this exploding building scene occurs, you have a "buffer" of fps between your average and 60 that you can dip into without it really impacting your visual experience.

From the perspective of frametimes, a similar rationale can be made, that if you are rendering frames quicker on average (because your FPS is higher on average) then if some extra-long frame render time comes up, the stutter likely won't be as noticeable. If your frametimes are all over the place, then it doesn't matter what your fps is though.

  • 1 month ago
  • 1 point

Thank you!

  • 1 month ago
  • 2 points

A 60 Hz display repaints the screen 60 times a second, period. If the computer is going faster, I suppose you might see some artifacts, but I can't think of any way that you could see an actual improvement.

If anyone has proof or claim to the contrary, I'd be curious as to how it works.

  • 1 month ago
  • 1 point

In general that means you'd have Vsync turned off, so might see tearing and such. If you had Vsync turned on and your machine could always run the game above 60FPS, you'd have a stable 60FPS which you might be able to notice a tiny bit, never having dips.

But I'm not sure if there's a definitive answer. And in my experience people notice different things. Some people will say yes, some people will say no and they'll have reasons. So the best test would be to try it yourself and decide, but also be open to the possibility while you might notice a difference, someone who's not looking for it may not see it or it might be too small for them to care on any level so it's effectively no difference to them.

  • 1 month ago
  • 1 point

Like gorkti200 has already said the short answer is no. But even if you can't really see the difference you can reduce input lag by still having your fps higher then the refresh rate of your monitor. I'm not sure where I read this but I think the rule is double the Hz(refresh rate) of your monitor. Additionally if your going to enable Vsync just know that it will also increase input lag. I could be wrong but I've done quite a bit of research on this kind of stuff since it was having some issues with my overwatch settings.

  • 1 month ago
  • 1 point

Well our eyes do not see in FPS like our screens shows and everything we see is processed by the brain which is mostly how optical illusions work by taking advantage of some side effects.

Now the difference of 60 fps to 144 fps on if you can see it or not also depends on a few things. Typically faster moving objects are things that your brain notices on higher FPS for a difference. If you have a 144 hz display give this a test. That is a frame rate test where on a 144 hz display it will show 144/72/36 fps comparison and I can easily see a difference between 144 and 72. Even when playing games like doom 2016 it was night and day difference between my old 60hz display and my new 144hz display when it came to aiming because moving objects were smoother in movement and easier to nail headshots.

As for how your brain processes it above 15 fps your brain takes all the images and translates it into a moving picture and this is basically to save on brain processing power. Sure there are limitations and workarounds for said limitations which has been used for years in the television sector. One such limitation is when something is moving fast on the screen it can move a good distance between frames and if that gap is large enough the brain might pick it up as a jump rather than movement. Adding a blur effect to it (motion blur) can make that jump less perceivable but it also makes it harder to see. Again if it was something moving slowly (or not at all) you likely won't be able to tell the difference between 30 to 60 fps let alone 144.

Now things in games can move really fast on your screen to the point where you might be able to see the jumps at 60hz so 144hz can deliver a better experience. Now there are 240hz displays out there but the time between frames on 144hz is far smaller so the jumps on fast objects is not as large as it is on lower framerates.

Sort

add arrow-down arrow-left arrow-right arrow-up authorcheckmark clipboard combo comment delete discord dots drag-handle dropdown-arrow errorfacebook history inbox instagram issuelink lock markup-bbcode markup-html markup-pcpp markup-cyclingbuilder markup-plain-text markup-reddit menu pin radio-button save search settings share star-empty star-full star-half switch successtag twitch twitter user warningwattage weight youtube