Jump to content
Search In
  • More options...
Find results that contain...
Find results in...
Royal_Sir

Why do people care about FPS past 120?

Recommended Posts

A recurring theme I've seen in these forums is the comparison of frames per second, especially in regards to performance. Which is all fine and dandy until I see complaints like 

 

"My computer can only run this at 165 fps"

"How to get 300 fps?"

"At times my frame rate even came as low as 120 fps!"

 

Personally I think past 60 fps is redundant, but for the sake of having a safety factor, lets call it 120 fps. Why would *only* having an fps of 120, that you likely couldn't even visually detect bother you guys?

Share this post


Link to post

humans can perceive input delays as low as 2ms

 

1 second divided by 120 is 8 milliseconds and some change

 

 

for more information on how that relates to input delay you could watch this talk by john carmack, the lead programmer of the game this website is about

Share this post


Link to post

Don't know how to best describe my take on this, but will try to anyway.

 

I don't think having "only" 120fps is the problem, as one might not be able to notice the fps difference visually, but say your fps is going down from 300 to 100 or even 60, this surely makes a difference, and it's most noticeable when moving around the mouse, which could be the real reason behind some people's complaints.

Share this post


Link to post

What's wrong with 60 fps? My system runs most games at that and I don't think any more investment is cost effective to increase that to 120 fps etc..

Share this post


Link to post
Guest Unregistered account

 

 

Share this post


Link to post

fps is always interesting. Is it a visual frames per second or is the game going through is logic 60 fps? Does something need to be rendered faster than we can see it or feel it? Does logic need to be activated 60 times per second to burden your CPU? If I'm sitting close to a monitor I can notice the slightest drop in frame rate. If I'm sitting further back, suddenly I can't. Even with a fps checker to verify that yes there was a dip. Then again, that's just me.

 

The games limited to 30 fps are usually limited due to a drop in frame rate when things get chaotic. Would you rather have a smooth 30 fps that stays 30 or would you rather have 60 fps for 90% of the game until explosions, mayhem and carnage happen suddenly it dips to 30 or worse because its doing its thing twice as much? Chances are programmers should alter things to start dropping intensive things when said mayhem starts happening.

 

Then there's the Windows popup of "something is taking up resources" for games that use 100% of CPU to maintain 200 - 1,000 fps.

 

As for online multiplayer and smoothness, that stuff matters. If you aren't smooth, the other guy is. If you don't have the FOV cranked up to see 110 the guy next to you does.

Share this post


Link to post

I require no less than 4200 FPS because this way it's evenly divisible by 24, 25, 30, 35, 50, and 60, providing a smooth experience regardless of game or video being played. Also the HDMI cables need to be gold-plated to make sure the bits stay crisp.

Share this post


Link to post
14 minutes ago, Pegg said:

They can with a better monitor.

Me thinks I'm due for a new one anyway. No hurry though.

 

@Gez 4200 FPS?! Isn't that a bit overkill?

Share this post


Link to post
15 hours ago, Royal_Sir said:

Personally I think past 60 fps is redundant

 

If you have seen the difference of 60 and 100+ fps on a 144Hz screen, you will think otherwise. It's a lot smoother and just looks so much better. 

 

If one wants more fps than his 144Hz monitor can display then it's probably a cs:go play who is on a very high, if no profesional skill level as higher famerates still give lower input latency, even if it's very subtle. 

Share this post


Link to post

After getting a 144hz monitor, I no longer agree 60 FPS is a fine max. It was already noticeable, and then when I dual monitored with a 60hz and when back and forth with the mouse, it was even more so, especially the ghosting on the cursor.

At this point 60 is my minimum for games (so bye bye 9/10 of my PS4 games) but I couldn't really tell a difference between 120 and 144, just as long as it doesn't have random hitching or stuttering (PUBG)

Share this post


Link to post
11 hours ago, Gez said:

Also the HDMI cables need to be gold-plated to make sure the bits stay crisp.

 

You trust the bits to just walk past so much gold without helping themselves to any of it? Tsk!

Share this post


Link to post
On 3/17/2018 at 3:45 PM, Pegg said:

They can with a better monitor.

 

Perversely, old CRTs could go beyond 60 fps way before LCD monitors were even seriously considered for desktop computers (in fact, a 60 Hz refresh rate was considered ergonomically too low for a CRT, even in the 1990s, and every monitor and VGA card manufacturer was competing to get the higher possible numbers at the highest possible resolution). It was not unusual for a good quality monitor to go beyond 200 Hz @ 640 x 480, or 100 Hz @ 1024*768. Granted, the goal with CRTs was actually flicker reduction, rather than Sup3r Dup3r Fux0r H4x0r 1337 pr0 G4m3r d00d frame rates, but high frame rates and display immediacy just 'came naturally' to CRTs. You needed no special "game grade" gear to have high frame rates and low lag/latency.

 

But when LCDs took over, paradoxically, refresh rates took a setback to 60/70 Hz, and it took nearly a decade and specialist "gamer monitors" to break the 100 Hz barrier again. Of course, with an LCD monitor, you don't need a high refresh rate in order to have a flicker-free image, so none really bothered with it -at first, at least. Input lag is another sore point, as an entire frame must be buffered before it can even be displayed, unlike a CRT, which is continuously plotted pixel-by-pixel.

Share this post


Link to post

LCDs don't focus as much on refresh rate because they don't handle it the same as a CRT monitor. Anyway I don't think op used a CRT let alone one with more than 85 fps to make that remark about frames per sec (don't make me miss my giant toaster monitor fffs).

Share this post


Link to post

60FPS is sufficient, but higher FPS is definitely noticeable. Nevertheless, going beyond 120/144 is pointless IMHO. All it does after that is tax your hardware needlessly for pretty much no increase in image fluidity and, of course, bragging rights (some people will judge your worth as a gamer based on your FPS count....).

 

I think that a 1080ti and a very good 120/144 Hz monitor should last you a lot of years from now in terms of image quality.

Share this post


Link to post
On 3/16/2018 at 7:42 PM, anotak said:

for more information on how that relates to input delay you could watch this talk by john carmack, the lead programmer of the game this website is about

 

That's not Sgt Mark though?

Share this post


Link to post

Anyways I (used to) use a 144hz GSync monitor and honestly it was mostly lost on me. I could tell the difference between 60hz and 100hz+ when I looked at a pendulum demo, but while playing a game it was practically unnoticeable to me.

Share this post


Link to post

There are many factors to consider here, some factors are scientific, some are related to human physiology, some psychological, and some are pure b.s. Here are a few of those factors, in no particular order:

  • Movies, especially older movies are traditionally shown at a very low frame rate (24 fps to 30 fps). But they typically look rather fluid in high-action scenes, regardless. If you slow down a movie, and look at each frame, you'll notice the 'motion-blur' effect. Cameras create this effect due to exposure time (shutter speed), and due to certain properties of the film. In essence, this effect works hand-in-hand with how the eye works, allowing the viewer to perceive fluid motion.
  • CRT screens have a retention period, where the phosphor of each pixel continues to glow after being charged. This assists the motion-blur effect. LCD/LED screens lack this ability, though many of the high frame rate monitors have special circuitry that blends frames to fake a higher frame rate.
  • Human eyesight is not as precise as one might think. The eyes also have a retention effect. More accurately, sight works largely by differential. If you stare at a bright light, then look away, you see an inverse color effect. The eyes are also more sensitive to some colors, and less sensitive to others. This effect also depends on position in the field of vision. Near the edges, the eyes are more sensitive to brightness than color. And each primary color is retained for different periods of time.
  • Some scene changes (like a flickering light) can be detected by your eyes up to 1/500th of a second. Most others require more time to detect.
  • When you pan a real-life scene, your eyes don't stay fixed: They make micro-movements as they try to focus on interesting areas. The equilibrium mechanism in your ears aids this effect. But, in video games, the mouse does the panning, not your head. This effects the micro-movements of the eye, as they are no longer guided by actual movement.

So, the motion-blur effect is very complex, and it depends on a lot of different factors. In movies, the camera creates a motion-blur effect very similar to what happens in the eye. In real-life interaction, the eye's retention, combined with micro-movements has a similar effect, which the brain interprets as fluid motion.

 

In video games with low frame rates, lack of a motion-blur effect can make turning look very jarring. The more an object moves per frame, the worse it looks.

 

However: Because of the nyquist theorem, producing more than double the number of frames than your hardware can render provides no visual benefit whatsoever.

 

Now some people claim that, with faster frame rates, the time between the mouse being read, and the next frame is reduced, allowing better control. I counter with this:

In the music for Doom E1M1, with that crazy guitar solo, those note are being played slower than 12 notes per second. Imagine the picking hand, while playing that solo, alternating 12 times per second. Some guitarists can do it, but if you can, you're fricking wailing!

 

Now, imagine alternating the mouse, instead of the pick, 12 times per second. Now imagine doing it 60 times per second. See what I mean?

 

I suggest that a nice motion-blur effect would work as good, if not better than a massive frame rate, and it would be easier on the electricity bill. This is not an easy effect to build right, but it's worth looking into.

Edited by kb1

Share this post


Link to post
9 hours ago, kb1 said:

Movies, especially older movies are traditionally shown at a very low frame rate (24 fps to 30 fps). But they typically look rather fluid in high-action scenes, regardless. If you slow down a movie, and look at each frame, you'll notice the 'motion-blur' effect. Cameras create this effect due to exposure time (shutter speed), and due to certain properties of the film. In essence, this effect works hand-in-hand with how the eye works, allowing the viewer to perceive fluid motion.

 

That is true for movies. In PC games however, you are not bothered by the limits of physical hardware, such as cameras. Once you reach high enough framerates (60 is easily enough for this from my long experience with gaming), gimmicks such as Depth of Field or Motion Blur have just one effect. The final image looks like shit.

This is most obvious in console ports, where you usually have to do some workaround to disable them, as these "features" (as the apologists usually refer to them) are enabled by default with no option to disable them. You have to do an ini tweak or add a command line parameter etc.. They usually forget when porting games to PC, that those do not have shit hardware that can't even reach stable 60 FPS.

 

Simply put:

Motion Blur is used to mask low framerate. Is used in movies and should be used ONLY in console games, that are locked to 30 FPS for that "cinematic" feel (as an Ubisoft dev once put it, ROFL). Also MIGHT be useful for people with garbage hardware to make a game feel smooth enough to play. People with average and better hardware are better off disabling this as soon as they can.

 

Depth of Field is probably the most redundant "feature" ever, because the game can't possibly know, where on the screen you are focusing at this very moment. Needless to say, that your eyes facilitate this effect on their own as they can't focus on the whole screen at once, resulting in blurry and smeared image everywhere except the middle of the screen.

 

The first thing I do with any game is check wether these 2 are on and disable them immediately. Why would I voluntarily lower my image quality?

 

TLDR

With high enough framerate (45-50+), Motion Blur loses all benefits and only hurts your final image quality.

Share this post


Link to post
On 3/22/2018 at 9:09 AM, Linguica said:

Anyways I (used to) use a 144hz GSync monitor and honestly it was mostly lost on me. I could tell the difference between 60hz and 100hz+ when I looked at a pendulum demo, but while playing a game it was practically unnoticeable to me.

 

I kinda felt the same way at first for the first few weeks of my 144hz monitor, but then when I went back and started playing games on an old 60hz I could instantly tell the difference in-game. 60hz felt pretty choppy in comparison, at least in games that I regularly play in 144hz

Share this post


Link to post

If I am engrossed in a game then the frame rate is not even something I am really aware of, so long as it's "good enough." I experimented for a while in Doom 2016 with different resolution scalings to try and get different FPS targets, and hitting 144 FPS was fun and all but it didn't make my game more enjoyable than hitting 80 or 90 FPS - in fact, it made it less so, since I could more easily notice the lower resolution than I could notice the higher frame rate.

 

edit: I guess I should make the distinction between two things: I can notice the difference between 60 and 100 FPS, but I don't find it a very dramatic change in terms of my experience and not one really worth the tradeoffs of buying a high-refresh panel; and also, I can really not even tell at all the difference between 100 and 144 FPS.

Share this post


Link to post
7 hours ago, idbeholdME said:

Simply put:

Motion Blur is used to mask low framerate. Is used in movies and should be used ONLY in console games, that are locked to 30 FPS for that "cinematic" feel (as an Ubisoft dev once put it, ROFL). Also MIGHT be useful for people with garbage hardware to make a game feel smooth enough to play. People with average and better hardware are better off disabling this as soon as they can

 

But doesn't Motion Blur require extra inter-frame processing and hence GPU and/or CPU time to accomplish? If anything, I thought that having it was a privilege for hardware powerful/advanced enough to support it. Unless it can be done with little penalty "on top" of the actual rendering of distinct frames, it doesn't really sound like a very good "mask" at all.

Share this post


Link to post

On Doom 2016 the entire post-processing pipeline took about 2.5 ms per frame and the motion blur part of that was probably less than 1ms per frame. GPUs are absurdly powerful when used well.

Share this post


Link to post

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×