Jump to content
Search In
  • More options...
Find results that contain...
Find results in...
Koko Ricky

How necessary is 60fps?

Recommended Posts

I think part of the thing is we're on a PC FPS forum. This is the genre where you want the most precision, the highest FPS etc, fitting coincidence that FPS stands for both things. But when playing say GTA5 when it first hit on 360, at 30FPS, it looked incredible to me. For a cinematic type of game where twitch reflexes aren't needed, I think 30 FPS can look just fine.

 

Naturally if I could have every game be played at 60FPS while looking solid, I'd do that. And for consoles it was nice that Metal Gear Solid V was at 60FPS on PS4 and whatnot, you could tell the difference, but none the less I can always enjoy a 30 FPS game. On PC it's a bit different though, maybe because I'm sitting closer to the monitor, and also because I'm customizing stuff so I want it to be able to have a higher FPS than that. Plus the FPS genre feels nice at higher FPS.

Share this post


Link to post
1 hour ago, GoatLord said:

Numerous platformers—even after the release of the smooth scrolling Commander Keen games—would scroll at a monstrous 8 pixels at a time or so. 

PC games in general tended to have shitty scrolling well into the 90s -it was embarrassing seeing a 486 PC being owned by a lowly Amiga, Atari ST or even C64, all 80s technology. There was something about getting smooth scrolling on the PC that most game developers simply didn't "get". PCs were more like glorified Amstrad CPCs or Spectrums: no scrolling, no blitter, no sprites... It's surprising how they got on top simply with brute force, 100% reliant on the CPU.

Share this post


Link to post
13 hours ago, dpJudas said:

No, not really, not noticing that. Movies are quite choppy to me when I'm paying attention to it. I'm also able to instantly tell when just moving a mouse cursor if my monitor is running at 30 or 60 Hz.

 

There's also several movie directors that complained publicly about how they didn't like 60 Hz recording equipment because it removed that "film effect" that the 24 Hz motion blur creates. As both 60 and 24 Hz cameras apply motion blur, if 24 Hz was plenty they wouldn't complain about this.

 

I generally agree that 'stable frame rate > higher frame rate', though. Micro-stuttering like in BoA is so destructive that I'd much rather have that mod run at 30 fps, and Frozen Time can give me motion sickness from how the framerate can fluctuate between 30 and 300 fps.

Well, you're not supposed to notice - that's the point. Now, I can't speak for movie directors, just my own experience. I spent some dedicated time studying this effect, because I wanted to know exactly why Doom looks choppy, and movies don't, even though Doom was running at a higher frame rate. So, I chose scenes in the movie where the camera was turning kinda like you might expect to see in Doom, and there was the answer: In the movie, the entire scene was blurred, yet it somehow looked "cleaner" than Doom did. The motion blur somehow makes the scene more palatable to the eyes. It's counter-intuitive, but it works.

 

7 hours ago, GoatLord said:

Choppiness in 24fps film is actually quite noticeable when the camera is panning. This results in a mildly jarring stuttering effect because even though motion blur naturally occurs during panning, that type of movement reveals the limitations of the framerate. It's also worth noting that it's not "overexposure in the camera" that results in motion blur. Exposure is related to how much light the camera is taking in, which is governed by the aperture or f-stop, which affects brightness and depth-of-field, not motion. Motion is regulated by shutter speed, which can be measured in degrees or fractions.

 

Let us use fractions as the example, as that's a bit easier to understand. If your shutter speed is set to 1/50th, it means that, regardless of framerate, the shutter will capture motion at 1/50th of a second, which provides a very "true to life" appearance; that is, motion blur occurs frequently, but is only slightly visible in most cases, although significantly during very fast movement, which is very similar to how motion appears to the naked eye. If you lower the shutter to say, 1/24th, that might be the equivalent number to the framerate (24), but much more motion blur appears due to less information being captured per second. If you set it to say, 1/500th, it will eliminate almost all motion blur, causing the video to look somewhat unnatural, as everything will be sharp, which does not reflect how we perceive reality.

 

In video games there is no shutter speed; rather, motion blur is handled by the engine. The idea is to simulate a particular shutter speed, so without motion blur games look like the camera has the shutter set to something like 1/2000th. Much like video captured at a high shutter speed, this looks...odd, unless the framerate is significantly higher than 24. However, when motion blur is applied to a game, 24fps can actually look pretty natural, though probably not smooth enough for extremely precise aiming.

Noticeable? Maybe. But much less noticeable. You must describe such things in relative terms, because we're merely simulating vision, using that which is available to produce the best result, without being technically impractical, or overly expensive.

 

"Overexposure" was a misleading term, I guess, though I'm not sure what term would be more accurate or understandable, to describe the effect of the shutter being open for too long per frame, during the capturing of high-speed action, allowing the film to be exposed to enough varying video input that the resultant picture is the translucent sum of all of the action that occurred while the shutter was open. Motion blur is the effect of capturing too much visual input per frame

 

Yes, it is a function of the shutter speed - the duty cycle - the pulse width, so to speak. So, if your goal was to avoid motion blur, yet your shutter speed is too low, then it could be said that you overexposed the film.

 

In contrast, the eyes do not have a "shutter speed" or frame rate, instead they capture constantly, with a retention rate. This is obvious when you look at a bright light, then look away, and you can see an imprint of the bright light in your field of vision, with colors inverted, as your eyes attempt to compensate for the bright light. Motion blur works because of this retention. The blue essentially becomes invisible, as your eyes and brain are compensating for the image you just saw in a very similar manner. It just so happens that the motion blur in the video mimics what your eyes would be doing anyway. In a way, the motion blur makes your eye's job a bit easier.

 

However, if the artificial motion blur is eliminated, your eyes will still compensate, but not nearly as much, and this is detected as choppiness.

 

Motion blur can be emulated in video games quite believably. But, it's difficult to do efficiently, especially in software mode, without being very clever. You can translucently belnd the current frame with the previous buffer at a large alpha, but this is murder on your memory cache. You could also experiment with interpolation code, and try to apply some blur to just fireballs, which might create a nice effect, even if it doesn't generate believable motion blur. I think GL mode is when it becomes practical.

 

6 hours ago, GoatLord said:

I was reading about why PAL and NTSC have different refresh rates but can't remember, I want to say it's related to voltage or something. Personally I think 25fps "European" video looks kinda shitty, but probably because I'm used to 24.

European AC is sent 50Hz (cycles-per-second), and American at 60Hz. Many devices use this rate for timing purposes (clocks, for one). Interestingly, if the power companies realize that they've been sending, say, 59Hz for the past 30 minutes, they'll start sending 61Hz for a while, to allow clocks to "catch up". This surprised me.

Share this post


Link to post
1 hour ago, kb1 said:

Well, you're not supposed to notice - that's the point. Now, I can't speak for movie directors, just my own experience. I spent some dedicated time studying this effect, because I wanted to know exactly why Doom looks choppy, and movies don't, even though Doom was running at a higher frame rate. So, I chose scenes in the movie where the camera was turning kinda like you might expect to see in Doom, and there was the answer: In the movie, the entire scene was blurred, yet it somehow looked "cleaner" than Doom did. The motion blur somehow makes the scene more palatable to the eyes. It's counter-intuitive, but it works.

I'm not denying that motion blur helps on things looking choppy. What I am saying is that eyes are more than capable of seeing 24 fps without blur. Notice how you can see the detail of the fan in the video clip a lot better with the higher shutter speeds. If it had been running on a 500 hz monitor with 500 fps, then your eyes might very well been able to see the movement of the fan a lot more clearly than at lousy 24 fps, despite using a high shutter speed without it looking "unnatural".

 

As Doom is a game where aim is one of the key elements of who wins a death match, being able to track someone's movement down the pixel level gets extremely important. "Cinematic" low FPS motion blur totally kills your ability to do that. I don't know what the limit is for what eyes can track, but I'm quite sure it isn't 24 or 30 Hz.

Share this post


Link to post

@dpJudas: No arguments here. I believe most people can easily distinguish 120 fps, and sense well above 240. This has been my experience, anyway.

 

I think you exaggerate when you say that low FPS motion blur kills your ability to track someone's movement. Now, to be accurate, I have to get technical, and provide some exact example specs:

 

Given 2 systems:

1. System 1 can internally render at 120 fps, but can only display 30 fps; with no motion blur

2. System 2 can internally render at 120 fps, but can only display 30 fps; with motion blur enabled

 

...system 2 has some advantages going for it:

System 2 output contains details from 4 frames

System 2 appears more natural to your eyes, in that it works with the retention of your eyes seamlessly, emulating a perceived continuous frame rate

System 2 softens the the choppiness effect, while providing extra directional clues

 

...and the one disadvantage:

System 2 is more blurry. But, again, the eyes largely hide this effect.

 

Now, if the motion blur effect slows down the output frame rate, all bets are off.

 

Of course, the proof is in the pudding. I cannot claim to have mocked up a convincing motion blur effect in Doom. But, I think one of the software ports added a translucent pass option to their software renderer - was it Doom Retro? I haven't looked at it yet. All logic suggests that the clear renderer should be the absolute best. But I will claim that the motion blur in movies does something miraculous to movies, that, for me all but hides animation choppiness, emulating a much higher, closer-to-reality perceived frame rate. There's no reason it couldn't work for Doom, given a good implementation, and sufficient frame rate.

 

Can anyone else comment on the implementation in Retro? How much does it slow down rendering, if any? Does is smooth out choppiness, or is it awkward?

 

Share this post


Link to post

@dpJudas, on the topic of what our eyes track and how it relates to frame rate...I googled this awhile back and a study suggested that human eyes do not have a constant "frame rate" but rather one that fluctuates constantly, and possibly multiple frame rates exist at once since you see far more detail in the center of your vision than the peripheral. I've also read that we only process a pitiful few images at a time and our brain compensates for anything in between. I've witnessed this in action when watching a ceiling fan; if I stare at it, it will appear blurry most of the time, but for a split second I'll see a very sharp image of a single blade. I'm also reminded of how fans look like they're spinning backwards very slowly if they're rotating fast enough. Very strange tricks our eyes play on us. On a related but slightly tangential note, it's been suggested that the resolution of the human eyes is something like 8k. This would indicate that a 4K monitor is sufficient in most cases, but 8k would be optimal for VR headsets since the display would be inches from your face.

Share this post


Link to post

I was quite surprised to learn that a lot of older games ran at high frame rates; many were at 60fps. Which is really funny because if memory serves correctly, I came across some blatantly false advertising for both Wolfenstein 3D and Flasback that claimed the games ran at a high frame rate "just like the movies," even though the former ran at 75fps and the latter at different rates for gameplay versus cutscenes, at least on the Genesis... 

Share this post


Link to post

Re: The fans spinning backwards: This only occurs when your view of the object is being regularly impaired, either by flashing the light source, by physically blocking the view, or by using a mechanism such as a film projector or television that breaks up the image into frames.

 

Flashing the light source: This is the principle behind a car mechanic's timing light. Beforehand, an easy-to-see mark is placed on the flywheel with crayon or chalk (or the mark already exists). When the equipment inductively detects that spark plug #1 is firing, it powers the light. The light is only lit for a fraction of a second. The effect is that it looks like the mark is in the same spot, though the flywheel is spinning very quickly. Slight adjustments in timing cause the mark to appear to move a tiny bit forward or backward, allowing the m echanic to precisely time the engine.

 

An easy way to see this: Find a flourescent light. Spread out your fingers and hold your hand between your eyes and the light, in such a way as to allow your fingers to partially block your view of the light. If you then rapidly move your hand left and right, you will see multiple distinct copies of your fingers. The effect does not work if you use an incandescent light. Although the incandescent bulb is being powered off and on 120 times a second, the filament stays hot enough between cycles that the light continues to glow.

 

Blocking the view: This can sometimes be seen, if you're riding in a car, looking through regular posts, like guard rail posts. This can create the effect where a car's hubcaps can look like they're spinning backwards, slowly, or even stopped. Walking beside a wooden fence can have a similar effect.

 

(Yes, I had a strange, though interesting childhood :)

 

8k Eye resolution: Don't know what to make of that. When comparing digital and analog systems, we must be very careful. Analog systems have do not always compare as nicely as we would like. For example, how many colors can we detect? I know a lot of estimates have been attempted, but these are typically done as tests, where a bunch of college students try to differentiate colors. I personally have never seen "pixels" in any outdoor view, but I sure can see the pixels in a 4K display. Also, the retina is teeny-tiny, vs. the size of that massive display. I have a feeling we have a long way to go yet, before the display pissing contest is over. Hope it ends soon, though: It's getting more and more difficult to maintain a nice framerate in Doom...

 

Share this post


Link to post

I run DOOM4 on 120+ fps. When I saw the game drop to 60 during one of the cutscenes for the first time (because those are locked at 60) I thought they've locked them at 30 because it was so jarring in direct comparison (actually had to pull up the fps counter to make sure).

 

Is 60-144 noticable? Absolutely, especially with fast moving objects. Hell, switch between the two and move the cursor across the screen - 144 IS less fragmented than 60.

 

Does higher framerate improve to the experience? Absolutely (especially for action games). Smoother, more responsive gameplay feels and looks better.

 

Is 60fps enough? No. More is better. I don't know what the limit is but I can easily differentiate between 60 and 144.

 

Is 30fps more cinematic? No. There's no such thing as "lower fps = more cinematic". Video games are not movies.

 

Is 30fps ever acceptable over 60+? Sometimes choppy animation works in certain cases (South Park, anime) but for games? Maybe in turn based strategy games it's not a big deal. 60+ always feels better though.

 

Can you get used to <30? Sure. Humans can get used to just about anything. That doesn't mean it's enough though. Most people who defend 30fps are just making excuses for their poor PCs.

Share this post


Link to post

Looking back at when I played StarFox on the SNES as a kid, the choppy framerate (which could sometimes bog down to something like 10 FPS or lower) really didn't bother me. As long as the core gameplay is solid, that's pretty much the most important factor. Vanilla Doom also wasn't exactly the smoothest looking game, but I don't think anybody would complain about that.

Of course a smooth framerate is always preferable, but I do think that a lot of people (mostly "PC gamers") tend to put too much emphasis on it as if having a framerate that's below 60 automatically makes the game suck.

Share this post


Link to post
54 minutes ago, Bugnotnotthegreat said:

Duke nukem 3D was capped at 15 fps and I don't notice it.

Uhh... No. This is demonstrably false.

Share this post


Link to post

Motion blur can fuck off, give me better animations over that crap anytime of the day. 

 

Some games had hilarious side effects due to how their frames happened, some became 100x times harder to beat due to better hardware; like Supaplex.

Share this post


Link to post
8 hours ago, kb1 said:

...I personally have never seen "pixels" in any outdoor view, but I sure can see the pixels in a 4K display.

 

Okay, I'm getting pretty tangential here, but hey, this is what conversation is all about. I can sort of see the "pixels" of my vision all the time, and can recall this being the case even as a child. There is a wikipedia article on a physiological phenomenon known as visual snow, and neuro-ophthalmologists have hypothesized "that what the patients see as 'snow' is their own intrinsic visual noise." The article uses an interesting gif as an example which I have included in the post. It accurately represents what I see all the time, although in most situations it's so slight as to go unnoticed unless I specifically pay attention to it. However, in low lighting, or when staring at anything dark, it looks a lot like the gif, where the brain's inability to coherently form images lacking sufficient photon information are interpreted in a very similar way to the video/film noise of low quality cameras. In this sense I am literally witnessing the limit of the resolution of my vision, and while it's rather difficult I can more-or-less decipher individual dots. This leads me to believe that the 8k human resolution proposal is probably somewhat accurate.

Red-blue-noise.gif

Share this post


Link to post
2 hours ago, Pegg said:

Motion blur can fuck off, give me better animations over that crap anytime of the day. 

 

Some games had hilarious side effects due to how their frames happened, some became 100x times harder to beat due to better hardware; like Supaplex.

I find this a bit of an odd comment. Motion blur is a natural and very explicit phenomenon in human vision. Photography and video capture motion blur all the time. Older cartoons would use elaborate, abstract paintings for frames of high speed movement to simulate motion blur. So why is the effect somehow inappropriate for gaming?

Share this post


Link to post

The difference is the smoothing done in cartoons\movies is way better than the shit in games. They aren't just in old cartoons they are in things like Pixar movies, they do look way better than a stupid motion blur smudge effect.

Share this post


Link to post

There is a reason smear frames in 2D animations are abstract looking and hand drawn, and not just somebody smudging a key frame with their thumbs.

Share this post


Link to post
35 minutes ago, GoatLord said:

Am I the only one who thinks we're getting pretty good at simulating motion blur in modern games?

You might be, because my first order of business with any modern game is to disable any mention of motion blur in the main menu. It irritates me every time I see it, and it was one of the first things that put me off about Doom 4. I get that it's sort of "realistic", like when you shake your head violently, but most cases I've seen use it with nearly *every* camera movement, which just makes for an annoying experience.

Of all the modern graphical features, I prefer bloom, because bloom actually looks pretty nice when it's used in a subtle way. Such as when casting a subtle bloom on light-sources when you look into them.
 

6 hours ago, GoatLord said:

Okay, I'm getting pretty tangential here, but hey, this is what conversation is all about. I can sort of see the "pixels" of my vision all the time, and can recall this being the case even as a child. There is a wikipedia article on a physiological phenomenon known as visual snow, and neuro-ophthalmologists have hypothesized "that what the patients see as 'snow' is their own intrinsic visual noise."

Red-blue-noise.gif

I also have visual snow. It's especially bad when I'm in a dark environment. Represent! *fist-bumps*

Edited by Agentbromsnor

Share this post


Link to post
32 minutes ago, GoatLord said:

Am I the only one who thinks we're getting pretty good at simulating motion blur in modern games?

No, I like subtle motion blur. A little bit of it smooths out the experience. DOOM4 amount is just about right for me.

Share this post


Link to post
On 8/31/2017 at 9:16 AM, everennui said:

I want monitor manufacturers to stop discontinuing nice screens after they've become affordable to make. Or at least not soup it up with a bunch of unneeded crap, like speakers, small bevel, super thing and give options for freesync on ALL monitors even if it has Gsync (it's a royalty free plug and it's open source (free) to impliment).

Can I please have a 144hz 1440p 21:9 screen for less than a down payment on a house? Please?

Give me a 5" thick monitor, with corn cobs for the stand, a menu in Klingon and the logo upside down if you have to. Just give me what I want for a reasonable price. It doesn't seem like it's worth it to "push" any technology forward with my wallet because a new standard is just a year away. I'm sure there's some real challenges in the production of monitors that I'm overlooking, but fuck, throw a dog a bone.

I'm just going to wait until Black Friday for a monitor. I don't know why it's so amusing to me to use such an old CRT with such a beast of a computer, but the prices on these things are outrageous.

Buy a used NEC MultiSync on eBay. I got a 20" 1600x1200 S-IPS monitor for $60 (original price was probably ten times that). Other than its almost CRT-like mass (over 20 pounds), frustrating stand mount, and CCFL backlight that's dimmer than modern LCD backlights, it's a great display. A 1920x1200 widescreen one would probably run you about twice as much and weigh even more, but it's certainly a lot cheaper than a new one.

Share this post


Link to post
8 hours ago, GoatLord said:

...visual snow

Red-blue-noise.gif

 

1 hour ago, Agentbromsnor said:

You might be, because my first order of business with any modern game is to disable any mention of motion blur in the main menu. It irritates me every time I see it, and it was one of the first things that put me off about Doom 4. I get that it's sort of "realistic", like when you shake your head violently, but most cases I've seen use it with nearly *every* camera movement, which just makes for an annoying experience.

Of all the modern graphical features, I prefer bloom, because bloom actually looks pretty nice when it's used in a subtle way. Such as when casting a subtle bloom on light-sources when you look into them.
 

I also have visual snow. It's especially bad when I'm in a dark environment. Represent! *fist-bumps*

Oh, wow - I thought I had heard about visual snow before. It must have been when you mentioned it in a previous thread. Question for you guys: So you both notice it in dark environments. How about a dark screen, but with a bright area, like a fireball in a dark room - does the snow effect persist, or does the contrast mitigate the issue, even though it's only a small area of the screen?

 

2 hours ago, GoatLord said:

Am I the only one who thinks we're getting pretty good at simulating motion blur in modern games?

The speeds and alpha (perhaps a bit different for each color channel, to simulate the eye's varying sensitivity to certain colors) must be chosen carefully, but once that's done, it can be made very realistic.

 

7 hours ago, Pegg said:

The difference is the smoothing done in cartoons\movies is way better than the shit in games. They aren't just in old cartoons they are in things like Pixar movies, they do look way better than a stupid motion blur smudge effect.

"a stupid motion blur smudge effect"?

6 hours ago, Red said:

There is a reason smear frames in 2D animations are abstract looking and hand drawn, and not just somebody smudging a key frame with their thumbs.

"smudging a key frame with their thumbs"?

 

You guys must have seen some really crappy software. Who said anything about a stupid, or smudgy effect? Again, a proper motion blur is nearly invisible in and of itself - it simply helps the eyes believe the movement.

 

Share this post


Link to post
7 hours ago, Agentbromsnor said:

I also have visual snow. It's especially bad when I'm in a dark environment. Represent! *fist-bumps*

OH YA, FIST BUMPS ALL DAY BROMSNOR!!! @kb1, I've mentioned it a few times. I tried saying it's a minor hallucination, since doctors are saying that nothing is wrong with peoples eyes with visual snow, might be a brain thing, but J4rio disagrees with me. For me it's far more noticeable in dark rooms, can't see it in bright lights, mainly because I try not to stare directly at bright lights. I don't know, that might just be a me thing though.

Share this post


Link to post

You can see a big difference when playing videogames with bad FPS.

 

15-20 fps is impossible.

30-40 is regular

50-60 is just right

60 and above is just bragging.

 

Yet in PC, stabble fps are neccesary, why? Well, your mouse.

The lower the fps, the worst the pointer. It really can get frustrating, slow, painful and unplayable.

60 fps give you the best out of your mouse speed, been fast and also smooth.

Heres a gif that shows you the visual difference in FPS. Btw, 60, 70, 120 fps, etc. Look the same, the difference starts below 60.

giphy.gif

Share this post


Link to post
19 minutes ago, Anidrex_1009 said:

Btw, 60, 70, 120 fps, etc. Look the same, the difference starts below 60.

Double the speed of that line and you'll instantly see a difference.

Share this post


Link to post
1 hour ago, MrGlide said:

Double the speed of that line and you'll instantly see a difference.

Unless he has a 60Hz monitor, obviously. Or is restricted by GIF framerate limitations which changes from browser to browser (I sometimes hear 50FPS typically but can't confirm it).

Share this post


Link to post
9 hours ago, kb1 said:

 

Oh, wow - I thought I had heard about visual snow before. It must have been when you mentioned it in a previous thread. Question for you guys: So you both notice it in dark environments. How about a dark screen, but with a bright area, like a fireball in a dark room - does the snow effect persist, or does the contrast mitigate the issue, even though it's only a small area of the screen?

Since screens (I take you're talking about monitors here) these days are mostly light-sources, it doesn't affect the visual snow too much.

Share this post


Link to post

I'm really tired of the "60< FPS isn't even noticeable" myth. I suspect most of these people either haven't looked at a 120hz screen long enough to really get a feel for how much smoother is is, or their eyesight just isn't sensitive enough.

 

I will say that as easily as I can tell the difference between 60 and 120, it isn't as significant as a leap as 30 vs. 60 (just like HD to 4k isn't as big of a leap as SD to HD was).

 

 

Share this post


Link to post

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×