Jump to content
Search In
  • More options...
Find results that contain...
Find results in...
Koko Ricky

How necessary is 60fps?

Recommended Posts

4 minutes ago, scifista42 said:

As long as the delay between frames is relatively consistent over time and never higher than 1/25 of a second, I'm fully happy.

You really like to use terms that confuse mere mortals, don't you?

Share this post


Link to post
8 hours ago, MrGlide said:

Most definitely not, but hey, you asked, I answered, then you called me crazy. Whatever.

I stand corrected. I didn't think anyone could really tell the dif between 60 and 144.

Share this post


Link to post
1 hour ago, GoatLord said:

I didn't think anyone could really tell the dif between 60 and 144.

You need to see it to believe it. You can REALLY tell the difference.

Share this post


Link to post
5 hours ago, scifista42 said:

What terms?

I was trying to say that I barely understood what you said in your post at first, because it's the first time that I see someone talking about the delay between frames and not actual framerate. If the word "terms" was misused, I apologize. My English still needs some refinement.

Share this post


Link to post

Gameplay at 60 fps is ok...

Recording gameplay at 60 fps is a different thing that many aspire to achieve... 

Share this post


Link to post

30 is slightly painful. 60 is very good, but 120 looks far better.

 

I think it's disingenuous to assume that what your own eyes are capable of seeing are all that anyone else can too. Some people have more sensitive eyesight than others.

 

To me, a CRT running at 50hz looks like a goddamn strobe-light, yet my friend said he only noticed any flickering when viewing the screen through his peripheral vision and not when directly staring at it.

Share this post


Link to post
On 8/28/2017 at 4:19 PM, GoatLord said:

Framerate shifts are definitely jarring as fuck. I definitely do not like that. I notice in those moments for sure. Like, I may not see a huge difference in a side-by-side comparison, but if refresh rate keeps changing depending on how much is being rendered at any given time, it is for sure a dealbreaker. Also, the people who think 60hz is choppy...I think are possibly delusional. The human eye is not very good at registering fast motion. Like if you literally wave your hand in front of your face you will notice characteristic motion blur that would not occur if say, you were recording a hand wave on a camera with say, a 1/100th shutter speed (the equivalent of 100FPS, downgraded to 24, 30 or whatever the video is shooting at). With that in mind humans are not going to see much difference between 60hz and 144hz.

People always say this but it's very apparent even going from 60 to 144.  30 is just a big flat no.  I could have trained my eyes to detect such things because I've played (mostly PC) games all my life, or something.  But going from 60 to 144 is so noticeable I ain't neva goin back.  It's so incredibly smooth.

Share this post


Link to post

As someone who grew up with games like Star Fox and the console ports of Doom, I'm not so much bothered by low framerate as inconsistent framerate. Star Fos ran at 10fps for the most part, but the gameplay was crafted with that in mind. And lest we forget that the original Doom was locked at 35. A low, but consistent framerate can be acclimatized to, but it's when the framerate is all over the place that things start to go wrong. It's all down to how the game feels to play, and particularly in physics-heavy games like driving games, variable framerates can actually mess with how the cars handle, making them unpredictable from one corner to the next.

 

Sure, a high framerate is objectively better, and anyone who says otherwise frankly doesn't know what they are talking about, but if made to choose, I'd take a steady 30 over a janky 47.

 

Of course, if people weren't so hung up on how shiny things looked we wouldn't be having this discussion because everything would be 60fps.

 

Oh, and as a footnote, the human eye can distinguish anything up to 160-170fps.

Share this post


Link to post

As many have said, frames per second isn't the issue, but latency , low input lag, and stability is.

I'd rather have a constant 30 than a jittery, screen-tearing "60".

On 8/28/2017 at 0:48 PM, Vorpal said:

Depends on the game, snes titles look horrible to me at 30 fps and I really need that 60, and doom mouse control feels horrible at 60 fps and I need to play it at 30 (or 35? whatever)

On a CRT with good contrast and phosphor retention, this was never an issue. For both examples.

Share this post


Link to post
9 minutes ago, Csonicgo said:

As many have said, frames per second isn't the issue, but latency , low input lag, and stability is.

I'd rather have a constant 30 than a jittery, screen-tearing "60".

On a CRT with good contrast and phosphor retention, this was never an issue. For both examples.

sweet.jpg

 

Thanks for letting me know what I can and can't notice

Share this post


Link to post

I'm not too fussed, personally.  I can notice the difference in Doom ports that do the uncapped thing but I wouldn't call it better.

Share this post


Link to post

It's a key factor of skill in certain games, if not that then it's extremely helpful. Take CS:GO and Quake as a good example of games where FPS is key (haha i made a funny)

Share this post


Link to post

It is not low frame rate that makes video games look choppy, it is the lack of motion blur.

 

Ever notice how, when watching a movie, all of that choppiness feeling is gone? Television is typically displayed at 24 to 30 fps, which most everyone here has said claimed to be awful for gaming. Yet, 24 to 30 looks great on a crisp TV when watching a movie. That's because the motion blur, introduced by the overexposure in the camera actually blurs the animation, which is interpreted realistically by the eye.

 

The choppiness in Doom is noticed most when the player is rotating, and a bright light/fireball makes distinct, crisp bright spots on the screen, and therefore in your retinas.

 

So, 30 fps, blurry > 120 fps, crisp. It doesn't seem right, but it's totally true. You can really see the effect when looking a LED clocks, where only one digit is lit at a time. This occurs at an extremely fast rate (much faster than 30, or even 120). But if you rapidly turn your eyes or head back and forth, while keeping the clock in your periphery, you will see the image in distinct spots in your field of FOV. Many people can distinguish over 200 fps somewhat. But, that's still not fast enough to defeat the lack of motion blur.

 

Regardless of all that, though, faster is always better, all other things being equal, up to 240 or so, for me, beyond which, it becomes difficult to see a difference. But the higher the contrast, the more choppy it will appear. A fireball moving left to right in a dark room, without motion blur will be choppy, even at 240, without motion blur, for example.

Share this post


Link to post
3 hours ago, kb1 said:

Ever notice how, when watching a movie, all of that choppiness feeling is gone? Television is typically displayed at 24 to 30 fps, which most everyone here has said claimed to be awful for gaming. Yet, 24 to 30 looks great on a crisp TV when watching a movie. That's because the motion blur, introduced by the overexposure in the camera actually blurs the animation, which is interpreted realistically by the eye.

No, not really, not noticing that. Movies are quite choppy to me when I'm paying attention to it. I'm also able to instantly tell when just moving a mouse cursor if my monitor is running at 30 or 60 Hz.

 

There's also several movie directors that complained publicly about how they didn't like 60 Hz recording equipment because it removed that "film effect" that the 24 Hz motion blur creates. As both 60 and 24 Hz cameras apply motion blur, if 24 Hz was plenty they wouldn't complain about this.

 

I generally agree that 'stable frame rate > higher frame rate', though. Micro-stuttering like in BoA is so destructive that I'd much rather have that mod run at 30 fps, and Frozen Time can give me motion sickness from how the framerate can fluctuate between 30 and 300 fps.

Share this post


Link to post
23 minutes ago, dpJudas said:

30 and 300 fps

300 fps is being rendered yes, but your monitor certainly won't be outputting that. 

Share this post


Link to post
23 minutes ago, Dragonfly said:

300 fps is being rendered yes, but your monitor certainly won't be outputting that. 

I'm aware of that, but that's not the point. When looking in one direction causes a scene render time to be 3 ms and other direction to be 30 ms it is extremely unpleasant to look at.

Share this post


Link to post

Scrolling 2D games (fighting games, platformers etc.) looked horrible if at anything other than 50 or 60 fps (as the Gods of PAL or NTSC commanded), and if anything moved more than one pixel at a time, in general.

 

For 3D games, the (subjective) requirement was always much more relaxed.

 

BTW, 24 fps is used in cinematography, not TV productions, where the native scan rate of 50 or 60 Hz (interlaced) is still used, at least for SD material. A 24 fps movie translated to 50 or 60 Hz via a telecine will always be an approximation. Of course, now there's a lot of new material being filmed in HD and 30 "true" fps (30 non-interlaced, instead of 60 interlaced) or even 60 fps non-interlaced.

 

And the lowest I've seen a computer monitor actually being able to sync was something like 43(!) Hz by using some weird SVGA video modes under DOS. Not really comfortable to look at for even a second on a 14" SVGA monitor. Most "office" video modes, even today, will use 60 or 70 Hz instead.

Share this post


Link to post

I think it's just right, but then again, your mileage may vary.

Share this post


Link to post

I want monitor manufacturers to stop discontinuing nice screens after they've become affordable to make. Or at least not soup it up with a bunch of unneeded crap, like speakers, small bevel, super thing and give options for freesync on ALL monitors even if it has Gsync (it's a royalty free plug and it's open source (free) to impliment).

Can I please have a 144hz 1440p 21:9 screen for less than a down payment on a house? Please?

Give me a 5" thick monitor, with corn cobs for the stand, a menu in Klingon and the logo upside down if you have to. Just give me what I want for a reasonable price. It doesn't seem like it's worth it to "push" any technology forward with my wallet because a new standard is just a year away. I'm sure there's some real challenges in the production of monitors that I'm overlooking, but fuck, throw a dog a bone.

I'm just going to wait until Black Friday for a monitor. I don't know why it's so amusing to me to use such an old CRT with such a beast of a computer, but the prices on these things are outrageous.

Share this post


Link to post

I will be convinced of the necessity of 60 fps only when I see an upgraded 60 fps animation of Cybie's butt.

Share this post


Link to post
9 hours ago, kb1 said:

It is not low frame rate that makes video games look choppy, it is the lack of motion blur.

 

Ever notice how, when watching a movie, all of that choppiness feeling is gone? ... That's because the motion blur, introduced by the overexposure in the camera actually blurs the animation, which is interpreted realistically by the eye.

Choppiness in 24fps film is actually quite noticeable when the camera is panning. This results in a mildly jarring stuttering effect because even though motion blur naturally occurs during panning, that type of movement reveals the limitations of the framerate. It's also worth noting that it's not "overexposure in the camera" that results in motion blur. Exposure is related to how much light the camera is taking in, which is governed by the aperture or f-stop, which affects brightness and depth-of-field, not motion. Motion is regulated by shutter speed, which can be measured in degrees or fractions.

 

Let us use fractions as the example, as that's a bit easier to understand. If your shutter speed is set to 1/50th, it means that, regardless of framerate, the shutter will capture motion at 1/50th of a second, which provides a very "true to life" appearance; that is, motion blur occurs frequently, but is only slightly visible in most cases, although significantly during very fast movement, which is very similar to how motion appears to the naked eye. If you lower the shutter to say, 1/24th, that might be the equivalent number to the framerate (24), but much more motion blur appears due to less information being captured per second. If you set it to say, 1/500th, it will eliminate almost all motion blur, causing the video to look somewhat unnatural, as everything will be sharp, which does not reflect how we perceive reality.

 

In video games there is no shutter speed; rather, motion blur is handled by the engine. The idea is to simulate a particular shutter speed, so without motion blur games look like the camera has the shutter set to something like 1/2000th. Much like video captured at a high shutter speed, this looks...odd, unless the framerate is significantly higher than 24. However, when motion blur is applied to a game, 24fps can actually look pretty natural, though probably not smooth enough for extremely precise aiming.

Share this post


Link to post

This is a good example of what I'm talking about. As the shutter speed increases, the motion blur disappears, and thus the video appears "unnatural." 

Share this post


Link to post
3 hours ago, Maes said:

BTW, 24 fps is used in cinematography, not TV productions, where the native scan rate of 50 or 60 Hz (interlaced) is still used, at least for SD material. 

Not entirely true. A lot of TV shows are shot on 24fps film or, more modernly, 24fps digital video. There was a trend for awhile, up until the 90s, where some shows were shot on piece-of-shit beta cams at 30fps (such as "All in the Family" or "Sanford and Son"), but shows with a higher budget were almost always shot on 24fps film ("Seinfeld" and "Friends"), although, quite bizarrely, "Dallas" was shot on 30fps film in order to make the conversion to 60hz less of a hassle. This came as a surprise to me, because I assumed all TV shows were 30fps, but that's only the cheap ones. Soap operas are still shot at 30fps and of course any live TV broadcasts, news segments and most any talk shows, are shot at 30fps. It takes awhile to see the difference and notice which ones are shot at which framerates.

 

As far as I can tell, TV still broadcasts, albeit digitally, at 60hz, and the only productions shot at higher than 30fps are experimental 3D movies like "The Hobbit" which was shot at 48fps, as a response to James Cameron's "Avatar" which was shot at 24fps and stutters noticeably in 3D.

 

 

Share this post


Link to post
1 hour ago, GoatLord said:

Soap operas are still shot at 30fps and of course any live TV broadcasts, news segments and most any talk shows, are shot at 30fps.

Actually at 60i, for you Yanks, and 50i for us Euro Pinko Commies. There are still advantages to interlaced video (such as an apparently higher fps number), as well as decades worth of available interlaced video material. Even DTV-B and similar digital TV technologies still maintain "backwards compatibility" with these modes, so to speak. Footage originally shot entirely in progressive scan modes is still relatively in the minority, though this will change in the future. The earliest convenient source of such footage for the average consumer were 24 fps BluRays of theatrical releases -all other consumer video media, including DVDs, followed NTSC or PAL video conventions. That is, if we exclude computer video file formats.

 

Dallas' example is more of an oddity -shot at "30p" if such a terminology can be applied to film, and broadcast at 60i, by necessity. Other direct-to-video soap operas exhibit...well, the "soap opera effect" for a good reason.

 

Let's not even get into what game consoles did in order to display "50p" or "60p" ;-)

Edited by Maes

Share this post


Link to post

I was reading about why PAL and NTSC have different refresh rates but can't remember, I want to say it's related to voltage or something. Personally I think 25fps "European" video looks kinda shitty, but probably because I'm used to 24.

Share this post


Link to post

It Morley depended with what you are comfortable with. A higher frame rate just means that things will be more responsive. So it doesn't mean much.For me I can't play with anything lower than 24 fps.

Share this post


Link to post

Sheesh, a lot of posts are using the word "tolerate" when referring to fps. Video game companies have it rough.

Share this post


Link to post

It's funny when you look back at 90s gaming, and what we accepted because there was no better alternative. Numerous platformers—even after the release of the smooth scrolling Commander Keen games—would scroll at a monstrous 8 pixels at a time or so. I was playing Cosmo's Cosmic Adventure recently, which has this issue (in addition to a low frame rate), but because it's such a relaxed game I can deal with it perfectly fine, even today. It seems Duke Nukem 2 has this problem as well, and yet it's still a very playable game. I also remember some horrific frame rates from Perfect Dark, particularly with the high resolution mode. I'm not sure I could play that now because I'm used to precise aiming with far higher frame rates, but back then console gaming was still pretty sluggish at times. 

Share this post


Link to post

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×