Carmack: "unfortunately, a lot of next gen games will target 30 fps"

From a console standpoint, yes. But on the PC this shouldn't be that big of and issue. A focus on getting developers to release 3d FPS with 120hz options and FOV set up has proven to pay off and a lot more of new games has had those as features.

Share this post


Link to post

He is most likely talking about the consoles. Given the fact that he is very excited about Doom 4 supporting Oculus Rift, it is very likely that PCs will go up to 120 Hz.

Share this post


Link to post
Platinum Shell said:

Well, the Human eye cannot really notice a difference from anything above 30-31 FPS.

Maybe YOUR eye :) ...I can easily detect 120, and I don't think that's very special.

On the other hand, the human brain can manage to make 30 FPS tolerable, and support the illusion of more or less fluid motion, but that's IMHO a bare minimum.

Share this post


Link to post
Platinum Shell said:

Well, the Human eye cannot really notice a difference from anything above 30-31 FPS.


I can notice one Hell of a difference.

Share this post


Link to post
Platinum Shell said:

Well, the Human eye cannot really notice a difference from anything above 30-31 FPS.

This has been disproven so many times.

Share this post


Link to post

Is it not more like after you go beyond 60 fps you can't really notice? IDK...just throwing that out there.

Share this post


Link to post
Aleaver said:

Is it not more like after you go beyond 60 fps you can't really notice? IDK...just throwing that out there.

I think that's more like it.
60 is clearly smoother than 30, but it's still pretty much playable. But lower than, more or less 20 fps, and I start to get headaches.

Share this post


Link to post
Aleaver said:

Is it not more like after you go beyond 60 fps you can't really notice? IDK...just throwing that out there.


I think it really depends on what you're using to view it. All of the computer monitors I have max out at 60hz, so while a lot of the games I play *say* they're running at 200+ frames per second, all I ever really *see* is 60 fps, because that's all the monitor is capable of pushing out. I remember using CRT monitors several years back that could go from 60hz all the way up to 120hz, and I recall remarking on how much smoother the image appeared.

I could be wrong, but that is merely my understanding.

Share this post


Link to post
kb1 said:

On the other hand, the human brain can manage to make 30 FPS tolerable, and support the illusion of more or less fluid motion, but that's IMHO a bare minimum.


That's more or less what I suppose I heard.

Guess I got some bad information. I suppose I'll do a little more additional research. Again.

Share this post


Link to post

Meh. I can live happily while staying confined to ancient games. So this is really no big deal. They can try whatever they want cause they won't be getting any of my dollars anyway. :p

Share this post


Link to post

Apparently it's still a low amount but our brain fills in the gaps. Much like one of those flip-page animations.

Share this post


Link to post
Platinum Shell said:

Well, the Human eye cannot really notice a difference from anything above 30-31 FPS.

I think the noticeable threshold was something like 48 Hz, which happens to be just below European mains 50 Hz.

Share this post


Link to post

The frames per second that one can notice go pretty high. Just as an example, there are some people who get a headache when looking at a CRT monitor on 60-75 Hz (that would be 60-75 fps) because it flickers too often. Said people would have to use their monitors on +85 Hz to not get those effects, so obviously your eyes can tell the difference.

Of course the headache issue doesn't happen on TFTs because of the monitor technology being different, so no "hurr durr my 60 Hz TFT doesn't make me bleed you're wrong" plz. :P

Share this post


Link to post
Jodwin said:

The frames per second that one can notice go pretty high. Just as an example, there are some people who get a headache when looking at a CRT monitor on 60-75 Hz (that would be 60-75 fps) because it flickers too often. Said people would have to use their monitors on +85 Hz to not get those effects, so obviously your eyes can tell the difference.


They can maybe tell a difference, but that difference likely isn't smoothness of motion on the screen. So it's important, but not for the reason some people would assume.

As for 30 vs 60, I don't think much is going to change. Anecdotally, I think I don't see 60 very often. I say that because I have the impression that I'm surprised at how smooth 60 looks when I happen to find myself playing a game at that framerate. Maybe the majority of console games are at 30.

Share this post


Link to post

The difference between 30 and 60 is huge for fast action.

As for CRT flicker, I find 50 Hz PAL TVs completely unwatchable. They look like strobes to me. The centres of my retinas are basically useless so I look at everything with peripheral vision. Anything flickering below 75-85 Hz is painful for me. LED Christmas lights aren't fun either. 60 Hz monitors are also unpleasant, but much better than 50 Hz.

Share this post


Link to post

How can you NOT see the difference between 30 and 60 especially in an FPS game? For me 30 is playable and 60 is optimal.

Share this post


Link to post

I recall there was a blog that pretty much filled the fps myth's ass with concrete.

That blog analyzed some borderline cases: e.g. if in a completely black room, you're flashed a single white rectangle for an instant, or the opposite, in a white room you're flashed a black one, then you can "perceive" a difference as fast as 1/800 sec (or 800 fps), but not actually make sense out of it.

If you need to make sense out of what you see, then you must go much, much lower than that. The final "decision" was that, to keep your mind at peace, you should ONLY play at 800 fps after you read that blog. And even then I'm sure that for SOME, that would't be enough ;-)

Share this post


Link to post

Also, a flash of white in darkness is more perceptible than a flash of black in bright light. The eye still "sees" the last image for a very short while in the absence of new stimulus. (This is how stroboscopes work.) You might not notice if lights go off for a fraction of a second while you were in a bright room; but you will definitely notice a flash of bright light if you were in total darkness.

Share this post


Link to post
Platinum Shell said:

Well, the Human eye cannot really notice a difference from anything above 30-31 FPS.

Seriously, THIS again? Look what you've started!

Frame rate perceptibility is one of the great schisms of the gaming and hardware enthusiast communities, if not the most enduring topic of debate and division. We're never going to come to a mutual conclusion over what frame rate is ideal and at what frequency does a monitor's pixel replenishment become unnecessarily high.

Share this post


Link to post
DoomUK said:

Frame rate perceptibility is one of the great schisms of the gaming and hardware enthusiast communities


Hmm....so who are the sciites, the sufites, the orthodox and the catholics, and who the protestants? :-p

Share this post


Link to post

Tsk tsk. And think that Doom, at least, should've taught everybody that a refresh rate that's "almost as fast as television" should be good enough, especially if it can be actually sustained.

Having been brought up with Doom frame rates ranging between 8-24 on a 486, a sustained 30 or 35 fps (for a non-scrolling game, aka anything 3D) sounds ultimate. Watching a movie at the theater wouldn't have been better (conventional ones, at least).

Share this post


Link to post

If you want to see some real badass flicker, get an Amiga 500 and set Workbench to high resolution with light background. It'll start getting annoying real quick. Helps to use a white text on black background though (and make the room dark by turning off lights, covering windows, etc.) But unless you get a converter to allow using a real VGA monitor, there's only so much you can do with the PAL/NTSC signal.

I never saw horrible flicker on PCs with SVGA cards though. Sure, if you turn your head sideways and observe carefully you might notice the refresh, but I never got headaches from it or any kind of distraction from regular use. For desktop stuff I'd set my refresh to 70 Hz and it was fine for working all day. Later on after getting new computer/monitor I tried higher refresh rates (up to 85 Hz) but I didn't notice anything better. I mean other than the monitor being nicer (higher dot pitch, bigger screen) it didn't feel any more comfortable.

Game framerate is a different matter and I found it tolerable down around 15-20 fps (often the case when playing Quake 1 on 28.8 Kbps modem) but 30 fps or whatever vanilla DOOM caps it at is perfectly fine. I'm more interested in a steady framerate than a higher FPS, because when framerate speeds up or slows down quickly it feels unatural and the jerkyness can also cause me to screw up (miss a shot, run into a projectile, get stuck on architecture or fall into pits, etc.)

Share this post


Link to post

Well, the Amiga's "monitor" was nothing but a TV-grade CRT. Any budget 14" TV with a SCART input in RGB mode was more than a match for it, so the comparison with even "office grade" SVGA monitors of the period is really not a fair one: they had better specs all-around for pure RGB signals, better multisyncing capabilities, and smaller dot pitches (no scanlines!).

hex11 said:

Game framerate is a different matter and I found it tolerable down around 15-20 fps (often the case when playing Quake 1 on 28.8 Kbps modem) but 30 fps or whatever vanilla DOOM caps it at is perfectly fine. I'm more interested in a steady framerate than a higher FPS, because when framerate speeds up or slows down quickly it feels unatural and the jerkyness can also cause me to screw up (miss a shot, run into a projectile, get stuck on architecture or fall into pits, etc.)


Agreed, even though the tolerability of lower framerates depends on the game genre. For FMV or 3D, lower (I still mean movie-grade, so around 20-24 fps is OK but lower than that, only in exceptional cases), but for 2D platformers/shooters, jerikness is unacceptable: I'd rather have a slowdown/"miss no frame" strategy in such cases, as many arcade games seem to do, rather than frame skipping (which instead is the norm for 3D games).

BTW, worst games ever: pinball or breakout games with frame skipping. UGGGGGGHHHHHHHHHHH

Share this post


Link to post
DooM_RO said:

How can you NOT see the difference between 30 and 60 especially in an FPS game? For me 30 is playable and 60 is optimal.


Maybe I missed someone's post, but who said they couldn't see the difference in 30 and 60?

Share this post


Link to post

Not sure what the big deal is. DOOM was capped at 35 fps, and it's at least as fast-paced as any modern game I've seen.

Share this post


Link to post
DataSnake said:

Not sure what the big deal is. DOOM was capped at 35 fps, and it's at least as fast-paced as any modern game I've seen.

But I'm used to "uncapped" ZDoom and if I play at 35 FPS it is very noticeable to me - and not in a good way IMO.

Share this post


Link to post

The human eyes cannot perceive more than 30 FPS, global warming is a hoax, we didn't land on the Moon, and the Earth is flat. You guys really need to read up on the facts.

Share this post


Link to post

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!


Register a new account

Sign in

Already have an account? Sign in here.


Sign In Now