Jump to content
Search In
  • More options...
Find results that contain...
Find results in...
Doom_user

Does the uncapped framerate cause input lag/display lag?

Recommended Posts

Does the uncapped framerate cause input lag/display lag?

After reading an explanation of how it works, it seems to me that on a 60hz display there would have to be atleast 11/12 of a tic (26.19 milliseconds) of input lag/display lag compared to the capped framerate.

Share this post


Link to post

The timing of player input would be unaffected, but visualization of the results of his actions might indeed be out of phase, either ahead or behind the moment player input being processed. With a 35 Hz refresh rate (matching the TICRATE), the player can base his input based on the actual game's status: What You See Is How Things Really Are.

With "fractional rendering" however, player input does not necessarily coincide with the time he gets visual feedback: you may get rendering e.g. 1/35th of a second right before input gets processed (which is really not that bad), or some tics later (which could be problematic).

What saves the day, is that the "phase" (the delay in between the last rendering and the player's input) is variable, so you never get consistently delayed or anticipated renderings.

[Edit: I had some calculations here, but I screwed up one major assumption. Disregard]

Share this post


Link to post
Maes said:

In general, for any given refresh rate, you will get a fixed pattern of premature and delayed renderings (premature: if it's up to 17 tics before player input, delayed: up to 17 tics after player input. 18 tics would be a middle ground).

However, this is not seen as much of a problem because (typically) you will get at least two renderings per second, and at least one of them will be close enough to the player input time. The being said, there are indeed "trouble seconds".

Some examples with refresh times expressed in tics:

50 Hz: one refresh every 24.5 gametics.


I have no idea what you're talking about.

If you have 35 game tics per seconds, and 50 screen refreshes per second, then you get about 1.43 refreshes per gametic. 35 of the 50 renderings will be "on the tic", and interspersed with them will be 15 refreshes correspond to the previous tic displayed again (possibly with interpolation based on prediction to make it look smoother).

How do you derive the display lagging 17 tics behind (half a second!) from having a refresh rate greater than the game logic?

Share this post


Link to post

I believe the perception of "lag" is entirely psychological, due simply to the empirical fact the rate of input sampling has not changed. However that doesn't mean I think it's "fake" or not important. Lots of psychological effects are very important and systems sometimes have to try to compensate for them.

EE's sound engine is an example. There are a limited number of sound channels available so, Eternity uses a "psycho-acoustic" model to determine what's the most important thing to hear. The result is a more pleasant play experience.

However with input, I don't see what can be done - you can't interpolate the mouse input because the game logic is only updating at 35 Hz. You can read the mouse between those tics all you want but the game isn't going to respond to any of that accumulating input until it's time for the next tic to run.

The only solution would be to allow higher gametic rates. Say for example allowing 70 Hz by doubling tic values and halving movement impulses. This would drastically change some things in the game, though, even if everything is mathematically "perfect." You'll have complaints about glides and wall running and the timing of stuff.

Share this post


Link to post
Gez said:

How do you derive the display lagging 17 tics behind (half a second!) from having a refresh rate greater than the game logic?


WTF, major braino there. Point taken. I will have to redo everything. :-/

OK, here's the thing: for comparing 35 and 50 Hz, their LCM is 350, so it's best to see what happens in terms of 1/350 ths of a second, let's call it one Time Unit (TU). 1 TU = 1/350 sec

One GAMETIC is obviously 10 TUs, while screen refresh occurs every 7 TUs. Obviously they sync up every 350 TUs (1 second of real time), but what happens in between?

Screen refresh @50 Hz is perfectly matched with the ticrate at the TUs of 0, 70, 140, 210, 280 and 350 TU or, in terms of seconds for every tic, at 0.0, 0.2, 0.4, 0.6, 0.8 seconds after the beginning of every tic, or every 7 gametics.

So for every second, at 50Hz refresh, you get at least 5 "in-sync" screen refreshes. The other 45 will be off-phase to some degree, but no more than 5 TUs or 14.3 ms (5/350 sec). Whether this is enough to result in perceptible lag between player input and screen output is debatable. Certain cheap TFT monitors add way more than that (70ms or more), and playing over the Internet can add over 100 ms, if not over 200.

For a refresh rate of 60 Hz, we need a different TU base: LCM of 35 and 60 is 420, so 1 gametic = 12 TUs, 1 screen refresh = 7 TUs (same as before, but for a different TU). Again, they sync up evern 70 TUs (0, 70, 140, 210, 280, 350, 420) but this time this means they sync up 6 times per second or every 166 ms. Maximum delay can be up to 6 TUs now, but this means once again 14.333 ms (6/420 sec).

Share this post


Link to post

I am thinking along other lines.
The eye is limited to about 50Hz. At 60Hz it cannot perceive individual events anymore. Let us assume the entire nervous perception system is about 50Hz.

What is also present is anticipation. This would also run about 50Hz.
The tic rate, 35Hz is obviously slower, so there is some lag.
When video and input are both lagging, the anticipation adjusts itself to eliminate the perceived error, and anticipate better. That is what it does.

When the video rate gets better it gives faster feedback to anticipation. When the anticipation adjusts to the confirmations of events given by the faster video rate, it becomes more aware of the error (measured against anticipation) of the input limited to the 35 Hz tic rate.

The lag has not changed, the perception of it has because anticipation is error adjusting against something better than the input tic rate.

This is my view on it.

Share this post


Link to post
wesleyjohnson said:

I am thinking along other lines.
The eye is limited to about 50Hz. At 60Hz it cannot perceive individual events anymore.


That's an oversimplification. First, it varies greatly with individuals, and secondly it also depends on the individual event in question. For example, a flash in darkness that lasts for 1/100th of a second is going to be more noticeable than a bright light going out for a 1/20th of a second.

Share this post


Link to post

Uncapped framerates do not cause additional input latency. However, most ports that have uncapped framerates also use interpolation to approximate the player's view angle and position between 35Hz gametics. This does add a sense of input latency because there is now additional latency in the result of the player's input being displayed to the screen.

Interpolation requires two known data points and for a Doom renderer that is trying to interpolate the player's viewing position, those two data points would be the player's position in the current gametic and the player's position in the previous gametic. If it has been 14ms (approximately 50% of the time of a gametic) since the last time the game performed its input handling and physics, interpolation will find the position that is halfway between the previous position and the current position and then render from that interpolated position.

So the interpolated position is always somewhere between the current position and the previous position. This creates a varying latency that can range from 0ms to 28.6ms (1/35Hz).

Share this post


Link to post
Dr. Sean said:

This creates a varying latency that can range from 0ms to 28.6ms (1/35Hz).


A lag of 1/35th of a second can not occur with any of the usual (50, 60, 70, etc.) refresh rates, as there's no instance where the screen won't be updated at all between two distinct gametics. Even if you choose a pathological rate like e.g. 36 Hz, the screen refresh will never "stray" farther than half a gametic.

You'd need to choose a refresh rate SLOWER than the ticrate for an entire gametic of lag to occur regularly, e.g. 34 Hz.

It may also only happen if the player's (or any other object's) movement changes in a way that the interpolation algorithm cannot predict e.g. stopping on a dime and full-speed reversal in just one gametic). Then you'll get indeed a bogus display.

Share this post


Link to post

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×