Jump to content
Search In
  • More options...
Find results that contain...
Find results in...
Patrol1985

What's the tickrate of Wolfenstein 3D?

Recommended Posts

I know that the vanilla Doom engine is internally locked to 35 FPS. I'd like to ask if any of you know the FPS limit for Wolfenstein 3D? Was there any improvement in this regard between Wolfenstein 3D and Doom?

Share this post


Link to post

70Hz, which is defined by the vsync rate of the VGA mode used.

However, unlike Doom, actor logic does not always run at 70Hz. A typical frame rate for the day was probably around 17fps and the AI would indeed run 17 times per second in that case. The rest of the tics that passed per frame would be accounted for by simply multiplying the effect by the number of tics passed. This makes the Wolf3D engine completely indeterministic. (The slowest the engine can run is 7Hz.)

id Software was aware of this, and thus demos are recorded and played back with a constant tic skip (4 tics per frame). In that case the TICRATE is effectively 17.5Hz.

Share this post


Link to post

I'm surprised this isn't a Sodaholic thread.

Share this post


Link to post
Patrol1985 said:

Do I understand correctly then that Doom is twice as smooth as Wolfenstein 3D (17 vs. 35)?

Assuming a modern computer, only during demo recording/playback. Under normal play Wolfenstein 3D is twice as smooth as Doom (70 vs. 35).


Although as a side note, Rise of the Triad runs at 35Hz. I think they also fixed the indeterminism.

Share this post


Link to post
Blzut3 said:

Under normal play Wolfenstein 3D is twice as smooth as Doom


That's intriguing :o Why would they decrease the tickrate in Doom engine? To save resources for more advanced rendering than the one in Wolfenstein 3D?

Maes said:

I'm surprised this isn't a Sodaholic thread.


I'm flattered :P

Share this post


Link to post
Maes said:

I'm surprised this isn't a Sodaholic thread.

Heh. ;)

Patrol1985 said:

That's intriguing :o Why would they decrease the tickrate in Doom engine? To save resources for more advanced rendering than the one in Wolfenstein 3D?

Basically, yeah. Not only the rendering, but it had more complex gameplay code (especially for monster line-of-sight checks) that could have caused a performance hit on contemporary hardware, at least assuming deterministic behavior. Wolf 3D also made occasional use of x86 assembly while Doom was written in pure C.

They probably could've gotten away with 70hz if they used a smaller view window like originally intended, had self-adjusting behavior like Wolf 3D and wrote the renderer itself in assembly. It's worth noting that neither the reduced framerate or pure C aspects of Doom were always the case.

Share this post


Link to post

Most CRT screens had a 70 Hz refresh rate, so targeting that value (or a clean fraction of it) made sense. Kind of like nowadays games tend to target 60 FPS or 30 FPS because most LCD screens have a refresh rate of 60 Hz.

Share this post


Link to post
Gez said:

Most CRT screens had a 70 Hz refresh rate, so targeting that value (or a clean fraction of it) made sense. Kind of like nowadays games tend to target 60 FPS or 30 FPS because most LCD screens have a refresh rate of 60 Hz.


Gotcha! Now I understand it. Thanks!

Share this post


Link to post
Gez said:

Most CRT screens had a 70 Hz refresh rate, so targeting that value (or a clean fraction of it) made sense.

VGA. Televisions either operated at 60 in some places (usually politically influenced by the United States in some way) or 50 in the rest of the world.

I assume you're non-American since you tend to refer to the US and its people in the 3rd person, so shouldn't you be more familiar with differing, lower refresh rates on CRT displays?

Gez said:

Kind of like nowadays games tend to target 60 FPS or 30 FPS because most LCD screens have a refresh rate of 60 Hz.

Not really. It had more to do with most console games natively targeting NTSC TVs, and LCDs happened to pick up on that refresh rate as the pre-established standard. Funny, given that 50hz was technically far more common globally, even if barely supported by games. Probably a result of the most influential engineers being in places like the US, Japan and South Korea.

Share this post


Link to post
Sodaholic said:

They probably could've gotten away with 70hz if they used a smaller view window like originally intended, had self-adjusting behavior like Wolf 3D and wrote the renderer itself in assembly.


That's extremely optimistic. For one, the percentage of CPU time spent on parts other than than actually drawing pixels to the screen is not negligible, but may about to up to 50% of total CPU time or even more, especially in maps with a lot of actors. "A lot" may mean 5000-10000 today on NUTS.WAD or something, but on a 486 it might have meant as little as 200 active monsters on the same map, without even necessarily seeing them, as early "nuts-like" mods like DMINATOR.WAD prove. You were lucky to get 1 fps in some spots.

But even assuming that on a given system non-rendering CPU time was on average 30% and that you could barely get a constant 35 fps on all of them most of the time, that left you with 70% of the CPU time being rendering.

In order to get double the framerate, you really needed to reduce the overall CPU time to 50% of the current one. Since non-rendering calcs would remain constant (assuming you didn't optimize them at all), they now would be 60% of the halved CPU time, leaving you 40% for rendering.

So, for simplicity's sake, if a tic was 1 second, what you could render in 0.7 seconds before, now you have to render it in just 0.2 seconds, since non-rendering time will still be 0.3 seconds in either case.

That's over a tripling in rendering speed required in order to achieve double framerate. You could of course reduce the number of pixels on-screen by 1/3rd or more, by shrinking the view window, but the benefits from shrinking it are far from linear, as a significant part of the renderer's CPU time actually is used in the BSP calcs, and those do not get any faster with lowering the resolution (at least not significantly). On the bright side they don't get much slower by increasing it, either.

Share this post


Link to post

I will note that there are a large number of differences in how the Wolf3D engine and Doom engine work internally beyond rendering techniques. Increasing Doom to 70Hz wouldn't have significantly increased line of sight checks as the AI routines aren't called every tic. Besides going through the list of actors to decrement the frame duration, the only thing it would do every tic is player movement. However the time spent processing the actor list is probably better spent on the renderer if typical computers weren't going to get 35fps any way. (Seriously, people often forget that most computers didn't obtain max frame rate on these games when they came out. I've never even seen a 486 do so, but that's probably because we never had a gaming graphics card.)

On the contrary, in Wolfenstein 3D AI routines are called almost every tic (except for in those brief pauses each "step"). This has interesting side effects when paired with the adaptive tics system in that even the speed and aggressiveness of enemies is indeterminate, although it's hard to notice. (The easiest way to notice this is the fake hitler fireballs which have a typo which makes them only work correctly when adaptive tics are used, but technically the speed variation happens on a much smaller scale for other actors. Due to the slight pauses being intentional, I can't even tell you if they will move faster or slower than they should, it depends on which frame you spend more tics on.)

Gez said:

Most CRT screens had a 70 Hz refresh rate, so targeting that value (or a clean fraction of it) made sense. Kind of like nowadays games tend to target 60 FPS or 30 FPS because most LCD screens have a refresh rate of 60 Hz.

CRT screens don't have a refresh rate in the same way they don't have a resolution. The explanation is more specific (was mentioned in my first post here) in that CGA/EGA/VGA 320x200 is 70Hz. This is a basically a quirk of the resolution to go along with rectangular pixels as 320x240 and 640x480 run at 60Hz.

Share this post


Link to post
Blzut3 said:

However, unlike Doom, actor logic does not always run at 70Hz. A typical frame rate for the day was probably around 17fps and the AI would indeed run 17 times per second in that case. The rest of the tics that passed per frame would be accounted for by simply multiplying the effect by the number of tics passed. This makes the Wolf3D engine completely indeterministic. (The slowest the engine can run is 7Hz.)

Interesting. Does ECWolf retain that quirk?

Share this post


Link to post
printz said:

Does ECWolf retain that quirk?

No, ECWolf is deterministic. Otherwise multiplayer wouldn't work. Besides, computers these days can easily obtain 70fps@320x200.

Share this post


Link to post

Maes is absolutely correct, there's basically no chance that doom could've matched wolf's capacity for 70fps given the hardware of the time. Think about it, a nice 486 might only barely meet that level of smoothness consistently for wolf, and doom came out only a year and a half later with more bells and whistles from a graphics standpoint than any game previously could shake a stick at. It was the Crysis of its time.

Carmack very consciously picked the 35hz ticrate for doom because he knew making that a standard was already setting the bar quite high for consumer hardware of the era.

Share this post


Link to post
sheridan said:

Think about it, a nice 486 might only barely meet that level of smoothness consistently for wolf


Are you sure? I thought 386 was more than enough to get everything that was to get from Wolf 3D.

Share this post


Link to post
Patrol1985 said:

Are you sure? I thought 386 was more than enough to get everything that was to get from Wolf 3D.

Depends on your video card. My 486 100/DX4 isn't able to run Wolf3D at 70fps or Doom at 35fps. I don't recall how it performed on the 386 I have, but I don't recall it being particularly smooth. One thing is for sure, I do remember the fast hitler fireballs, so it had to be sub-70.

Share this post


Link to post

Maybe Blzut3 had one of those weird overdrive 486s coupled with a really dodgy video card -or played DOS games within Windows 3.1, which could have unpredictable effects on their performance.

Share this post


Link to post
Blzut3 said:

Increasing Doom to 70Hz wouldn't have significantly increased line of sight checks as the AI routines aren't called every tic.

...what? All thinkers are stored in a list that is iterated over every tic.

Share this post


Link to post
Maes said:

weird overdrive 486s

I think it is an overdrive, but it's not the "co-processor" variant. Does this actually make a difference? That is to say, it was harvested from one broken 486 (well it used a proprietary PSU) and swapped into one and only socket in another more standard tower with VLB.

Maes said:

really dodgy video card

I believe it's a garbage pick from work. I'm sure it is. I do have a 486 with a VLB card, but it seems to be one of the worst VLB cards available. It's a "typically good" brand (can't remember off the top of my head, but maybe SiS), but I swear the thing can't exceed 20-something fps.

But it is exactly my point, while some people are no doubt going to post that their 25Mhz 386 ran Doom just fine, I'm going to assume most people at the time did not have especially great video hardware. Otherwise, I can't figure out why a good card hasn't come to my household at any point. (We actually have multiple 486s, and the Thinkpad actually has the best performance out of all of them. That one actually might get full frame rate in some games, but I haven't personally played with it.)

Linguica said:

...what? All thinkers are stored in a list that is iterated over every tic.

Right, but the AI routines are only called when the frame changes. For the Zombieman, that's every 4th tic while active, every 10th tic while looking for the player. Any other tic the only thing that happens is the duration is decremented.

Share this post


Link to post
Blzut3 said:

I think it is an overdrive, but it's not the "co-processor" variant. Does this actually make a difference?


From my experience, overdrive machines of any kind (whether it was a 386 to 486 overdrive, or 5V 486 to Pentium or DX/4 486) exhibited all sorts of weird bottlenecks and limitations, especially with regards to their memory and system bus speed.

Sometimes they worked as fast as expected (pure CPU tasks, preferably those that could take advantage of the cache(s)), but when it came to push a lot of data through the bus or do a lot of memory moving, they fell flat on their face.

This is to be expected, if you think about it. You can't expect pure 486 performance from a 486 overdrive mounted on a 386 mobo, but which still uses the 386's mobo's chipset. Similarly, you can't expect Pentium or DX4 performance from an old 5V 486 mobo, still using the same chipsets and memory types as lower performance CPUs. At best, you will get very uneven and circumstancial performance.

The most extreme case I had seen was a weird 486 DX4/120 system which somehow got jury rigged on a lowly 486 ISA-only mobo. Most hardware detection tools couldn't detect it as a 486, and while some things ran very fast (e.g. loading Windows), other things (like running scene demos) were actually worse than on my 486 DX/50!

VLB videocard performance is another sore point: the average user often only operated those video cards only in ISA mode, especially in the VGA modes used by most DOS games. To fully use their VLB capabilities, you had to use special drivers, VESA modes or software which could actually use them.

PCI videocards didn't have this dichotomy, AFAIK, and you always got the faster bus's speed benefits.

Share this post


Link to post
BaronOfStuff said:

A 486 was certainly more than enough for Wolf3D, or at least I never encountered anything close to gameplay issues (DX2 66MHz).

More than enough by standards of the time. After being spoiled on consistently smooth 60hz gameplay for years it's easy to forget how games looked and felt on the rocky framerates of yesterday (and I'm not saying that to put anybody down, the truth is I can hardly stand doom at 35hz now that I mostly play source ports like gzdoom).

Share this post


Link to post

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×