Jump to content
Search In
  • More options...
Find results that contain...
Find results in...
KVELLER

What's with the arbitrary limitations of the Doom engine?

Recommended Posts

Only 128 sprites can be on screen, demos can't be larger than 128 KB, only 30 platforms can be active at any given time, etc. Why is this? I'm guessing it was made for performance reasons, but why not just let people go nuts(.wad) like we do today? Was hardware not able to stop itself before incinerating or something? What's the reason behind this?

Share this post


Link to post
21 minutes ago, KVELLER said:

Why is this?

 

21 minutes ago, KVELLER said:

I'm guessing it was made for performance reasons

Doom was programmed in C. Unlike most modern languages, C doesn't let you automatically have arbitrarily sized data structures that can grow and shrink as needed. If you want some certain amount of data in memory, you have to specifically allocate and keep track of the memory space for that data (or else have some sort of C library in place that keeps track of it for you). If you want to draw 129 sprites on screen, you need to have a data structure that can concurrently hold information about 129 sprites at once, which means it can't just be set at 128 permanently and needs to be able to grow.

 

Due to time and performance constraints, John Carmack didn't bother making all data structures in the engine arbitrarily and dynamically sized, because that would have made the engine more complicated. Instead, he just used statically sized arrays and so forth all over the place, and just trusted the designers to stay within those limits.

Share this post


Link to post

its easier and probably faster to have an array of 30 plat thinkers that you recycle instead of having to allocate and free them on the fly. they do that when its important (map objects) but beyond that most limits can be managed by the designers and hey if they went over they could yell at carmack or romero to increase the limit

Share this post


Link to post
56 minutes ago, Linguica said:

 

Doom was programmed in C. Unlike most modern languages, C doesn't let you automatically have arbitrarily sized data structures that can grow and shrink as needed. If you want some certain amount of data in memory, you have to specifically allocate and keep track of the memory space for that data (or else have some sort of C library in place that keeps track of it for you). If you want to draw 129 sprites on screen, you need to have a data structure that can concurrently hold information about 129 sprites at once, which means it can't just be set at 128 permanently and needs to be able to grow.

 

Due to time and performance constraints, John Carmack didn't bother making all data structures in the engine arbitrarily and dynamically sized, because that would have made the engine more complicated. Instead, he just used statically sized arrays and so forth all over the place, and just trusted the designers to stay within those limits.

Not to mention dynamic limits would have made the engine slower and require more memory, and with Doom's performance already being only barely adequate on 486DX/DX2 machines of the time, this was unacceptable. Every clock cycle and kilobyte of RAM counted in those days.

Share this post


Link to post
1 hour ago, KVELLER said:

demos can't be larger than 128 KB,

 

This is also not the best example because later versions of Doom actually let you use -maxdemo to raise this. A better example is maximum savegame size, where this comes about not because Carmack arbitrarily decided on 180KB, but because he decided to stick the savegame data in the screen buffer as a temporary holding spot. This made his life easier because he already had a decently sized chunk of memory he could read and write from and he didn't need to allocate extra memory just for the savegame. (It also meant that he didn't have to worry about players on low-spec PCs literally running out of RAM and being unable to save their game - if the game already runs, it can save a game, because it's just using memory it already allocated.)

Share this post


Link to post

Let's take visplanes as an example. Doom has a fixed-sized array of visplanes that is 128 elements in size:

#define MAXVISPLANES 128

visplane_t visplanes[MAXVISPLANES];

This means that the size of the array is predefined and its exact location in memory is assigned when the .exe is compiled. Once you reach 128 visplanes the array can hold no more.

 

Why is this the case?

 

Answer 1: The "John Carmack is the world's greatest micro-optimizing programmer genius" answer:

 

Because the array is a fixed size and its location is statically assigned at compile time, the game runs faster. If the visplanes array was dynamically allocated (so it could grow as needed), it would require another pointer dereference every time the code looked up a visplane. Because Carmack was the smartest programming genius in the world he wisely chose to use fixed-size arrays, so that Doom would run efficiently on 386es, as everyone knows it definitely did.

 

Answer 2: The "John Carmack is lazy and the C programming language makes working with dynamically-allocated arrays a huge pain" answer:

 

In more modern programming languages it's easy to make an infinitely expanding list. For example in Python you can do something like this:

my_list = []
for i in range(10000000):
  my_list.append(i)

There's no need to tell the language in advance exactly how long the list is going to be. You can achieve the same thing in C, but it's very tedious to do: you have to maintain a separate length field and check every time you add something so that you can grow the array when needed. So the language tacitly encourages programmers to use fixed-size arrays.

Share this post


Link to post
2 hours ago, Linguica said:

 

This is also not the best example because later versions of Doom actually let you use -maxdemo to raise this. A better example is maximum savegame size, where this comes about not because Carmack arbitrarily decided on 180KB, but because he decided to stick the savegame data in the screen buffer as a temporary holding spot. This made his life easier because he already had a decently sized chunk of memory he could read and write from and he didn't need to allocate extra memory just for the savegame. (It also meant that he didn't have to worry about players on low-spec PCs literally running out of RAM and being unable to save their game - if the game already runs, it can save a game, because it's just using memory it already allocated.)

Couldn't they just fopen and fwrite and trust the built-in buffering provided by the standard C library?

 

EDIT: it kinda saddens me you keep saying "Carmack did this, Carmack did that". I prefer saying "id did this, id did that", as if it were a big faceless entity doing stuff collectively and owning the rights. Carmack wasn't the only programmer was he.

Share this post


Link to post

AFAIK, Carmack did most of the engine, including renderer, memory management, etc. Romero did level management stuff (like line type handling). Taylor did various interface stuff (HUD, automap, etc.).

Share this post


Link to post

to be fair you can tell at least a number of examples here are Carmack if only because of his commenting style (no caps, misspellings abound). I think Romero had a tendency to do all caps comments, at least it appears a lot in the level stuff in Doom and Wolfenstein, and there's almost none of that in the engine internals.

Share this post


Link to post
1 hour ago, printz said:

I prefer saying "id did this, id did that", as if it were a big faceless entity doing stuff collectively and owning the rights.

 

Why do you prefer pretending that id was a big faceless entity? As @Gez already mentioned, there are certain types of things (the various line actions) where it's pretty clear that Romero was responsible, and other semi-independent parts (automap, status bar) that are clearly written in a different style and we know to have been written by Dave Taylor, but the majority of code was Carmack.

 

1 hour ago, printz said:

Couldn't they just fopen and fwrite and trust the built-in buffering provided by the standard C library?

 

Sure, I suppose - the Heretic source shows that Raven changed the savegame routine to first attempt to 1) allocate space via the engine's zone memory system, and if that doesn't work for some reason, 2) just use fopen and fwrite.

Share this post


Link to post

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×