Jump to content
Search In
  • More options...
Find results that contain...
Find results in...
Sign in to follow this  
Linguica

Creating a Doom-style engine in C

Recommended Posts

Neat. He explains how it works, what he does, and why he does it! Not entirely in detail, but enough to keep it interesting and possibly useful.

Share this post


Link to post

Damn, i need to pay more attention it seems...
Well it was by accident but i just posted the videos by the same guy. lol.

Share this post


Link to post
boris said:

Pretty cool video. Although he makes it sound like BSP has no use.

Carmack himself has said before that adding the BSP tree wasn't as enormous a deal in Doom as people seem to give it credit for:

DOOM was known for using BSP trees, more so than rasterization, but that was a small part of what made up the rendering. There was another competitive engine at the time the Build engine that was used for Duke Nukem and Shadow Warrior, which used a completely different rendering architecture internally and produced pictures that were basically the same effects. That's just another example about how the exact particulars of implementation are not that important on the larger scale side of things.

People like to look for the magic special sauce. They like to look for that narrative. But for almost anything, there are multiple valid ways to get to the same end result. And DOOM started off with a different approach that wasn’t getting the speed I wanted. I first used BSPs on the Super Nintendo Wolfenstein 3D port, where I had to come up with more speed than the raycasting approach. And then when I came back to working on DOOM, I wound up working in that way because it seemed like a good approach. Conversely though, the Build engine (Duke Nukem 3D) didn’t use BSPs, and it was every bit as effective as the way DOOM was implemented. But certainly because of DOOM’s success, thousands of people learned what BSP trees were and followed up on some of those academic threads.

Share this post


Link to post
Linguica said:

Carmack himself has said before that adding the BSP tree wasn't as enormous a deal in Doom as people seem to give it credit for:

Visually yes. But what about performance? I was always under the impression that the main reason to use BSP was its speed.

Share this post


Link to post

Marathon had a pretty cool portal-based renderer for being so early in the sector-based shooter era, especially having been built by a rag-tag team targeting the more limited M68k family of Macs (though they of course had PowerPC builds for those who could afford such machines). A lot of the math was far less precise than in Doom. IIRC, Doom uses 16-bit subunits for precise coordinates within 64 units per cell, while Marathon only has 1024 subunits within one of its cells (4096x less precision).

It's unfortunately limited in a few ways however. There was a hard limit on map size of 64x64x18 units (roughly same size as a Wolf 3D block). Sectors could only have 8 sides max and had to be convex. This means what would take one sector in Doom could require more in Marathon.

The midpoints of 2-sided lines were used as path nodes for AI, and its AI movement was "on rails" as opposed to "free roam" like Doom's AI. It's highly inadvisable to have a 1-sided line surrounded by two 2-sided lines all parallel to each other, as the monsters would travel while embedded half-way in the solid wall to get to the opening on the opposite side.

It can also lead to nasty collision issues with players, possibly letting the player walk through a wall until he gets spit backward a few feet after "leaving" the sector. Sectors seem to act as "containers" as part of an object's world coordinates was the sector it currently occupies. It is impossible for an object to occupy the void as they can in Doom.


Okay, fine, the whole engine was a hacky kludge by relatively inexperience amateurs, but it's still pretty cool.

Share this post


Link to post
Sodaholic said:

Sectors could only have 8 sides max and had to be convex. This means what would take one sector in Doom could require more in Marathon.

Well, I guess it's a bit like drawing with subsectors directly, instead of letting the node builder take care of it, but I won't pretend that I know all specifics.

The midpoints of 2-sided lines were used as path nodes for AI, and its AI movement was "on rails" as opposed to "free roam" like Doom's AI.

On the other hand, pathfinding works better in Marathon (if you can even use the word "pathfinding" with the Doom's method).

Share this post


Link to post
boris said:

Visually yes. But what about performance? I was always under the impression that the main reason to use BSP was its speed.


BSP is faster than portals for very complicated geometry. Take things like some areas of ZDCMP2, Hobb's End Horror (Hellcore 2.0) or The Hag's Fingers (Sunder) where you can have hundreds of sector boundaries in sight; with BSP it's much faster than with portals.

But for the typical original game level, be they from Doom II or Duke Nukem, the difference isn't that noticeable. And that's especially true if you take into account the static limits in the vanilla engine preventing mods like Frozen Time to exist for vanilla.

So for their original purposes, the use of BSP instead of portals in Doom is not really for performance reasons.

Share this post


Link to post

Most people think using BSPs for world rendering was a stroke of genius in Doom. It wasn't so much genius as it was a shot in the dark to get the game, which was originally limited and slow, to work at an acceptable pace with more complicated geometry. Granted, nobody had ever done it before, and it proved itself to be a fruitful approach especially in Quake, but at that time pretty much everything was a first in fast consumer 3D graphics. The required technology was quite new and simply nobody had done anything with it in that sector yet.

Sodaholic said:

A lot of the math [in Marathon] was far less precise than in Doom. IIRC, Doom uses 16-bit subunits for precise coordinates within 64 units per cell, while Marathon only has 1024 subunits within one of its cells (4096x less precision).

Uhh, no, this is actually completely wrong. You are confusing terminology that you've picked up on the two engines and it's having a serious effect on your math, because there is no way Marathon could've been able to produce anything remotely pretty with precision like that (the smallest number it could've represented would be 1/16, which neither makes sense from a technical standpoint nor would it work from a graphical standpoint).

I was actually in the midst of writing a huge explanation of fixed point arithmetic and 3d graphics to explain how wrong this is but I stopped basically because it was getting way too long for one forum post.

Sodaholic said:

It's unfortunately limited in a few ways however. There was a hard limit on map size of 64x64x18 units (roughly same size as a Wolf 3D block).

This is wrong too because you are once again confusing your units. The size limit of a marathon map is actually exactly the same as a doom map, and it's limited only because a 16-bit number can get so big (65535 being the biggest thing you can represent in 16 bits).

Sodaholic said:

Okay, fine, the whole engine was a hacky kludge by relatively inexperience amateurs, but it's still pretty cool.

Such was everything else at the time, including Doom. And Werecat is correct, though Doom let you build maps with complex sectors it still busted them done into collections of simple polygons for the node builder. All Marathon did was move this step task from the compiler to the designer, which is often times a fair trade off.

Share this post


Link to post
sheridan said:

...I was actually in the midst of writing a huge explanation of fixed point arithmetic and 3d graphics to explain how wrong this is but I stopped basically because it was getting way too long for one forum post...

I'd be interested to read it (if it's good :)

Share this post


Link to post

The gist of it is that you need a helluva lot of precision and range in your numbers to generate any kind of recognizable 3D graphics at reasonable scale. Think about it... at 16 bits, even if you defined 1 "map unit" as being equal to 1 millimeter of real world length, you'd end up with less than 66 meters (~200 ft) on each axis to make any of your big mazy levels. Tiny? Yes. So clearly you need a ton of range to make anything reasonably large, and doom fixes this by making 1 map unit equally to something in the range of ~50mm. That gives them plenty of range to make nice big levels.

But why is the precision such a big deal? Well stop and think about that for a moment too. The most basic trig functions, sin and cosine, return FRACTIONS. That's not an accident. But it poses a problem because we need to use these functions frequently as coefficients to make our 3D transformations, and we'll lose precious amounts of information if a large number of the digits resulting from these computations is lost afterwards. So clearly, we don't just need range, we need precision too.

Doom meets this balance of range and precision by using a thing called 16-bit fixed point arithmetic. It is called 16-bit because each part of the numbers it uses have a 16-bit integer part, AND a 16-bit fractional part (binary fractions?! well, yeah, did you think fractions only worked if you counted in tens?). It is called FIXED POINT because the radix point (decimal point in plain english, but binary point more specifically) sits in the middle of the number and never moves (as opposed to a FLOATING POINT, which will put more digits on the left or right side of the number as necessary to increase the range or precision of the number). Lastly, it's called arithmetic because we're adding and stuff. Duh.

So, 16-bit fixed point arithmetic. That means our biggest number can be just shy of 65536, and our smallest number (sans zero) is 1/65536. That's a lot of precision! Is it enough?

Sort of. But if you've played Doom a lot, and you're astute person who picks up easily on visual glitches, you'll notice that even with ALL of this range and precision, you still get occasional visual artifacts such as "straight" walls that distort when you are extremely close to them, or extremely long side-defs that appear to "jump around" as you move the camera.

Can you imagine how bad those artifacts would look with over FOUR THOUSAND TIMES less precision? Walls would be CONSTANTLY jumping around. The bending and warping would be obscene. It'd be like you jumped into Picassoland. Ridiculous.

So that's PART of the reason it would make no sense for Marathon to use precision that small.

The second part is that using such little precision makes absolutely positively no sense from an implementation perspective. Marathon used fixed point math just like Doom. But if they really have FOUR THOUSAND TIMES less precision, that means they are only allocating 4 bits for the fractional parts of their numbers. Were the marathon devs extremely short on memory, forcing them to store and extract the fractional parts of different variables in whole bytes? Were they just masochistic? Or is somebody confused in their explanation? I'll let you decide.

By the way... the visual artifacts I mentioned from Doom have been fixed in many sourceports such as ZDoom, partly by increasing the size (and therefore the resolution) of their variables, and partly by switching to floating-point arithmetic (where the number of bits allocated to range and precision is adjusted as needed).

But even with this extended size, the visual inaccuracies haven't been completely eliminated so much as marginalized into non-existence. Visual inaccuracies, a result of information lost during transformations, is endemic to ALL 3D hardware/software. You just can't see it most of the time anymore thanks to most modern variable sizes, which are quite enormous. 64 bits = 64 binary digits, you know... and that is a lot of digits.

Share this post


Link to post

Well, that was good. Thanks. And, you're right: 16 bits is not enough. 32 bits is not enough, either, but it starts to get good at 32. It's the tradeoff of level size vs. tiny detail precision. At 32, levels can be reasonably big, and relatively precise, but leaving something to be desired.

Share this post


Link to post

I wasn't talking about rendering at all aside from the fact that Marathon is portal based.

sheridan said:

Uhh, no, this is actually completely wrong. You are confusing terminology that you've picked up on the two engines and it's having a serious effect on your math, because there is no way Marathon could've been able to produce anything remotely pretty with precision like that (the smallest number it could've represented would be 1/16, which neither makes sense from a technical standpoint nor would it work from a graphical standpoint).

I was actually in the midst of writing a huge explanation of fixed point arithmetic and 3d graphics to explain how wrong this is but I stopped basically because it was getting way too long for one forum post.

I think you've completely misinterpreted what I wrote. A unit in Marathon is equivalent to 64 units in Doom (frequently called a World Unit or just WU for short). And when I was talking about scale, I wasn't talking about its rendering precision, just how precise actor and vertex coordinates could be. It was indeed 1024 subunits per WU for those things. Technically this means that maps could have more precision for map detail, but far less smoothness for moving actors.

Another precision limitation moving actors faced was that it only recognized 512 unique angles for the direction it could face.

sheridan said:

This is wrong too because you are once again confusing your units. The size limit of a marathon map is actually exactly the same as a doom map, and it's limited only because a 16-bit number can get so big (65535 being the biggest thing you can represent in 16 bits).

The scale is different, again you misinterpreted my post. That 16-bit number is far more wasted in Marathon for map precision as it takes 1024 to make up what would be 64 in Doom.

And yes, it is limited to 18 story high levels. Quite arbitrary and makes no sense, but that's all the level editor lets me do. I've not looked into the map part of the engine source to find out why (I did read parts of the source, but mostly related to player physics).

Share this post


Link to post
Gez said:

BSP is faster than portals for very complicated geometry. Take things like some areas of ZDCMP2, Hobb's End Horror (Hellcore 2.0) or The Hag's Fingers (Sunder) where you can have hundreds of sector boundaries in sight; with BSP it's much faster than with portals.


The main performance issue is not number of sector boundaries but actually size of sectors. For a portal engine any sector boundary is considered a portal so the larger each such 'portal' gets and the more of them can get collected before sorting by distance is necessary, the slower the engine will get. If you just got tons of tiny sectors these won't accumulate in significant numbers.

But a map like Hobb's End Horror where all the streets are one huge sector spanning the entire map this portal sorting will pretty much kill performance for good.

With a BSP it's no issue at all because all the sorting has already been done when the BSP was built.

Share this post


Link to post

@Sodaholic
Okay, I've thought further about what you said and you are certainly right, Marathon does have 4096x less precision in its physics engine because even with a map unit that is 16x more precise than Doom's, it doesn't record a fractional part for any of its physics calculations, resulting in movement that is jerky and inaccurate especially in non-orthogonal directions. So I see what you were saying now, but what confused me earlier was that you didn't stop to note the jump in context from the rendering to the physics, which is crucial to understanding what the heck you're talking about.

Also, both the "World Unit" and "cell" are actually completely meaningless terms to both of the engines, so for clarity's sake could we please stop using them? The term "subunit" is extremely misleading too, especially when you start mixing it with the fractional parts of Doom's units.

In fact, let's establish a very clear ratio from all of this:

1 world unit   x   1024 "subunits"   =
64 map units        1 world unit

16 "subunits" (Marathon unit)
  1 "map unit" (Doom unit)

Sodaholic said:

And yes, it is limited to 18 story high levels.

Not doubting that, both Doom and Marathon seem to have weird issues with really tall sectors. It could've simply been an arbitrary limit set by one of the Marathon devs to ensure stability.

Share this post


Link to post

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
Sign in to follow this  
×