Jump to content
Search In
  • More options...
Find results that contain...
Find results in...
Vermil

How did ID/Raven/Rogue settle on the amount of animation and screen resolution?

Recommended Posts

I've been wondering whether there is any information about how ID/Raven/Rogue and other FPS makers settled on using the amount of animation frames and screen resolution they did?

 

For instance, Strife came only on CD, so why didn't they take advantage of the space to include a lot more animation frames?

Edited by Vermil

Share this post


Link to post
5 hours ago, Vermil said:

I've been wondering whether there is any information about how ID/Raven/Rogue and other FPS makers settled on using the amount of animation frames and screen resolution they did?

 

For instance, Strife came only on CD, so why didn't they take advantage of the space to include a lot more animation frames?

Probably because it would have been more work, and 2d sprite games were coming to an end. Quake was released the following month!

Share this post


Link to post

Take this with the caveat that it's from a non-expert who doesn't have any particular inside info unless otherwise noted, but I believe RAM was a limiting factor in how many resources you could reasonably use in your game more so than CD-ROM space (but also bear in mind that even all the way up to Hexen, floppy disk releases were still a thing, and every additional floppy was additional manufacturing cost for each copy so this was also an incentive to trim off any excess fat in the file sizes.  There was a period of time when the rule among a lot of game publishers was "nope, it has to fit on one floppy" although Doom stuff obviously all came out a bit after that.)

 

I don't have any exact mentions for Doom-engine stuff but I think it's fairly well known that Rise of the Triad cut a bunch of enemy variants that they'd already done at least partial resources for, because it would have driven the game's RAM requirement above what was considered the acceptable maximum at the time it was released.  You get a peek at a few of them when you beat the game as I recall.

 

Screen resolution, on the other hand, was to some degree decided by CPU power, since there wasn't the sort of 3D hardware acceleration we're used to today, but it is worth noting that as far as I can tell, VESA standards weren't a thing until about a year after Doom came out.  What that means is that if they'd wanted to support higher resolutions, it'd have to be either in a 16-color mode which was what standard VGA guaranteed for "high resolution" (the Hexen skull & bones loading screen actually uses this mode), or else it'd have to use some of several proprietary and incompatible "Super VGA" platforms that various manufacturers were making before VESA standardized it.  If you've ever played any of those old MoraffWare games (like Moraff's World) remember how it started up with a huge list of all these different video cards you could pick from and if you picked something different from what you actually had bad things would happen?  It would've been like that.  BUILD games came out just enough later that they could use VESA resolutions but I'm guessing no one who licensed the Doom engine really wanted to dig deep enough into it to implement VESA support.

Share this post


Link to post

ETTiNGRiNDER answers are spot on. The target CPUs at that time were 486, or if you were lucky, Pentium. These CPUs could paint a 320x200 screen with a decent frame rate, but not much more, even if the video card could support higher resolutions.

 

Doom's memory subsystem was designed to swap out previously-used resources and swap in the resources needed to paint the current frame, play the current sounds, etc. Having more frames means more swapping, unless you had enough memory. Again, the target PCs was a 4Mb machine which struggles with swapping.

 

This also gives a good reason for why they took out the rotations, and used mirrored sprites for a lot of the monsters. Computers were just starting to have the ability to do Doom-like things.

Share this post


Link to post

Doom originally had a hi-color mode planned, but it was scrapped fairly early in the development. You can see some traces of placeholder stuff in the early alphas.

As for the graphics mode, the 320*200 was available in several different hardware modes, I think different versions of Doom use slightly different hardware modes. This mode is rendered in 70Hz, and the game has most of the internal stuff done at 35Hz, producing two identical frames per screen refresh. Different modes might have had a lower frequency, giving choppier gameplay and possibly more eye strain. 60Hz on CRT isn't all that comfortable. If you want more information about mode x and mode 13h etc, just read Michael Abrash' black book.

The target for Doom was later 386 machines and 486 machines. It runs ok on those, especially the first game. These machines however do not have that much video memory, and the size of the memory limits the resolution the game can run at. A base VGA card at that time typically had 256kib of ram, as that were the VGA specs. Some really early cards had 64 or 128, but those were probably not found on machines that ran Doom very well.

320*200 and 1 byte per pixel gives Doom a memory requirement of 64kb for the one sceen update. Factor in double buffering, and 128kb is needed. Then we also have the palettes. Doom uses 256 paletted colors, where the values are lookups to 256 18bit user-definable values. A palette would need 768 bytes of memory.

As far as I remember, Doom is double buffered, the idea behind double buffering is to write to one memory area while displaying the other and then "swap" the buffers when you're done drawing an entire frame. This tends to produce images without tears, look it up online for "screenshots" if you're interested.

As for sprites, more rotations require more work and more memory, for a limited amount of gain. If it went from 8 rotations to 16 rotations, one would generally roughly halve the amount of monster types that a map could feature, or way fewer textures, flats, etc. Memory was already fairly tight in Doom, requiring 4 megabyte machines without many tsrs or other things running.

The only sprite i do miss more rotations for are the player and rocket sprites, that would have made deathmatch look and play slightly better.

I have always assumed that Doom 2 map30 doesn't spawn former humans mostly to cut down on the amount of sprites needed. If all non-boss monster sprites would need to be loaded and an infinite amount of enemies could spawn, the game could run out of memory pretty quickly. In a similar vein, Doom 2 map15 is one of the biggest maps in the two first games, and it has that one odd arachnotron in a secret area. An odd inclusion. The  map doesn't have any demons, revenants, arch-viles, mancubi or spider masterminds. This is probably a tradeoff for memory usage just as much as a design decision.

Fun fact, Doom 2 map15 has areas where 90 vis planes are visible, most likely making it one of the most complex scenes rendered in the game.

Share this post


Link to post

I think in terms of graphics mode Doom does the same things as the well known Mode X but leaves the resolution at 320x200. In his wolf3d book fabien sanglard assumed the reason for that was to not have to fill those 40 extra rows (which would be fairly expensive) and it would result in the artists doing things at the normal 320x200 pixel ratio while the game uses a 1:1 pixel ratio, making their lives more difficult.

Share this post


Link to post

Doom is triple (actually quad but they almost never used the 4th plane) buffered not double buffered, but that was in RAM I think; there were also two banks of VRAM iirc for the VGA mode it used, and some of the drawers went straight to VRAM and didn't go thru the RAM triple buffers (e.g. status bar)

 

the alpha/beta video stuff demonstrated that they had some ideas about diminished lighting that got simplified, and the game wasn't yet capped to 35fps, but I'm not sure there's any evidence that they targetted any other video mode for the PC.  When Doom shipped no PC that existed could run it at the full 35fps.

 

10 hours ago, InsanityBringer said:

I think in terms of graphics mode Doom does the same things as the well known Mode X but leaves the resolution at 320x200. In his wolf3d book fabien sanglard assumed the reason for that was to not have to fill those 40 extra rows (which would be fairly expensive) and it would result in the artists doing things at the normal 320x200 pixel ratio while the game uses a 1:1 pixel ratio, making their lives more difficult.

 

The artist explanation doesn't make sense, because they used deluxe paint that could e configured into a 320x240 screen mode as desired. They switched/added scans via the NeXTStations later on, and had to do aspect ratio correction on those, so it would have been less work if they'd used 320x240 for those.

 

On 11/6/2018 at 7:15 AM, Vermil said:

For instance, Strife came only on CD, so why didn't they take advantage of the space to include a lot more animation frames?

 

Paging lumps off CDROM would have been far too slow.

Edited by Jon

Share this post


Link to post

The Pentium 60 and 66 MHz were introduced in March 1993. I think they could run the game at 35 fps at least some of the time. These were at that time extremely highend machines, and I doubt they were a platform id targetted. They're more Final Doom and Quake platforms.

As for Mode X, it's not really one specific resolution, it's a set of different resolutions that require a bit of manual fiddling to set up.

As for the hi-color mode, here's a fair bit of documentation with screenshots from version 0.4:


 

Share this post


Link to post

Soon after Doom's release, I bought a 486 50Mhz (straight 50, not doubled) with 8Mb, and a 40MHz Diamond Stealth Local bus video card. It had a pretty crazy lean DOS memory setup, and counter to the instructions, I ran SmartDrive which made a big difference for Doom lump paging.

 

This box regularly enjoyed 35 fps. But, because the local bus video was slower than the CPU, I'd occasionally get "sparkles" on the screen, which were single pixels of the wrong color. I think a conflict occurred between the CPU writing to video memory, and the video card reading the same memory to update the monitor (at least, that was my best guess).

 

I called Diamond tech support, and the tech was extremely rude and kept saying that they would not honor a refund, even though I wasn't asking for a refund. I just wanted to know what was happening, and if it was ok to keep using that card on that motherboard, with that CPU. The guy refused to answer any questions (probably was clueless), and he eventually hung up on me!

 

Anyway, I think Windows 95 and the DirectX effort may have played a big part in pushing some video mode standardization. It was sorely needed then, because those old DOS games really struggled to support those fancy hi-res video modes. The video cards claimed to be able to do hi-res with tons of colors, but it wasn't easy to find programs that would try to use those modes.

 

Then again, the hi-res/hi-color modes might have been a disappointing, because the computers back then simply couldn't push that many pixels quickly enough. Wolf and Doom were not the first to do 320x200, or 3D rendering. But they were probably the first wildly popular games that combine 320x200, 256 colors, and high-speed full(ish) screen 3d first-person video. Textured 3D 320x200x256 colors @ > 20fps was just about the most that could be done with affordable computers back then.

 

For those wondering about the power of Doom nostalgia:

When I bought that computer, I also bought a tape drive for backups. The drive was bought used, and it came with a handful of used tapes (those tapes were expensive.) The tapes has been used to backup a local bulletin board system (BBS). One of the tapes had doom1.exe (the shareware Doom v0.99.) I had never heard of Doom, and there was no description.

 

I had been checking out the files on the tapes, and decided to give Doom a try, with no idea what to expect. When the game started, my jaw hit the floor - I had *never* seen a convincing fully textured, shaded, computer-generated 3D view before. I wasn't sure that it wasn't a movie at first. But, when I figured out how to open the door, and I walked into the zig-zag room, I must admit that I freaked when I saw the imps, and they threw fireballs at me!

 

That experience burned E1M1, and Doom, into my mind forever, as an amazing experience. *This* is what Doom nostalgia is for me. On that blurry 320x200 screen, I had seen something that I couldn't imagine was even possible.

 

I think they chose 320x200 because it was the highest resolution that could produce the desired frame rate on machines of the day, and at the time, it was plenty effective.

 

Edited by kb1 : oops - left some crap text at the end of my post

Share this post


Link to post

The problem with VLB was that it ran at the same frequency as the bus. A 50MHz cpu that wasn't clock doubled would mean a VLB gfx card and other things ran at that speed as well. Many cards couldn't handle this, and there were issues with stability and image degradation. That is why the 486 DX2 66MHz runs at a double of 33, the 75MHz at 33MHz * 3 and the 100MHz at 33*3. Doubling a 50MHz bus to run a 100MHz cpu would have led to the same problems you had with your 50MHz system.

Thhere were 80MHz systems that are 2x40MHz and a lot of other variations, but these were non-intel cpus. Other vendors produced a slew of 486-compatible chips with various speeds and enhancements.

The VLB was a simple solution while they were waiting for the PCI standard, it was tightly tied to the 486 CPU. Before this one basically had ISA at ~8MHz and at most 16bit. PCI was a much more cozy 32bit 33MHz interface, and with amuch better form factor and the way it worked behind the scenes was better. I am keeping EISA and MCA out of the picture due to costs.

For home consumers though, VLB was the preferred choice. DirectX was an ok way to standardize things like video modes, but the VESA group had mostly fixed this, check out things like scitech display doctor. The main thing about DirectX was that it did this for video, sound, modems, mice, keyboards, joysticks, netowrking etc. One unified interface instead of having to test tons of configurations, support weird and badly documented hardware. Early versions of DX weren't the best, but after a few versions things started to work well enough for it to be a usable standard.

DX also opened up the market for smaller vendors, since things like Soundblaster emulation was no longer a required standard for soundcards etc. Games could have midi support without a ton of work.

Doom was released 4-5 years before this was a working standard, so it relied on supporting the most common configuration. A few soundcards, a basic vga mode and a non-fpu cpu. A notable exclusion of the Doom soundcard support is the lack of support for the Roland MT32. It would have been interesting to hear the soundtrack optimized for an MT32, with new samples and custom instruments and all that jazz that the MT32 supported. Dune 2 had a soundtrack written especially for the MT32, try listening to that one compared to the GM version.

Share this post


Link to post

@zokumVery interesting info about local bus video cards! I knew it could be a problem when I bought it. I got it anyway, because I had done some timing tests, and verified that writing to video memory was much slower than standard memory, and much slower than I thought it should be. The LB card was much faster, and I could live with the issues. I traded the occasional wrong pixel for much faster write speeds, and a very obvious frame rate boost.

 

Regarding DirectX (and the 'fun' of ISA cards in general), I think younger programmers would be surprised/horrified by just how ugly game (and other) development was in the DOS days. A few topics come to mind:

 

  • 16-bit/segmented memory model: All memory had to be "chunked" into max 64K chunks/segments, unless the developers were brave enough to use DOS Extenders to allow them to compile 32-bit programs that temporarily dipped into 16-bit mode when DOS functions were needed
  • The above-mentioned video mode nightmare, to include video memory banks which worked differently in the different modes. In some modes, a bank covered only a portion of the screen. In other modes, a bank controlled a color channel
  • ISA cards with DIP-switch selectable memory ranges, interrupts/extended interrupts. Not of this fancy plug-and-play, or self-configuring USB stuff. Conflicts had to be resolved manually, by physically configuring individual cards. Some cards only gave you a few choices of the available range, so this became tricky. And, once the hardware conflicts were resolved, the software had to provide the same options allowing it to communicate with the hardware.
  • Each brand of a particular type of hardware had unique capabilities. To take advantage of any of these specialized capabilities, each software title had to add brand-specific support. Because of this, it was rare that special capabilities were ever used.

Windows 95, and DirectX must be given some credit for their role in standardizing access to this hardware, without crippling it, and without degrading the performance too much. This took the hardware code out of the games, and put it in the OS.

 

But, in Doom's day, you either had to include hardware code for every device you wanted to support, or you made "safe" choices, and chose hardware that you could expect to exist. 320x200 mode was a safe choice. Just look at all the difficulties Doom had with the sound and music, as they tried to take advantage of some special capabilities.

 

Today's programmers can concentrate almost exclusively on the game, and just expect the hardware to work.

 

Share this post


Link to post

Id outsourced some of the soundstuff and aparently also the first version of the networking code. By doing this, they could concentrate on some of the more interesting features they had planned for the game. I'm not sure about Wolfenstein 3d, but it is aparently very picky about the soundblaster support. Many cards with sb emulation do not work with this game, so having a card that works in wolfenstein 3d is aparently a gold standard of emulation. If this was the case, it would seem very reasonable to just outsource the soundcard support and get it to run with well-tested code that worked with many soundcards. The DMX library didn't quite live up to what they had envisioned, but newer versions of the game did get support for more soundcards like the Pro Audio Spectrum and AWE 32 etc.

I would also say that the OPL midi instruments that come with this library are quite decent, even if the implementation is a  bit buggy if one tries to make different sounds. This stuff is well documented in another thread here on doomworld. All in all it seems id tried to outsource a fair bit of stuff and use well-tested components instead of doing everything on their own so they could spend more time on features, testing and innovative code solutions.

Share this post


Link to post

Carmack did say he regretted outsourcing the sound code in the end, but it sure seemed like a pragmatic decision at the time, given they were such a small team, and it would have certainly been a waste of Carmack's time to write that stuff. They'd have needed a second Dave Taylor.

Share this post


Link to post

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×