There are three main factors that inform the design of the PlayStation's engine, in order of importance:
Any hints on this? Does anybody tech-savvy enough have an answer for this?
Out of these three, the second was solved pretty trivially in various ways by Williams - primarily by creating compact compressed files which hold the resources needed at the beginning of each level. For example, all the sprites from the IWAD are duplicated in a different file for each map which contains only those sprites needed pre-arranged to fit into VRAM.
- Drive access times
- CPU time
The first and third however remain, and out of those two, the first is the absolute most arduous for the PSX. You may have heard that the system has 2 MB of RAM. That's already half of what PC had to deal with. Now, that 2 MB is for *everything*. There's not a bunch of extra banks of different kinds of RAM that aren't counting toward that limit.
PlayStation Doom is BARELY squeezing itself into these limits, particularly in terms of textures (wall, floor, and monster sprites). This is why it suffers frequent breakdowns, such as "TEXTURE CACHE OVERFLOW" when firing the BFG. It's just sitting right on the edge of feasibility in this regard. To be fair, there were opportunities for further optimizations which Williams could have done and did not - using a translation to turn Barons into Hell Knights for example (which they did do for Doom 64, where cart space demanded it).
In terms of CPU speed there's not that much to work with either. A 32-bit MIPS RISC @ 33 MHz, about a third in terms of clock speed of what was guaranteed to provide a smooth experience on PC. On top of that, the compiler they used was completely boneheaded. There's massive inlining of functions throughout the program, but absolutely nowhere it would actually matter for performance (I am fairly confident it is significantly degraded, in fact, by this). There's no actual hardware mul or div on R3000A either - they're macro instructions that explode into 12 to 13 instructions each, so the compiler goes way out of its way sometimes to avoid them, and then when it didn't/couldn't, you have this huge sequence of seemingly nonsense operations that are logically equivalent to a single "mul" on x86 - whether they're really faster or not, as MIPS certainly liked to claim, is up for CPU historians to argue, but I have a feeling they're not, particularly when cache is considered.
So in short, I believe this: PlayStation Doom could be optimized a bit more and probably handle many of the PC maps, but I believe some of them would still be no go and some retexturing and restricted monster deployment at the least would be critical (ie. no 3 or 4 versions of STAR* in one map, or Spiderdemons in a map with anything wider than a baron). It really is pushing the system as it is.
Last edited by Quasar on Jul 17 2014 at 15:44