Jump to content
Search In
  • More options...
Find results that contain...
Find results in...
Quasar

Simple linear format

Recommended Posts

The real problem is not what the on-disk format is, as long as it's not too brain-dead with handling transparency, doesn't have too much overhead, and it's reasonably easy to support in editors. PNG and even GIF already do that, if certain assumptions about the use of the palette are ironed out.

The real trouble is designing a good in-memory format that can work with existing or minimally modified column drawing functions, especially at EXTENDED color depths, and which most ports could support. The most natural thing to do would be an extension of the column_t and patch_t formats, but those would be strictly in-memory, not on-disk formats. It would be pointless to design an arbitrary linear format from scratch that is different between memory and disk.

Share this post


Link to post

Where == linear font caching. And you're right, it's entirely loading overhead. The PNG is loaded, converted to 8-bit with color requantization where necessary and/or indicated as desired by the user (with necessity overriding desire), and then converted to a naked linear buffer for later rect-based blitting by the game engine.

You have to admit it's a hell of a lot more complicated than

 byte *linear = static_cast<byte *>(wGlobalDir.cacheLumpName(foo, PU_STATIC));
but yes, the amount of time it takes is really not a concern in this particular instance.

Now, if there were anything that needed to only *cache* these graphics, and not keep them around in memory forever as fonts have the benefit of doing, there would be a problem. There is a disconnect between the physical file data and what has to be cached in memory and it's very difficult to fix. The naive approach would be PNG decompression and format conversion every time it's been dumped from cache and that would be much slower, as you point out.

EE's wad system might offer a solution already however, as we can now maintain multiple caches per lump, a generic one plus special ones for indicated formats. I would suppose adding a "linear" cache pointer may not be a bad idea if/when such a situation as the above comes up (the only current alternate format cache is for patches).

Maes: maybe I need to clarify this isn't a replacement for patches. I specified a headered LINEAR format because this is an application where LINEAR format graphics have support in Eternity. I just do not like the idea of linears without any format metadata. It took me a good 10 minutes between lump extraction, source code diving, and calculator punching to work out the size of EE's CONCHARS lump, as I had forgotten its width*height. After working out from the source that 32 chars are required per row, divided the 8192 byte lump size by 32x8 (thank god I know it's an 8-char-wide font) to get the pitch.

Share this post


Link to post
Graf Zahl said:

decals in zdoom.pk3

Those things are really fugly and don't blend well with id's art style. You guys should keep it in an optional extra PK3, kinda like the dynamic lights pk3 that comes with GZDoom.

Share this post


Link to post

Not sure exactly how you plan to use this, but if it were me, I'm adamant about using what already exists. For example, for console characters, I'd probably use a standard texture. Each patch could be a character. Yes, you'd have to have code that renders to a block buffer for run-time display, but, you'd have your headers, it'd be in a standard format, no decompression time, and only minimal extraction time. As a bonus, you get proportional fonts, if you desire (not so good for a console, though, heh :)

Of course, make whatever format you wish, it's your port. But if the goal is for easy end-user modification, you'll have to consider that too. No one wants to "load raw" if they don't have to - doesn't matter how easy it is. Just my 2 cents.

EDIT: Actually, having an icon-like, multi-patch drawing gizmo would be kinda neat, and useful. Things like HUD icons, number displays, Heretic-like inventory pics, splats, etc. Could be useful to shove all of those into a texture... With the right set of extraction/drawing support functions, that could be very useful, indeed.

Share this post


Link to post
kb1 said:

Actually, having an icon-like, multi-patch drawing gizmo would be kinda neat, and useful. Things like HUD icons, number displays, Heretic-like inventory pics, splats, etc. Could be useful to shove all of those into a texture... With the right set of extraction/drawing support functions, that could be very useful, indeed.

Texture Atlas

Share this post


Link to post

The point is that there's a bunch of crap you have to deal with when you're handling PNGs. Chunking, interlacing, compression, blah blah blah. It's a little silly when all you want to do is store a tiny amount of header info and raw pixel data.

The appeal of PNG is that it's common, compressed and lossless.

The idea is that your favorite resource editor would accept PNGs and save as the simple linear format, and allow exporting as PNG. That way ports aren't bloated with dumb PNG crap, and mappers/modders can still use their favorite tools.

Share this post


Link to post

Bullshit!

PNG isn't 'bloat'. Furthermore, once the code is in, you do not have to worry about it anymore.

In case of ZDoom, zlib is needed anyway for various things, like loading Zip/PK3 files, compressing savegame and demo data so all that needs to be added is the PNG loader which is a moderately sized file. And if you don't want to roll out your own solution, use libpng. So your executable may end up being 100kb larger than it was before. Who cares? ZDoom adds more code to support much less demanded features and nobody cares.

Offloading stuff like this to external tools is a very, very bad design. With proper graphics format support you don't even need a resource management tool except for some very specific stuff like adding the offsets to the graphics.

We don't live in 1993 anymore where caching efficiency was important to keep the game running at acceptable speeds. In today's world where even the cheapest available computer has multiple gigabytes of RAM any resource management scheme that tries to keep the memory footprint at as few megabytes as possible is a wasted effort.

Yes, you can brag that your code may run on some ancient setup but what's the point? It's wasting time and effort to cater to a very small minority of potential users.

ZDoom does not purge any loaded resources based on memory footprint. It keeps everything in memory as long as it's needed. And if it really gets too much I still think that relying on the OS's virtual memory management is better than doing this yourself.

Share this post


Link to post

I don't mean execution speed and memory use, I mean code/repo bloat. If you use the linear format, the function you use to read it into a patch or whatever is like 8 lines tops, and you don't have to do weird lump caching stuff because the format in the WAD is different than the in-memory format.

It's also worth saying that ZDoom's (and EE is not immune to this either) philosophy of "just put the library in the repo" is kind of insane. I can understand it though. POSIX systems have nice, sophisticated mechanisms to search for libraries during linking. On Windows you're stuck menu diving, I'm pretty good at it and I hate it every time, so I can totally see the appeal of having all your deps in the repo and configuring your project files to point to them. It's just super bloated and nasty is all.

Anyway, the definition of bloat is stuff you don't use. No one uses the chunking feature of PNG (because it's used for transfer), so you can say it's very small, maybe even inconsequential bloat, but it's still bloat.

Finally, you can probably argue that caching efficiency is pretty essential to software rendering. I'm not an expert by any means, but I would expect a pretty big FPS drop if you branch too much in the renderer. You're right in this context though, you're not gonna run into that regarding PNG/LINEAR.

===

EDIT:

Oh and I thought of something else. While this is largely in a single-player context, keeping an eye on memory footprint is important for a server. For example, if I wanted to run 10 EE servers, and an EE binary uses up 80MB running Sunder MAP05 or something, that's 800MB of RAM I'm eating. Urghhhhh.

(Of course, the EE server doesn't load textures and doesn't use 80MB of RAM ever, this is just an example).

Share this post


Link to post
Graf Zahl said:

We don't live in 1993 anymore where caching efficiency was important to keep the game running at acceptable speeds. In today's world where even the cheapest available computer has multiple gigabytes of RAM any resource management scheme that tries to keep the memory footprint at as few megabytes as possible is a wasted effort.


If you replace "multiple gigabytes" with "multiple tens of megabytes" and for "cheapest computer" you mean a three-digit frequency Pentium (as opposed to 486 and first-gen Pentium), it sounds like the same exact argument used by mid-90s Windows 95 developers to justify the bloat introduced by it, all over again.

This was very apparent for those games that still shipped with separate DOS/Windows executables, first and foremost, our very "own" Doom95 vs doom.exe.

I'm still perplexed that there are still people supporting the discredited trope that CPU & memory advancements will swallow up any inefficiencies. Sadly, it's still the norm (e.g. modern smartphones have incredibly bloated UIs, and are less responsive than even early 90s mobiles for basic stuff), but I was sincerely hoping that this line of thinking would lose steam when it became evident that there is a performance wall somewhere down the line, and we've hit it full face.

Share this post


Link to post

Cooking up convoluted caching schemes has nothing to do with avoiding bloat.

If you got sufficient RAM to keep all your needed data in memory it's a wasted effort to do so.

What I mean is that if you know you need the data you keep it in memory and then delete it if you don't need it anymore.

And once you got these issues out of the way there's absolutely no reason whatsoever anymore to make your on-disc data match the in-memory data structures as closely as possible - which to me seems to be the whole point of this entire idea.


And btw, some time ago I made some profiling tests to see how loading compressed PNGs might affect loading times. It was quite clear from that that loading from disc - even if the data was already in the HD cache cost more than performing the decompression. And decompression was considerably less than loading the uncompressed data instead in many cases. And in case of OpenGL it all was irrelevant when comparing to the time needed to upload the texture data to the graphics card.

Sorry, but to me the whole idea smells too much like trying to be efficient in the wrong place.

Share this post


Link to post

For what it's worth EE keeps everything PU_CACHE in memory now until libc malloc() returns NULL (which can happen, but almost only ever when the engine is in an infinite loop or recursion >_> ).

I've been maintaining the functionality of the cache system under the pretense that it could become active again in the future, particular if EE was ported to a platform with limited resources. Load everything and let god sort it out works great as long as you only run on PC. What if EE wants to run on Android in the future? Phones don't have as much RAM and it's certainly not all available for your own process to hog. DS/3DS would be even more restrictive platforms (ask Kaiser about this).

I hate to say it Graf, but sometimes you kinda have blinders on when it comes to that kind of thing because ZDoom only targets PC, and barely even other operating systems for PC other than the very latest version of Windows.

I may up and eliminate the caching layer altogether, though. Especially since even phones are starting to have enough RAM to the point you could load the entire IWAD. Frankly it would be more efficient most of the time, especially the more you do introduce compressed resources, to just walk the entire directory at startup and precache everything - so long as it's physically possible.

Maybe where you get into trouble is, after you have DECORATE-level support, people start making these "library" WADs like AeoD that are not just "100's of Megabytes" but GIGABYTES in size. Completely precaching that much shit - especially sprites - could be disasterous >_> - it doesn't help that so many newb wads just include the entire thing too, instead of picking/choosing what they are using only. It seems to me there is still some kind of upper bound you'd need to set and/or be smarter about the precaching of certain classes of resources; ie., "I'm not going to cache sprites until R_PrecacheLevel detects a thing is going to spawn using them".

Then of course you run into scripts. Guess you gotta hope the majority of monsters are not script-spawned, which is true on most reasonable mods.

Share this post


Link to post
Quasar said:

ZDoom only targets PC, and barely even other operating systems for PC other than the very latest version of Windows.



That's utter and complete bullshit.

For the record, there is an official MacOS download, Linux can be built from source (just like the Linux guys want to have it) and all operating systems down to Windows 98 are supported, although on Win98 there are some limitations due to FMod not working on that system.

Share this post


Link to post
Graf Zahl said:

That's utter and complete bullshit.

For the record, there is an official MacOS download, Linux can be built from source (just like the Linux guys want to have it) and all operating systems down to Windows 98 are supported, although on Win98 there are some limitations due to FMod not working on that system.

We also have semi-official Ubuntu binaries. Also ZDoom once ran on Solaris, but Firelight took down the downloads for the Solaris version of Fmod.

The only Windows 98 problem I was aware of is the startup console being unreadable, but that doesn't affect game play.

Quasar said:

Maybe where you get into trouble is, after you have DECORATE-level support, people start making these "library" WADs like AeoD that are not just "100's of Megabytes" but GIGABYTES in size. Completely precaching that much shit - especially sprites - could be disasterous >_> - it doesn't help that so many newb wads just include the entire thing too, instead of picking/choosing what they are using only. It seems to me there is still some kind of upper bound you'd need to set and/or be smarter about the precaching of certain classes of resources; ie., "I'm not going to cache sprites until R_PrecacheLevel detects a thing is going to spawn using them".

For the record, ZDoom does only precache what is needed at level start. There are a few rare threads regarding stuttering in heavy ACS based levels where ZDoom has to load in large resources.

In terms of memory footprint I'm getting 16MB with vanilla and 40MB with a somewhat modern mod. From what I can tell the only thing stopping ZDoom from running on Android/iOS is that no one is willing to port it.

Share this post


Link to post

ZDoom has been ported to platforms such as the GP2X, whose specs aren't all that impressive. Sure, you probably wouldn't run Aeons of Death on it; but it would be good enough to play, say RTC-3057 or Daedalus: Alien Vanguard.

Share this post


Link to post

We've kind of wandered off into the ZDoom weeds a little, but I would say a couple things:

- If it has to be ported to a platform, that platform wasn't a target.
- Blzut3 said ZDoom was using 40MB RAM with that mod, the GP2X has 64MB RAM total leaving 24 left for the OS and other programs, that's not a lot of space.
- The GP2X is 320x240 16-bit, so that's a big boon for performance right there.
- Graf is pretty famous for telling people to upgrade their crap hardware when they complain about GZDoom performance.

But anyway I agree that while it's important to get into good efficiency habits (heap allocations, etc.), in general performance is probably a big timesink for programmers. I think an even bigger timesink is code bloat and complexity though, which is what the LINEAR format tries to address.

Share this post


Link to post

Making up some fancy new but non-extendable binary format where one exists already is a bad idea.

In ReMooD, there is the Image backend. This backend can grab any supported format, then expose the image to the game in the desired format (whichever it needs). It has auto-detection but it might be wrong depending on the image data, it is also a bit more complex*.

* On note of complexity: Doom Legacy has a pic_t which has "auto-detection", but it isn't really that great. As a result, I make sure a patch_t is pretty mostly valid before using it.

So basically, ReMooD can read patch_t, pic_t, and raw images (currently only square ones) then the game can grab any of those. This is how ReMooD supports floors on walls and walls on floors, etc. OpenGL texturing is easy, can just grab a GL version of the image.

I could easily add PNG support with decompression via miniz and have it work without issue without changing any of the other codebase.

...

On another argument, about "we have so many resources, it is OK to waste them!". You can make the argument with Firefox, even though it uses gigs of memory and sometimes eats all your CPU (to run all those flash ads at once), you can have an entire 1TiB disk dedicated to swap! It also shows that you are lazy and you could care less about your own code. It also means that you probably won't be supporting embedded targets within this decade.

...

ReMooD supports many systems though, including DOS. Making the game run smoothly on ancient DOS systems benefit everyone.

Share this post


Link to post
Ladna said:

.
- Blzut3 said ZDoom was using 40MB RAM with that mod, the GP2X has 64MB RAM total leaving 24 left for the OS and other programs, that's not a lot of space.



That 40MB footprint also covers lots and lots of system resources that won't be present on other platforms. And the OS won't use up quite as much as something like Windows or Linux, if parts of it aren't even stored in a ROM where they don't take up any space whatsover.

Ladna said:

But anyway I agree that while it's important to get into good efficiency habits (heap allocations, etc.), in general performance is probably a big timesink for programmers. I think an even bigger timesink is code bloat and complexity though, which is what the LINEAR format tries to address.



Does it really? It's a classic case of a programmer thinking only of himself and not his customers - which do not care if the executable is 100 or 200 kb larger. They do care, however, if ease of use is compromised by opting for a non-standard solution.

That's the nice thing about using PNGs for graphics and Zips for resource containers: They are standard formats that can be used by all the tools in the world.

Aside from adding offsets to graphics I haven't had any need to use a Doom resource manager like Slade at all in recent years - and that counts a lot more to me than avoiding some minor complexity in the resource management.

As for the amount of complexity we are talking about here, aside from adding zlib, there's a 31kb source file for PNG management and an 11 kb source file for Zip management, not counting the 10kb which were added later to support the ancient compression formats - all of that relatively straightforward code. Nothing complex at all, really...

Share this post


Link to post
Graf Zahl said:

I haven't had any need to use a Doom resource manager like Slade at all in recent years

So, how do you go about creating WADs without dedicated or compatible tools? Do you use a hex editor or something absurd like that?

Share this post


Link to post
Sodaholic said:

So, how do you go about creating WADs without dedicated or compatible tools? Do you use a hex editor or something absurd like that?

Personally I use a Python script as its more convenient (Slade takes way too long to start up given its a file utility) and far more powerful (I can immediately customise it to do whatever job I need).

Share this post


Link to post

There's no need to call out the entire brigade just because I have been misinformed about ZDoom's total portability :P

That aside:

  • I already said I'm using PNG in the place where I thought this format would be nice to have as an additional option, so no use anybody bringing that back up at this point.
  • If it was added as a supported format, its use wouldn't be much less or more convenient than any other format in the typical use case (SLADE or like WAD/pk3 editor), so I don't really believe that's a valid argument against it. I'm stating this only on principle and not because I care (see point 1 above again).

Share this post


Link to post
GhostlyDeath said:

* On note of complexity: Doom Legacy has a pic_t which has "auto-detection", but it isn't really that great. As a result, I make sure a patch_t is pretty mostly valid before using it.


That pic_t format, that's the image format taken from Quake, isn't it?

Share this post


Link to post

I see the point about having a tight header format. Not going to get others to support it though. Would need just about every editor and tools to be expanded to support it, and that is the real problem. To get around that write your own wad converter and file converter. You can download standard wads but convert them to your better compressed format.
Don't need any cooperation from the community to do that.

This header has unused fields but no format version.
Should always have a format version because after a year there will always be a version 2.
Should put that in the first field such as "DTEX001", "DTEX002".

Warning .. Irritated poster .. Warning:

I have an almost identical problem with SimGear format used by FlightGear. They choose XML for their data format. The required data file is 600MB. I have tried 3 or 4 times to download that monster.
Another case of developers that thought that Everyone has a Megabyte per second cablemodem. Even at the library I can only get 150Kb sec, and they kick you off after 60 min. It needs to be stored in a better, tighter format. Doom wads are headed in the same direction.

I don't think there is any excuse for bloat.
There is no problem with supporting PNG or other formats, but I cannot stand for that argument that waste does not matter because everyone has multi-gigabyte memories. The only machines I know with that much are built for windows game playing. The machines I see run about 800 MB to 1500 MB main memory. I depends upon what crowd you run with, and the world is nowhere homogeneous.

Share this post


Link to post

The overhead from using the PNG format will be at most 768 (for the full palette) + 12*5 (for necessary chunk headers, supposing we have IHDR, PLTE, IDAT, IEND, and grAb) + 8 (PNG header). That's 836 bytes of overhead per graphics; the rest is raw data that you cannot omit.

If you're not using it for a tiny tiny picture, those less than 1kb will be offset by the compression.

Keep in mind that you can always optimize your PNG a bit by running them through PNGOUT and then DeflOpt. Also remember that the PNG format doesn't require a full palette. If your image only uses ten distinct colors, the palette data will only take up 30 bytes.

You cannot compare the "bloat" from PNG to the bloat from XML. XML is a super-bloaty format. First it's text-based, secondly it's very, very verbose. Compare those two examples:

<tag>some data</tag>
Nine characters used for markup.
tag{some data}
Five characters used for markup.

Just because hundreds of megabytes of bloat are far too much doesn't mean that a couple hundred bytes of bloat are also unacceptable.






Anyway, I just noticed that an IMGZ header has the same size (24 bytes) as Quasar's proposed linear format. Offsets and size data are stored on the same type of variables (16-bit each). There's 11 unused bytes in the header which could be used to mark transparent index (since it's 0 by default and the unused bytes are always zero, it wouldn't break the existing graphics, and as flags for whether the data is paletted or alpha mapped; and for whether not to use transparency at all. The run-length encoding is optional but might be useful for UI graphics like font characters anyway.

Share this post


Link to post
Gez said:

xamples:

<tag>some data</tag>
Nine characters used for markup.


I count eleven. ;)

But anyway, XML is one big mess of redundant markup data.

For one tool I work with the XML output is currently at 32 MB, compressed with 7z it gets shrunk to 250kb. Not surprising if approx. 70-80% of your file is markup, not usable data.

Share this post


Link to post

XML is perhaps slightly more verbose than it needs to be but seriously, this doesn't even matter. XML (like PNG in a computer graphics context) is an exchange file format, i.e., it is intended as an interim format used when transferring data between systems.

If you think that XML is "bloated" then you are more than likely using the wrong tool for the job. If markup accounts for 80% of your file size then this should be pretty obvious IMO.

Share this post


Link to post
DaniJ said:

If you think that XML is "bloated" then you are more than likely using the wrong tool for the job. If markup accounts for 80% of your file size then this should be pretty obvious IMO.


Pretty much correct, but that's what happens if people who think that XML is the right thing to do the job make the decisions.

XML is still bad because of all the redundancy in the markup.

Share this post


Link to post

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×