GooberMan Posted September 4 (edited) Aight, so I checked with the mods whether we could make a new thread to discuss the spec and they were cool with it. This thread is a clean-slate thread where conversation should stick to the specification and avoid speculation. I'm putting together the 0.99.2 revision of the spec. It's currently in draft status. Some of the feedback has been rolled in, but I've very likely missed quite a few valid points raised about the spec. So before I consider this revision finalised, let's get some conversation going.Major changes Just to make clear that copy/pasting the text of the spec is totally cool, the documents are labelled CC0. While I haven't copied the assets and code in to the draft folder just yet, the code will remain GPL; and the data released with the spec will also have a Creative Commons licence applied to it. The wording in regards to id24res.wad has been rephrased to make clear the original intention of it just being another set of assets that is loaded to support the feature set; and that the code itself only requires assets with those names to resolve and handle as it always has (highlighting that projects like Freedoom or IWAD replacements for TCs are totally cool). A new field for DeHackEd things, Self damage factor, after feedback on the Immolator's splash damage highlighted the need for it. The ability to apply a translation to every graphic asset in a WAD has been added to GAMECONF There's also a bunch of minor wording changes throughout the spec to clarify a few concerns that have popped up. Subjects of interest/that require more feedback include: JSON comments Many people see JSON comments as a necessary way of working based on how they use comments in other formats. JSON itself (as has been observed) has a very robust ecosystem surrounding it. JSON parsers however fall under two basic rules: Every JSON parser supports no comments Only some JSON parsers support comments I've personally been bitten by JSON documents with comments failing to parse when using tools on Nightdive projects that compile resources in to larger resources. And this is something that we should highlight here: Compiling a WAD file is becoming increasingly common thanks to things like DoomTools. And that's not to say DoomTools needs JSON right now, that's just to say that I have no idea what toolsets and toolchains will be used in the future to handle Doom data authoring. The safest bet is to take an approach that covers all bases. Which means discussing the types of JSON parsers. There's the kind that return associative containers, and these kinds of parsers generally can handle comments just fine. But then there's the reflection based parsers, where you point a JSON parser at an instance of your object and it automatically deserialises the values in to each field for you. These parsers generally cannot preserve comments without a bunch of hoops for the programmer to jump around. So when someone uses a theoretical tool based off this kind of parser and saves it back out, there's a good chance the comments disappear. I've used JSON across five different programming languages at this point, both of the associative-container and reflection styles. And there's one solution I've found that works for everything: add a "comment" value to each object. On top of everything discussed, this also has the benefit for reflection-based UIs not needing to write special-case code to handle information the user considers relevant. The JSON Lump spec however takes a hands off approach to everything except the root object. This is intentional, as the root node is designed to be used for functionality identification and putting data in expected places. But if there's demand for it, we can spec it so that each object will always have a "comment" field reserved for it. Translating every graphic in a WAD I had an interesting realisation a few days after the initial publication of the ID24 spec - we actually had a path with this spec to handle converting from one PLAYPAL to another. A translation in regards to Doom is traditionally used to map one colour range to another, but at its heart it's just a dumb lookup table. It is entirely possible to use a translation to describe a complete PLAYPAL conversion. So I mentioned to a number of prominent modders I'm in contact with that the Translation field in a thing could be used to convert, for example, the Doom IWAD's assets to their new custom PLAYPALs. This was met with a unanimous, enthusisatic, and resolute "Okay, but what about walls?" So I set about upgrading the idea to a complete feature. Short story, you no longer need to ship an entire IWAD's worth of content with your mods just to change the palettes around. In terms of implementation, this is handled by a couple of templated WAD entry loading functions that check if a handler is installed for that WAD by comparing the typeid hash provided by the templated type. If the lump is contained within such a WAD, the handler immediately processes that data on load to perform the convert operation. From that point on, as far as the rest of the code is concerned it's like the lump has been loaded in natively. So translations for player palettes work just fine on the resources afterwards. (Of course, to test this I did some truly disgusting things to Doom. Images inside the spoiler tag.) Spoiler (I was going to post more screenshots but I hit the filesize limit) Tranmaps and colormaps in true colour/hardware renderers Now, I haven't kept my professional resume a secret. So it's always struck me as weird that software rendering features that have been around since 1998 for the tranmap (1993 for the colormap) have been considered extremely difficult to impossible to implement on a hardware renderer. Needless to say, I put some thought in to how to handle these things on the hardware side when I made sure my software renderer was compliant with Boom. Spoiler tag since it's on the technical side. Spoiler Okay, so the problem consists of several parts that we can break down in to different chunks. The first one is addressing true colour renderers. There's a bit of a point that needs to be made here before we get in to the theory: When a content creator decides to use a feature that reduces the fidelity of the scene down to 8-bit paletted, that's entirely their choice. It's up to the engineers to make sure their vision is presented to the end user accurately and to their satisfaction. Which means somehow we need to bring a true colour space back to the 8-bit palette. There's actually an incredibly simple way to handle this with a little bit of preprocessing time on program initialisation: Generate an 256x256x256 8-bit 3D texture. Treat each axis as a colour component (X for R, Y for G, Z for B). For each entry in that texture, fill it with the closest match the colour has to an entry in the PLAYPAL (effectively the same code used to generate the default 65% tranmap for Boom rendering features). This is a critical building block that's kinda brute force but is also pretty efficient once you get to the renderer. Register usage is kept to a minimum, the only performance concern would be how a GPU's cache handles it. Generating it could be seen as unacceptably slow on the CPU, but if you've got a compute pipeline then this is exactly the kind of job it excels at. And if you don't have a compute pipeline, a bit of task-based threading will get you a decent speedup. But having seen 3D SDFs used extensively in products I've shipped over the last decade, the runtime cost pre frame is basically a concern for lower powered devices. I am very interested if there's a faster method for matching true colours to palette indices on such hardware, and if so then providing multiple implementations and picking-and-choosing the correct shader for the hardware sounds like the way to cover all bases. This then means that when applying a colormap to a surface that uses true colour textures for example, the steps are: Sample the texture Perform a lookup using that sample on the 3D palette lookup Use that index to look up the remapped entry from the colormap Use that index to look up the PLAYPAL Send that value along to the next function in the chain Tranmaps though, well, that's another kettle of fish. It's not an unsolvable kettle though. In particular, if you've ever needed to solve scene sorting and layering for multiple glass refraction shaders then you already have an idea on how to approach this. The actual steps involved to do a correct tranmap operation is just as simple as the colormap: Sample the "background" true colour image Perform a lookup using that sample on the 3D palette lookup Sample the texture used for this surface If it's true colour, perform the 3D palette lookup Sample the tranmap using those two indices as UV coordinates Use that index to look up the PLAYPAL Send that value along to the next function in the chain The tricky part of course comes from correctly fencing those layered calls and propagating the correct samples to the next fenced area (and for heavy usage of tranmaps to not instantly throttle your GPU down to no real work being done). There's two decades of real-world research and practical applications in gaming on this front though, so finding a method that works nicely with your renderer's architecture is well outside of the scope of this post. UDB config files These are grossly incomplete and basically only let you use the line specials and texture any surface right now. But they're there for reference. udb_id24config_v0.1.zip Edited September 13 by GooberMan : Added UDB ID24 initial configs 23 Share this post Link to post
Gez Posted September 4 4 minutes ago, GooberMan said: Translating every graphic in a WAD Reminds me of this: As you can see, not a new idea. 0 Share this post Link to post
LexiMax Posted September 4 I don't have a ton to say about the spec at this moment, but one thing about JSON caught my eye. 32 minutes ago, GooberMan said: I've used JSON across five different programming languages at this point, both of the associative-container and reflection styles. And there's one solution I've found that works for everything: add a "comment" value to each object. That's fair. I've often hand-written JSON for strict parsers, and adding a _comment field is incredibly common. It's also worth mention that JSON has schema support as well, which is enabled automatically in editors like Visual Studio Code. I can attest that it has made hand-writing JSON much easier, as it allows me to tab-complete valid keys and values for a specific scope. I wonder if creating such a schema for ID24 would help. 0 Share this post Link to post
Trov Posted September 4 (edited) Quote Determine an authoritative session feature level by the maximum value encountered from the following checks Current gameconf declared level Feature level found in COMPLVL lump If authoritative feature level is found and is lower than what is found in DeHackEd, this will produce undefined behavior and therefore is an error condition How deep is the Dehacked feature level check? It is extremely common for instance for Limit Removing or even Vanilla wads to have DEHACKED lumps with BEX [STRINGS] level name/story text/etc string replacements and [PARS] section that otherwise have no Boom features. As far as JSON comments which I discussed in the other thread, I think a dedicated field for them is a good solution. Edited September 4 by Trov 1 Share this post Link to post
GooberMan Posted September 4 33 minutes ago, Trov said: How deep is the Dehacked feature level check? It is extremely common for instance for Limit Removing or even Vanilla wads to have DEHACKED lumps with BEX [STRINGS] level name/story text/etc string replacements and [PARS] section that otherwise have no Boom features. As a reference implementation detail this, is something no other implementation is required to follow but does give an insight as to how kexDoom handles it if you're striving to match functionality specifically with kexDoom. But short story is that the kexDoom implementation is not taking a hardline stance that only DeHackEd as a tool was authored in the 90s can modify data found in vanilla/limitremoving feature levels. BEX and other patch types that only modify things like par times and strings do not indicate a higher required feature level by themselves. It's purely what data is modified by the patch that indicates what feature level is required. As such, if your BEX string patch modifies a string introduced in Boom then that is considered reason enough to indicate that the boom2.02 feature set is wanted (these are the strings used by the generalised linedef locks for reference). Same deal with modifying the ID24 strings or using a custom USER_ mnemonic, that indicates id24 is wanted. I can see a usecase where a generic gameplay mod patch that is otherwise vanilla compatible might want to change the Boom strings, I would love some examples there to see if this is an actual thing that is done or if mods have generally avoided that. 1 Share this post Link to post
OpenRift Posted September 4 (edited) So on a semi-related note, once ID24's format is finalized, will we see Legacy of Rust receive newly recorded demos? As it currently stands the new ID24 demo format does still have inaccuracies due to Kex's Boom/MBF/MBF21 implementations not having behavior that's 100% accurate to the real deal, making it incompatible with other source ports. I would alternatively suggest re-recording the demos in MBF21 format (given LoR's use of ID24 are all cosmetic), but as I said, I don't know how much the MBF21 inaccuracies would cause issues. However, maybe this approach could be done if Boom/MBF/MBF21 is made fully accurate before ID24 is finished? I don't know. Apologies if this is a bad place to post about this. 1 Share this post Link to post
Arsinikk Posted September 4 (edited) Quote If authoritative feature level is found and is lower than what is found in DeHackEd, this will produce undefined behavior and therefore is an error condition I still don't like the idea that DehackEd can overwrite GAMECONF and COMPLVL. Even if it's a rare occurrence that a certain string could bump up the compatibility level, I just really don't like that an "authoritative" feature level can be overwritten. At that point it isn't "authoritative". I honestly don't understand why Dehacked is being used to determine compatibility anyway. In ports such as PrBoom, DSDA Doom, Woof, the dehacked isn't used to specify compatibility, and any boom, mbf, mbf21 codepointer is just skipped over in Vanilla complevels. I don't understand why we can't just do that here as well. I feel that introducing Dehacked into the compatibility check is unnecessary, and I really don't like the idea that "authoritative" isn't really "authoritative" here. Especially how boom-only strings (ex: "3 keys to unlock door" message) are just skipped in Chocolate and Crispy Doom. Quote Feature level of Complevel 9 if MUSINFO lump is found I think that MUSINFO should be detached from complevel9, and shouldn't be tied to any compatibility. It's often commonly used for Limit-Removing maps, and even sometimes in Vanilla maps (MAP33-35) for Crispy Doom. I could say the same thing with MBF sky transfers as well. One thing to note about Vanilla maps is that you can have boom / mbf line actions in vanilla maps and it work perfectly fine in Vanilla. The engine just ignores reading them. It's only unknown things and unknown sector effects that actually crash vanilla in DOS. Edited September 4 by Arsinikk 4 Share this post Link to post
GooberMan Posted September 4 1 hour ago, OpenRift said: So on a semi-related note, once ID24's format is finalized, will we see Legacy of Rust receive newly recorded demos? Yeah, this is outside of spec talk. But I will answer it with a statement that demo compatibility is a long-term goal of the codebase. The kexDoom demo format as such is handled separately of the spec, and my honest recommendation is that everyone should still record their demos with other source ports for normal releases and record new demos with kex for the BNET uploads. It's not really worth trying to support kexDoom/Rum and Raisin Doom format demos there as from the perspective of another source port it's basically un-engineering your code to be as incomplete as the those implementations. The spec itself has no random behaviors outside of non-deterministic visual elements, so ports that already have a demo-compatible MBF21 codebase can assume that implementing features for ID24 will result in correct demos. I guess there's a gap in the spec there since my focus has been on kexDoom that such ports should have a demo version that indicates that, but I'm not up to speed on how MBF21 demos define that version. 51 minutes ago, Arsinikk said: I honestly don't understand why Dehacked is being used to determine compatibility anyway. In ports such as PrBoom, DSDA Doom, Woof, the dehacked isn't used to specify compatibility, and any boom, mbf, mbf21 codepointer is just skipped over in Vanilla complevels. I don't understand why we can't just do that here as well. Again, this is an implementation detail for kexDoom. It has been designed to eliminate undefined behavior. Undefined behavior leads to crashes and weird edge cases, and when you're running on a console that tends to get you in trouble with the platform holders. A mod that is trying to tell kexDoom that it is for one feature set but is clearly using features from another feature is not something that kexDoom should make assumptions about as a result, and will require the user to deal with it in one way or another. Since this is specifically an implementation detail of the reference implementation and not a requirement of the spec, it's something that your own implementation is free to do differently. But as long as no one here is a console platform holder, the rules need to be adhered to as best as possible. And this often means placing restrictions on things that are otherwise fine when someone is running a source port on their PC. It's just the way things need to be. 2 Share this post Link to post
Xaser Posted September 5 Demo compatibility and complevels are a separate (though related) concept from the standards themselves, since different ports have different goals on this front. With MBF21, for example, dsda-doom and Woof! both implement a shared demo-compatible complevel 21, while other ports (GZDoom, Eternity, Odamex, etc.) do not. It's not expected that you'll be able to record a demo in dsda-doom and play it back on Eternity, for example. Demo compatiblity isn't something the spec can require, realistically, but that's fine in practice since the relevant ports practice good demo-compat hygiene in every feature that gets added. Anyhow, that's a long winded way to say: I wouldn't use the phrase "ID24 Format" to describe Kex-recorded demos, since the incompatiblity comes from the source port, not the feature set. "Kex format" or "kexDoom format" is fine, maybe "OSRS2" if you want to be a nerd and use the demo signature as a reference. :P 2 Share this post Link to post
rfomin Posted September 5 Quote * doom1.9 - Equivalent to the final DOS retail releases of The Ultimate Doom, Doom II, and Final Doom. * limitremoving - Certain limits removed to stop crashes and memory overwrites; includes a select few line actions from Boom that only affect visual features and do not break demo compatibility. * bugfixed - A limit removing mode that fixes well known bugs that outright break demo compatibility with doom1.9. * boom2.02 - The Boom feature set as originally released in 1998. * complevel9 - Sky transfers from MBF, MUSINFO, longtic demos, and a few other bits and bobs that do not exist in Boom 2.02 but are generally accepted by the community to mean “Boom compatible”. * mbf - The MBF 2.03 feature set as released in 1999. * mbfextra - Everything from mbf plus DEHEXTRA. * mbf21 - The MBF21 feature set as released in 2021. * mbf21ex - Everything from mbf21 plus DSDHACKED. * id24 - The featureset described by this group of specifications. The COMPLVL lump is intentionally very minimalistic. Currently this list looks like a roadmap for R&R/KEX ports, it's hard to implement even for demo-compatible ports. We've already discussed this a bit, but can it be simplified? I suggest removing all cosmetic/visual only features from compatibility checks. I also don't think we need the following: * limitremoving/bugfixed distinction. Current limitremoving supposed to be doom 1.9 compatible, bugfixed is complevel boom (9) and higher. * boom 2.02 - The difference with complevel 9 is only cosmetic/visual. There are very few demos recorded in Boom 2.02, almost all of them are complevel 9. * mbf - All notable MBF PWADs are complevel 11 and desync/don't work in MBF.EXE * mbfextra - Currently DEHEXTRA works with all complevels in demo compatible ports * mbf21ex - Same as DEHEXTRA. Otherwise, we will end up in a PrBoom+ situation where there are a tons of complevels and only a few of them are used. This is confusing for mod creators and newcomers. 4 Share this post Link to post
Lollie Posted September 5 So I don't really have a horse in the spec definitions race, but I have questions about TRAKINFO. It doesn't seem to be mentioned anywhere in the specs, even though it's used by extras.wad to define the SC55 and remix soundtracks. MUSINFO is mentioned, but it sounds like it's only present as part of compatibility for complevel9. Is TRAKINFO meant to be part of ID24, or is it just to be treated as a D+D2 feature? 0 Share this post Link to post
Edward850 Posted September 5 It's just meant as an engine feature, though could be expanded into a spec of its own if the need arises. Currently all it does is define track replacements and volume. 2 Share this post Link to post
The Dommo Posted September 5 are there any plans to update legacy of rust or introduce some test maps when id24 is fully finished? i think it'd be great to see what this new feature set is fully capable of when finished, especially as someone who now mains the kex port. (good job to everyone at nightdive and gooberman, by the way!) 1 Share this post Link to post
The Dommo Posted September 5 6 minutes ago, The Dommo said: are there any plans to update legacy of rust or introduce some test maps when id24 is fully finished? i think it'd be great to see what this new feature set is fully capable of when finished, especially as someone who now mains the kex port. (good job to everyone at nightdive and gooberman, by the way!) oops, i didn't realize this thread is for specs only. if a mod could delete the post i'm quoting and this post, that'd be great. 0 Share this post Link to post
Cacodemon345 Posted September 5 The tranmap handling as described here in the opening post still isn't friendly with hardware renderers IMO (although colormap rendering is). The reason is sampling from the "background" image is not as easy as it may sound. Most desktop GPU hardware are incapable of sampling the framebuffer in a fragment shader (or even if possible, without serious performance hits and/or introducing design issues on existing hardware renderers). The only other alternative involves making a copy of the image after each draw call which is sure to destroy performance. Even if such alternative were invoked only when encountering TRANMAP mobjs, it would still significantly slow down the renderer on maps featuring a large amount of such mobjs. It may work well for software renderers, but it will not be friendly to any hardware renderers that does not target solely non-desktop hardware, including the ones in DSDA-Doom and GZDoom. What should be done is abandoning this non-futureproof idea altogether and instead going for a comprehensive solution that is compatible with most hardware renderers. TRANMAP's arbitrary nature makes it impossible to support properly, and even with various Vulkan/OpenGL extensions it would still remain impractical to support. It's better to stick to alpha levels (as I mentioned in the original post) and what can be reasonably and easily adapted by hardware renderers (a good starting point is this.). I'm sure colormaps to deal with various blending modes can be reasonably generated on-the-fly. 1 Share this post Link to post
GooberMan Posted September 5 1 hour ago, Cacodemon345 said: The tranmap handling as described here in the opening post still isn't friendly with hardware renderers IMO (although colormap rendering is). Well, there it is. The exact kind of opinion where I would reject an application for a junior graphics programmer in the AAA space. I've already been in communication with hobomaster, so rather than derail the thread and take apart why this fails the litmus test (other than to say no one in AAA uses the final framebuffer for sampling so stop bringing that up as a smoking gun against the technique) I'm just going to focus on providing a sounding board and support for Helion and watch the goal posts move yet again about what is impossible/worth doing when it supports tranmaps correctly. 1 Share this post Link to post
Cacodemon345 Posted September 5 1 hour ago, GooberMan said: (other than to say no one in AAA uses the final framebuffer for sampling so stop bringing that up as a smoking gun against the technique) No BSP-based Doom hardware renderer does so either. Not sure why you're bringing this up. 1 hour ago, GooberMan said: watch the goal posts move yet again about what is impossible/worth doing when it supports tranmaps correctly. Tell us how Hellion does it (especially with OpenGL 3.3) and I'll be happy to be proven wrong. :) 4 Share this post Link to post
elf-alchemist Posted September 5 17 hours ago, rfomin said: * mbfextra - Currently DEHEXTRA works with all complevels in demo compatible ports * mbf21ex - Same as DEHEXTRA. I had actually DM'd GooberMan earlier about this, specifically, about DEHEXTRA running on lower complevels, and to quote him from almost two weeks ago: Quote First off, there's a fundamental misunderstanding there of what DEHEXTRA is meant to be. As per the development thread where all this was hashed out, it explicitly requires both the MBF tables (and a few extra prBoom tables) to be a complete spec. DEHEXTRA without MBF functionality is effectively undefined behavior and in violation of the spec. As I understand it, Crispy Doom is basically the only port to break with this and implement DEHEXTRA. And it cannot be a complete implementation by definition if it is not implementing MBF functionality. The thing to understand about how the ID24 spec defines those feature levels is that they have been chosen both as a result of the clean room implementation running the actual Boom and MBF executables to see what is actually included in each spec. prBoom added a lot of stuff that has been assumed to just mean Boom, but this is provably wrong (Overboard cannot be loaded in Boom nor MBF as just one example, a mod that lists itself as Boom compatible). This is also much to the chargrin of other port developers who continually get bug reports about 25 year old code that actually was taken from Boom. Further, as this is a spec designed to run on consoles (which have far more restrictive environments than the anything-goes PC space) it is a requirement to eliminate undefined behavior wherever it is found. While most feature sets are pretty straight forward, the -complevel 9 feature level was defined to accept the reality of the above Boom incompatibilities, as well as the fact that MBF was virtually unused until it became popular with prBoom's implementation over a decade later. As such, the suggested feature levels go against the more restrictive requirements and overlook many of the reasons these feature levels have been defined in this manner. So, the big picture, source ports on the PC space seem to have a lot more leniency that KEX Doom can't afford, hence the extensions to DEHACKED being fully locked behind their respective complevels in here. One can imagine these features on a two-level system of strictness, whereby KEX Doom has to follow the stricter version of it, and other ports can generally assume the looser version of the spec, where backporting features as plausibility enables is allowed. I will agree however that the nomenclature is a little taxing for those not already in The Know™, and perhaps something simpler like the following may very well be an improvement on that front, bringing it closer to the minimalism of COMPLVL, if losing a little on being technically descriptive. * vanilla -> doom1.9 and limit-removing * vanilla_extra -> bugfixed * boom -> boom202 * boom_extra -> complevel9 * mbf -> mbf * mbf_extra -> mbfextra * mbf21 -> mbf21 * mbf21_extra -> mbf21ex 0 Share this post Link to post
GooberMan Posted September 5 (edited) Sorry, but I'm going to have to tell you to not quote me from DMs if you're going to not make clear the fact that you wanted DEHEXTRA specifically on the limit-removing feature set. Putting it out in this context muddies the waters quite a bit. rfomin's post is being discussed with a few other people, and I will reply to that when there is something worth saying on it. 2 Share this post Link to post
Professor Hastig Posted September 6 14 hours ago, Cacodemon345 said: Tell us how Hellion does it (especially with OpenGL 3.3) and I'll be happy to be proven wrong. :) Does it even matter? Helion is a palette emulated hardware renderer so it doesn't mean much if it gets TRANMAPS to work somehow. A true color renderer will never be able to handle TRANMAP properly. Even for palette emulated renderers things can get tricky if they blend the depth fade levels like various Build engines do or allow true color images mixed with palette emulation - again a standard feature in Build engines for hi-res replacements. A common standard meant to be implemented by all interested ports out there should only define features that do not assume implementation specific traits of the renderer - and "strictly 8 bit paletted with a single global palette" is such a trait - hat alone disqualifies TRANMAP as part of a "universal" standard. Having an alpha value for transparency is both easier to use and more compatible while covering the majority of TRANMAP use cases without forcing the mapper to provide data the engine could easily generate itself. 0 Share this post Link to post
GooberMan Posted September 6 That's quite honestly a fundamental misunderstanding of what a tranmap is. The data represented by a tranmap is the results of a blend operation between a render target and a surface. It is not just transparency, that is just one function that can be expressed. Consider the SKYTRAN lump that I uploaded to the resources folder in the 0.99.1 spec release - it chooses different results for palette index 0 and thus can implement Hexen skies entirely with a tranmap with no other specialised code paths required. This is where the distinction of a tranmap being the results becomes clear - each entry in a tranmap can have an entirely different blend operation. This is even a valid optimisation path in modern AAA gaming. Leaving it entirely to shaders and not baking results is obviously the easiest way to do it but it's going to make it far more difficult to achieve 60FPS consistently if you're mucking your material pipelines up with complicated user-defined functions. All this discussion about true colour renderers never blah blah blah and alpha is easier blah blah blah- it's straight up misdirection. The simple fact of the matter is that tranmaps have existed since 1998's Boom releases (1994 if you want to go even further back to Heretic). It is a staple part of a software renderer and an identifiable part of the graphical aesthetic. For people who willingly choose that aesthetic in 2024, it is a creative choice. It doesn't stop them from targeting hardware renderers specifically. And quite frankly, saying "no" to the choice is the antithesis of an engineer's mindset. An engineer seeks solutions to a problem, and the problem in question here is how to support artistic expression. The direct feedback I've gotten from prominent modders that are preparing mods with tranmap and colormap usage, in fact, have thanked me for giving so much control over the end result to them. 5 Share this post Link to post
Gez Posted September 6 Update UMAPINFO's bossaction field to allow thing numbers as well as mnemonics; and add the bossactionednum field to use a thing's doomednum instead Allow user-defined string mnemonics starting with USER_ to be added to your string lookup table Okay, so. Why not, instead of using DEHACKED numbers in UMAPINFO, allowing to define in DEHACKED new things mnemonics that can be used in UMAPINFO? From the perspective of DecoHack, it'd be a new custom property that things can have or not. The DEHACKED parser keeps a table of these custom strings and associate them with the relevant number. And when encountering a mnemonic in UMAPINFO, the engine looks at the table filled in by the DEHACKED parser. At no point does the modder need to bother knowing which number DecoHack will give. They can just keep using "auto thing" for everything. Inserting or deleting things doesn't force to update anything. 0 Share this post Link to post
GooberMan Posted September 6 Xaser asked for the editor number functionality for that exact DecoHack reason, the numbers it generates for thing IDs can change but generally the editor number is entirely controlled by the mod author. This, however, is a good suggestion that never crossed anyone's mind. To simplify the implementation, I would suggest a USER_ mnemonic used in a bossaction would do a string-to-integer conversion on request. But otherwise we can spec that up pretty easily. Perhaps introducing a new TID_ mnemonic to handle it so that USER_ doesn't get overloaded? 0 Share this post Link to post
Gez Posted September 6 TID_Name seems reasonable enough, though the association of TID generally is with instances of things, rather than with classes. Maybe MOBJ_ mnemonic would fit better? That's just bikeshedding, though, it's not excessively important which convention is used. 0 Share this post Link to post
SaladBadger Posted September 9 (edited) Just a heads up that the weird copy-pasting seems to still be present in the Finale lump specification. "castmembers" is still using the wrong description and the format of "castmember" isn't precisely defined anywhere. also just a small thing, but I've wondered for a while if an inverse colormap would have worked for tranmap emulation, so hearing an actual AAA graphics dev bring it up makes me kinda happy. I'd be happy if more ports ended up implementing it 0 Share this post Link to post
GooberMan Posted September 9 Ah crap, yeah, I was in a bit of a hurry to get the spec written down before QuakeCon and missed finishing the finale spec. I'll get on that. 4 hours ago, SaladBadger said: I've wondered for a while if an inverse colormap would have worked for tranmap emulation, so hearing an actual AAA graphics dev bring it up makes me kinda happy Colour space conversions is super common in AAA (there's a ton of effects that are best done in HSV just as one example), and that's all this is at the end of the day. When it comes to sorting blended objects for effects like refraction - and by extension tranmaps - how each engine does it is quite different and is highly dependent on the architecture under the hood. 0 Share this post Link to post
GooberMan Posted September 13 Minor bump to highlight that UDB configuration files that let you use the new linedef types and place any texture on any surface have been attached to the first post. The configurations are grossly incomplete and require you to manually extract into the UDB "Configurations" folder, so for right now come with an "unsupported" disclaimer. They will be updated in the future to provide better support. 2 Share this post Link to post
Cacodemon345 Posted September 13 Can an explicit "levelnum" field be added to UMAPINFO for the ID24 spec? The currently-undocumented method used to determine map numbers is very suboptimal. Episode fields shouldn't restart the map number from 1 and it should be determined from checking the lump name of the map specified if there's no "levelnum" field. 0 Share this post Link to post
GooberMan Posted September 13 It's one of my primary bugbears with UMAPINFO that determining the levelnum is unspecced and basically "implementation defined". As it stands though, the behavior that's in the current live version of Doom + Doom II is considered to be buggy because it doesn't match those of other source ports, and I have fixed that behavior for release in the future to scrape numbers from ExMx and MAPxx at all times. I think in the case of adding a new levelnum field though, it's something there should be a larger discussion on. Just a few questions I have: How do you define an episode number when you're in commercial (ie Doom II) mode? What does that mean for -warp/idclev? Demos save episode and map number to resolve the map to load, should this go hand-in-hand with an expanded demo format that allows you to explicitly declare the map being loaded? 1 Share this post Link to post
Arsinikk Posted September 13 On 9/4/2024 at 8:18 PM, rfomin said: * limitremoving/bugfixed distinction. Current limitremoving supposed to be doom 1.9 compatible, bugfixed is complevel boom (9) and higher. I very much disagree with this. "Bugfixed" does not equal "Boom", nor would I want it to equal "Boom". "Boom" changes so much shit in the backend that basically makes certain Vanilla things not work correctly. Stuff such as floor action behaviour being "fixed", stuff like actors falling off, and just so much other stuff that just isn't Vanilla. I realise this is going to be extremely confusing here, but my port / DSDA-Doom fork Nyan Doom actually adds what I called true "limit-removing" functionality which is very similar to what "Bugfixed" does here, in that it removes all vanilla overflows (which I considered part of Vanilla's limits) in Vanilla complevels (0-4). Obviously I felt that there was a need for such a "compatibility" option, as there are many limit-removing wads that exceed overflows, that I felt there actually needed to be a Vanilla+ compatibility option, which is currently sorely missing in source ports. 0 Share this post Link to post