Jump to content
Search In
  • More options...
Find results that contain...
Find results in...
JPL

Tech Idea: Cross-Port Intermediate Demo Format

Recommended Posts

Using delta compression and skipping objects that aren't moving should really get the size down especially in conjunction with something like gzipping the stream. You're basically recreating how the network protocol works in some clinet/server ports and further Quake games. Delta compression would make rewind a little more complicated, but the space savings probably makes it well worth it, and scale to thousands of monsters.

Share this post


Link to post

I know this sort of thing has been discussed in chatrooms to some extent although there's never been any formal community effort since those seem to seldom go well. The gist of the discussions was to take the existing savegame code, which already serializes most of the game state, and overhaul / repurpose it as the demo format, maybe even by delta compressing the savegame each tic / having occasional I-frames to use video encoding terminology.

Share this post


Link to post

Thanks for the responses! A couple more thoughts:

 

The first steps towards proving something like this out would probably be to take a vanilla-compatible source port that is reasonably easy to work with, like Chocolate Doom, and hack in just the LMP playback + export support described above. Then hack in playback support for the generated data. See how much work it is to make that process stable. Check the I/O and disk space concerns. If it seems promising, start thinking about how to actually standardize the format and what features it would need to work with the full range of possible codebases.

 

This might be a can of worms, but I'm thinking about things like the Heretic teleport "glitter", that spawn tons of actors that essentially don't affect the playsim at all, and ZDoom-derived mods that spawn tons of particle fx actors. These would bulk up the data a lot without much point. So maybe you could flag certain actor classes as "let the playsim handle these", only their spawn is recorded by the demo data and the game is responsible for updating them as in a traditional format demo. Potentially messy implications but it seems like it could be a good option to have.

Share this post


Link to post

I've thought about this in the past, before realizing that the juice was not worth the squeeze.

 

The people who use demos the most in 2020 are speedrunners.  I'm not a speedrunner myself, but I highly suspect that most speedrunners would consider state-capture demos strictly inferior to deterministic ones in determining authenticity - if you can't trust the sequence of events in the demo, why not record an attempt live on Twitch instead?

 

And once you cross speedrunners off the list, what do you have left?  Attract demos, which only appear at the start when you're not actually playing the game, and have been disabled by ZDoom for decades anyway?  Considering also that any new demo standard would then have to have WAD's that include demos with that new standard, with a lot of older ones being left out.

 

I dunno, just doesn't seem worth it.

Share this post


Link to post
21 minutes ago, AlexMax said:

I've thought about this in the past, before realizing that the juice was not worth the squeeze.

 

The people who use demos the most in 2020 are speedrunners.  I'm not a speedrunner myself, but I highly suspect that most speedrunners would consider state-capture demos strictly inferior to deterministic ones in determining authenticity - if you can't trust the sequence of events in the demo, why not record an attempt live on Twitch instead?

 

And once you cross speedrunners off the list, what do you have left?  Attract demos, which only appear at the start when you're not actually playing the game, and have been disabled by ZDoom for decades anyway?  Considering also that any new demo standard would then have to have WAD's that include demos with that new standard, with a lot of older ones being left out.

 

I dunno, just doesn't seem worth it.

 

Yeah, these are 100% valid concerns and questions. That's why I'm wondering how much work the initial research would be, ie to determine what the real squeeze would be. (I'm not quite comfortable enough with C that I could just bang this out myself, unfortunately.)

I do think there's a chicken-egg dynamic where more people (beyond speedrunners) might use demos if they were more widely compatible and accessible, ie as simple as pressing the Share button on a PS4 controller, but that's speculation. As for the more experimental uses, I think it'd need a killer app, something someone really wants to make that they couldn't do as easily any other way.

Share this post


Link to post

I've discussed the matter in past but I'm a lazy sod. I'm all for this happening though.

One thing it probably would enable that would be a major benefit (I know people have wanted it before) is being able to, ideally, rewind demo playback. Hell, imagine being able to rewind during gameplay because you're snapshotting everything. Forza Doom sounds sweet.

Share this post


Link to post

Is there even anything preventing implementation of demo rewind during playback in the current state of engine? Sounds like a simple matter of writing a "reverse" game logic that does everything backwards from a given tic.

EDIT disregard that, I'm an idiot.

Edited by tchkb

Share this post


Link to post
22 minutes ago, JPL said:

Yeah, these are 100% valid concerns and questions. That's why I'm wondering how much work the initial research would be

 

I mean, that's just my assumption.  If you want, you could ask the speedrunners if they could find some use out of it.  There's a forum for them here, and a discord as well.

Share this post


Link to post
34 minutes ago, tchkb said:

Is there even anything preventing implementation of demo rewind during playback in the current state of engine? Sounds like a simple matter of writing a "reverse" game logic that does everything backwards from a given tic. 

 

I'll defer to the testimony of actual port authors, but speaking just as a programmer I'd have to really think carefully about how to do it in my own engine. And I think certain architectural assumptions about how to handle time and simulation stepping make it relatively easy, hard, or nigh-impossible. From everything I know of Doom's architecture it wouldn't be easy. For example, how do you "rewind" an actor that existed for a while but is now destroyed (as in, not just fatally damaged, but actually despawned and garbage-collected)? You can determine the future state of an actor much more easily than you can determine what happened to it previously from a given tick. It's different for eg a time-based puzzle game where you deliberately write everything so time can move backwards, but Doom obviously ain't that.

Share this post


Link to post

As a point for this kind of thing, if it is designed with the ability to easily be parsed from outside a Doom engine (perhaps implemented with protobufs or something similar) then this greatly opens up the ability to do data visualization in mass.

 

It would be cool to map out every speedrun submission, and watch the route shift around as new techniques are discovered. Tools for playtesting new WADs, where playtesters record and submit demos and it is analyzed to find the hotspots where players run short of ammo, or low on health, to improve balance. Map walkthroughs on DoomWiki could show a path with highlights for where keys are picked up by looking at the change in the inventory flags.

 

As a joke, I hooked up Crispy Doom to Redis where it basically (very poorly) serializes out every mobj in the game after the frame runs, and reads it all back out right before running the next frame. This is obviously silly on it's own, but this kind of serialization into a simple format that can be interacted with in any language opens up a lot of potential applications.

Share this post


Link to post
2 hours ago, JPL said:

This might be a can of worms, but I'm thinking about things like the Heretic teleport "glitter", that spawn tons of actors that essentially don't affect the playsim at all, and ZDoom-derived mods that spawn tons of particle fx actors. These would bulk up the data a lot without much point. So maybe you could flag certain actor classes as "let the playsim handle these", only their spawn is recorded by the demo data and the game is responsible for updating them as in a traditional format demo. Potentially messy implications but it seems like it could be a good option to have.

Skulltag/Zandronum has the CLIENTSIDEONLY actor flag for, well, basically the netcode equivalent of this purpose: let the client handle it on its own, the behavior of this actor will not affect the rest of the sim.

Share this post


Link to post

Roughly speaking the disk space requirements would be about the same as a Zandronum client side demo.  In fact, a Zandronum client side demo is more or less what you want and demonstrates the real issue with implementing this idea: It's not that simple to make a stable format.  I don't know how much bloat would be added once you make a protocol which is actually able to support any future actor extensions.  What about floating point vs vanilla's fixed point?

 

With that said I don't have much interest personally since to me demos are a debugging tool and ticcmd captures are way more useful for that.  There's been plenty of issues in Zandronum that have been difficult to investigate since while the issue was captured in a demo since it's just a dump of the network traffic it's not actually reproducing the bug just the observable result.

 

The stated features are more nice to haves than anything.  Speed runners would still need to stick to one version in order for times to be fair so this feature is inherently useless to them since the current ticcmd demos work fine for that even in GZDoom.  So that just leaves viewing demos for fun in a different environment from which they were played.  Which is cool I guess, but I don't see much point.

Share this post


Link to post

If it was that easy.

Even if you got the basics figured out, there's still such things as vastly differing feature sets that have to be tracked as well, even if it's something basic as translucency. Some ports do not implement it at all, some only implement it as a cheap TRANTBL effect controlled by actor flags while others provide full expression of alpha and render style. And among those, it isn't even guaranteed that the render style is expressed in a compatible format.

 

And this will go on for every single variable that gets added - if its type does not match or the internal semantics differ, it doesn't matter one bit if the data stream is supposedly complete - some ports won't be able to reconstruct the actor from it anymore and the idea of a cross-port format goes out of the window.

 

The entire thing would only work if all participating ports had compatible feature sets that allow reconstruction of the internal game state from it. Which ultimately brings us back to where we are right now: Feature centric ports for which preserving demo compatibility has always been a hassle would be inconvenienced even more than they are now, because the format is far more invasive and imposes most of the same limitations current demo support does.

 

I can say with absolute certainty that I won't ever put up with this. It's something that sounds nice on the surface but the horrors linger below and will haunt the developers for a long time.

 

 

Share this post


Link to post
2 hours ago, Graf Zahl said:

Even if you got the basics figured out, there's still such things as vastly differing feature sets that have to be tracked as well, even if it's something basic as translucency. Some ports do not implement it at all, some only implement it as a cheap TRANTBL effect controlled by actor flags while others provide full expression of alpha and render style. And among those, it isn't even guaranteed that the render style is expressed in a compatible format.

Different feature sets didn't stop UDMF from existing.

 

So a new demo format could be built upon the existing UDMF spec and parsing code.

 

Though I tend to agree with AlexMax that speedrunners would not fully trust these types of demos, and that makes the whole enterprise a lot less beneficial (not worth the squeeze).

Share this post


Link to post

The translucency example doesn't really fit since rendering properties should not have effect on the gameplay. And if there are gameplay effects (like the SHADOW flag that throws off monster aim), this gameplay effect can be implemented identically despite the differences in rendering implementation.

Share this post


Link to post

Even with a new demo format we would still need to appease the speedrunners. Attempts to do such a thing was made once by Xaser when he made one of his maps (or mapset?) compatible with Eternity and that failed to attract the speedrunners as it wasn't PrBoom+ compatible.

If we are trying out a UDMF-like format for the demos anyway, my proposal for it would be like this:


#maplump nameOfMapLump // Indicates the map used to record it.

#mapchecksum checkSum // checkSum computed from the map lump data.

tic numOfAbsoluteTic // would start from 0

{

     cam

     {

          String PSpriteName; // PSpriteName can indicate the sprite the player is holding

          Vector3 pos;

     }

     actor

     {

          RenderStyle styleNum; // where styleNum would represent a integer indicating the RenderStyle.

          String curSpriteName; // where curSpriteName would indicate the sprite name used to draw it.

           Vector3 spritePos; // indicates the position of the sprite.

            String curPlayingSound; // sound played at this tic point.

     }

     // Keep adding actor structs here until it ends.

}

Granted, I think much better could be done than this mockup I made (including using numbers instead for sprite names although that would defeat the purpose of a universal demo format) and the source port devs can come up with a better solution than what I made. I don't consider this a perfect solution anyway.

Share this post


Link to post

Hmm, yeah I knew this would have to account for the potential range of port features somehow but I'd hoped that stuff as specific render style wouldn't have to be stored. Is the point of a UDMF-like format here just that keys like RenderStyle can be optional?

 

Anyways yeah if the answer is "It'd be a lot of work, and people don't care enough about demos / are fine with demos as they currently work, for it to be worth the effort", that's fine, it was just an idea I had.

Share this post


Link to post

I mean personally at least I think it'd be a cool idea for something like Eternity, if not for cross-port compat then at least for self-demo compat. Given it can play back vanilla demos through to MBF it's a shame it can't play old demos of itself. Having a more robust demo format is definitely handy for it, at least.

Share this post


Link to post
37 minutes ago, JPL said:

Is the point of a UDMF-like format here just that keys like RenderStyle can be optional?

Keys like RenderStyle can be ignored by ports and can be omitted in ports not supporting them (although that would require predefined default values across ports).

Share this post


Link to post

I think I like the idea most for inter-port compatibility, since that is a big problem with things like ZDoom or heck even Eternity I believe. The difficulty is there, but if it could be overcome it might assist in making zdoom demomaking at least somewhat feasible.

Share this post


Link to post
4 hours ago, JPL said:

Anyways yeah if the answer is "It'd be a lot of work, and people don't care enough about demos / are fine with demos as they currently work, for it to be worth the effort", that's fine, it was just an idea I had.

 

It's still a good idea as far as I am concerned because some sort of intermediate format is necessary to be able to "scrub" a demo's timeline, and any source port that added such a feature would surely find an audience. Even if it was just generating the intermediate format internally without saving to disk, once a single source port had it working it could publish a reference implementation for other ports or tools to follow if they so desired.

Share this post


Link to post
On 8/14/2020 at 6:54 PM, AlexMax said:

 

I mean, that's just my assumption.  If you want, you could ask the speedrunners if they could find some use out of it.  There's a forum for them here, and a discord as well.

We would never use a format like this in competition because it removes most of the (speedrunning-related) benefits of the demo format. That doesn't mean it wouldn't be useful though. In particular, it would be cool if you could take a gzdoom lmp (version-specific syncing demo) and convert it into a universal lmp or whatever you want to call it that can be played back by future versions or other ports. Then playback becomes much easier when it comes to sharing with others (and the real demo is there for verification purposes).

 

The biggest thing that comes to mind for me would be using i-frames as Linguica mentioned. If you linked those frames to places in the source demo, you could use them to speed up the TASing process significantly on larger / longer demos.

Share this post


Link to post

The idea of taking a GZDoom demo and converting it to a "universal" format fundamentally doesn't make any sense. Say the demo involves playing on a level that involves slopes, or involves an ACS script or custom DECORATE monster that relies on RNG calls... How on earth do you do that in a way that plays back in any meaningful capacity in PRBoom?

 

Serializing everything would be great for scrubbing purposes or maybe also for helping analyze a demo and determine what makes it go out of sync between versions, but that's about it.

Share this post


Link to post
On 8/18/2020 at 6:19 PM, Wagi said:

The idea of taking a GZDoom demo and converting it to a "universal" format fundamentally doesn't make any sense.

I think you're missing the point. You're asking for authenticity, but by design this format would have no authenticity - it's specifically not meant to represent the game state that your port would have produced. If you start with just positions and transformations, as described in the OP, you already satisfactorily cover the vast majority of demos that have ever been recorded, and that will be recorded in the future. You could probably convert all 58k demos on dsda into such a format and play them back in any port that can load the maps. What you see when playing the demo would be missing certain details obviously, more or less depending on the complexity of the port used to record them.

Share this post


Link to post
On 8/15/2020 at 6:06 AM, Blzut3 said:

Roughly speaking the disk space requirements would be about the same as a Zandronum client side demo.  In fact, a Zandronum client side demo is more or less what you want and demonstrates the real issue with implementing this idea: It's not that simple to make a stable format.  I don't know how much bloat would be added once you make a protocol which is actually able to support any future actor extensions.  What about floating point vs vanilla's fixed point?

Extensible formats exist which already solve this kind of backward/forward compatibility question. If you wanted an off-the-shelf example for binary formats, protocol buffers are an obvious suggestion. However, for consistency with past efforts like UDMF, a plain text format would probably be preferable since it would address the floating/fixed-point discussion. You can compress the output to save space.

 

Most of the objections I see here in the thread seem kind of nitpicky and I think it would be a real shame if a good idea got buried because of them.

Share this post


Link to post

I have been considering the similar concept, a demo format that is more robust.

 

DoomLegacy has already modified the demo format for its own use.

It records facing angle as an absolute, instead of a relative difference from the previous facing.

DoomLegacy also tries to fix network play sync issues by updating the absolute position of players on the client machines.

 

I feel it would be easier to just fix the errors of a demo, rather than try to record everything.

Close to sync is adequate, as long as the time and distance of de-sync is limited.

Important interactions need to be fixed, such dead or not dead.

This minimizes the amount of new work that is needed.

 

The doom ports already can reconstruct the replay almost correctly, so keep that and refine it.

Keep recording keyboard input, due to it being the most important data.

 

Let the recording machine add postings that add to the fidelity of the result.  There are other formats that

behave this way, adding refining information that is optional.

 

Create a way to identify monsters, such as by sequence number or position, or mobj id.

Players can be identified by player number.

 

Periodically post the absolute position and angle of each player.  Once every two seconds ought to be enough.

When a player is damaged, the player position and health is posted as absolute values.

 

Periodically post the absolute position and angle of a monster.  One or two monsters each frame should keep them close to sync.

This can be variable or controlled by a user setting without affecting compatibility.

 

When a monster shoots, post the initial position, angle, and momentum of the monster, and the projectile.

When a monster is damaged (by anything), post the position, angle, momentum, and health of the monster.

When a monster dies, it's death is posted, identified by id.

When a monster spawns, it's spawn is posted, with position, and a new id.

 

Keep the positions in fixed point format.

Slight errors are tolerable, as another absolute update will appear before it becomes significant.

Important interactions will recorded as absolute events.

 

One significant problem will be during playback, the engine discovering an interaction before the posting in the demo is read.

 

It might be possible to find the demo posting of the interaction, and block of the engine doing

anything about the interaction it discovered.

EDIT: on second and third thought, trying to block anything the engine does on its own is a can-of-worms.

The engine should just do what it does normally, and let the demo postings adjust the results.

 

During playback, make it possible for a demo posting to always override whatever the engine may have done

on its own with a discovered interaction.

This can be as simple as having the demo postings just overwrite the player and monster values with more accurate values.

It may have to recreate a monster or mobj, so it may be safer to put mobj in a recently-died fifo, just in case the

demo is found to still be using them.

Playback of demo postings, might have to create or kill a monster or player or object.

 

This would fix the inevitable differences before they accumulated large enough to affect interaction.

The important results of an interaction would be directly made absolute.

The demo playback may be constantly slightly out-of-sync, but this demo format deals with that situation.

It can continually bring monsters and players back into sync.  The adjustments may be slightly visible, but that is far

better than the total de-sync that is commonly experienced now.

 

As all new parts of this demo format are optional, this can be expanded and modified extensively

without affecting its port compatibility.

The tic format would have to be re-formated, as there is little room for additional information now.

 

EDIT: Put postings between the tics in the demo recording.

During recording, the tic command is written first, then the tic command is executed.

Significant events that happen during the tic execution might be recorded as postings after the tic command.

During playback, the tic command is read, then the tic command is executed.  Significant events will be discovered by the engine.

Then the postings after the tic command are read.

These adjust and correct the values of positions, angles, etc, of the player, monsters, and objects.

This lets the engine execute normally, but refines the accuracy of the result.

Health, Death, and object existence is always fixed.

 

 

 

 

Edited by wesleyjohnson

Share this post


Link to post

@wesleyjohnson The proposal this thread presented included the ability to timeline scrub, which your ideas do not lead themselves towards making particularly efficient. This is why a delta snapshot methodology is presently the desired route given its ability to be keyframed and easy on demand reliable reconstruction without the need of extensive pre-processing.

Share this post


Link to post

Is there a term I'm missing? Can someone explain "scrub"? It seems to have no entry on the Doom Wiki...

Share this post


Link to post

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×