Jump to content
Catoptromancy

Savegame buffer overruns

Recommended Posts

We have a problem with a pure vanilla compatibility. Many otherwise easily convertible or already exitable vanilla maps have savegame buffer overrun crashes. All these crashes were tested while standing still at the player start, meaning more maps can overrun if player saves the game after waking up many enemies.

Now there is philosophical question. Chocolate-doom has a compatibility option to turn off savegame buffer overruns. Should we go for true vanilla compatibility or chocolate-doom with enabled savegame buffer protection compatibility.

If Choco wants to ship freedoom1 and 2 in a Linux repo, it will need to flip the savegame buffer overrun protection on as a default. Random linuxers will just otherwise assume a crashy game. Maybe savegame buffer overrun messages can point players on how to disable them.

We need to severely strip many vanilla maps, or chocolate-doom needs to enable buffer overruns by default.

EDIT> After good discussion, Freedoom's ideal target seems to be vanilla compatible, excepting for the savegame buffer. Freedoom's only real incompatibility would be with the commercial dos binaries, but only if trying to save a game.

Share this post


Link to post
Catoptromancy said:

We need to severely strip many vanilla maps, or chocolate-doom needs to enable buffer overruns by default.

The latter sounds like a much better idea. Freedoom's already got the vanilla-compatibility goal over it's head, I don't think stripping out the vanilla maps would be a good idea for this.

Share this post


Link to post
Catoptromancy said:

or chocolate-doom needs to enable buffer overruns by default.

That would surely go against Chocolate-doom's primary goal : compatibility with vanilla.

It would be the thin edge of the wedge. If Chocolate-doom changed this for Freedoom's sake, then maybe next year it will stop crashing on VPO errors, which is the gold standard of vanilla compatibility. Pretty soon it will turn into just another source port with arbitrary modifications added at the whim of the developers.

Share this post


Link to post
andrewj said:

It would be the thin edge of the wedge.


This is the trick question. Do any other programs, especially on linux repos have a "stop crashing" option. Intentional and guaranteed crashing.

Even though chocolate-doom is(would be) 100% compatible with freedoom, it is not by default. The savegame buffer is the last of the limit-removing options for true vanilla compatibility and allows freedoom's vanilla maps to be quite busy with items and linedefs.

Share this post


Link to post

Been looking for this post: https://www.doomworld.com/vb/post/1534583

How can a map be accurately tested for being in the limits of savegame buffers when anything could be happening in the map during any instance?

For simplicity's sake, pretend that doesn't exist for now and concentrate on the maps themselves. This issue can be addressed in the future.

Share this post


Link to post
Voros said:

How can a map be accurately tested for being in the limits of savegame buffers when anything could be happening in the map during any instance?

Play testers can save once in awhile.

Voros said:

Been looking for this post: https://www.doomworld.com/vb/post/1534583
For simplicity's sake, pretend that doesn't exist for now and concentrate on the maps themselves. This issue can be addressed in the future.


Savegame buffer is an invisible limit that just requires tons of saving and play testing. An invisible limit that constantly changes. All other compatibility problems are relatively easy to fix and straight forward.

Might be best to scrap almost all the maps and start from scratch. We could focus on a coherent theme and style while keeping maps within savegame buffer limits.

Share this post


Link to post

If you want 100% vanilla compatibility it may actually really be best to scrap everything and start fresh with completely new maps - keeping the remains of the current Freedoom as a separate mod with the maps you have to scrap.

Or split some of the larger ones and ditch a few of the bad ones - there's certainly a few in here that are better replaced anyway.

Unlike everything else, this simply is not fixable without constantly being aware that maps may not get too large and requires an entirely different mapping style than the existing maps.

I remember back in the day how I had to amputate some larger maps to make them saveable, it's often necessary to strip out huge chunks of them to make it fit the savegame buffer.

Share this post


Link to post

You're talking about ditching over 60 maps. Which were brought together for over 15 years. I seriously think that's a bad idea.

And like you said, it requires tons of playtesting to test if a map works within savegame buffer limits. I believe that's inefficient and unnecessary. Hell, even BTSX has the same problem IIRC, and it's supposed to run in vanilla.

Also, I doubt new users will play with chocolate Doom, as the official website highly recommends to new users two good ports to download. Naturally, they'll download one or both, find which one they like and continue with that port. They might play the chocolate Doom or vanilla, probably for shits and giggles (I remember i did this once, and this was before I knew anything about PWADS, sourceports, etc).

Point is, testing savegame buffer limits is a huge huge huge pain in ass, mainly its time consuming. Due to the randomness, there's always a possibility of that something got off the radar of the playtesters. Unless there are absolutely devoted playtesters out there willing to give their time to do this, I don't think it'll go far.

That's my opinion.

Share this post


Link to post

i agree with voros, and ths kind of error can be only PERFECT traced whit a correct tool (well i think of we have any) but whit some lucky the developer of "rendenter limits" can try add a real-time memory buffer scan while playing.

Note: D2m30, TNT and Plutonia 1 & 2 is so easy to buffer ouverrun (thats why chocolate doom has this option TNT is vannila...)

Share this post


Link to post

I think I have logical solution.

Linux repo install scripts can enable savegame buffer overruns in the cfg. Chocolate itself is untouched. The iwad it ships with is then fully compatible with the binary by default.

Actually even better:

Make sure issue is well documented and information easily findable. Chocolate-doom's wiki can make sure people turn on the compatibility option when using freedoom.

I think "chocolate-doom compatible" is a much nicer goal and still basically vanilla. I do not think any other ports maintain the savegame buffer. Who is going to use id's binaries with freedoom anyway.

Share this post


Link to post

Linux isn't the only OS. There's still Windows and OSX.
But what you propose sounds like a good way to overcome the sgb limit for Linux system.

Share this post


Link to post

Can savegame buffer overruns only be disabled via cfg or also via command line option? If the second is possible it may make sense to put a shell script or batch file along that starts the game with the proper settings.

Share this post


Link to post

A short skim of Chocolate Doom's command line parameter page, doesn't seem like there is a direct way to manipulate savegame buffer limit, although there is -config and -extraconfig.

But Freedoom consists of three IWADs. Does that mean three different batch files should be included or using the choice command (if its in Windows)? Not sure what commands an Apple computer has though.

Share this post


Link to post
Voros said:

Not sure what commands an Apple computer has though.


OSX is Unix-based so it should be mostly similar to Linux.

Share this post


Link to post

I feel like I can live with having freedoom/choco users manually allow savegame buffer overruns.

Almost all freedoom maps are now nearly choco compatible(with buffer enabled). All future maps can be quite detailed in smaller areas. Ignoring the savegame buffer might be the edge freedoom needs. This would make it the only vanilla limitation and allow for epic maps.

Since freedoom is only a dependency of choco and not directly bundled, it might make sense allowing an initially crashy appearing game.

Share this post


Link to post

I recommend ignoring the savegame buffer limit.

If you're really concerned about it then it's probably possible to make a tool that (given a WAD file) will examine a particular level and tell you whether it's likely to exceed the savegame limit or not. We could add that as part of the build process and output a warning. But in all honesty I don't think it's worth it. Most vanilla mods don't even bother addressing the savegame buffer limit.

Share this post


Link to post
Catoptromancy said:

Ignoring the savegame buffer might be the edge freedoom needs.

Why does Freedoom need an edge, and how is ignoring the save buffer that edge?

And if Freedoom needs an edge, why did it blunt itself by forsaking its longstanding historical Boom requirement?

Share this post


Link to post

I think Cato meant "the edge it needs right now". And he said "might be" too.
Ignoring the savegame buffer is an edge for the vanilla goal, else all the maps would have to be scrapped/remade, both vanilla and Boom fomat maps, in both IWADs. Like several have said here.

Share this post


Link to post

I think all ports allow savegame buffer overrun protection. So ignoring the savegame buffer will allow freedoom to be compatible and savable with all ports, even with chocolate if the compatibility is turned on.

Maps can get surprisingly huge and detailed if buffer is turned off. Just need to tone down the large open areas, while smaller rooms can be very detailed.

The only binaries not savegame compatible would be the actual dos exes.

Share this post


Link to post
Catoptromancy said:

I think all ports allow savegame buffer overrun protection.


I think most ports have completely abandoned the concept of a savegame buffer, which was utter stupidity to begin with. It would have been better to just write out the data in pieces with fwrite and loading it back in piece by piece the same way. The function already performs I/O buffering to improve performance.

For Freedoom a far bigger problem will be to make the maps vanilla compatible in the first place. Those which exceed the limits do it so badly that they'll have to be completely torn apart. I really think that limit-removing is the best that's achievable. Getting rid of the Boom stuff is relatively simple. Going down to vanilla from there will end in a disaster, if even a map like MAP07 had to be butchered to get it below the limits.

Share this post


Link to post

MAP07 did not have to be cut down so far. I made a nearly identical version years ago that was almost unnoticeable. E4M7 is a huge map, wide open areas. It is now vanilla. It did take constructing a wall in the middle though. Gameplay is basically identical in new E4M7. MAP07 did greatly change gameplay though. My favorite boxes to hide behind are gone, but there is still a bit of a hiding spot.

There is no need to butcher maps. Just need carefully planned adjustments. MAP07 was an easy one. E4M1, like E4M7 had carefully crafted lighting. I still managed to make them vanilla, without touching the lighting...too much.

Spoiler

Old

New


Lighting in finished vanilla map.

MAP15 will be a challenge to properly tone it down. Maybe need to turn some windows into walls, mess with lighting a bit, or connect vertexes in outside areas. But gameplay will be unchanged and overall feel of map will be identical.

Share this post


Link to post
Gez said:

And if Freedoom needs an edge, why did it blunt itself by forsaking its longstanding historical Boom requirement?

Can we stop arguing about this? The decision's been made, I don't see any good coming from endlessly bringing it back up again all the time.

Catoptromancy said:

There is no need to butcher maps. Just need carefully planned adjustments. MAP07 was an easy one. E4M1, like E4M7 had carefully crafted lighting. I still managed to make them vanilla, without touching the lighting...too much.

A suggestion: if we want to keep to the vanilla savegame limit then do this in stages rather than all at once. This is essentially the same as what I suggested a couple of months ago: do the vanilla conversion in progressive milestones, ie:

Milestone 1: All levels limit-removing (no use of Boom extensions)
Milestone 2: All levels vanilla-compatible (runs in vanilla)
Milestone 3 (if necessary): Levels keep to vanilla savegame limit

Even if you think the savegame limit is worth doing, it's of lesser importance than milestones (1) and (2) above. It's better to have all levels reach one of those milestones than one level reach all milestones. Does anyone disagree?

One thing to note: in the text I've quoted above, it seems like you're arguing that keeping to savegame limits will be easy because keeping to other vanilla limits has been easy. That's a false equivalence: writing savegames involves writing the entire level. That means you're constrained by the complexity of the level as a whole - it's not like eg. the visplane limit, the drawsegs limit or other limits where you can partition the level to reduce the complexity of a particular region. I think you probably do need to "butcher" maps to conform to it - or perhaps easier, split them into multiple smaller maps.

Share this post


Link to post

Maybe I was not clear. We should have maps run in vanilla, while ignoring the savegame limit.

All ports would be able to play and save in freedoom. Only people that use a commercial dos .exe with freedoom would have savegame issues, which seems kind of silly.

Share this post


Link to post
Graf Zahl said:

Those which exceed the limits do it so badly that they'll have to be completely torn apart.


Not really. I was able to take MAP10, which is probably one of the most detailed maps in the game, and make it run in vanilla without too many compromises. If you have the Visplane Explorer plugin for Doom Builder and ChocoRenderLimits for testing, it's not too difficult to make most maps work. IIRC I did the conversion in two evenings.

I agree that dealing with the Savegame buffer would be excessive. We have tools for testing for Visplane and Seg overflows, yet progress is slow. Savegame buffer overflow is hard to test for, and that would only reduce motivation to get the conversion done.

Share this post


Link to post

I agree with fraggle here. Even if the savegame buffer is taken into account, it shouldn't be taken too seriously ATM. Also, the DOS exes won't be able to load Phase 1 and 2 anyway, because the name of both IWADs are 9 characters long, officially.

Share this post


Link to post
Voros said:

the DOS exes won't be able to load Phase 1 and 2 anyway, because the name of both IWADs are 9 characters long, officially.


Need to rename the iwads to doom.wad or doom2.wad, which is also quite silly. So ya, freedoom working in all ports is a fine target.

Edited top post with general conclusion.

Share this post


Link to post
Jewellds said:

Not really. I was able to take MAP10, which is probably one of the most detailed maps in the game, and make it run in vanilla without too many compromises. If you have the Visplane Explorer plugin for Doom Builder and ChocoRenderLimits for testing, it's not too difficult to make most maps work. IIRC I did the conversion in two evenings.



It always depends on how the maps exceed the limits. I had a look at a few. MAP02 should be relatively easy to fix, but witm MAP0h, for example I really see no chance to do it without tearing out the center section or doing some major redesigns on it. Or take MAP24. No chance that's ever going to work in vanilla.

fraggle said:

Can we stop arguing about this? The decision's been made, I don't see any good coming from endlessly bringing it back up again all the time.


Actually, no. Some people here think - and rightfully so in my opinion - that the decision to go vanilla is stupid and self-destructive.

I'd agree that the little amount of Boom features in these maps is not really needed, but I'll repeat as long as this goes on that with the existing material targetting limit-removing is the best that can be achieved.

But I agree with your milestones.
First, the last empty map slot needs to be filled so that the games are at least complete.
The next should be a limit removing release, call it 0.9 if you will.
And only after that, vanillafication should be undertaken. And only make a new release once everything has been hacked apart. There's no point compromising a working limit removing release unless all maps work with vanilla. In any case that final limit-removing release should be kept, best with a download counter so you guys can see how much players appreciate your hack work! :D

Share this post


Link to post
Graf Zahl said:

Actually, no. Some people here think - and rightfully so in my opinion - that the decision to go vanilla is stupid and self-destructive.

I don't doubt that you believe that, but regardless of how destructive you may think it is, it's far more destructive to have to have this argument dragged up again and again, derailing every single thread. If you don't have anything constructive to contribute, if all you're going to do is stand at the sidelines yelling about the project being a "hack work", I'd like to request you to stop posting in this forum.

Share this post


Link to post

If the maps are going to truly vanilla, including the savegame buffer limit, the maps would have to be toned down even more.

Wouldn't it be a better idea to write a warning message in the README file? The problem seems to be specific with vanilla/Chocolate Doom only. That way, players will know how to overcome it, if they're playing on vanilla/chocolate Doom.

Share this post


Link to post

Just for comparison and your information, in BtSX we ignore the savegame buffer limit. There's that dynamic nature of its possible occurence, but mostly it's just way too restrictive for the mappers. Basically mapping stops being fun altogether, heh. Iirc even some Doom 2 maps can break it?

Share this post


Link to post
Guest
This topic is now closed to further replies.
×