Jump to content
Search In
  • More options...
Find results that contain...
Find results in...
ketmar

k8vavoom: no good thing ever dies!

Recommended Posts

@Remilia Scarlet and sector damage is implemented too. eh, you gave me some hard time by setting suit leakiness to 128: i didn't noticed it, and spent almost a hour trying to understand why my leaking code is "not working right". ;-)

Share this post


Link to post
4 minutes ago, ketmar said:

and sector damage is implemented too. eh, you gave me some hard time by setting suit leakiness to 128: i didn't noticed it, and spent almost a hour trying to understand why my leaking code is "not working right". ;-)

lol sorry about that.  Figured it was part of the whole damage code, so I should set it to something.

Share this post


Link to post
2 minutes ago, Remilia Scarlet said:

lol sorry about that.

ah, it's ok, i'm not complaining. it is good that you included that flag, because my leaking code was indeed broken. i am just making fun of my "selective blindness". ;-)

Share this post


Link to post

Hi! K8vavoom used to have breaking bugs on Arch Linux(everything black&white, that kind of stuff), but now it's working nice: D.
Do you have any wads to recommend where it's lighting system shines the most?

Share this post


Link to post
8 hours ago, Baron Pampa said:

Do you have any wads to recommend where it's lighting system shines the most?

it highly depends of what you mean with "shines". ;-) good lighting is not really noticeable unless you're specifially looking for it. for example, Silent Steel is done with *alot* of lights, but they're quite natural, and you may not even realise that some lighting is not done with map/textures. that's what i call Great Lighting. ;-)

 

basically, most maps with dynamic lighting (GZDoom ones) will look... interesting. and you may try all @Gunrock (and his brother ;-) maps: Braham Manor, Dark Wispers, Storage Area 32, etc...

Share this post


Link to post

and i fixed save/load with long texture names. it should work now. plus some bugs with save/load are no more. old saves shouldn't be broken.

Share this post


Link to post

i accidentally looked into network code, and rewrote a big part of it. multiplayer still sux (there is no client-side prediction; the player moves only when the server acks the movement), but i was able to play "localhost 2plr coop" over several maps. that is, mp is close to "playable over LAN" state. i won't enable it yet, tho, because it still needs alot of work (yet you can run it from the CLI if you know all the magic cvars), but hey, i said that multiplayer is not dropped, and i still stand by my word! ;-)

 

at least you can connect mid-game, because Vavoom is real client/server (and dedicated server can be compiled again). and no desyncs, of course (the server is the authority).

 

don't expect the working mp mode soon, tho: it needs *alot* of love first. but we're moving towards being Teh Best MP sourceport! ;-)

 

staty with us, we have cookies plans!

 

p.s.: no, you cannot run nuts.wad over the network. don't even ask for it.

Edited by ketmar : wordsalad

Share this post


Link to post
1 hour ago, Remilia Scarlet said:

Once I have time to make a map for it.

the more you're waiting, the more bugs will be fixed, and the more features will be implemented. just don't wait for +INF time! ;-)

Share this post


Link to post
2 hours ago, ketmar said:

i accidentally looked into network code, and rewrote a big part of it. multiplayer still sux (there is no client-side prediction; the player moves only when the server acks the movement), but i was able to play "localhost 2plr coop" over several maps. that is, mp is close to "playable over LAN" state. i won't enable it yet, tho, because it still needs alot of work (yet you can run it from the CLI if you know all the magic cvars), but hey, i said that multiplayer is not dropped, and i still stand by my word! ;-)

That's good news! By the way, remember the TDBots? Yeah, turns out coop works with them in K8Vavoom so that's something that i'll definitely put out a new version for.

Also, did K8Vavoom get much slower for anyone else? Been a while since i last updated, but now it's super stuttery for some reason, not so much low framerates, but stuttering (at 35 fps :).

Share this post


Link to post
11 minutes ago, -TDRR- said:

That's good news! By the way, remember the TDBots? Yeah, turns out coop works with them in K8Vavoom so that's something that i'll definitely put out a new version for.

i also fixed a major fuckup in `A_LookEx()`. it should be in the next build.

 

11 minutes ago, -TDRR- said:

did K8Vavoom get much slower for anyone else? Been a while since i last updated, but now it's super stuttery for some reason

that's strange. i am pretty sure i didn't changed anything related to that. actually, it should be smoother now. you may try to turn off bloom effect, it may cause some slowdowns. but that's basically all that was added...

 

p.s.: tbh, i am slowly deprecating GPUs without OpenGL 3.3 support. not that i am happy with that decision, but i really need to streamline rendering code, and this is almost impossible with 2.1. there's only so much time and resources i can put into k8vavoom (not more than 12-15 hours per day ;-), and i have to choose. moving to newer OpenGL will allow me to spend less time on maintaining renderer code (and i'll be able to add more features, but this is not the main reason, just a coincidence ;-). for now it means that i am paying much less attention to properly support 2.1 GPU specifics. it may cause some slowdowns too...

Edited by ketmar

Share this post


Link to post
1 hour ago, ketmar said:

that's strange. i am pretty sure i didn't changed anything related to that. actually, it should be smoother now. you may try to turn off bloom effect, it may cause some slowdowns. but that's basically all that was added...

That's not it, disabled and performance didn't improve too much. I should probably fiddle some more with the other settings, but the ones i'm using now were fine for vanilla maps previously (50fps+). It was a very late almost end of 2019 the last one i had used before this 14/02/20 build. Maybe my laptop is just too crappy lol.

Share this post


Link to post
12 minutes ago, -TDRR- said:

That's not it, disabled and performance didn't improve too much. I should probably fiddle some more with the other settings, but the ones i'm using now were fine for vanilla maps previously (50fps+).

very-very strange. if you'll be able to find out some option that affects it, it would be great. i did some little changes here and there in OpenGL code, but they should *improve* performance. and i optimised VM code too.

 

you can try to set "dbg_world_think_vm_time 1" to see how long VM takes. and "dbg_vm_disable_thinkers 1" to almost completely disable VM to check if VM is to blame.

 

p.s.: i am really sorry that k8vavoom became worser for you. this is something i didn't planned... yet. ;-)

Share this post


Link to post

I attached a room that's particularly lag-spiky/stuttery/whatever, this is on Doom 2 MAP20: "Gotcha!".

 

VM reports 4-5 "time taken" when staring at the cyber and mastermind, reports 3-5 when not facing them and FPS raises up to 60. Disabling thinkers makes the game run at a rock-solid 60fps. For some reason, it didn't report 4-5 when first facing the monsters, only a while after (this is why the screenshot shows VM: 3)

EDIT: You mentioned that you are deprecating GL2.1 support, however my GPU is fully GL4.0 compliant so it's not that i won't be able to run K8Vavoom, i just won't be able to run it smoothly :p

Regardless, i like playing around with K8Vavoom's lighting, and it running stuttery won't stop me from doing that. I hope the transition goes well! :D

shot0004.png

Edited by -TDRR-

Share this post


Link to post
8 hours ago, -TDRR- said:

I attached a room that's particularly lag-spiky/stuttery/whatever, this is on Doom 2 MAP20: "Gotcha!".

hm. now, this is kind of "i absolutely don't know" bug. ;-) i cannot even make a guess what could be wrong there.

 

8 hours ago, -TDRR- said:

VM reports 4-5 "time taken"

this is rounded milliseconds. ;-) so it doesn't really matter if it is 3 or 5, two milliseconds doesn't make a huge difference.

 

but... 60 FPS w/o thinkers? it means that rendering takes almost all frame bugdet, and VM time matters... this is something that absolutely should not happen, if your GPU is capable of *any* hw acceleration! ;-)

 

8 hours ago, -TDRR- said:

You mentioned that you are deprecating GL2.1 support, however my GPU is fully GL4.0 compliant so it's not that i won't be able to run K8Vavoom

there is a huge difference between declaring support for newer OpenGL, and actually support it. ;-) i mean, some GPUs (hello, intel and amd!) can staight out lie, and when they doesn't, the things that should be performant with newer OpenGL are freakin' slow...

optimisations for OpenGL2 and OpenGL3.3+ are different (usually because it assumes different GPU generations). i think you know all that, but cannot stop myself from lecturing. ;-)

 

i guess you cannot run it in shadow volumes rendering?

 

also, are you playing with vsync on? if yes, can you try to turn it off?

 

8 hours ago, -TDRR- said:

Regardless, i like playing around with K8Vavoom's lighting, and it running stuttery won't stop me from doing that. I hope the transition goes well! :D

thank you! i am really sad that i broke it for you, but i haven't the slightest idea of what i did wrong... this is prolly one of the bugs of "cannot ever happen, because it is totally impossible" kind. and i don't even know where to start with it.

 

p.s.: i really need to add profiling tools to the engine. this is something i am planning to do for years, but... you know how it usually goes with Good Plans. ;-)

Share this post


Link to post
5 hours ago, ketmar said:

but... 60 FPS w/o thinkers? it means that rendering takes almost all frame bugdet, and VM time matters... this is something that absolutely should not happen, if your GPU is capable of *any* hw acceleration! ;-)

It's an Intel Baytrail, which is absolutely garbage in all honesty. By the way, disabling VSync helped, but only a tiny little bit.

 

5 hours ago, ketmar said:

there is a huge difference between declaring support for newer OpenGL, and actually support it. ;-) i mean, some GPUs (hello, intel and amd!) can staight out lie, and when they doesn't, the things that should be performant with newer OpenGL are freakin' slow...

optimisations for OpenGL2 and OpenGL3.3+ are different (usually because it assumes different GPU generations). i think you know all that, but cannot stop myself from lecturing. ;-)

Yeah, i know that. This Intel definitely has slightly slower GL2 than GL3 (as if it's GL2 wasn't already slow) but it's definitely not enough of a difference to cause all this stuttery junk to happen. I just played through Doom 2 MAP26 by the way, and it stuttered only a teeny tiny bit, it appears that places with setups like Gotcha! are just really heavy for whatever reason.

 

5 hours ago, ketmar said:

thank you! i am really sad that i broke it for you, but i haven't the slightest idea of what i did wrong... this is prolly one of the bugs of "cannot ever happen, because it is totally impossible" kind. and i don't even know where to start with it.

 

p.s.: i really need to add profiling tools to the engine. this is something i am planning to do for years, but... you know how it usually goes with Good Plans. ;-)

No big deal, guess i'll just have to avoid some maps! Just hope you can get those profiling tools done in case someone else has an issue like mine :)

 

By the way, hope it's not bothersome, but it would be nice if you could ping/tag/mention me when the new update releases. I don't wanna miss it like i did with a couple ones in the past!

Share this post


Link to post
2 hours ago, -TDRR- said:

I just played through Doom 2 MAP26 by the way, and it stuttered only a teeny tiny bit, it appears that places with setups like Gotcha! are just really heavy for whatever reason.

i suspect lightmaps then, because this is the only "advanced" thing left. also, please note that i am sometimes changing cvar names (usually when i need to unconditionally change a default, or cvar meaning changed radically), so you may want to re-check your options.

 

2 hours ago, -TDRR- said:

Just hope you can get those profiling tools done in case someone else has an issue like mine :)

ah, i need them prolly more than anyone else! but putting profiler calls all around the code, and creating profiling UI/reports is soooo boring... ;-)

 

2 hours ago, -TDRR- said:

By the way, hope it's not bothersome, but it would be nice if you could ping/tag/mention me when the new update releases. I don't wanna miss it like i did with a couple ones in the past!

no problems, i put a reminder in my changelog file, let's hope i won't forget it. ;-)

 

(whispering) spam Linguica with requests for a separate subforum for k8vavoom, and we'll get proper release topics! ;-)

Share this post


Link to post

oh... i am still at the net code. (and i just wanted to fix ONE bug there!)

 

i see what Janis wanted with it, but i cannot make heads or tails of the way he intended to implement that. the ack/resend logic seems to be completely broken: it could work by a sheer luck (some key packets should not be lost ever). absolutely not a problem on localhost, of course, but a complete disaster for any real-case scenario.

 

for example, the net layer is trying to "pack" several smaller data chunks into one bigger network packet (roughly up to MTU size), and "i got it" aka "ack" is sent for the whole such big packet. which is a smart way to avoid flooding network with acks, but... it doesn't work. the game opens "virtual channel" for each entity is has to process. netcode is doing it by setting "this is the first data chunk to the virtual channel" flag. so far, so good. but what if we get some data for the virtual channel first, and only then we'll get "open the channel" request? as the channel is not open, first data will be discarded. but it will still be "acked", because we ack only one huge packet, not individual data chunks. then we got "open channel" request, but will never get that data that we dropped, because the other side is sure that we already collected and processed it. this won't happen on localhost, but will happen often with internet. oopsie.

 

of course, i can create some workarounds for every corner case i know of, but this won't last long, because different corner cases will pop again and again. looks like i have to stop trying to patch that code, and move to the drawing board, to redesign the whole thing from the ground up.

 

again, the ideas are right, and the foundation is good, Janis didn't done anything stupid. it just looks like he wrote the first draft of the code, and moved to work on other things, prolly with the intention to fix it all later. and now it's all on me. and i am not even playing MP games, and never wanted to. sigh.

Edited by ketmar

Share this post


Link to post

eh... while i am working on the new network code, here's a new build for you all. i wanted to delay it until you can play at least LAN games, but it may take a while, so here it is!

 

Spoiler

* alot of internal code cleanups (as usual), not noticeable to the end user (but it still adds a line to the changelog)
* GZDoom MODELDEF fixes: "ZOffset" corrections, correctly using (or not using) actor pitch and roll angles
* hackfix for GZDoom HUD weapon models: for some reason most of them has negative x scale; wtf?!
* fixed HUD weapon model rendering with non-default FOVs and non-zero pitch
* "gimme ammo" now gives ammo only for the weapons you have in your inventory; use "gimme ammo full" to get all possible ammo
* "gimme weapons" doesn't give weapons without slot number assigned; use "gimme weapons full" to get all possible weapons
* added "gimme currammo" cheat to give ammo only to the active weapon
* added support for wall and flat sprites
* DECORATE fixes (paren-less function calls in expressions, added `GetCrouchFactor()`)
* implemented "DontThrust", "NeverTarget", and "SeeInvisible" DECORATE flags (i should use three bullets for this!)
* implemented "ProjectileKickback" DECORATE property
* some fixes in graphics lump searching (some overriden picture lumps may go unnoticed)
* parse MAPINFO "DamageType" sections
* some fixes in FONTDEF parser (added support for "SpaceWidth" property, fixed some typos)
* some DEHACKED fixes (more ammo types, allow weapon with no ammo, codepointer autorouting to decorate actions)
* "nointermission" MAPINFO flag should skip everything, not only stats screen
* various actor optimisations (some decorations becomes "notick" on idle)
* fixed MAJOR logic bug with ambushed monsters in `A_LookEx()` (such monsters could be totally blind sometimes)
* added options to turn off "you found a secret" message and the corresponding sound (thanks, Remilia Scarlet)
* band-aid hacks to support textures with long names in animdefs and ACS (thanks, Remilia Scarlet)
* added `A_SetSpecial()` DECORATE action
* fixed segfault when moving an entity destroyed by "ThingRemove" line special
* forgot to add CVAR_Archive attribute to "sv_pushable_barrels" (thanks, plums)
* implemented UDMF sector damage properties (thanks, Remilia Scarlet)
* fixes to envirosuit leaking logic (it really leaks now! ;-)
* alias model textures should be repeated (fixes some broken texturing on MD3 models)
* added gameplay option to control tossing of dropped items
* fixed ACS `ChangeLevel()` processing (extra inventory reset after running ENTER scripts, etc.)
* implemented "ProjectilePassHeight" DECORATE property
* recenter UI menus on resolution change
* added support for "defaultterrain" in terrain definitions lump
* fixed rendering bugs with additive alias models in shadow volume renderer
* fixed bug with different left/right (and fwd/back) player movement speeds with non-default walk and run speeds
* implemented missing render styles ("stencil", "shadow", "shaded", "subtractive")
* externalised detection of some known wads (the engine can adjust itself to some of them; previously it was hardcoded)
* greatly reduced occasional static lightmap artifacts (removed spurious unlit surface parts; it now looks like spurious AO, lol); this should be properly fixed in the future, of course
* fixed wrong translation of exit/secret_exit line specials -- they should always use zero location (thanks, steinkrauz)
* implemented flags argument for `A_SelectWeapon()`
* added `ExplosionDamage` and `ExplosionRadius` DECORATE vars
* moved `Stamina` and `Accuracy` properties from `Player` to `Entity` (and implemented DECORATE/ACS accessors)
* fixed bug in "monster_dropoff" (and turned it on by default); fixed wall bouncing (i hope)
* fixed two typos in UDMF parser (thanks, Khorus)

 

@-TDRR- ping! ;-)

Edited by ketmar

Share this post


Link to post

so, i completely rewrote the network communication layer yet again. it seems that Janis modelled if after Unreal (that complex logic with channels, RPCs, reliable/unreliable packets -- it is all how Unreal does its business). i had two choices here: either nuke all the nice code Janis wrote, and switch to q3-like snapshots, or try to finish the existing implementation.

 

of course, snapshot approach looks easier, but hey, we have all that replication info, and remote procedure calls in VavoomC, i really don't want to throw that away! so i read about Unreal networking several times and more, and wrote what i believe is the similar implementation.

 

fun fact: Janis was *almost* there; it only had some small bugs in acking logic, and missed channel saturation checking. scary fact: i don't know if Unreal network design is copyright-protected. i mean, i am using names and such very similar to Unreal (because there's alot of info in Teh Internets about that), and it looks like even such small things can attract lawyers (hi, oracle!). ;-) not that i think that epic will come after me for that, but it is still somewhat... uneasy feeling.

 

now, something more interesting: i tested the new implementation with 56K bandwidth limit (those old phone line modems, if somebody remember 'em ;-), and with ~30% packet loss (emulated, of course; the emulation sux and failed to capture the real packet loss picture, but it is still better than nothing), and the game is playable at Doom2 MAP01 with monsters. of course, without client-side predictions it stutters like hell due to all lost packets, but i still managed to play through 2.5 maps. of course, this is far from what can be considered "good network code", but i believe that in the next build you'll be able to play on LAN, and maybe even over good and fast internet connection. there's still ALOT of work to do yet, but it seems that we have a solid *and* *working* foundation at least.

Edited by ketmar

Share this post


Link to post

just tried to play over the internet. with ~55 msec lag it is quite playable. sure, you have to get used to a little lag in input (something i completely forgot about with Vavoom ;-), but then i was able to play MAP01-MAP03, kill monsters, collect pickups, manupulate doors and switches, and such. now, i think that this is *much* better than nothing at all. the most important thing here is that i am quite confident in what we have for communication layer now, and i can build other features on top of it.

 

i will prolly enable MP in the next build. expect MP protocol to change with each new build, tho (it has built-in version check, at least), so don't setup your dedicated servers yet (yep, dedicated server is working again too ;-).

Edited by ketmar

Share this post


Link to post

and now i stumbled across a mysterious bug. after some time playing (random time, it seems), the network communication comes to a halt. neither server nor client has their bandwidth saturated, no channel overflows, it just mysteriously halts, and then timeouts. this is something that cannot happen at all. "solid foundation", haha. craptastic code! and we're back to point zero.

 

that's why i haet networking.

Share this post


Link to post

Just use IPoAC to carry HTCPCP packets and it'll work fine.


*Cough* Anyway... I have some thoughts on the engine!  I noticed there's an option to control Lost Soul transparency... could there be an option for Specter transparency as well?  I'm used to having hard-to-see specters like in GZDoom and vanilla, but others might not feel the same.

Also, the sliders to control the random pitch don't seem to work.  I have them both turned all the way down to try and disable that, but I still get random pitches.  Is this a bug?

Share this post


Link to post
2 hours ago, Remilia Scarlet said:

could there be an option for Specter transparency as well?

currently it is hardcoded in the decorate (lost souls have a dedicated render style, though). sure, i can introduce one more render style for Spectres. i'll prolly do that, why not? ;-)

 

2 hours ago, Remilia Scarlet said:

Also, the sliders to control the random pitch don't seem to work.  I have them both turned all the way down to try and disable that, but I still get random pitches.  Is this a bug?

yes. i never bothered to change those, and the logic of pitch applying is completely broken. currently, "pitchrange" from sndinfo overrides any user settings instead of serve as a hint to the sound system. and all default sndinfos has those ranges set, of course.

 

basically, "random pitch" slider has any effect only on sounds that should not be pitched at all according to sndinfo. yet pitch boost should be able to turn the whole thing off, but... you cannot set it to zero from the menu, lol.

 

so thank you, it should be fixed. i will add an additional option to disable any random pitching without moving the sliders too.

Share this post


Link to post

ooooh. it is fun. it looks like something is VERY wrong with network+light sources. static light sources (like candelabra) spawns dynamic light in network mode (in addition to static light), and dynamic lights with shadows cuts network communication. but in a very strange way: ONLY if shadows are enabled, and `send()`/`recv()` are still called. and the client still sends keepalive packets. but for some reason it stops receiving packets from server, and server stops receiving client packets. in the same time, client FPS is still around 60, which means that the client is not slowed down that much. something strange is going on...

 

yeah, candelabras are recognized as static light sources by the engine, so they should not spawn dynamic lights, in no case at all. and even with alot of dynamic lights, we still have stable FPS, and it means that netwok communication code is still called at least 60 times per second (as it should).

 

but at least this is something i can work with: now i have something to debug!

 

p.s.: "normal" dynamic light sources (like barrels, or pickups) are ok, totally no slowdowns at all.

Share this post


Link to post

Thanks ketmar for the new build. Testing it now with the upcoming "Dissolution: Remastered Edition" ;)

Share this post


Link to post
2 hours ago, Gunrock said:

"Dissolution: Remastered Edition"

great! ;-) feel free to report everything wrong, as usual. i love when mappers reports bugs: i can request test maps from them! ;-)

 

i still have a major rendering bug with 3d floors and water, tho. but only rendering, coldet is ok. ;-)

 

also, i still hope to make your DM maps usable too! ;-) if you'll look at commit history, you'll see that last weeks were spent mostly on network code. that's what can happen when you wanted to fix one small bug.

Share this post


Link to post

and the hiccup/disconnect bug seems to be connected with... now, try to guess it... GPU. at least this is what i figured out so far: when both client and server are writing alot of logs to terminal emulator, and the GPU is busy with something fillrate-consuming (like shadow volumes), it locks up terminal updates. and the engine cannot proceed, because it wants to write its logs to the terminal. boom! hiccup. then GPU releases the terminal, it eventually catches up with logs, but sometimes it is too late (timeout), and sometimes it is just a huge stuttering.

 

so, i turned off tty logging, and seen no more hiccups in that light-heavy room. sometimes the bug is not where you think it is. ;-) it doesn't mean that the network comm layer is perfect, but at least *this* bug seems to be not mine. ;-)

Share this post


Link to post

so, i implemented a hack: if client is starving on packets (i.e. is going to timeout), the engine will temporarily cap its FPS at low value (around 28 for now). this magically cured all disconnects: once client starts to hiccup, it sees "dangerous timeout", degrades FPS, and yay! the communication is alive again. so this is *definitely* my GPU driver doing weird things. and this has nothing to do with my terminal emulator: writing excessive logs in terminal emu just consumes more GPU resources, so it is going to do weird things sooner. tested with enabled logs (the Very Bad Situation that caused disconnects earlier), and FPS throttling helps too.

Share this post


Link to post

What terminal emulator are you using?  If you're using one that's using compositing, there's a slim chance that might be affecting it.  Not to mention, not all terminal emulators are exactly fast...

Share this post


Link to post
Guest
This topic is now closed to further replies.
×