Jump to content
Search In
  • More options...
Find results that contain...
Find results in...

Blondie

Members
  • Content count

    19
  • Joined

  • Last visited

Posts posted by Blondie


  1. And since Memfis mentioned it (thanks, Memfis!), could somebody who still has Grazza's translucent things dehacked patches upload them somewhere, just in case Andrey decides not to implement a config variable? Grazza's old post contains broken links to both: http://www.doomworld.com/vb/showthread.php?s=&postid=593019#post593019

    I still believe that PrBoom needs a way to universally disable transparent sprites internal to the engine, though, because having to use dehacked patches is inconvenient, unreliable, and confusing — particularly to anybody who isn't aware of them.

    Edit: Nevermind. Myk attached the file here: http://www.doomworld.com/vb/showthread.php?s=&postid=871780#post871780


  2. That would work, although inconvenient. Wouldn't it just be easier to add a config variable that toggles sprite translucency in software mode? I don't see why this can't be done since transparent sprites are already universally disabled in GL mode.

    Edit: Or you could simply expand the preexisting compat option "comp_translucency" to apply to all complevels. Either way, I think that transparent sprites need to be universally togglable in the engine itself for software mode users with legacy hardware.


  3. This is unrelated to what you guys were discussing, but would it be possible, e6y, to create a config variable to force transparency off for Boom's translucent sprites in software mode? I know that "comp_translucency" technically already does this, but the problem is that it doesn't work at either the Boom or MBF complevels, which are the two complevels I use for playing Boom-compatible WADs.

    The reason I ask is because I play on an old computer with a single-core CPU, and I get severe framerate drops whenever a lot of transparent sprites are on the screen at once, such as the plasma gun being fired at close range. Transparent sprites are already disabled at all complevels in OpenGL mode due to the sprites/walls sorting bug in GLBoom, but there is no way for me to achieve this in software mode other than to disable translucency altogether, which I don't want to do.

    A config variable like "sprite_translucency," however, would easily solve this problem for those of us who still use legacy hardware and prefer the software renderer. Thanks in advance.


  4. entryway said:

    Can't reproduce. Just tried:
    glboom-plus -iwad doom.wad -file pwads/DTWID.wad -deh pwads/DTWID.deh

    Strange. I changed my DOOMWADDIR environment variable path from "C:\Games\DOOM\" (PWADs subfolder) to "C:\Games\DOOM\WADs\" (DTWID subfolder), and I still get the same issue:

    D_DoomMainSetup: Cannot find .deh or .bex file named DTWID\DTWID.deh

    Here are my exact paths and command line as of now:

    DOOMWADDIR environment variable path: "C:\Games\DOOM\WADs\"

    IWADs subfolder path: "C:\Games\DOOM\WADs\IWADs\"

    DTWID subfolder path: "C:\Games\DOOM\WADs\DTWID\"

    Command line: "prboom-plus.exe -iwad IWADs\DOOM.WAD -file DTWID\DTWID.wad -deh DTWID\DTWID.deh -complevel 3"

    This is definitely some sort of bug introduced by 2.5.1.4.test, because I do not encounter this problem with 2.5.1.3. If I remove the .deh file from the command line, it loads as expected. However, inclusion of a .deh or .bex file in the command line via my DOOMWADDIR path results in a crash. I'm using Windows XP SP3, if it makes any difference.


  5. Just wanted to add that this bug appears to only affect environment variables. When loading a .deh file with an exact path, the game launches as normal. However, when I abbreviate per the path specified by my DOOMWADDIR environment variable, it crashes. I don't experience this issue with either iwads or pwads, only dehacked patches and boom extensions.


  6. Vorpal said:

    Is this a known issue? Or did vanilla simply behave this way also?

    Are you using complevel 11? If so, there is a bug at that complevel introduced by Lee Killough's recoded clipping behavior for MBF (monsters can respawn and resurrect on top of stuff). A compatibility option was later added with PrBoom that disables MBF's buggy clipping (comp_moveblock), but unfortunately, the complevels that acknowledge it also suffer from the ledge compatibility bug.


  7. HackNeyed said:

    I just wanted to point out this annoying visual glitch with the Plasma Gun in hopes it could be fixed, please.

    That's a really old PrBoom bug. If I remember correctly, it was introduced when high color modes were added. It actually affects all weapons, not just the plasma gun. Never_Again and myself both reported this to e6y on the PrBoom-Plus bugtracker, but it has never been addressed because fixing it would involve disabling color modes above 8-bit (I think?).

    The only two resolutions at which this bug does not occur are 320x200 and 640x400. However, the prevalence of the bug varies per resolution, as you've already noted with 640x480. If you want to use non-standard DOOM resolutions in software mode, my only suggestion would be to use a resolution where it is least noticeable and try to ignore it. Personally, I play at 848x480 with the default aspect ratio, and it doesn't bother me too much.

    Here's the link to my bugtracker report: http://sourceforge.net/tracker/?func=detail&aid=3004102&group_id=148658&atid=772943


  8. fraggle said:

    Latest SVN version uses double buffering, and I've also stopped borders from flashing in boxed screen modes.

    Fantastic! It looks like we've identified the culprit of the reduced framerate in directx boxed fullscreen resolutions: now that only the game image illuminates, excluding borders, directx 32-bit runs smoothly at all resolutions, including 1280x1024. However, 8-bit modes still illuminate the borders, and I suspect this could be what impacts directx 8-bit with the aforementioned stuttering, although I may be wrong. It's odd, though, that directx 32-bit outperforms windib 32-bit, since it's usually the reverse for me.

    Porsche Monty said:

    What I still don't get is the "slower" part. At least on my 3.0ghz + GF6200 + XP-powered machine, the performance drop is unnoticeable regardless of the resolution. Makes you wonder if it's something only Vista/7 users get.

    Try 32-bit windib, instead of directx. On my machine, directx 32-bit performs as expected at all desired resolutions, now that fraggle addressed the framerate problem, but windib 32-bit still chokes on high resolutions. I'm running on XP32 with a 2.2ghz AMD Athlon, 2gbs DDR400 SDRAM, and a Sapphire ATI HD 3850 AGP video card.


  9. Quasar said:

    I had THOUGHT this bug was fixed as of SDL 1.2.14 myself. The suspected culprit code was a bizarre and unnecessary conversion of the existing screen buffer contents into the new palette by use of a closest color match on every pixel whenever the palette was changed, in SDL's code.

    Unfortunately, it's still present, and quite noticeably, in Chocolate Doom's 8-bit windib mode, although the severity appears to differ between each setup. In my case, the bug is so pronounced as to render this mode entirely useless, because the constant intermittent pauses during palette changes halts fluid gameplay altogether. The inclusion of a 32-bit mode has nonetheless provided a workaround, though.


  10. Porsche Monty said:

    So we have:

    8bit + directx + vsync + boxed resolution = stuttery but full framerate?

    32bit + directx + vsync + boxed resolution = no stuttering but reduced framerate? (-8in32's supposed to be slower, but not anywhere near this much, and normally you shouldn't have to enable it on XP anyways)

    Pretty much, yes, although the framerate is also significantly reduced in 8-bit directx mode with fullscreen boxed resolutions, and vsync can be subtracted, since that was ruled out as after disabling it within my drivers.

    Porsche Monty said:

    Which video card/drivers do you have?

    Sapphire ATI HD 3850 AGP with Catalyst 9.12.

    Porsche Monty said:

    I can only suggest you make sure the refresh rate for whatever resolution you've chosen is 70khz, that should deal with some of the stuttering.

    Tried it, but to no avail. I'm beginning to suspect that it's something hardware-related.

    fraggle said:

    Chocolate Doom can only scale the screen to a particular set of dimensions (the dimensions of windowed-mode windows). When you choose a fullscreen resolution, it picks the largest scale size that will fit inside the screen. For 1280x1024, that is 1280x1000.

    Ah, that would indeed explain the letterboxing/windowboxing in fullscreen. Thanks for clarifying!

    fraggle said:

    Which resolutions?

    The only resolutions affected are those that letterbox, pillarbox, or windowbox the game image in fullscreen, specifically 1280x1024, 1400x1050, and 1680x1050. And it only happens when using 8-bit directx mode. Maybe it's specific to my system?

    fraggle said:

    32-bit is always going to be slower than 8-bit, because there's an extra conversion stage that needs to be performed. Best suggestion I can give you is to pick a lower screen resolution.

    Thanks for the suggestion. 800x600 in 32-bit windib mode works perfectly! Of course, it's a non-native resolution for my LCD, but you have to do what you have to do. Just out of curiosity, do you happen to know why 8-bit windib mode still exhibits the glitch that causes the game to briefly lag during palette flashes when grabbing items, taking damage, etc.? I only ask because I get the best performance with that mode, but can't make use of it due to the bug. I assume that it differs from each setup, depending on complications with SDL.


  11. Porsche Monty said:

    Vsync does not normally cause any framerate issues in Chocolate Doom, not even under Vista/7.

    You're right! I forced vsync off through the drivers (I forgot I had them set to enable vsync if the application lacked the option). However, turns out vsync wasn't the issue: I still get severe stuttering when playing at certain resolutions when using directx. Incidentally, this only seems to occur with resolutions where an image is pillarboxed, letterboxed, or windowboxed within another resolution. -8in32 seems to fix this, but the framerate appears noticeably reduced, even choppy, in 32-bit mode when compared to resolutions that don't exhibit the stuttering issue in 8-bit mode. Also, -8in32 seems to negate vsync entirely, even if I have it forced in my drivers' settings. I'm using XP32, if it makes any difference.

    Porsche Monty said:

    Try -8in32 on the command line. You'll need the latest svn build.

    That corrects both the palette-lag bug and the stuttering bug, as far as I can tell, in both windib and directx. However, like I said, the gameplay seems noticeably choppier in 32-bit mode, though I'm willing to sacrifice if there is no other way around these bugs. Ironically, however, it now seems that even directx exhibits the palette glitch if -8in32 isn't used with the latest SVN, whereas it didn't occur before. This technically forces me to use -8in32, no matter what. Has the latest SVN introduced another bug?

    Porsche Monty said:

    How are you measuring this? the sensitive thing to do here would be disabling/enabling "correct aspect ratio" from chocolate-setuo.exe, but it could also be your monitor messing with the image.

    Disabling "correct aspect ratio" merely renders the image widescreen, instead of 4:3. My monitor isn't the culprit. Chocolate Doom literally forces a 1280x1000 image within a 1280x1024, 1400x1050, or 1680x1050 resolution. During palette changes, the black borders from the letterboxed/windowboxed image illuminate, so it technically is rendered in the proper resolution, just not at the desired dimensions. Here's what it says in my stdout.txt:

    I_InitGraphics: Letterboxed (1280x1000 within 1280x1024)


  12. Is there any possible way to disable vsync in Chocolate Doom when using directx? I've run multiple tests, and it appears that vsync is always forced in this mode, which is causing serious framerate issues to the point that it's unplayable.

    I can't use windib, because I get that god-awful palette glitch. Also, I've noticed that when attempting to play in either 1280x1024, 1400x1050, or 1680x1050 resolution, the screen is always letterboxed/windowboxed to 1280x1000. Is there any specific reason for this, or is this a bug?


  13. I've recently attempted to use the latest raven-branch win32 release to play Hexen, but I've encountered a serious issue that renders the port unplayable. Specifically, if I attempt to play in windib mode, I run into the annoying palette-delay bug that bothered so many users until SDL incorporated a fix. While the patch resolved the delay for me when using directx, it still occurs with windib. This wouldn't be a problem; however, directx mode also exhibits an issue of its own.

    When playing in directx, the game suffers from severe stuttering and slowdown. This isn't exclusive to Chocolate Doom, though, as I've also encountered it in PrBoom-Plus. In that port, if I enable vsync while using the directx renderer with the screen multiply function, it also produces severe stuttering. But this is resolved if vsync is disabled. I would therefore assume that something directly related to the way these two SDL-based ports stretch the game image to the desired resolution conflicts with vsync. As far as I can tell, Chocolate Doom lacks any specific settings present in the cfgs to toggle vsync. Is vsync forced in Chocolate Doom when using the directx renderer, and has anyone else encountered this problem? Any help would be appreciated. I'm running XP32, by the way. Thanks!

×