Jump to content
Search In
  • More options...
Find results that contain...
Find results in...

Graf Zahl

  • Content count

  • Joined

  • Last visited

Posts posted by Graf Zahl

  1. Ah, ok, I missed the part where the flags are being translated. It was a bit off the path where I was looking. This already clears things up a lot.

    Seeing how your checker works, I don't think you can simply go 32 bit, as you want to check the higher bits as well.


    That just leaves some of the flags in the first word


    22 minutes ago, kraflab said:

    Also agree we need 64 bit to cover the extra mbf flags. I'd say that's an oversight in the spec and not intended. Assuming we limit the flag options based on the ones that make sense, 32 bits is probably still enough for the field in dehacked at least.


    No idea how you want to handle it. Currently the args array is 'long', which means the size is not consistently defined - it's 32 bit on Windows but 64 bit on Linux. If it's supposed to be 64 bit it needs to be 'long long'. Considering how your code performs the checks you definitely need 64 bit words here to do it right.


    But the flag functions truncate the value to 32 bit, losing all the upper flags, even if the args were 64 bit so this definitely needs to be fixed.


    Plus handling MF_NOBLOCKMAP and MF_NOSECTOR properly. AFAIK this can cause crashes if you clear these without unlinking the actor in the process.




  2. As I am currently implementing the standard in GZDoom I'd like to give some feedback.


    The vast majority of features was no problem, but there's one thing in here that I have to consider a major portability hassle for an open standard and that's the 3 functions for setting and checking the flags (A_JumpIfFlagsSet, A_AddFlags and A_ClearFlags.)


    Here's a list of problems with this, some are only relevant for ports that changed things under the hood, others are general issues, applying to all ports:


    * the set and clear functions make no checks for special flags like NOBLOCKMAP and NOSECTOR. Just flipping these will most definitely cause problems with actors not being unlinked/relinked into the proper chains.

    * No checks are being done to mask out undefined/non-universal flags. It just performs a binary comparison of the flag words, requiring each port which wants to implement this feature to replicate these words bit by bit. As an example I'd use MF_SLIDE. ZDoom, for example, removed MF_SLIDE long ago as it was not used for anything useful. Another one would be MF_TRANSLUCENT or MF_TRANSLATION. Again, this is not how more advanced ports would handle translucency - it's very limiting and if a port handles this differently, having these flags among the checkable ones may be a major issue for robustness, resulting in undefined behavior of the A_Jump... function.

    * The args are only 32 bit so this covers all the Heretic flags in the second flag word but only half of the MBF21 flags that would be of actual interest. It also misses several important high flags in the first flag word.


    With this in mind, is there still a chance to revise these 3 functions so that they work in a truly portable fashion by only allowing to check a well-defined set of flags that other ports can easily implement without having to compromise their implementation of the underlying features - and by making sure that they can actually access all relevant flags and not miss a significant portion by not allowing to check the higher bits?



  3. 2 hours ago, Dragonfly said:

    If the wad/mod is worth playing, people can and will go the extra mile to set up the environment to play it, provided there's clear / easy enouh instructions to follow.


    I doubt it. The last time this happened for me was Caverns of Darkness. I thankfully declined when faced to play this with the semi-broken DOS EXE it came with. Fortunately I was able to come up with an alternative solution.

    Now what would a techically less knowledgeable user do if faced with a setup they do not like? Not even the best mod in the world would help if the obstacles are too high - and that's totally a subjective assessment.


  4. No. My guess is that most regular people who want to play Doom on a modern system will Google for "Play Doom on Windows/Mac/Linux" and be guided to some tutorial to set up a source port - or use the Unity version right away if they do not own the game yet.


    DosBox is arcane for those who grew up on modern computers and have no interest in technical solutions. They may unknowingly use it if they get a predefined setup, but they most likely won't be able to create one themselves.


    But even if you gave one to them, the first question you are likely to get is "How can I set this to a higher resolution?" and they either stop right there or look for a better solution (read: Look for a source port that best matches their interest.) since the answer is "You can't".


  5. 7 hours ago, Blzut3 said:

     Users don't read anything, so I would personally say being transparent to the user is more important, but that's just me. 

    I don't think that this kind of user will ever use DosBox and Doom.exe to begin with - they'd probably settle on something easier to handle...


  6. Sorry, Lila Feuer, but that's classic armchair advice you were giving here.


    Where I work there's one guy whose sole responsibility is to think up new test cases and run our software through all sorts of ludicrous scenarios. But he's doing nothing else the entire day, 5 days a week, 52 weeks a year minus vacation.


    But for something that's ultimately a hobby project on a limited time budget with too few people working on it that's totally impossible, even on a smaller scale. We inevitably have to depend on users to report when they see something wrong.



  7. 1 hour ago, NY00123 said:

    I know that the latter approach is being used for Commander Keen mods. Of course, it's practically impossible to properly support all possible patches in a re-implementation (like Commander Genius) or a source port, not without more-or-less running the original exe in an emulator. A small, finite subset of patches can still be supported, of course, but there will obviously be more.


    Binary patches of executables always come with a risk, though, and that's not limited to exploits. Just have a look at Quake 2. Thanks to its DLL interface, any port switching to a different architecture other than 32 bit Windows is left in the cold with many of those old user mods which shipped their own game DLL but didn't bother including the source.


  8. 30 minutes ago, Redneckerz said:

    Calling it a Exe hack in the DoomWiki defined sense would mean it would actively tamper with Doom's internal works. Doom-ACE does not really do that to begin with: It exploits a flaw, but it does not actively change anything within Doom's internals.


    If it didn't "change any of Doom's internals" it would not work. The entire purpose is to do that! And in this regard it is completely irrelevant whether you apply it statically to the .EXE file or through an exploit at run time.

    Seriously, the only new thing here is the delivery method. Patching an executable to do thiings it was never made for is an ancient concept - remember Entryway's limit epanding modified EXE?

  9. 3 hours ago, Maes said:


    How? Isn't Chocolate Doom's purpose to be "as close as possible to the original executable's experience", going to extra lengths as to include Dehacked support, preserving limitations/bugs and even emulating bugs if need be, for the sake of demo compatibility (albeit not to the extreme extents of prBoom+)?


    Then maybe we need "Real Chocolate Doom": just like Chocolate Doom, but, you know, really real ;-)



    It's more a question of feasibility. The way this thing works you need to create machine code that's hardwired to the DOS EXEs intricacies. Trying to execute that code in a different context will be a gargantuan effort that'd only be worth it if there was a genuine gain. And a handful of projects isn't really it. My guess is that the scope of these changes will be on a scale where it is easier to reverse engineer them to source code (unless provided directly, of course) and integrate that code directly into the port or script it in a port capable of instead of trying to run the actual machine code.






  10. Obvoiusly any port trying to support things like this would have to check for the overflow condition and deal with it on its own terms instead of letting it run amok on the system.

    But the real problem lies elsewhere. It would only work on x86 32 bit, but that's not really a future proof platform. So to support it on other architectures you need an emulator - but you not only need an emulator but a code checker that blocks malicious attempts to access functions and system calls that can cause real damage. It also needs total knowledge of where functions start, especially in cases where some jumps into the middle of a function are performed. The amount of work needed can quickly exceed any reasonable effort. So sorry for anyone dreaming of getting this to work with more modern ports: It won't happen.

  11. 15 hours ago, seed said:

    If it wasn't for GZDoom's very annoying RNG - yes for me it is annoying because I can immediately spot the discrepancy, the player deals less damage and I think the enemies do too - it would take the #1 spot for me, but for that reason alone I tend to prefer more conservative ports like Eternity and PrBoom.


    I'm sorry to tell you, but you must really be imagining things here. I've run some tests on both GZDoom and Boom's RNG - same as in its child ports - in the past and while the actual sequences are not the same, the resulting distribution is very similar. It's both a lot more random than Doom's original one.


    15 hours ago, seed said:

    If vanilla/Boom RNG or a variation thereof ever gets introduced to replace the current method, I'd be sold for good.

    I think what you really feel here is that GZDoom does not use a single RNG but hundreds - a lot more than even Boom and as a result you get poor sequences for some  events.

    If you really want I can do a test build for you switching back to Boom's RNG, which ZDoom had been using in the beginning, and mapping all calls to a single one for SP games. It might be interesting to hear your impressions.

  12. If you look at the entire Blood source you will see this attitude everywhere - the map format is different (hopelessly convoluted with bit fields for properties, also encrypted), the palette management does not use the standard features, it overrode several engine functions with their own, courtesy of the linker prioritizing their own content, etc.


    Although with the file system the actual reason was that GRP was added to the engine relatively late and since the Blood team had already set themselves up to use their stupid resource indices (another thing that makes it very hard to replace stuff) they just circumvented the GRP code, which admittedly was a piece of shit all of its own.




  13. 1 hour ago, NoXion said:

    How do you think the Amazon/Google model compares with the Microsoft one? One of my friends has an Xbox Game Pass subscription that he claims gives him cheap access to new(ish?) games, most of which he has no real desire to hold onto. I don't think he would like services like Stadia or this Amazon one, but I don't think I've talked with him about it. Microsoft have at least 10 million subscribers so there's at least a market for it, but a quick Googling seems to imply that Game Pass is a loss maker? I think the recent acquisition of ZeniMax might indicate a longer-term strategy.


    The Microsoft model depends on local play, it's not just an interactive video stream. As such, it's a business model I'd find more interesting than buying renting DRM-crippled games but pay full price. At least it is honest about its intentions.


    57 minutes ago, Mr. Freeze said:

    B: Praying that by the time this stuff releases, the internet speeds around the country will have increased so latency is a small factor. 


    Too bad then that the speeds that need increasing can't be. The problem is not a question of bandwidth - it's a question of how long a single bit from your system takes to reach the server, and like I pointed out, that's subject to physical limitations - the further away your server farm is from your customers, the longer the bits take to travel back and forth between your system and the server and the more lag the customer will experience.


  14. 7 hours ago, Artman2004 said:

    Who is the customer base for this, anyways? Why not play on a PC or console?


    Those people who, according to the industy, "have no need for a real computer". The story probably goes like "Hey, in order to play new games you constantly have to buy upgrades to your home computer. Why not come to us? - all you need is some cheap hardware that's capable of decoding a video stream."


    43 minutes ago, NoXion said:

    I honestly wonder what the real motive is behind these companies pushing cloud gaming. Obviously they must have done some market research, but since we're not privy to their findings, we can't honestly be sure about what exactly were the goals and motivation behind that research. So the idea that there must be some genuine market for this kinda crap seems like a specious argument.


    "If there is no market, we will create one."

    Sometimes there's a genuine Cloud-Cuckoo-Land mentality around there, it's very clear that these IT guys often have no idea how real people tick.

    I think a big part of the master plan is to take control of as much as possible before our dear politicians finally wake up and try to combat the monster the had allowed to grow.


    But seriously, there's definitely a market for gaming subscriptions. The problem is that streaming is not the solution for it because it simply cannot work due to the technical limitations.



  15. 4 minutes ago, Ajora said:

    I don't disagree. Just wanted to provide the other side of the argument. And thanks for shedding some light on this discussion. You made some very good points.


    Heh, yes, and it was inevitable that it totally sidestepped the one problem they'll never be able to overcome. Can't beat the hard limits of universal physics.

    What they totally forget is that unlike movies, gaming is a two-way affair - data goes in and out. With movies it doesn't matter if the data travels for a second or two, what is important is that you get an uninterrupted stream. But with games it is of the utmost importance that no time at all is lost for data transmission which is a physical impossibility.


    1 minute ago, NoXion said:

    They'd never let cloud gamers face off against PC or console gamers. The cloudy lot would be getting their arses kicked consistently.


    Yup, most definitely.