Jump to content
Search In
  • More options...
Find results that contain...
Find results in...

dpJudas

Members
  • Content count

    267
  • Joined

  • Last visited

5 Followers

About dpJudas

  • Rank
    Member

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

  1. dpJudas

    Programming Languages in 2021

    You actually want them to hate your choices. As a C++ developer, the day everyone else stops hating me is the sign that C++ is done and everyone moved on. ;)
  2. dpJudas

    Programming Languages in 2021

    When people sigh about Java it isn't because "Java is slow". It has to do with what it takes to run a java application: the JRE needs to be installed. For most types of applications it doesn't really matter how fast your language is. Also, since it sounds like you are (relatively) new to programming, keep in mind that developers invest a lot in the technology stack that they use. That means you will almost always hear people from different stacks complain no matter what you use. If you use C++ you'll hear the C# dudes sigh. If you use C# the python devs will sigh. If you use python the java devs will sigh. And so on. That's just how the tech world works. Don't let the peer pressure of other developers dictate which tools you use. Pick what you find to be the best solution for the problem for you. :)
  3. This isn't true. Copyright doesn't mean you can't copy an idea. That's called a patent and is an entirely different thing. Copyright means you can't take the original implementation of an idea and duplicate it or create derivatives of it. The reason wine developers won't look at the Windows source code is simply that, if they have looked at it, it becomes more difficult in court to state the source code ended up pretty similar by coincidence. This is particular important when we are talking about reimplementing an API where there's only so many ways you can code something.
  4. dpJudas

    What does your desk space look like?

    Here's my current setup - compared to most other pictures here it is clear I need to add some personality to it :)
  5. dpJudas

    GZDoom Software Rendering Question

    Easiest way for you test those fixes is to download the latest nightly build from https://devbuilds.drdteam.org/gzdoom/ - all those changes will eventually make it to the next release.
  6. dpJudas

    GZDoom Software Rendering Question

    The dancing sprites effect is caused by rounding errors when sampling from textures or when deciding which screen pixels are covered by a wall/flat/triangle. ZDoom has (or had? can't remember if my fix made it into zdoom itself) those rounding errors, I then fixed them back in the day, and then I apparently managed to break it again in that commit. In ZDoom it wasn't only the sprites that were affected btw. The midtextures in particular had a tendency to wrap too early. Same thing happened with flats (the teleporters made out of 4 textures at the end in E1M8 is the most obvious example of that). Also the general stability of the picture when moving around is much better without those rounding errors. In short: this bug could probably be fixed by adding +0.5f or removing a +0.5f a single place somewhere in the sampling or screen column/row code. Maybe one of the active GZDoom developers will do that, though I don't think you should put too much hope into that. Most of them only seem to be interested in the hardware renderer.
  7. So you keep saying. Yet, the last time I tested those claims I found no memory leaks at all on my system. Either you're using some setup that doesn't match what mine is (read: the bug report was not reproducible), or the memory leak has since then been fixed. In any case, I added a ccmd that can output the memory usage according to the vulkan memory allocator if you truly want to prove there is one.
  8. That was done on purpose to make custom GLSL shaders written for OpenGL compatible with the vulkan backend. The RHI was mainly written with OpenGL in mind and that did mean I had to either do something like this or make extensive improvements to the RHI.
  9. Most Doom fans in the community aren't developers, so of course they can't go into technical details. Nevertheless, there are several people in the community over time that has attempted to address the same issues as you are battling here. Will you do it better? Who knows, but don't assume that Chocolate Doom is the state of art when it comes to improving Doom's performance. That's not at all the focus of that port. Also please keep in mind that computers in 2020 are significantly different than what they looked like in 1993. That no source port did the transpose is mainly because using the GPU to rotate the final frame buffer image wasn't a real option in the golden age of Doom software renderer source ports. However, that still doesn't mean that they all perform as badly as your Chocolate Doom numbers indicate. Take ZDoom for example, randi spent quite some time optimizing the drawer functions there to write in 4 columns at a time (DWORDs instead of bytes). That port also memory aligned the frame buffer it used for the Pentium age cache lines. Now don't get me wrong - I don't want to take away your thunder for being the first guy to actually implement the transpose, because that really is cool. And fixing the span drawer is an important quality fix. The real issue here isn't so much the precision of the span renderer, but rather that the Doom renderer never properly implemented drawing at the pixel centers. The GZDoom drawer, for example, doesn't need to self-correct after N pixels at all. The errors that creep in aren't enough for the artifacts to show. I don't have the link, but there's an image somewhere here on Doomworld that show dancing sprites in older version of (G)ZDoom - all that stuff happens if pixels aren't clipped and sampled at pixel centers.
  10. The span drawer in gzdoom also uses 32 bits for the sampling coordinates. I also fixed the sampling to be done at the pixel center. If I remember correctly, Eternity uses floats for the sampling coordinates.
  11. About the backbuffer transpose, I wonder how much that would affect the bottleneck in the GZDoom software renderer. Right now the drawers doesn't really seem to be the performance bottleneck in GZD. At least not if you increase the resolution to 4K. Even though I went from an i7 haswell cpu (4c/8t) to a threadripper (32c/64t) the frame time stayed virtually the same. Right now it seems that drawer setup is what slows it down more than anything. If you're lucky you'll be less impacted by this in vanilla Doom because there's less features there than what zdoom supported. For very complex scenes the BSP traversal and sprites becomes the main bottlenecks. You can multithread the BSP by splitting the frame buffer into multiple subsections of the scene, reducing the field of view for each thread. The sprite performance can be improved by not always calculating the top/bottom clipping lists from scratch.
  12. dpJudas

    Your favorite game engine?

    The unreal 1 engine and the Unreal Tournament game that went along with it. Shame Epic never open sourced it or that game.
  13. I rarely need to use tools like address sanitizers as they typically only come into play for C code or when the code never gave any proper ownership to allocated data. In quality C++ codebases such stuff is handled automatically by unique_ptr and shared_ptr in 99% of all cases. That's not to say that the tool isn't incredibly useful whenever that situation arise, but still, I'd say I need to set a breakpoint 1000 times more often than track down a memory misuse like that. In fact, I can't remember the last time I had to debug a use-after-free bug. Even buffer overruns are very rare the way I code things in 2020 - haven't had one yet this year.
  14. Lua is essentially just Javascript with an uglier syntax. IMO its only selling point is that its easier to embed.
  15. I completely agree that new code should use 'enum class' and all my own new code does do that. The issue here is to have a tool waste my time with stuff that barely improves the quality of the code at all. In visual studio it even highlights it as if it was an actual error. Basically, we have errors, which the compiler knows is always is wrong. Then we have 4 levels of warnings with increasingly chance that its a false positive. Then after *that* we have the linter with mostly useless stuff about theoretical problems in the code. At some point it just isn't worth it anymore. Now thanks to the linter the team doesn't spend their time discussing the real problems to solve anymore and are now debating instead if we should type "NULL" vs "nullptr", "==" vs "===", "enum" vs "enum class".
×