Jump to content
Search In
  • More options...
Find results that contain...
Find results in...
Harha

chocolate-doom in vmware debian linux host

Recommended Posts

So, I set-up a debian 8 x64 OS in VMWare with a MATE desktop environment. Everything works fine, I cloned the latest chocolate-doom source from the github repo, compiled it with the latest gcc and installed it. It works fine, but there's one slight problem. Framerate is crap...

I have an Intel Core i7-4790K 4.0ghz processor and a 'slightly' older Ati HD48900 GPU. The virtual machine runs absolutely fine and I gave it 4 cpu threads and 4gb memory, I have all the VMWare drivers installed and so on, so 3D graphics or anything like that shouldn't be an issue. So I really can't understand why would a simple game like doom lagg... The framerate is around 10-25, changing constantly. Audio works fine. I can guess that chocolate-doom is purely software-rendered, but still, it should run absolutely fine... :/

Anyone knows what could be causing this? I'm quite sure it's VMWare's fault, I'm planning to install debian on my main computer after doing some more testing using this virtual machine so I'd like to know whether this is debian/MATE's fault or a fault in the virtualization tech.

Here's a video demonstrating the framerate; https://dl.dropboxusercontent.com/u/5184285/out-1.ogv

I've been trying to ask around in VMWare's and Debian's irc channels but nobody knew what caused it. Chocolate-doom seems way more slick/responsive in my host OS which is windows 7 x64...

Share this post


Link to post

Virtual machines are rarely optimized for good 2D performance (presently, Chocolate Doom possesses no OpenGL capability). You might try booting your VM in VirtualBox and see how that fares (Linux is fairly nice about drivers and all the hardware changing -- but you might still want to clone the vmdk anyhow just in case). I very much doubt your hardware is at fault.

If you really want to though, you can always run Debian from a CD or USB stick without having to install it: https://www.debian.org/CD/live/

Share this post


Link to post

It really shouldn't be that bad. I can run Pr+ inside VirtualBox on a Windows Host/Arch Linux guest and get over 60 FPS in FreeDM's first map (640x480). Core i5 2520M. My guess is bad Linux drivers, or you haven't installed the VirtualBox client extensions.

Share this post


Link to post

My uneducated guess is that it has to do with the fact that Chocolate Doom, and the vanilla renderer in general, is extremely cache dependent. I would suggest downloading PrBoom and trying the software and hardware versions respectively. If hardware works fine but software is slow as a dog, then that is probably the culprit.

Share this post


Link to post

Hard to diagnose, but I remember getting poor performance in the past when using a compositing window manager. Maybe try out a simpler window manager and if the problem goes away you'll then at least know what's causing it.

Share this post


Link to post
chungy said:

Virtual machines are rarely optimized for good 2D performance (presently, Chocolate Doom possesses no OpenGL capability). You might try booting your VM in VirtualBox and see how that fares (Linux is fairly nice about drivers and all the hardware changing -- but you might still want to clone the vmdk anyhow just in case). I very much doubt your hardware is at fault.

If you really want to though, you can always run Debian from a CD or USB stick without having to install it: https://www.debian.org/CD/live/

I never said my hardware would be at fault, I said that VMWare's own virtualization tech doesn't apparently support 2D software rendered graphics that well. And yeah, I will surely try out the live-cd before installing the OS. :P

Ladna said:

It really shouldn't be that bad. I can run Pr+ inside VirtualBox on a Windows Host/Arch Linux guest and get over 60 FPS in FreeDM's first map (640x480). Core i5 2520M. My guess is bad Linux drivers, or you haven't installed the VirtualBox client extensions.

I'm not using Virtualbox, I'm using VMWare, which according to my research/tests performs a lot better at everything. The virtual machine is silky smooth, while in Virtualbox there are random jitters and things like that which are rather irritating, present in any distro, even in a minimalistic Arch linux / LXDE setup I tried. I'll try some OpenGL port I guess, it will most likely run fine.

Linguica said:

My uneducated guess is that it has to do with the fact that Chocolate Doom, and the vanilla renderer in general, is extremely cache dependent. I would suggest downloading PrBoom and trying the software and hardware versions respectively. If hardware works fine but software is slow as a dog, then that is probably the culprit.

I'll do just that, thanks.

fraggle said:

Hard to diagnose, but I remember getting poor performance in the past when using a compositing window manager. Maybe try out a simpler window manager and if the problem goes away you'll then at least know what's causing it.

I'll probably give this a try too. I'll report back here when I'm done. ;P

Share this post


Link to post

Just as an aside, I experience the same when I play Doom Retro in a VirtualBox that runs Windows 7 64-bit and is hosted by a Debian unstable system. The game is extremely slowly, close to being unplayable, whereas the native version compiled on Linux runs without a flaw.

Share this post


Link to post
fabian said:

Just as an aside, I experience the same when I play Doom Retro in a VirtualBox that runs Windows 7 64-bit and is hosted by a Debian unstable system. The game is extremely slowly, close to being unplayable, whereas the native version compiled on Linux runs without a flaw.

Virtualbox is pretty bad anyways. I think you'll have much better experience with for example QEMU.

Anyways, I compiled and tried out prboom, it runs silky smooth, really really smooth. So I guess it's just a matter of the VMWare not liking the way chocolate-doom renders things, for some reason. Both, the software rendered and GPU rendered versions of prboom worked pretty much identically.

Share this post


Link to post
Harha said:

Anyways, I compiled and tried out prboom, it runs silky smooth, really really smooth. So I guess it's just a matter of the VMWare not liking the way chocolate-doom renders things, for some reason. Both, the software rendered and GPU rendered versions of prboom worked pretty much identically.

That's really puzzling, especially as PrBoom and Chocolate Doom both use SDL so there's no underlying difference there that could potentially be the cause. Software vs. GL differences were going to be my next suggestion for things to check.

What happens if you try:

chocolate-doom -timedemo demo2
What FPS do you get?

bradharding said:

Could this be fixed by playing around with values for the SDL_VIDEODRIVER environment variable (as described here)?

No, because Harha is using Linux and x11 is the only real driver he/she should be using (there are others like svgalib that can be used without X but they aren't a good idea to use).

Share this post


Link to post
fraggle said:

That's really puzzling, especially as PrBoom and Chocolate Doom both use SDL so there's no underlying difference there that could potentially be the cause. Software vs. GL differences were going to be my next suggestion for things to check.

What happens if you try:

chocolate-doom -timedemo demo2
What FPS do you get?

timed 2347 gametics in 199 realtics (412.788940 fps)
:D What the hell. I don't understand this anymore...

Share this post


Link to post

Interesting.

Were you using PrBoom or PrBoom+? If the latter, are you running with uncapped frame rate?

If you run Chocolate Doom with -devparm, how many dots do you tend to see at the bottom left of the screen?

One thing that Chocolate Doom does is to call SDL_Delay() between frames so that it doesn't eat up the CPU (as the amount of CPU required to run Doom is miniscule on modern machines). I'm wondering if the actual delay perhaps ends up being much bigger than usual because you're running in a virtual machine.

Share this post


Link to post
fraggle said:

Interesting.

Were you using PrBoom or PrBoom+? If the latter, are you running with uncapped frame rate?

If you run Chocolate Doom with -devparm, how many dots do you tend to see at the bottom left of the screen?

One thing that Chocolate Doom does is to call SDL_Delay() between frames so that it doesn't eat up the CPU (as the amount of CPU required to run Doom is miniscule on modern machines). I'm wondering if the actual delay perhaps ends up being much bigger than usual because you're running in a virtual machine.

I can't access my pc atm but when I can I'll try to fiddle around with the main loop and rendering loop and see if I can make it work, if that -devparm doesn't do anything.

I wouldn't count on SDL_Delay(Uint32 ms);

Share this post


Link to post

Assuming you have compiled your own version of Chocolate Doom, in src/d_loop.c, there's this line in TryRunTics():

        I_Sleep(1);
Comment that out and see if it makes any difference.

Harha said:

see if I can make it work, if that -devparm doesn't do anything.

-devparm won't fix anything but I'm curious how many dots you see.

Share this post


Link to post

I was using PrBoom, not the plus version, with default settings, resolution just set a bit higher than the normal.

I commented out that line, did make uninstall and make clean and then make and make install, tried the game out again and it seems like it reduced the lag a bit. It's still noticeable when I'm comparing it to the performance my host os delivers but it's better. I haven't looked into the doom source code that well but to me it looks like the loop is intended to fix some timing problems, so that it triggers whenever something is incorrect and then sleeps 1ms each time and does that until the problem is fixed. I didn't honestly have too much interest in looking at that so I cannot say that it's being triggered each frame multiple times in my vmware setup or something like that but I guess that's what you're wondering?

I also tried out the -devparm (before I recompiled with the small source edit) and got these dots.

Share this post


Link to post

Looks like you're at one dot - it doesn't jump up to two or three dots at all? Maybe you can record a video like the one in the OP.

Is there any noticeable difference in dots between if you have I_Sleep line enabled or commented out?

One dot means you're running at maximum frame rate, so if it's steady at one and not jumping up higher at all then it would seem to be working as intended. Not that I'm saying there's no bug to investigate here but it does make the situation harder to understand.

Share this post


Link to post

You are aware that Chocolate Doom purposely only runs at 35 FPS, right? I hope it's not just the case that this whole time you've only been wondering why it didn't run at 60 FPS or what have you.

Share this post


Link to post
fraggle said:

Looks like you're at one dot - it doesn't jump up to two or three dots at all? Maybe you can record a video like the one in the OP.

Is there any noticeable difference in dots between if you have I_Sleep line enabled or commented out?

One dot means you're running at maximum frame rate, so if it's steady at one and not jumping up higher at all then it would seem to be working as intended. Not that I'm saying there's no bug to investigate here but it does make the situation harder to understand.

Removing I_Sleep(1); does make a difference in gameplay/fps-wise, but nothing in the dot section. Dot section always seems to report full framerate, though it seems to behave slightly differently (the first dot is blinking) but still, just one dot.

Linguica said:

You are aware that Chocolate Doom purposely only runs at 35 FPS, right? I hope it's not just the case that this whole time you've only been wondering why it didn't run at 60 FPS or what have you.

Yeah, obviously I am aware of this and I think you could see it from my previous posts. The program running at full 35 frames per second constantly doesn't always mean the visual feedback what you see on your window is the same. For some reason VMWare causes this, it runs completely fine at 100% speed in the background but displays the outputted framebuffer poorly.

https://dl.dropboxusercontent.com/u/5184285/chocolate-doom.webmhd.webm

There's a video showing how the framerate differs between debian vm and windows 7. It isn't a lot anymore, since removing I_Sleep did help a bit, but it's still noticeable.

https://dl.dropboxusercontent.com/u/5184285/out%20%282%29.ogv

There's a video showing that the virtual machine can actually output smooth gameplay just fine. Using prboom-plus this time with default settings.

Anyways, as interesting as this might be I'm sick of trying to fix it. I tried to play around with the rendering loop stuff with no luck. I think it's SDL being incompatible with the vmware stuff somehow.

Edit: Weirdly though PrBoom seems laggy in the video. :D That's just because the recording software I'm using on linux isn't recording it at the full fps at all, Fraps which I used for the chocolate-doom video is recording at 60fps though. Anyways, chocolate-doom on windows feels as smooth as PrBoom or PrBoom-plus on linux virtual machine, which is what bothers me.

Share this post


Link to post
Harha said:

Removing I_Sleep(1); does make a difference in gameplay/fps-wise, but nothing in the dot section. Dot section always seems to report full framerate, though it seems to behave slightly differently (the first dot is blinking) but still, just one dot.

Interesting. Maybe it could be related to this.

Also related bug.

I think I need to investigate this "double rendering" issue that Linguica brought up more carefully because it may be causing some undesirable side effects.

Harha said:

Anyways, as interesting as this might be I'm sick of trying to fix it.

I totally understand. Thanks for taking the time to patiently answer my questions; I think there may still be a subtle bug here that deserves further investigation and your answers have been very helpful in that regard.

EDIT: I committed a fix for the bug I mentioned above. Curious to see if it fixes your issue as well (ie. no I_Sleep() tweak needed), but I totally understand if you've had enough of dealing with this issue now.

Share this post


Link to post

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×