Jump to content
Search In
  • More options...
Find results that contain...
Find results in...
Patrol1985

What's the benefit of higher resolution?

Recommended Posts

Patrol1985 said:

By "stable" I mean "no less than 60". The framerate may be unstable as long as it stays within "60 - infinity" range :P


Let me rephrase -wanting a minimum guaranteed performance on a 3D game with heavily variable visuals and heavily variable internal status and no self-imposed automatic cutback is impossible. It's easy to tailor the engine to the limits of a known -and fixed- hardware, like what happens in game consoles or other custom 3D simulation applications. This simply isn't done anymore in PC games. The "solution" is simply to throw more hardware at the problem when better PCs become available.

Even if you throw enough hardware to make the most complex map known today to run at no less than 60 fps all the time (practically capped at 60 Hz because that's what most monitors can display, nowadays), I can always come up with something that just pushes the limit a bit (or a lot) farther, so when do we stop?

It might be possible to construct some sort of pathological benchmark map (e.g. one that floods BSP tree calculations with ridiculous depth recursions that break the stack, or one that has 1000000 monsters), but would you really want to invest in hardware that's guaranteed to run that at a constant frame rate?

Share this post


Link to post

I see what you mean and I'm aware that it works like that, so to further specify what I want:

"I want Doom to run at no less than 60 FPS when playing any official IWAD"

To be honest, I'd like it to render the graphics at 60 FPS with some common wads too (Hell Revealed, Memento Mori, etc.), but I'm certainly not interested in playing some bullshit WADs devised only to cause trouble (i.e. NUTS.WAD). Those can stay at 1 FPS on my machine for all I care :P

Share this post


Link to post
printz said:

For bragging rights. Being able to run software rendering (with a bit of hardware blitting help, yeah) at the likes of ultra-HD is quite a feat for the port author, considering we're counting millions of operations per tic.

At 1920x1080x32 bit (8bit internal, blit to 32 bit), that's 1920x1080x4 bytes x 35 fps = 290,304,000 bytes/sec.

Now, consider that the 8-to-32 table is being referenced - add another 290,304,000 = 580,608,000.
Then, consider that that data came from somewhere, so add 1920x1080x35 to that = 580,608,000 + 72,576,000 = 653,184,000.
Then, consider that those colors are "looked up", add another 72,756,000 = 725,940,000 bytes per sec.
Finally, in Windows, that frame is blitted to the actual screen memory, add another 290,304,000 bytes = 1,016,244,000 = just shy of 1 gigabyte per second.

And that's for 35 fps. For 70 fps, it becomes 2Gb/second.

And that does not take into account, the fact that the image is comprised of scaled textures, with hidden surface removal, in a 3d projection. Then add 44.1k sound generation, player movement and collision detection, monster animation and AI, input controller handling, status bar updates, possible network sync and handling, etc.

Yes, it's quite amazing that it works as well as it does. I fear the day when I bring home one of those 4k monitors :)

Share this post


Link to post
Patrol1985 said:

"I want Doom to run at no less than 60 FPS when playing any official IWAD"


That's shit-easy to do. With any Pentium-class machine you already had that in vanilla (well, capped at 35 fps and at vanilla resolution but you see what I mean).

Patrol1985 said:

To be honest, I'd like it to render the graphics at 60 FPS with some common wads too (Hell Revealed, Memento Mori, etc.)


Yeah, because those WADs are totally at the same complexity level as the official IWADs (though HR is arguably more about monster count, than anything). But even those PWADs, by modern standards, are shit-easy to run. They are vanilla PWADs, after all, though I do not know how well they would run on a first-gen Pentium, but I think I recalled that at least HR gave even high-end 486s (DX/100 and above) a hard time.

In any case, in your case the bottleneck is wanting to use ridiculously high resolutions (compared to vanilla). FIY, 640x400 requires 4x as much CPU power to render as 320x200 (at the very minimum). 1280 x 800 would require 16x as much (again, as a minimum). Plus you want more fps....nearly double that...so 32x as much raw CPU power. Plus there are all those other factors kb1 mentioned.

Using OpenGL might not always be an advantage in Doom source ports, depending on the resolution, GPU and drivers it might even perform worse that the software renderer under certain conditions.

The above being said, as long as you stick to 1994-1995 PWADs you should be fine with any computer made the last 10 years provided you don't use a particularly unoptimized port or otherwise broken setup....but ofc I know and you know that nobody today would be satisfied with good performance out of vanilla PWADs alone. Sooner or later you'll expect the same performance out of limit removing or something like Deus Vult. What then?

Share this post


Link to post

I want those ridiculous resolutions for the same reason a number of users who replied in this thread - to align nicely with my screen's native resolution without any stretches/crops etc.

Personally, I've always been a fan of 800x600 (perfectly enough for me), but that's 4:3 and displays aren't really designed with this ratio anymore.

Maes said:

Sooner or later you'll expect the same performance out of limit removing or something like Deus Vult. What then?


Like you said - I'll have to throw "more hardware" at the game. No way around it :D

Share this post


Link to post

All these strange effects depend upon the port and render mode.
I will have to adapt DoomLegacy to these 16:9 and 16:10 monitors.
This is hampered by only having 1600x1200 monitors here.

So what are the desired behavior(s) ?
Does it vary for software render and OpenGL ?

1. Wider viewpoint filling the screen.
2. Use the center portion of the screen, no stretching.
Really needed for intermission screens, IMO.
3. Use a window of the 3:4 proportions (current solution).
4. Keep using 800x600 and similar and let the monitor cope (I doubt this works with the usual TV widescreen people get these days as a monitor).
5. Native modes of widescreen monitors, assuming they cannot change modes.
6. Divided multiples of some popular widescreen modes. Are there any monitors that can actually change to odd modes. Anyone have any modelists for widescreen monitors ?
7. Find the native resolution in use and make the largest window that
has the normal Doom proportions. Using the current scaling of the software render, such as x4 or x5.
9. Other ??

10. All of the above (of course).

Share this post


Link to post

Back in the day, before every machine had 3d acceleration, playing a game like Doom would cause a bunch of swirling pixels as you looked about. Due to the limited number, the pixels were constantly forced to change what they were representing causing this distortion. Then came mip mapping and all sorts of texture filtering schemes to try and over come this. But all these techniques tend to soften the textures and blur them slightly. With a decent resolution like 1280x1024 or so you don't need any texture filtering and so don't lose the contrast and detail designed into the textures.

That's my take anyway.

Share this post


Link to post
rampancy said:

With a decent resolution like 1280x1024 or so you don't need any texture filtering and so don't lose the contrast and detail designed into the textures.


Finding a "good enough" point in visuals to please everyone is a never-ending quest. There are people who aren't pleased even at double that resolution, with 8x antialiasing, and with all filters on.

Now, there's a legitimate reason to use filtering even with high resolutions: certain patterns can cause unwanted visual phenomena (fringes, moire noise, crawling pixels etc.) plus the eye's ability for finding "pixelated lines" seems to be uncanny, even on 4000x 4000 displays up close.

There even was a thread which showed what happened in Doom with a tall unitextured wall from a distance....without filtering, higher resolutions just changed the distances at which the weird optical phenomena appeared, but didn't eliminate them.

Edit: good stuff here...

http://www.doomworld.com/vb/doom-general/47773-dunno-what-causes-this-but-wow/

Share this post


Link to post

That is an interesting thread. I'm not very knowledgable, but I wonder if having a high frame rate (like 75+) and a matching refresh rate on your monitor would make any difference.

Edit: I suppose I am describing vsync.

As far as pleasing everyone, impossible lol. So much of what we see has to be interpreted by a biased brain before we can rate our experience. Everyone is coming at the situation from a slightly different perspective.

As a collective it ought to give us all a better understanding of what any given situation really is, but we are typically poor communicators.

Edit: re Uncanny ability to see jaggies. If people are looking for it they will certainly see it I believe. The whole industry has become obsessed with them and so we see them. I remember a time when they weren't the end of the world and didn't detract from the experience. About 1280x1024, and def by 1600x1200, they become a non issue for me personally. Is the push towards large screens is necessitating higher resolutions to achieve the same effect since the physical pixels are larger?

I guess anything that pushes things foreword is a positive, but we tend to like our games the way we like our women (good looking), and likewise there are probably a lot of other things that are being neglected due to our obsession with visuals.

Edit I guess when you think about it, we interact through our senses, and it is easier to change the visuals than it is to come up with a better way to deliver audio or provide a better control mechanism. Maybe in the future we will wear headgear that stimulates our brain while we play and then we can be addicted to games for real lol.

Share this post


Link to post
invictius said:

Is this a level of Mockery 2? And what commands are you using to keep track of the number of sprites etc?


nuts.wad.

It's pretty easy to get the all-ghosts glitch on nuts on nightmare difficulty, so using the fly cheat and idrate (prboom-plus is the engine) I can get a very large number of sprites on screen quickly.

Share this post


Link to post
Salt-Man Z said:

I stick to lower resolutions (I think I use 800x600) because any higher and the onscreen text becomes too small.


Zdoom has a console command to resize the text at higher resolutions. Actually I think it's in display options.

Share this post


Link to post
invictius said:

Zdoom has a console command to resize the text at higher resolutions. Actually I think it's in display options.

Thanks! I'll have to check that out; it's been bugging me for years.

Share this post


Link to post

Well, 640x480 has always been my preferred optimum for playing Doom to "feel okay". I use it in all ports except vanilla and Chocolate Doom, where I can't.

Share this post


Link to post
Graf Zahl said:

I play at 1920x1080 because anything less than the native resolution looks like shit on a TFT flatscreen.



this. besides, i use a monitor in lightboost mode to emulate a crt's lack of motion blur, since blurred motion literally gives me a headache. playing at less than the native resolution would serve no purpose.

Share this post


Link to post

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×