Jump to content
Search In
  • More options...
Find results that contain...
Find results in...
invictius

When did pc's begin to handle all of zdooms' higher software resolutions?

Recommended Posts

Playing on a netbook reminded me there was a time when pc's just couldn't handle 1080p in software mode.

Share this post


Link to post

Well, a Pentium 3 in the 700-900 MHz range could handle up to 1024 x 768 depending on the map, which is not too far from 1080p (BTW, 1080p is a TV resolution. The closest standard desktop SVGA resolution in 4:3 would be 1280 x 1024, and that was considered "normal" by the early 2000s), while with prBoom+ you could certainly venture higher. If only it was available back then when Pentium III's were at their heyday, of course.

Honestly, 1080p is nothing exceptional on the desktop, and hasn't been so more than a decade now. It was, however, a major improvement for devices that are normally connected to TVs (e.g. game consoles, PVRs and other set-top boxes), which were stuck at SDTV PAL/NTSC resolutions for decades.

Share this post


Link to post
Maes said:

1024 x 768 depending on the map, which is not too far from 1080p

Lol, no, it's like 40% as many pixels. 1024x768 is 0.78 megapixels, 1080p is 2 megapixels.

Share this post


Link to post
Maes said:

...The closest standard desktop SVGA resolution in 4:3 would be 1280 x 1024, ...

I'm being excesively nit-picky here, but 1280x1024 is 5:4

maes said:

Honestly, 1080p is nothing exceptional on the desktop, and hasn't been so more than a decade now. It was, however, a major improvement for devices that are normally connected to TVs (e.g. game consoles, PVRs and other set-top boxes), which were stuck at SDTV PAL/NTSC resolutions for decades.


1920x1080 is sticking around so long as a 'standard' resolution is likely because most HD media is in 1080p so providing larger resolutions like 2560x1440 doesn't provide much extra value to those who are using their screens mostly for consuming media. 3840x2160 (4K) is likely to be the next stepping stone. I don't think the future of screens after that will be in higher resolutions but in better pixel quality as 4k is pushing the limits of what the human eye can actually see.
edit
I'd also like to put this here

Share this post


Link to post
doom2day said:

I'm being excesively nit-picky here, but 1280x1024 is 5:4
1920x1080 is sticking around so long as a 'standard' resolution is likely because most HD media is in 1080p so providing larger resolutions like 2560x1440 doesn't provide much extra value to those who are using their screens mostly for consuming media. 3840x2160 (4K) is likely to be the next stepping stone. I don't think the future of screens after that will be in higher resolutions but in better pixel quality as 4k is pushing the limits of what the human eye can actually see.
edit
I'd also like to put this here
http://i.i.cbsi.com/cnwk.1d/i/tim/2013/01/27/resolution_chart.jpg


That graph hurts my eyes just to look at it...

To notice the full effect of 4k for a 55 inch TV, you'd need to sit less than 4 feet away. 7-7.5 for 1080p.

Hey I bet optometrists are happy.

Share this post


Link to post

I played around with a friends VIVE a month ago. The pixelation is fairly obvious with such wide angles of vision even at 2x1080x1200, so I suspect that might be the next 'big' thing to push resolutions higher if VR takes off in any serious way.

Share this post


Link to post
doom2day said:

I'm being excesively nit-picky here, but 1280x1024 is 5:4

(Virtual) pixels are not always square, so you have to consider this as well, when thinking about aspect ratio.

Linguica said:

They're already planning 8K though. https://en.wikipedia.org/wiki/8K_UHDTV

Ugh! That's going to be a huge burden on Doom software rendering! Let's see...

Assuming an 8k display is (3840*2160)*2 = 16,588,800 pixels...

That's 16,588,800*4 bytes (32-bit color) = 66,355,200 bytes...
Then add 2 back buffers: 66,355,200*3 = 199,065,600 bytes...
Now that screen is painted using texture pixels, so add another buffer: = 265,420,800 bytes of data transfer...

Consider 35 fps: 9,289,728,000. That's 9 Gb per second (unless my math skills suck), just to paint the screen with TITLEPIC. Now, consider visplane calculation, flat rotation and perspective, wall perspective, diminishing light calcs, etc. It's not pretty. So, what's the benefit? That cacodemon, 2 miles away, will be a bit sharper. That's about it :)

Share this post


Link to post

Just do what choco does, render to 320x200 and scale. No need for 8k backbuffers. TITLEPIC never looked better.

Share this post


Link to post

@kb1: I think that after a certain resolution point, pure software rendering will not be an option anymore, simply because the req'd memory bandwidth will exceed what the current motherboard technology will be capable of, even if the CPUs could (theoretically) spit out data that fast and even faster. At least the trend so far has been that memory bandwidth always lagged significantly behind what the CPUs are capable of (and things get worse if you start adding cores. OK, so you added 16 cores by plugging in a single, new chip. Now, can you upgrade the memory bandwidth 16-fold just as easily?)

OTOH, such bandwidths and data rates are not uncommon even today inside GPUs and their multi-channel, extra-wide-bus memories. So perhaps the future of Doom's software rendering will lie in some form of hardware acceleration, even partial. It would be cool to think of Doom's column rendering functions implemented as GPU pixel shaders, and flats as vertex shaders... that would be an interesting approach: there would be no OpenGL-like scene geometry, but there would be some acceleration nonetheless.

Share this post


Link to post

But why bothering with all that when 1024x768 is perfectly fine?

kb1 said:

what's the benefit? That cacodemon, 2 miles away, will be a bit sharper. That's about it :)

Share this post


Link to post
bzzrak said:

But why bothering with all that when 1024x768 is perfectly fine?



That is not fine on a 1920x1080 TFT monitor. But anything more sounds like a waste of perfectly good computing resources

Share this post


Link to post
Graf Zahl said:

That is not fine on a 1920x1080 TFT monitor.


#BringBackCRTs

Share this post


Link to post
Jon said:

Just do what choco does, render to 320x200 and scale. No need for 8k backbuffers. TITLEPIC never looked better.


I'm not sure what you mean by scaling, let your monitor upconvert to the desired resolution?

Share this post


Link to post
invictius said:

I'm not sure what you mean by scaling, let your monitor upconvert to the desired resolution?

No, monitors increasingly don't support scaling from such low resolutions. Choco uses a software scaler but IIRC they're moving to letting the GPU handle it.

Share this post


Link to post

I thought that what Choco did was simply line doubling/tripling etc. without attempting to do any fancy interpolation or filtering. That hardly requires GPU assistance to perform, as it can be done with fast memory REP MOV instructions, at least on Intel. Of course there's a hardware scaling mode as well, if I remember correctly.

The former mode is quite different from what modern monitors and TVs do: those employ quite complex transformations and processing (usually DCT-based) in order to auto-scale signals of (reasonably) arbitrarily low resolution to whatever the native resolution is.

Share this post


Link to post
Maes said:

I thought that what Choco did was simply line doubling/tripling etc. without attempting to do any fancy interpolation or filtering. That hardly requires GPU assistance to perform, as it can be done with fast memory REP MOV instructions, at least on Intel.


But then you'll have to transfer up to 16x as much data to the GPU, so it is more efficient to do the scaling on the GPU side. You'll have to get it there anyway.

Share this post


Link to post

Yeah, you can scale, but then why did you shell out $mass-green for that 8K monitor? Scaling just makes a low-resolution picture fit on a high-res screen, by rendering pixels as rectangular blocks. Fine if you want that 320x200 vintage look, but, if I buy an 8K monitor, I want an 8K picture, so I can see that mile away caco in full sharpness.

Graf Zahl said:

#Notgoingtohappen

Nope. It's kind of a shame. CRTs are a neat technology that has all but gone away. They'll still be made for certain specific applications, but not for general use. VCRs are another neat technology that's going to disappear. I recently looked at an old VCR tape, and it looked A LOT crappier than I remember. We've all become spoiled with digital technology. But, I have to say: I have yet to see a digital "anything" that let's me pause, rewind a few frames, fast-forward a few seconds, and instantly start playing, the way Beta VCRs used to.

Maes said:

@kb1: I think that after a certain resolution point, pure software rendering will not be an option anymore, simply because the req'd memory bandwidth will exceed what the current motherboard technology will be capable of, even if the CPUs could (theoretically) spit out data that fast and even faster. At least the trend so far has been that memory bandwidth always lagged significantly behind what the CPUs are capable of (and things get worse if you start adding cores. OK, so you added 16 cores by plugging in a single, new chip. Now, can you upgrade the memory bandwidth 16-fold just as easily?)

OTOH, such bandwidths and data rates are not uncommon even today inside GPUs and their multi-channel, extra-wide-bus memories. So perhaps the future of Doom's software rendering will lie in some form of hardware acceleration, even partial. It would be cool to think of Doom's column rendering functions implemented as GPU pixel shaders, and flats as vertex shaders... that would be an interesting approach: there would be no OpenGL-like scene geometry, but there would be some acceleration nonetheless.

Yes, it's a neat idea. I suppose the card could hold texture strips, and a shader could stretch the strips as columns on the screen. You'd run the card in 2D mode I suppose, and still handle the visible surface detection on the CPU with the nodes. I guess you could write the column distance to a Z buffer, even in 2D mode, and have another shader clip the sprites using the Z buffer, which would be really helpful, speedwise. That would also correct some of the weird flickery sprite issues with the software renderer, when a sprite is on stairs. (I'm sure you've seen a mis-drawn sprite in the software renderer before, where the simple vertical clipper doesn't handle proper visible surface removal. Luckily it's usually kinda rare. That is to say that, it occurs a lot, but it's just not that noticable for some reason).

I have an unnatural love of keeping everything completely software-based, though, but your approach is essentially software "pre-rendering", with the GPU doing the BLITs (the heavy lifting). I could get around that approach, especially after seeing the increased frame rate :) I'd like to see it on 1920x1080, actually :)

Share this post


Link to post
kb1 said:

I have yet to see a digital "anything" that let's me pause, rewind a few frames, fast-forward a few seconds, and instantly start playing, the way Beta VCRs used to.


D-VHS? Granted, the format never got popular by the masses, but in its day it was quite revolutionary. 720p and 1080i on tape, before the Blu-Ray format came out. Granted the Blu-Ray format is superior than D-VHS, but D-VHS could record DTV like a regular vhs could record regular tv.

Share this post


Link to post
Danfun64 said:

D-VHS? Granted, the format never got popular by the masses, but in its day it was quite revolutionary. 720p and 1080i on tape, before the Blu-Ray format came out. Granted the Blu-Ray format is superior than D-VHS, but D-VHS could record DTV like a regular vhs could record regular tv.

Cool, didn;t know about that. Can you see the output while you're rewinding or fast-forwarding, though? That's what I always liked about the analog stuff. (And, it was the only option way back when). I can remember this 100 pound monster VCR my Dad would bring home from work every year to play Christmas cartoons on. Man, that thing was huge, and it was very difficult to fine-tune to get a good picture. It had lots of manual tracking and speed adjustments. Things have come a long way. They will soon be forgotten like CRTs, typewriters, and other things. Ask a kid what a tape recorder is :) I have a very old wire recorder that recorded audio on super thin steel wire! Get that wire tangled and it's all over.

Share this post


Link to post
kb1 said:

Cool, didn;t know about that. Can you see the output while you're rewinding or fast-forwarding, though? That's what I always liked about the analog stuff. (And, it was the only option way back when). I can remember this 100 pound monster VCR my Dad would bring home from work every year to play Christmas cartoons on. Man, that thing was huge, and it was very difficult to fine-tune to get a good picture. It had lots of manual tracking and speed adjustments. Things have come a long way. They will soon be forgotten like CRTs, typewriters, and other things. Ask a kid what a tape recorder is :) I have a very old wire recorder that recorded audio on super thin steel wire! Get that wire tangled and it's all over.


The Russians used to use vhs tape as computer storage. An ISA card that went to a composite socket.

Share this post


Link to post
invictius said:

The Russians used to use vhs tape as computer storage. An ISA card that went to a composite socket.

Smart, there's a lot of storage capability, and tapes are cheap... You know, the head spins at an angle diagonal to the tape movement to get a longer surface to record a strip of data on. I wonder just how much digital data you can get on a minute of VHS tape? Could make a nice backup solution.

Share this post


Link to post
invictius said:

The Russians used to use vhs tape as computer storage. An ISA card that went to a composite socket.


Even the technology behind ArVid was nothing new: VHS (or any video tape, for that matter) was already being used to store data with the so-called PCM adapters already in the late 70s/early 80s. Granted, those were ridiculously expensive devices aimed at audiophiles that converted analog to digital audio and stored the bits directly as a video signal, and were nominally audio-only, but they achieved CD bitrate (1.4 Mbps) already. ArVid went just a tad higher than that. Either way, both were incomparable with formats designed to store digital data directly (E.g. Digital8 or DV cassettes: a fraction of the size, and 30 times as much capacity. 60 GB per DV tape is still a respectable capacity, even today).

Edit: fix'd ArVid link

Share this post


Link to post
kb1 said:

Can you see the output while you're rewinding or fast-forwarding, though?


Not 100% sure, but i know you can with MiniDV, so it wouldn't surprise me if D-VHS also supported this feature.

kb1 said:
invictius said:

The Russians used to use vhs tape as computer storage. An ISA card that went to a composite socket.

Smart, there's a lot of storage capability, and tapes are cheap... You know, the head spins at an angle diagonal to the tape movement to get a longer surface to record a strip of data on. I wonder just how much digital data you can get on a minute of VHS tape? Could make a nice backup solution.


For the record, the maximum storage size a D-VHS supported was 50 GB.

Share this post


Link to post
Maes said:

ArVid


Don't kill me for this meme...but I couldn't resist. It took about an hour to make this abomination...



For those curious, the binary is supposed to read "телекомпания ВИD представляет..." ("VID Production presents...") Unfortunately, it had to be cut off... perhaps its for the best.

Share this post


Link to post
Danfun64 said:

For the record, the maximum storage size a D-VHS supported was 50 GB.


Much of the improvement in storage from these latter formats didn't come just from the medium improvements themselves, as much as in the encoding/decoding electronics and modulation used. Using just two signal levels when the VHS medium (in theory) has a 40 dB SNR in the Luma component (therefore, at least 6 bits could be encoded somewhat reliably per "pixel", rather the one) was very inefficient, but easy to decode/encode with reasonably priced hardware, which certainly wasn't the case in the early 90s let alone 80s/late 70s. Therefore, just improving the data encoding could yield an improvement of almost order of magnitude. Coupling that with other improvements, would give you that 50:2 or 60:2 figure or D-VHS or MiniDV vs VHS with ArVid or PCM adapters.

Similar improvements are what took POTS modems from 300 baud to 56 kbaud, eventually reaching the theoretical Nyquist limit for POTS lines.

Share this post


Link to post

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×