Jump to content
Search In
  • More options...
Find results that contain...
Find results in...
Sign in to follow this  
Holering

Display port 1.3=WTF?!

Recommended Posts

When I first saw this news I couldn't believe it. http://www.guru3d.com/news-story/vesa-displayport-1-3-standard-outlineds.html

Quote from Guru3D.com's News "Hilbert Hagedoorn on: 09/15/2014":

This new spec doubles signal bandwith towards up to 32.4Gbps. The higher bandwidth is needed for 5K screens with a resolution of up to 5120 x 2880 pixels as well as using not one but two 4K UHD displays.


Okay more bits is great but so what about the higher resolution? But then it starts getting a little worse:

The new standard adds support for the 4:2:0 pixel structure, a video format commonly used on consumer digital television interfaces, which enables support for future 8K x 4K displays.


4:2:0? Are you kidding me?! That's like VHS color! We are less than a month from year 2015 and 4:2:0 is being considered for the future? WTF... And who cares about 8k x 4k displays?! Do people need a wall as their monitor? Do they need 20 damn monitors to simulate a larger one? How is that going to improve gaming? (Do we need 16 cylinders and 5mpg to go to a bathroom?). I personally will take a virtual reality headset anyday!

With its higher 8.1 Gbps per-lane link rate, DisplayPort 1.3 can support a single UHD monitor with 60Hz refresh and 24-bit color over two lanes, while assigning the remaining two lanes to increase capacity for alternate data types, such as SuperSpeed USB data, as allowed in DockPort.

24-bit color?! Another wtf. Is this a joke? Does anyone here remember Carmack in 2001-2002 commenting on 32-bit color not being enough for Doom3? The level designs were talked about using more grey to work with 32-bit color limits. Here's a link with some of that http://www.doomworld.com/vb/doom-3-general/9363-interesting-new-doom-iii-facts/

homelanfed.com had this: Carmack talked quite a bit about his views about games moving up to 64-bit colors in the future and that will open up how lighting can be improved in future games. Even thought Doom III won’t support 64-bit color when it is released, he sees it as a natural progression from 32-bit color even though he knows some game programmers have said they see no real need to do so for gaming. Carmack said those kinds of comments reminded him on what some programmers said a few years ago about moving up from 16 to 32 bit color.


Does anyone else here see this as absurd? 24-bit color and 4:2:0 for the future? No mention of 48-bit (or more) color, or dynamic 3D positional audio via 2.0 headphones? Why do people need 4 freakin' monitors that can't do full blown RGB, with a freakin' nasty gap in between each? What's wrong with one monitor? And why such high resolutions? Are they going to stare point blank to see every detail? People used to use 17" 1280x1024 monitors as standard and that was quite big enough for a 2D display. What about stereo 3D vision via some small compact goggles no larger than a helmet? This is a WTF from my end.

Honestly, Stereo 3D vision with dynamic 3D positional audio via Occulus Rift in a Metroid Prime title would be total sweetness. What about "Brutal Doom" (Get off scumm!!)? Heck, even the head gear itself would make me feel more like Doom-guy (AAaaarrrggghh!).

Other stuff going on in computer technology that I'm not so sure about. Some stuff might be sad. Samsung suing Nvidia, Nvidia suing Samsung too. Yeah... Imagination Technologies releasing PowerVR SDK v3.4 in October 2014. I'm not sure what else there is, but these things seem to stick out from the non-changing 24-bit color and multiple monitors with higher resolution. Keywords (google).

Share this post


Link to post

32 bit color uses the extra bits to manage transparancy, when it comes to actual color you will not notice a difference with 24 bit.
For the rest this insane hunt for giant screens with 4k and 8k resolutions is just lost on me...

Share this post


Link to post

With anything but CRT or Plasma technology, anything beyond 18 bits of color depth is generally wasted anyway. Also, the reason 4:2:0 pulldown (and others of this kind) work so well, is that the human eye is much less sensitive to hue changes than it is to luminosity. VHS had a much worse chroma resolution BTW. Luma bandwidth was 3 MHz (for 50/60 fields in PAL/NTSC) but Chroma was only 0.5 MHz.

Trying to transmit full color information when the bandwidth is already at the physical limits would just mean double the effort with no real benefit.

Share this post


Link to post

Won't comment on the rest, but I don't understand what your problem is with high resolution screens. It actually makes a noticeable, significant improvement to use high res monitors - we've been stuck at pitifully low resolutions for years while phones and tablets have been leading the way in getting better and better screens. Seems insane that the pixel density on my phone is higher than on my desktop machine I use for work.

If you don't understand what I mean, go to your local Apple store, check out one of the Macbook Pros. I'm not trying to convince you to buy a Mac or anything, just check out the screen ("retina display" as they market it). Say what you like about Apple but they're doing modern displays as they ought to be done.

Share this post


Link to post

^ Heh, for someone who just went on a rant about audiophiles, that's a funny thing to say.

I can't spot the pixels on my 22" 1680x1050 screen, sitting 50cm away. I have 10/10 eyesight with correction.

Obviously if I bend forward until I'm ~20cm away from the screen and purposefully look for it, I can see the pixels. But this is not a normal use case by any stretch of the imagination.

I have the 2013 Nexus 7 which I think is 1920x1080? for 7", it is pretty nice how I can't see the pixels even if I put my nose on it. Which makes sense for such a device because it actually can sit pretty close to my nose at times. My brother has an iPad 4 and let me try it for a week. I can't tell the difference.

Now, if some of you want 8K displays or gold cables on your headphones, more power to you. I literally can't handle gaming under 40 FPS anymore on a computer screen, so I know how it is when people insist specific quirks that are critical to you don't matter. Let's not pretend those personal preferences are objective and universally shared, though.

Share this post


Link to post
Phml said:

^ Heh, for someone who just went on a rant about audiophiles, that's a funny thing to say.

I can't spot the pixels on my 22" 1680x1050 screen, sitting 50cm away.

I think that being able to "spot the pixels" is slightly misleading. Writing this from my work desktop machine, I can't say that I can make out individual pixels either. However, I do notice pixelisation, if that makes sense? I can perceive the limits of the display.

The point is that it's a noticeable improvement in quality. I've been using a "retina display" Macbook Pro as my main laptop for the past year and I do really notice the difference to certain things - particularly font rendering, where the characters are much crisper and better defined.

Actually, font rendering is a really good example of why hi-res displays are important. I don't know how much you know about how fonts are displayed, but when text is rendered on a display screen (as opposed to print), hinting is used to adjust the shapes of the characters to fit to pixel boundaries. Essentially, the shapes are distorted slightly so that the characters look crisper (there's a good example in the Wikipedia article). But if you're using a hi-res screen, there's less of a need to use hinting, so you get better rendering.

I do think that it's diminishing returns, though. Certainly I think the hi-res screens found on modern tablets and Macs are "good enough" and that if we increase the resolution further it's unlikely to bring any serious improvement. After all, if it's a "retina display" then the entire point is that it's at the limit that the human eye can perceive.

And that's the same as the attitude I have towards headphones really - there's a noticeable difference between $30 and $300 pairs of headphones, but beyond that, you're going into subtleties that are probably difficult to notice.

Share this post


Link to post
fraggle said:

Won't comment on the rest, but I don't understand what your problem is with high resolution screens. It actually makes a noticeable, significant improvement to use high res monitors - we've been stuck at pitifully low resolutions for years while phones and tablets have been leading the way in getting better and better screens. Seems insane that the pixel density on my phone is higher than on my desktop machine I use for work.

If you don't understand what I mean, go to your local Apple store, check out one of the Macbook Pros. I'm not trying to convince you to buy a Mac or anything, just check out the screen ("retina display" as they market it). Say what you like about Apple but they're doing modern displays as they ought to be done.

I just don't understand why higher resolutions are pushed forward more so than other features like 3D vision, higher color depth, and oversampling. It's like people want their computer to become a sort of Hollywood production machine. What's wrong with high fidelity color and interactive gaming?

I'm never really around Macs but I've heard of those retina displays. I'm assuming they squeeze more resolution than normal into portable displays. I think that's more cool on a Mac or portable device. I just don't understand the problem with something compact like goggles for games. It's way smaller than a monitor and you don't need to adjust your seating position. Standing while on a desktop is actually way better and makes it more fun IMO. Kinda like playing arcade games and the Wii; I think the blood flow has something to do with it.

Share this post


Link to post

In regards to the pixel density question: A number of Thinkpad users have commented on a similar topic saying that around 140-150ppi is their upper limit for comfortable viewing of unscaled media. I agree with this based on the 144ppi display on my laptop, but as fraggle stated I can still see pixelation in content. So following Apple's theory, around 280ppi is the point where I could get the same screen space without seeing the pixel limit. (For desktop/laptop displays.) I patiently await the day 17" 4K or 21" 5K displays are available.

Maes said:

With anything but CRT or Plasma technology, anything beyond 18 bits of color depth is generally wasted anyway. Also, the reason 4:2:0 pulldown (and others of this kind) work so well, is that the human eye is much less sensitive to hue changes than it is to luminosity.

IPS displays haven't made it to Greece yet? High end color critical LCDs have been able to do 10-bpc/30-bit color for some time now. Standard IPS displays which do 24-bit color are fairly common now (especially on mobile devices, and I think most TVs are IPS these days). Granted you are correct if you're talking about gaming monitors which still use TN displays for higher frame rates.

As for 4:2:0, another good point there is a lot of the content we consume is lossy compressed to 4:2:0 (I think TV content is for example, but don't quote me on that). Might as well send that over the wire instead of expanding to 4:4:4.

Share this post


Link to post
Holering said:

Does anyone here remember Carmack in 2001-2002 commenting on 32-bit color not being enough for Doom3?

He was talking about the calculating / rendering side, not the display side. Using 64 bits internally would allow for greater precision when doing lots of lighting calculations, where otherwise the small errors from lack of precision could end up being multiplied together and be far more apparent in the final value. No one is suggesting 64-bit color on displays, which would be insane overkill.

edit: here's the .plan where he talked about it http://floodyberry.com/carmack/johnc_plan_2000.html#d20000429

The situation becomes much worse when you consider the losses after multiple operations. As a trivial case, consider having multiple lights on a wall, with their contribution to a pixel determined by a texture lookup. A single light will fall off towards 0 some distance away, and if it covers a large area, it will have visible bands as the light adds one unit, two units, etc. Each additional light from the same relative distance stacks its contribution on top of the earlier ones, which magnifies the amount of the step between bands: instead of going 0,1,2, it goes 0,2,4, etc. Pile a few lights up like this and look towards the dimmer area of the falloff, and you can believe you are back in 256-color land.

Ironically, nowadays Carmack *IS* complaining about dealing with lack of color depth on the display side, because 24-bit color banding is much more visible on VR goggles than on a monitor: https://twitter.com/ID_AA_Carmack/status/486660710672248833 Like he says, however, the solution is dithering, which makes the problem basically unnoticeable.

Incidentally, HP just came out with the very first monitor to (mostly) display the new Rec. 2020 standard. The standard was designed for UHD-type displays and supports 12 bits per color channel, so you can start slavering over that now.

Share this post


Link to post

However, I do notice pixelisation, if that makes sense? I can perceive the limits of the display.

The point is that it's a noticeable improvement in quality. I've been using a "retina display" Macbook Pro as my main laptop for the past year and I do really notice the difference to certain things - particularly font rendering, where the characters are much crisper and better defined.


Hmm, you make a good case. Now that you say that I can spot it too. I don't mind and don't pay attention to it, and I'm unsure the average person does; but it's not like there's any evidence to back up my feeling, so I can definitely see where you're coming from.

Share this post


Link to post
Holering said:

I just don't understand why higher resolutions are pushed forward more so than other features like 3D vision, higher color depth, and oversampling. It's like people want their computer to become a sort of Hollywood production machine. What's wrong with high fidelity color and interactive gaming?

Modern machines use 24-bit color and have done so for years. Maybe there's some room for improvement but to be honest it's a pretty big range already and I've never really heard anyone suggest before that it's insufficient.


I'm never really around Macs but I've heard of those retina displays. I'm assuming they squeeze more resolution than normal into portable displays. I think that's more cool on a Mac or portable device.

Actually my point was the opposite - pretty much every modern smartphone or tablet has a hi-res display like this now, but most "real computers" - ie. desktop PCs, laptops, etc., are still stuck with the same comparatively low resolution screens that we've been using for years. Seems really backwards when we usually expect the PC world to be leading these kinds of innovations.

It's still hard to find a decent laptop with a hi-res ("retina") screen - until recently there was the Macbook Pro, Chromebook Pixel, and that was pretty much it. Lately a few manufacturers have started bringing out high-end laptops with decent screens, but they're still comparatively rare.

I just don't understand the problem with something compact like goggles for games. It's way smaller than a monitor and you don't need to adjust your seating position. Standing while on a desktop is actually way better and makes it more fun IMO. Kinda like playing arcade games and the Wii; I think the blood flow has something to do with it.

Okay, sure, that's what the Oculus Rift is.

But I guess I just don't understand what it is you're objecting to, really. We're getting higher resolution screens, but that doesn't have anything to with 3D vision. Why do you think that one precludes the other?

Blzut3 said:

In regards to the pixel density question: A number of Thinkpad users have commented on a similar topic saying that around 140-150ppi is their upper limit for comfortable viewing of unscaled media. I agree with this based on the 144ppi display on my laptop, but as fraggle stated I can still see pixelation in content. So following Apple's theory, around 280ppi is the point where I could get the same screen space without seeing the pixel limit. (For desktop/laptop displays.) I patiently await the day 17" 4K or 21" 5K displays are available.

Yeah, to clarify: I don't want higher resolution just so that I can get more desktop space; I want it so that I can see the same as what I'm already seeing in greater clarity. Most modern OSes now have support for running at a "doubled" resolution - software support is essential to make this work because lots of legacy code assumes a particular DPI.

Mac OS X doesn't actually even allow you to run at the full resolution - that is, the resolution on the MBP is 2880x1800 but you can't run at that resolution in the "traditional" sense. There's actually a hack that allows you to enable it, but when I tried it out, it hurt my eyes straining trying to read things.

Share this post


Link to post
fraggle said:

Modern machines use 24-bit color and have done so for years. Maybe there's some room for improvement but to be honest it's a pretty big range already and I've never really heard anyone suggest before that it's insufficient.



Depends on what you intend to display.
Even with Doom, some darker levels can cause visible banding due to lack of color precision. I have seen this in GZDoom on occasion that there's a clear distinctive border between an area with color (4,4,4) and (5,5,5), for example.

Of course, for computing this all is a moot point as long as consumer graphics hardware cannot even create a screen buffer with more than 8 bits per pixel. Pity...

Share this post


Link to post

I've wished for higher color-depth. In some cases, color banding is still obvious. One of the most prominent cases IMO is any photo of a clear sky, the shifting values of blue doesn't get well-represented in 24-bit color and can actually create some ugly banding.

Then I have my desktop set to a gradient from #123 to #456 and the bands are somewhat annoying. This is a bit of a more artificial example but it's still kind of poignant (perhaps dithering is the solution but MATE doesn't dither this...).

Share this post


Link to post
Phml said:

Hmm, you make a good case. Now that you say that I can spot it too. I don't mind and don't pay attention to it, and I'm unsure the average person does; but it's not like there's any evidence to back up my feeling, so I can definitely see where you're coming from.

I don't particularly mind either, but it's "nice" to be able to look at characters on my screen and have them look like actual characters of the kind that you'd see on a printed page of a book. Feels like progress even if it's ultimately a fairly subtle change.

I remember seeing a blog post someone made about the font rendering stuff, with scaled up text of different sizes and how with a hi-res screen they scale up smoothly along a consistent curve. I can't find where I read it now, though. It was a pretty good illustration.

Certain apps on OS X run in "low res" mode - they look like they do on a non-retina display Mac. The Steam updater window is one example. I do notice immediately when an app looks like that - it really shows up the difference.

Share this post


Link to post

The color space used limits what can be displayed efficiently, moreso than the total bit depth.

E.g. 24 bits and 16.7M colors have been around for a good 20 years now, but with 8 bpc you can only display 256 grayscale levels, no getting around that. Getting incrementally more bits per channel (9, 10, 11, 12 etc.) only increases this amound by little: 512, 1024 etc.

That's why there are special grayscale displays capable of 10 or 12 monochrome bits, for purposes such as medical imaging, where accuracy and shade depth is important.

If you use different color spaces like e.g. HSV or HSL and dedicate unequal amounts of bits to luma, hue, saturation etc. you might get more grayscale precision in 24 bits, while still allowing 16.7 M colors, but with a different gamut. Too bad that most computer systems use either RGB (straightforward, but also simplistic) or LUTs, if they are older.

Even with

Share this post


Link to post

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
Sign in to follow this  
×