Jump to content
Search In
  • More options...
Find results that contain...
Find results in...
Sign in to follow this  
Dima

carmack convinced nvidia!

Recommended Posts

My source reports the following:

One of the best programmers in the world and id software co-owner, John Carmack, once again managed to do the impossible.
Trying to get Epic and Monolith to realise the new concept of 64bit color has failed, and Carmack decided to try and convince his old counterparts, Nvidia to include 64bit in their next chip on his own.
Long discussions with Microsoft and Nvidia ended with success, NV35 with be released not only with support for 64bit color, but 128mb of ram as well!
Besides that, the new feature will probably be included in DirectX 9.0 specifications as well.
Look for offical announcement at the beginning of the year.

Share this post


Link to post

Shit, I bet Nvidia would paint all their chipsets pink and fit them with 512Mb if Carmack asked nicely. When he goes down a certain path, the rest of us have no choice but to follow.

Share this post


Link to post

Dima, you are such a moron. Every time I see one of your posts it makes me nauseous because your rumors sound like the fanciful creations of a 12-year-old on crack.

NV35 is THREE GENERATIONS away. NV25 and NV30 will come first. That's at LEAST 18 months away. That card likely won't come out 'til after Doom 3 does, and besides that, 64-bit color support has been hinted at for NV30 from much more reliable sources than some guy on a forum's (who noone knows) "source". And there are already 128MB graphics cards out there for consumer use (not just professional 3D cards), yet you make it sound like it's such a big deal. And on top of all this, you say that 64-bit color might be included in DX9 - I got news for you, honey; the DX9 specs are already out.

Quit posting these "rumors" unless you can provide something to back them up.

Share this post


Link to post

Katarhyne - I will re-ask my source about the 'NV35' thingie.
But I'm pretty sure it's 'NV35' and not 'NV30'.

Yeah, I know that there are 128mb cards out there. still, that's a LOT of ram.

About the 64bit color:

You said it yourself: it's been rumoured that NV30 will support 64bit color, the chip hasn't been announced yet, so you can't be certain in what you're saying.

About the DirectX 9 specs:

Yeah, I know that they are out already, but they are not final yet, things can change, and probably will.

P.S
No need to flame me, I'm pretty sure this information is correct, if the number is incorrect, fine, but I doubt it.

Still, even if it is like you say, the fact still remains, Carmack convinced Nvidia to include 64bit color!

Pretty cool stuff... such an influence this guy has :)

Share this post


Link to post

YAY

Altough I wonder how many programmes will be able to actually use it to it's fullest. It takes a lot of prediction on how you'll engine will/should behave to fully encode extra crap in a 64 bit enviroment.

Share this post


Link to post

Katarhyne - well, what can I say? do what you gonna do, just don't push the limits!

Share this post


Link to post

Dima: Don't take it personally - even if it is personal. I flame people; it's what I do.


Maybe you should flame me some more Kat - just so that I got used to it.

Share this post


Link to post

If it's true, then yay. We always knew 24 bits wasn't enough. </sarcasm>

Well, really, 64bit would be something suacy :)

Share this post


Link to post

Katarhyne - I will re-ask my source about the 'NV35' thingie.

Tell Janitor Joe at Wal-Mart I said hi.

Jesus Christ BBG, you just got on my good side. =)

I remember Carmack talking about how he wanted to get 64 bit color up and rolling a year ago. One of those silly Doom interview things that seemed to pop up once in a while, I believe I read the snippet on DoomCenter.

Of course, 64 bit color doesn't really do anything for me. Probably because I don't know what the hell it does. I know that GL uses 32 bit color over 24 because the 32 bit color has the extra shit for alpha transparancies, and you can really tell the fucking difference (does nVidia have the WORST FUCKING DITHERING ALGORHYTHMS ON THE FUCKING PLANET OR IS IT JUST ME?). So what the hell does 64 bit color add, luminance and hue?

So is this going to be like 16 and 32? Going up to the new notch will effectively dropkick your framerate in the balls and give you better colors, or will it add much more than that this time around?

I still can't believe the idiocy of the AGP slot. I mean, shit, it's been long enough it's Goddamn time for a replacement. Something that will supply the additional power needed for the multiple processors and fans on the card.

I was also thinking, which I usually try to avoid, that buying 64 megs of ram each time you get a video card is kind of a damn waste of money. Soon we'll be doing 128s then 256s and shit... I don't get it.

Why don't they put another ram slot RIGHT NEXT TO or perhaps... just make the AGP slot a hell of a lot longer, and have a ram slot on the back end of it?

It would cut the cost of the card, plus, you could just upgrade the card and keep your ram for it.

I mean like picture your ugly little AGP vagina right now, now make it longer so it can transfer more data and have more power connectors, now splice a ram slot onto the very end of it too. That'd be long as hell...

But you see my point, right? I mean, shit, if the architecture is designed well enough there's no reason NOT to do something like this.

You could get a generic video card and plug 512 megs of ram into it. =)

As many generations as these video card companies are into making their products I'm still disappointed that all mainstream gaming cards now don't have built in video in and out connectors. Only some even have video out, and only ATI really does it at a decent quality.

I also find it disappointing that no matter how many new features are implemented into these new engines that all pretty much puff on the nVidia cock and force us to constantly upgrade to see the new features they always seem to overlook the nice, old features. Like tv out and splitscreen action.

Game companies have to devote so much fucking time to a single player game now they don't even have the resources left over to make a multiplayer to go with it. Undying, Max Payne, and SOF2 will be like that too. Since Carmack apparantly thinks Q3A is the peak of multiplay (for some unfathomable fucking reason) they might not even add multiplayer to Doom 3 (which would be pretty fucking idiotic). Maybe if they made TWO GAMES on the current tech instead of constantly bunny hopping to new tech with each game they'd get more accomplished.

For the game, of course.

It wouldn't look as good, and we all know Dima would rather DIE than play any game that isn't 100% bleeding edge tech, but the damn game would be a hell of a lot better.

Heretic, THEN Hexen.
Heretic 2, THEN SOF.

This is probably why I still like console gaming so fucking much. I mean, shit, Undying is just about the ONLY PC game I've played that was done well enough from the initial release to not need a serious patch. AFAIK it doesn't even have a minor patch. Max Payne had one out in a few weeks 'cuz some people had major problems with it. Half-Life has over 150 megs of patches now. Even Hexen had one patch.

Playing a game on PC with the quality level of ICO, Klonoa 2 or Parasite Eve 2 is fucking rare as hell.

Just because new tech is being vomited all over the floor doesn't mean you have to drop EVERYTHING YOUR DOING to pick it all up for Chrissakes.

With a good console developers have a good 3 to 5 years to get used to the tech and push it's limits, refine existing engines and most importantly just focus on making the Goddamn games run fine and FUN TO PLAY.

I mean, shit, bumpmaps are nice to look at but they aren't necessary. 64 bit color would be nice but it isn't necessary. 200,000 polygons on a bottle of pepsi in the game would be nice but it isn't necessary.

Max Payne was nice and pretty. To me, though, it basically looked like the Q2 engine with hires textures, bumpmaps, that stupid Matrix slow time effect and better scripting. If they started from Q2 though I bet the Goddamn framerate would've been better.

Railgun... Matrix slow mo... I wonder if there's going to be any other SHITTY action movie gimmicks being GROUND INTO THE FUCKING DIRT in video games soon. New super gun in a muscle bound freak movie? It'll be there. New camera trick used in Gap commercials? Sure. Why the hell not. Anything we can do to distract gamers so they don't notice our level design sucks and the genre hasn't had any new life breathed into it in five years. Shh.

Now that I think about it, when Frost threw that cute little Asian girl into traffic... you saw Blade's silver bullets coming out of his faggotated Uzi... so wasn't Matrix ripping off Blade? I can't remember which one came first, but Blade rocked and Matrix SUCKED DONKEY COCK so I hope Blade came first. ^.^

Jesus Christ I'm bored. I gotta stay the hell out of this forum before I start polluting it again.

GODAMMIT MATRIX FUCKING SUCKED!

Share this post


Link to post

Of course, 64 bit color doesn't really do anything for me. Probably because I don't know what the hell it does. I know that GL uses 32 bit color over 24 because the 32 bit color has the extra shit for alpha transparancies, and you can really tell the fucking difference (does nVidia have the WORST FUCKING DITHERING ALGORHYTHMS ON THE FUCKING PLANET OR IS IT JUST ME?). So what the hell does 64 bit color add, luminance and hue?


Apparently, the extra 32 bits are used to get smoother transitions in the lighting. Because if you put a bunch of lights close to a wall, you wind up with banding that makes shit look like 256-color mode. Or something. I dunno, you should ask Zaldron.

Share this post


Link to post

Okay, first off, Renegade_Style:

Nice sig. ^.~

Second off, deadnail...gah, where to begin?

Yes, 64-bit color will drop-kick your framerate right where it hurts. But a switch to a higher color depth will not only murder your effective fillrate (halving it compared to 32-bit color), but it will also murder your memory bandwidth, as that's now 8 bytes per pixel that you have to push around. If you tack on a 32-bit Z buffer, and 4X supersampling FSAA, you've got a 100MB framebuffer @ 1024x768. That means every single frame you're rendering is 100MB. Obviously, you can't do that without a 128MB graphics card (or more, obviously), and since most immediate-mode renderers (i.e. every card but a Kyro) require double-buffering, you're looking at a minimum video memory requirement of 200MB. Remove the FSAA and you're still looking at a 25MB frame, which is much more manageable, but then, the same thing in 32-bit color (1024x768x32) is only a 12MB frambuffer. Which means roughly a 25MB framebuffer for double-buffering, which leaves you about 40MB for textures. Figure a 6:1 texture compression ratio (not unrealistic at all) and you've got 240MB of local video memory for textures. Nice, huh?

But I guess that whole little thing was kind of irrelevant.

Let me do a little science lesson here. A monitor signal (like a TV signal) is made up of red, green, and blue channels. To get gradients thereof, you merely change the mix of red, green, and blue (hereafter referred to as RGB). So, obviously, in an RGB signal, the more gradations of R, G, and B you have, the more precise your colors can be. So, good. We have that down.

The reason I had to explain that is to explain the "bit depths" of color. When someone refers to a certain color depth in bits, they're referring to the "bits per pixel" or the number of bits per pixel on the screen that the video card has to process. Monochrome is 1bpp, or one on or off signal per pixel on the screen. Obviously, current displays have more. Nowadays, you pretty much only see 8-bit, 16-bit, and 32-bit color.

8-bit color isn't used much anymore. There's a reason for that; it's ugly. 8-bit color is what Doom was originally in, mostly for purposes of available technology. 8-bit color is so called because there are eight bits per pixel on the screen. So you have up to 256 different colors, since for each pixel you can have any binary combination from 00000000 to 11111111. I don't remember which bits control which in eight-bit color; it's been too long since I took that class.

Anyway, 16-bit color was the de facto standard for quite a while, because it looks much better than eight-bit with only double the performance penalty. In 16-bit color, you have four channels, RGB and then a channel called alpha, which controls the transparency. While rendering in 16-bit color, you have 4444 bits, or, 4 red, 4 blue, 4 green, and 4 alpha. So, you can have up to 16 gradations of RGB and 16 degrees of transparency, giving you a total of 65,536 colors. In a non-3D situation, 16-bit color is often expressed as 565, with an odd bit for the green channel. This allows for somewhat sharper colors in photographic work and the like, but is less useful in 3D games and the like.

32-bit color has recently become a requirement for all 3D applications, basically because everyone realized that 16-bit color just looks like shit without some pretty advanced filtering. 32-bit color one-ups 16-bit color by supplying 8888, or eight bits per channel. You often hear about 24-bit color, which was used for a long time by graphics enthusiasts; 32-bit color is the 3D application of 24-bit color (which has no alpha channel). 32-bit color provides a mind-blowing 4294967296 colors.

64-bit color I don't know much about yet. I don't know how they're going to do it; I'd assume 16:16:16:16, which would produce...a lot of colors.

Now, obviously, noone is going to want to manually select shades from a palette that large. So, 64-bit color is mainly for the purpose of giving more realistic lighting and shading...which is exactly what Carmack is looking for, and in my opinion, the primary goal in 3D rendering, right in front of increasing polycounts.

Anyway, as for the rest of what you said, deady...

AGP slots don't provide enough power. AGP Pro slots do. The matter's already been taken care of.

They don't do the whole "RAM slots on the card" thing simply because standard SDRAM or DDR SDRAM is much too slow for use in a graphics application. Most SDRAM or DDR SDRAM that you buy is 7 or 6ns. Most of the RAM on graphics cards nowadays is 5ns or less. (some of the faster NV20 and R200 cards have 3.8 and 3.5ns RAM!)

There ARE two games being made on the new Doom tech. Doom and Quake 4. So hush.

Max Payne does a lot of things that the Quake THREE engine isn't capable of. They'd end up re-writing half the engine to do it. And so, optimization would've lost time to conversion. So no, the framerate wouldn't have been better. Buy a new videocard.

I haven't seen Blade in a long time, but IIRC Blade's gun wasn't an Uzi or any permutation thereof. It was an Ingram M11.

Share this post


Link to post

No, no, NO

The key word here people is INFORMATION. When we add more bits to our displays it's NOT necesarilly to add COLORS but more INFORMATION.

Agreed, jumping from 8 bit to 16 bit to 24 bit only gave us more colors, up to a point where while it's not all the colors the human eye can perceive, it's the amount of colors we can discern. The human eye can't tell the difference between the gray 128 and the gray 129, so adding a couple of more bits for each RGB channel would only end in :

1) SLOWER CALCULATIONS
2) MINIMAL QUALITY GAIN

So, what the heck we did in 32-bit displays then?
24 bits ultimately ends in 16,777,216 colors and while 32 bits sounds like 4000+ million, it's actually 16,777,216 too.

Then what the fuck we did with those extra 8 bits? Easy, we transformed those into a new type of information we call ALPHA CHANNEL.

Every pixel in the frontbuffer surface (the piece of RAM that ends being displayed on your screen) stores now not only the Red, Green, and Blue components, but it also stores an Alpha value ranging from 0 to 255.

Now this thing is useless unless you plan to use it in your engine. Alpha channel is a way to measure transparency of THAT particular pixel. Imagine you're in the era BEFORE 32 bits displays, and you want to simulate alpha channels. What do you do?

You need 2 images. The real one with all the pretty colors and another one that's on grayscales and shows the transparencies for each pixel. Black being transparent, white being opaque, the rest of the shades interpolated from those values.

Now, in order to combine them you need to make a loop, a classical programming language tool to perform the same algorythm an X number of times.

In this case, X will be as many pixels as this image has, so it's Height x Width. What the calculations does then?

1) Starts with the first pixel of the image you want alpha-blended.
2) Looks on the game screen the pixel that's being hidden by this little surface we're pasting over.
3) Looks in the first pixel of the "alpha" image.
4) With some relatively fast calculations we draw a pixel of a certain color that matches the result from transparenting the target pixel and mixing it with the destination's color.
5) Start again for the second pixel.

Coders, I know there's a better way to do this, but that's not the point of my rant ;)

These calculations are fast for CPUs, but by no means a valid tool today. Imagine how slow this can be with large surfaces and true color instead of 8 bit...

But once you have alpha channel supported from hardware (the 32 bit display), this is handled seamlessly by the GPU. It's no longer a consuming pain in the ass, but something so natural we've been using it for a fucking long time, practically since the day 3dfx started dying.

So that drives us again to all this 64 bit thing. Since we're not using it for colors, what kind of information are we storing in those extra 32 bits?

Whatever the coder wants. Carmack already has a few ideas on how to improve the visual quality, and I'm pretty sure we won't see real-time lighting with radiosity until 64 bits is a fact.

You want some examples? 3DS MAX for example has something called a "Virtual FrameBuffer". It's a 64 bits buffer, and virtual because it's not supported by hardware (at least not yet). MAX uses those extra 32 bits to store important information like Z-depth and the G-Buffer.

Z-depth stores how far from the spectator is that specific pixel, and could be used to create funky effects like Depth of Field or Motion Blur. They'll look a little crappy because we use one sample per frame, but that's another story.

G-Buffer tells the engine what pixel belongs to what group of objects. This is a incredible effective way to discern stuff and occluded stuff. No more hitscans and polygon collision when trying to guess what are you shooting at, now you only have to check the G-Buffer of that specific pixel on the center of your crosshair.

Bear in mind these things are the key values for Video Postproduction. Every non rendereable effect like Lens Flares, Glows, Cartoon Shaders, even adding plugins from Photoshop to your scene are done mainly with this values.

The hardware should work at 64 bits, but since all those 32 bits aren't exactly information we can perceive directly with our eyes, it's not necesary to output that to your monitors. The hardware, then, will know a lot about the pixels, but the monitor, of course, we'll only receive the R, G, and B components. Not even 32 bits is visible on your screen, that's 24 bits for you, and 32 bits for internal calculations.

Ok, I ranted way too much. I hope things are clear now.

Share this post


Link to post

Max Payne does a lot of things that the Quake THREE engine isn't capable of. They'd end up re-writing half the engine to do it. And so, optimization would've lost time to conversion. So no, the framerate wouldn't have been better. Buy a new videocard.

Like what? I haven't seen anything particulary difficult to archieve, technologically speaking.

Share this post


Link to post

Zaldron - "real-time lighting with radiosity" - you can't be certain, maybe DOOM engine already supports this future.

Share this post


Link to post

Zaldron - "real-time lighting with radiosity" - you can't be certain, maybe DOOM engine already supports this future.

No way. You would need 10 GF3s working on paralel to archieve 30 fps. And that if Carmack finds a way to not break the light bouncing with the portals, so not even with that many.

Share this post


Link to post

Yeah Blade came before Matrix. It rocks, one of the only superheroe movies that doesn't stink.

DOOM will include multiplay. Check the VE interviews.

And the technology/game design aspect doesn't really make sense right now. Yeah, there are too many games with fucking bad game mechanics and awful plots. But last time I checked the technology and the game content is made by different people. It's not like they're trading resources.

Share this post


Link to post

But last time I checked the technology and the game content is made by different people. It's not like they're trading resources.


what game?you mean DOOM3 right?and who is making that?

Share this post


Link to post

I mean in general. Carmack's the only one that focuses on technology, while the rest of id makes the game.

Share this post


Link to post

Bah. You had to go and get all technical on me and stuff.

The way you say it, you're not looking at 64-bit color depth, but merely 64-bit depth per pixel. Which, I suppose, is more useful, like you said, but we're already doing that. Just, not in hardware. (32-bit Z + 32-bit color)

And hey, if you take every color in 24-bits and then display it in every shade possible in 8-bit alpha, you're going to get 4 billion "colors" (not different shades, but they are different values). So, it's semantics. Granted, there are only 16777216 colors, but there are 256 shades of opacity for each color. Bah.

Share this post


Link to post

Bah. You had to go and get all technical on me and stuff.

That's my only weapon.

The way you say it, you're not looking at 64-bit color depth, but merely 64-bit depth per pixel. Which, I suppose, is more useful, like you said, but we're already doing that. Just, not in hardware. (32-bit Z + 32-bit color)

Well that's what Carmack wants afterall. Yes we're not doing it in hardware, and that pretty much kills all real-time possibilities.

And hey, if you take every color in 24-bits and then display it in every shade possible in 8-bit alpha, you're going to get 4 billion "colors" (not different shades, but they are different values).

Sure, and that's why I said "information" instead of "color". Add the extra 32 bits Carmack wants and we can have a collosal amount of possibilities, but there's only 16777216 variations when it reaches our monitor.

Share this post


Link to post

Kat, nice long rant but I DID say the difference between 24 and 32 bit color was that 32 had the alpha channel... only I said it like that and not in six paragraphs. =Þ

AGP slots don't provide enough power. AGP Pro slots do. The matter's already been taken care of.

I've heard of them and I'm still waiting for them to become somewhat popular. A different kind of slot for people willing to lose 2 or 3 fucking nanoseconds wouldn't be so bad I wouldn't think, but then again, I guess the whole concept behind AGP is to please people that jack off to framerates.

There ARE two games being made on the new Doom tech. Doom and Quake 4. So hush.

You missed my point, wench. =)

My point is that each company is now only making one game with each new level of tech that's being shovelled out. The more time they spend learning and implementing pretty features the less time they spend making game content.

Raven spent a while with the Doom engine and the Quake 2 engine... now they're only making one Quake 3 engined game. I hope to Christ people stick around with the New Doom engine for at least TWO FUCKING GAMES.

Max Payne does a lot of things that the Quake THREE engine isn't capable of. They'd end up re-writing half the engine to do it. And so, optimization would've lost time to conversion. So no, the framerate wouldn't have been better. Buy a new videocard.

Every six months? Um, fuck you. =)

I haven't seen Blade in a long time, but IIRC Blade's gun wasn't an Uzi or any permutation thereof. It was an Ingram M11.

With a giant dongle on the front. The back half looks like a Mac10 or Uzi or one of those ghetto fabulous weapons.

Zaldron's insane rant

Aha, so that's the benefit of 64 bit color. Programmability or some shit.

Max Payne does a lot of things that the Quake THREE engine isn't capable of. They'd end up re-writing half the engine to do it. And so, optimization would've lost time to conversion. So no, the framerate wouldn't have been better. Buy a new videocard.

Like what? I haven't seen anything particulary difficult to archieve, technologically speaking.

I sure as fuck don't either. It's Quake 2 with hi res textures and a pretty particle engine. The model physics are pretty nice though, but why the fuck did the game take so many years to finish so I could beat it in 5 hours and never feel the need to play it again?

Share this post


Link to post

Raven spent a while with the Doom engine and the Quake 2 engine... now they're only making one Quake 3 engined game. I hope to Christ people stick around with the New Doom engine for at least TWO FUCKING GAMES.

Do not despair, Raven's making both SoF II and Jedi Knight II with the Q3 engine. And they already tested the engine with Elite Force. They haven't even started Q4 yet, except for some gameplay concepts and the plot.

Share this post


Link to post

Blade

The only movie almost as bad as the Matrix.

I heard like a year ago they were working on a sequel that was supposed to be as "revolutionary" as The Matrix was when it was released.

Note: I failed to see anything revolutionary in The Matrix except for an unprecedented drop in standards. It's as if NIN invaded the movie industry.

Share this post


Link to post

It's a matter of alter egos. The plot in both movies isn't really satisfactory, but the whole thing of being able to see a character you would love to be or identify with is awesome.

If there was a movie with Hexen's Warrior... *sigh*

Share this post


Link to post

He wants 64bit COLOUR. Thats 16-bit floating point per component (RGBA). 16:16:16:16. On top of that you want a 32-bit Z aswell.

Now that will take up 1024*768*(64bit + 32bit)*5(for fsaa, 1 buffer 4 times bigger for 4xfsaa + front buffer)

That is 45mb excactly for 1024*768 res + 64bit colour + 32bit z + 4xfsaa including back and front buffers.

Why 64-bit colour? I don't know, I'm not a 3d programmer. All I know its that its simply about precision. You do maths with binary, you loose precison(i.e you get error). Thats because binary represents discrete numbers, you can't represent every number with binary in the ranges your representing.

Lots of texture passes = lots of maths = lots of error. More precision = can handle more error before it looks like shit (i.e "back in 256 colour land" as carmack said). Also something to do with representing the range of light that the human eye can see. Currently we get 0-256 or something but we can actually see light intensities that differ by about 10,000 or something. Don't know about that last bit.

Just go read carmacks .plan update. http://www.gamefinger.com/plan.asp?userid=johnc&id=14308

Also read this post by the member "SA" over at beyond3d. You'll need to scroll down a it to see it.
http://bbs.pcstats.com/beyond3d/messageview.cfm?catid=3&threadid=1554

Share this post


Link to post
Guest
This topic is now closed to further replies.
Sign in to follow this  
×