Jump to content
Search In
  • More options...
Find results that contain...
Find results in...
Reaper978

The Computer Ceiling

Recommended Posts

So, do you think there is, or ever will be, a limit to computer hardware advancements? Looking at the graphics of today's games, it's difficult for me to imagine how it could be improved. No doubt they will find new ways of advancing graphics still more but is there really much further to go beyond Crysis and Crysis 2? Will hardware developers simply find new ways of charging people for more advanced hardware? Is this type of marketing ever going to change?

Crysis 2 looks pretty fucking amazing, by the way.

Share this post


Link to post

Of course. There's a shitload of things you can do to improve it. Like leave the polygon leaf method behind and use volumes similar to voxels to create assets that actually has a body and aren't just hollow shells.

Share this post


Link to post

Most of today's gaming computing power goes into faking a lot of stuff. As time goes on less will have to be faked, and just defined - define the properties of this plant, or this brick, or whatever and you will have real-time calculated results without needing to rely on polygons and stretching a pre-made bitmap image layer set over it.

Share this post


Link to post

There's also atomic limitations on the size of silicone chips.

So eventually yes, we will be unable to make smaller processors using 'current' technology methods because they will get too small, and quantum unpredictability of positioning will cause the system to be unstable.

Share this post


Link to post

There always are and will be limits dictated by the current state of the art in electronics, and, increasingly, by the structure of matter and the physics themselves. For example, the processing power that a single core can carry has already hit a brickwall, that's why they are going multi-core, but that also has logical limits and constraints to the types of problems that can benefit from it.

So, for now, the speed at which we can solve inherently serial problems is limited, and quite a lot, I must say. But it's a good day and age for embarassingly parallel problems....at those, you can throw 1000s of cores, and graphics evolve so much exctly because they ARE an embarassingly parallel problem, for the most parts.

Share this post


Link to post

Computers just simulate reality. If you want it to be completely realistic then you need a 1 to 1 bit per (whatever the universe uses as bits if anything) correspondence, like all the way down to the quark level. Maybe a lot of the universe's "information" in a real apple is redundant, like maybe you can get a seemingly realistic apple by simulating each atom in a simplified form without worrying about scales smaller than that, but it still wouldn't be a "real" apple because it wouldn't have quarks or whatever (I only vaguely know about physics so maybe "quarks" is wrong). To really simulate an apple, you'd have to simulate with bits or whatever all the little genes and stuff of the dna in each cell etc. Like to really simulate it, you'd need the apple seed to grow into a new apple. Is it possible for reality to contain another reality? Doubt it, how are you gonna fit reality inside reality; the real reality is hogging all the information/space. Shit I don't know, maybe simulated reality can dwarf reality, like maybe the universe has a certain amount of information and a "computer" inside that universe could contain more information? Stupid idea, because the computer is IN the universe so IS a part of the universe, so its all the same universe.
Well maybe you can't completely simulate the ENTIRE universe, maybe just a tiny portion, like can a computer ever completely simulate a single apple? Even if it contains all the DATA (each molecule, each cell, each collision, etc), where's it gonna represent that data? Right now you can draw an apple on 2d monitor pixels but you can't have the computer output "apple essence units"/ apple molecules or whatever, right into the air in 3d. Maybe 3d printers could evolve into something similar. But now you're dealing with "stuff" in reality. Like animation is based on the illusion/trick of erasing an item then redrawing it in a new position. How are you gonna erase a real molecular apple and draw it on the next frame? It'd just be a boring everyday apple, like you couldn't do weird stuff like have a program make it gradually grow taller or something. Ok, I'm done with my armchair pseudointellectual BS post now.

Share this post


Link to post

One day, perhaps computers will have so much power that they could figure problems like this out for themselves, leaving humans to worry about things like grooming and deliberate over whether it's too rainy to take the dog for a walk.

Share this post


Link to post
eargosedown said:

There's also atomic limitations on the size of silicone chips.

I predict computers in the future may weigh no more than 1.5 ounces.

Mark my words, people.

Share this post


Link to post
Technician said:

I predict computers in the future may weigh no more than 1.5 ounces.

Mark my words, people.

My Raspberry Pi (revision B) weighs 1.6 ounces when stripped naked, so I'd say the future is almost upon us. ;)

Share this post


Link to post

Bucket said:
I thought that someone turned their ceiling into a computer.

I thought it was going to be a discussion of the CONS flats :-P

Share this post


Link to post
Bucket said:

I thought that someone turned their ceiling into a computer.
:(

I thought that this was about some of those textures/flats from the DOOM IWAD.... You know, the ones which are either completely blue or completely gray or a mix of both blue and gray?

Anyways, yeah, I think that 3D graphics for flat screens have finally reached the peak of their evolution. The next step has to be some kind of external holodeck a la Star Trek.

Share this post


Link to post
Reaper978 said:

So, do you think there is, or ever will be, a limit to computer hardware advancements? Looking at the graphics of today's games, it's difficult for me to imagine how it could be improved. No doubt they will find new ways of advancing graphics still more but is there really much further to go beyond Crysis and Crysis 2? Will hardware developers simply find new ways of charging people for more advanced hardware? Is this type of marketing ever going to change?

Crysis 2 looks pretty fucking amazing, by the way.


You're overestimating the visual fidelity of Crysis 2. I have yet to see a CGI movie that could fool me for long and Crysis 2 is rendered not with a huge render farm, but with a single video card. Video game graphics have a long way to go, and that without worrying about simulating an interactive environment.

Maes makes good points here, and he hasn't even discussed software improvements can squeeze nicer graphics from older hardware. If no new hardware improvements happen then graphics will still improve for a while.

Share this post


Link to post

Gaming is not what drives hardware advancement. The two main driving forces are high throughput (ie. scientific) computing and ultra low power computing. These two problems are unbounded, so advancement will not stop.

Share this post


Link to post
exp(x) said:

Gaming is not what drives hardware advancement. The two main driving forces are high throughput (ie. scientific) computing and ultra low power computing. These two problems are unbounded, so advancement will not stop.


Pffft, gaming drove the advancement of graphics hardware (in both specs + lowering of price). Graphics cards just happened to turn out to be a very nice parallel processor.

Share this post


Link to post

Computer graphics are really a subset of "scientific" computing: massively parallelizable number-crunching problems, with little or no data dependencies. In a way, they are THE ideal "show off" application for multicore etc. technologies.

In any case, we've had our fair share of "wow, aren't computers totally awesome?" threads recently, even if indirectly:

http://www.doomworld.com/vb/everything-else/62982-loss-of-things-over-time-and-where-did-the-future-go/

http://www.doomworld.com/vb/everything-else/56946-are-there-limitations-to-computer-generated-imagery/

There also were several "infinite detail" threads but I won't link to those because they turned into massive, flaming shit-flingin' fests.

Share this post


Link to post
exp(x) said:

Gaming is not what drives hardware advancement.

AFAIK, video cards for "serious" applications like Quadro and FireGL have always been based off whatever flagship gaming GPU was around at the time, not the other way round. So there's that.

Share this post


Link to post
exp(x) said:

Gaming is not what drives hardware advancement.


A statement that might have been true up to and until the mid 90s, but not afterwards.

Before that, "gaming" needed very specific stuff (e.g. 2D hardware scrolling capabilities, hardware bitblt, hardware sprites, etc.) which were seen as a particular/custom niche of hardware development, not really an "advancement" per se. To make a "next gen" console, you just stuffed it with enough custom proprietary ASICs, and that was it. No need for industry standards, open APIs, or even general purpose processing: you could have a shitty general-purpose CPU driving super-specialized hardware (practically all arcade machines of the 80s and early 90s were exactly that, and home computers like the Amiga thrived on that concept).

But with Pee-Cees lacking any sort of specialized hardware, it was all about raw CPU power, usually far in excess of what would be "enough" with other platforms for the same type of game genres.

A 68000-based Amiga 500 had smooth scrolling by default,on Pee-Cees you REALLY needed at least a 386 or even a 486 CPU to achieve that (CPU power difference: from 16:1 up to 50:1). This in turn gave PCs an advantage in 3D titles which DID NOT benefit from the Amiga's custom hardware and....well, that's why Doom appeared on PCs and not on Amigas ;-)

That "power inflation" drove features and consumer demand sky high. That's why 68000-based Amigas ended up being sold alongside 64-bit Pentium I, which quite literally were 100 times more powerful in raw MIPS. It was only a matter of time until their demise.

We all know the rest: the PC, with its general-purpose architecture with no specializations, thrived on raw CPU power (only exception: the introduction of GPUs and related APIs, which only became "general purpose" in the late 2000s!). If instead the industry had went the Amiga way, there would still be developments in graphics chips (3D accelerators would have appeared anyway) but there would be an over-reliance on ASICs and DSPs, rather than super-complex general-purpose CPUs.

Check out platforms that relied too much on that: Jaguar, Dreamcast, N64, PS3 (Cell processor)...yup, not really a winning strategy.

Share this post


Link to post
Reaper978 said:

So, do you think there is, or ever will be, a limit to computer hardware advancements? Looking at the graphics of today's games, it's difficult for me to imagine how it could be improved. No doubt they will find new ways of advancing graphics still more but is there really much further to go beyond Crysis and Crysis 2? Will hardware developers simply find new ways of charging people for more advanced hardware? Is this type of marketing ever going to change?

Crysis 2 looks pretty fucking amazing, by the way.

Back in the day people said similar things about Doom.

Share this post


Link to post
boris said:

Back in the day people said similar things about Doom.


I doubt that, since there were already more advanced examples of 3D graphics (either CAD/prerendered, movie/TV CGI or real-time via super-expensive dedicated hardware, e.g. in arcades or early VR simulators). Perhaps not on Joe Average's humble pee-cee, but they existed nonetheless. And everybody had seen T2 by the time Doom came out ;-)

Doom was innovative in that it delivered for the first time a convincing combination of a real-time texture-mapped environment with lighting, fast action, good controls and awesome gameplay, which was playable on an entry-level PC of its time. Remove any of these elements, and you'll see why any previous/similar attempts (e.g. various flight sims, Spectre-like games, 3D mazes, demoscene creations, Ultima Underworld or even Wolf 3D) were not as memorable ;-)

Now that I think about it, I can't remember ANY polygon-based game having as fluid controls or captivating gameplay as Doom. E.g. how would a polygon-based Doom clone be received e.g. in 1989 or 1990, assuming that levels were as architecturally complex and gameplay unaltered?

Share this post


Link to post

The fast rendering speed is why Doom was so so impressive technically. Feature-wise, the Ultima Underworld engine was a lot more powerful -- texture mapped everything? Check. Dynamically changing light levels? Check. Dynamic change of level geometry? Check. Slopes? Check. Walking, running, jumping, swimming, flying? Check. 3D objects with full collision data, such as tables or the UW2 blackrock gem? Check. Physics engine? Check.

But the flipside that all this was rendered in a tiny window, with the HUD taking up most of screen space.

I'm not talking about gameplay here because it's a completely different beast, what with being a CRPG mostly focused on exploration, conversations, and questing than on running around and shooting things. Controls were far clunkier than in Doom, but they could afford to be because the game was all kinds of awesome nonetheless.

Anyway, the point is, Ultima Underworld II was a superb game which is still one of the best CRPGs ever created and it is very memorable. Heck, one Id Software's adopted sisters is Arkhane Studio, also known as "we created our company to make Ultima Underworld III, but we couldn't get the rights to it so instead here's Arx Fatalis."

But nobody playing it would have thought of using the engine to make a twitchy run-and-gun with blazing action.

Share this post


Link to post
Gez said:

Feature-wise, the Ultima Underworld engine was a lot more powerful


Which is why they fell into the feature trap: certainly, someone could copy a super-complex code out of a VR sim of the time and apply it verbatim to a PC port, physics, full texture mapping and all. The question is: would it function acceptably? Short answer, no. Never release a game BEFORE the ideal hardware for it comes along. Even some later Doom clones fell into this trap, and resulted in being unplayable on rigs where Doom was King (DN3D and Descent come to mind. Quake was in an entirely different league, perhaps where Ultima left off).

Gez said:

But nobody playing it would have thought of using the engine to make a twitchy run-and-gun with blazing action.


Another point against it: being it a CRPG it appealed only to a very niche aspect of gaming. Even RPG fans were bitterly divided over it. In any case, it was not a game for everyone, whereas Doom certainly was a game "for most".

Share this post


Link to post

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×