Jump to content
Search In
  • More options...
Find results that contain...
Find results in...
TheeXile

Cloud-based gaming

Recommended Posts

Honestly, this sounds like a pump'n'dump scam (remember the Phantom?). Given the state of US broadband it's only a question of when this will fail, not if. The number of challenges for this is staggering, and the only thing they've talked about is some magical video compression algorithm without discussing the real killer, which is the lag between the server and client. Unless they break that pesky speed of light constant this will never work for twitch games.

Share this post


Link to post

Heh, I wasn't really aware of that, but reading about it, it indeed seems very similar. Even the names of the services are spooky. Phantoms are insubstantial and clouds dissipate. The Roberto Alfonso posting in the comments doesn't bode well, either.

On the other hand, it seems that Steve Perlman, the CEO of the company, has managed to put forth some successful ventures, like QuickTime and WebTV, so there might be something to it.

Share this post


Link to post

Penny Arcade's newspost today notes that there are other games besides twitch games, which this model could theoretically work for.

Yeah I read PA. What of it?

Share this post


Link to post
TheDarkArchon said:

Essentially, the service is streaming video over the network (5MB/s at full pelt), which is a lot more intensive than online gaming.


This could only ever work over a circuit-switched network (imagine a dedicated POTS-like 5 MB/sec service).


Perhaps the actual data rate will be much less than that: e.g. if they are capable of performing real-time DivX SDTV-resolution compression (with audio), that would work with under 200 KB/sec or 1.5
Mbit/sec...hmm that's pretty much the speed of a T1.

However, the slightest packet delay or miss will scramble the following frames, and there's the issue of routing back player control information before the next frame arrives :-/

Hell, there are even issues with TFT/TN monitors delaying their output by a few frames and making FPS unplayable, and that's a LOCAL device, imagine how shitty things can get with a network bottleneck and the associated buffering, packet drop etc.

Share this post


Link to post

Actually, I don't think the gaming console will die, but there will be new consoles that will work with this technology. There was a console version of something similar to this a few years ago, but it must've been too far ahead of it's time, plus the stutter was a bit much at times.

I think this'll be great, and the console market will benefit as well.

Share this post


Link to post
Georgef551 said:

Actually, I don't think the gaming console will die, but there will be new consoles dumb a/v terminals that will work with this technology


Corrected that for you.

Georgef551 said:

I think this'll be great, and the console market will benefit as well.


No doubt about that, as soon as people start reverting to "non-cloud" consoles after they experience first-hand what a cartload of doody bullfuck disappointment "cloud gaming" resolves to.

Share this post


Link to post
Maes said:

stuff about connection speed

Yes, this will fail because of two big factors:

1) Very few people in the US have access to connections that can sustain those numbers in the real world
2) People will very quickly hit the download caps imposed by their ISP

ISPs absolutely loath sustained transfers. Comcast did a lot work just to cut down on BitTorrent transfers, which would pale in comparison to the bandwidth required by this.

Share this post


Link to post

BitTorrents don't add anything to the ISPs other than possibly some users; well, and legal pressures. You can reduce the load on the servers by charging more for the service and the effects of long distance (lag) can be curtailed by channeling users to nearby servers.

Consoles and purchased games would not die because this system would not be applicable in any place, but maybe it could work around urbanized areas.

Share this post


Link to post
myk said:

Consoles and purchased games would not die because this system would not be applicable in any place, but maybe it could work around urbanized areas.


The only way this could work with existing infrastructure, would be on a large-scale LAN, not even a proper WAN. Local enough to have negligible lag (under 20 ms under ALL circumstances), and large enough to call it somewhat "decentralized".

If, on the other hand, all future homes in (rich) countries are built with a two-way optic fiber which is dedicated to this sort of services, is attached to a specialized gaming service center (and kept separate from general purpose-internet) , then yeah, maybe it could work with acceptable performance.

Emphasis on it being dedicated and separate from general purpose internet, again.

Don't forget that HDTV graphics will be all the hype, and delivering compressed video and audio can't possibly compete visually with what even a PS2 is capable of.

The idea is interesting, but it's also a proof of how perversely one can use digital technology: substituting powerful "local" hardware with remotely controlled one...well d'oh. This thing could also be done with analog video and a modem, provided you dedicate it an exclusive use channel. Implementing it over packet-switched, shared use networks with no warranty of QoS whatsoever is just a big WTF.

Share this post


Link to post
david_a said:

Honestly, this sounds like a pump'n'dump scam (remember the Phantom?). Given the state of US broadband it's only a question of when this will fail, not if. The number of challenges for this is staggering, and the only thing they've talked about is some magical video compression algorithm without discussing the real killer, which is the lag between the server and client. Unless they break that pesky speed of light constant this will never work for twitch games.

Here's the thing. People have gotten hands-on access with this thing. There actually is a product here, and the technology definitely works, at least on a small scale. In addition, a few major game companies have signed on. I don't think the Phantom had that.

The real question is, okay, we know the technology works on a small scale. We know that they've figured a way to stream one video of one game to one computer, and do it in such a way that the game is actually playable. But can they scale that up? Can they handle having thousands of players connecting at the same time?

Share this post


Link to post

The servers will need to be powerful brutes if - on top of the games they're serving - they're running a virtual machine for each logged-in user that emulates the hardware required for the game in play.

Share this post


Link to post

I would agree, it seems completely unrealistic to actually stream graphically intensive games with virtually no lag to millions of users. There is the argument that it would work fine for strategy games, or turned based games, but they're touting it's ability to run PS3/Xbox360/PC games with no lag.

Unless, as stated there was a direct fiber optic cable going to each house dedicated solely to this service, there's no way they could achieve playable framerates. And I doubt they'd want to pay to install these new cables into every house that would want it. And the fact that they would probably need an exclusion zone sized set of the manliest servers ever built to power this infrastructure makes it seem like a pipe dream.

Honestly after first hearing about it I immediately thought of the Phantom. It seems like vaporware, existing to dump any excess profits into a single field of research that can make one server power a TV and play a game about 15 feet away. The rest of the profits obviously go into the wallets of the executives who you can see with their hands in their pockets whistling quietly as they stroll away.

Steve Perlman, the brain child behinds Quicktime, the most godawful video player short of realplayer, and WebTV, which if you've ever used it requires no more explanation. Well, still, wow, 640x480 resolution on a 56k modem on my TV, with no mouse support and no harddrive, and I have to navigate with the tab button, fantastic! The internet without being able to download things, how can I lose? He's the king of either failed software, or ideas that implode on themselves after a year or two because they're completely unfeasible, and yet walks away from the exploded building, dusts the drywall off his suit, and goes off to find more investors.

Share this post


Link to post
Jello said:

There is the argument that it would work fine for strategy games, or turned based games, but they're touting it's ability to run PS3/Xbox360/PC games with no lag.


Perversely, I know of a strategy game (the Warlords Battlecry series) where sending player input over the network, instead of unit fighting data, would result in a significant data reduction.

Share this post


Link to post

I don't expect this to be successful any time soon thanks to the overabundance of shitty internet connections but once the bandwidth problem gets solved it might be interesting, provided that it doesn't devolve into a copyright mess with regional lock outs and other nonsense as any other downloadable-content-service so far (which I fully expect this will.)

Share this post


Link to post
Graf Zahl said:

I don't expect this to be successful any time soon thanks to the overabundance of shitty internet connections


That's the self-imposed limit that everybody mysteriously associates with this service: it doesn't have to (and it better not) be operated over general-purpose internet connections, which have no minimum performance or availability standards. This is a service more suitable to be run over fixed-speed, point-to-point dedicated data lines (which however would NOT be usable for general purpose internet, at least not on the same channel/carrier as the gaming service).

Under that aspect, this would work much like a cable TV on demand (but it would require more infrastructure compared to cable TV, which is not point-to-point, and would be different from a cable modem, which relies on bandwidth sharing).

OK, proposing dedicated, switched lines is not a very 21st century "net" approach, but it's the only practical way such a service could work, at least on a local scale. E.g. in large arcades.

Share this post


Link to post
Jello said:

Steve Perlman, the brain child behinds Quicktime, the most godawful video player short of realplayer

Realplayer is the only player that can play HD Ready res vids that VLC chokes on on my comp, so it can't be that bad.

As for cloud-gaming, I dislike the very idea behind it. The dumb terminal - mainframe model that Maes mentioned is not an unlikely end result, which would only mean removing the user's control over the process. I'm sure quite a few governments would enjoy the possibilities it gives them (Germany's game censorship laws come to mind).

Share this post


Link to post
geekmarine said:

Here's the thing. People have gotten hands-on access with this thing. There actually is a product here, and the technology definitely works, at least on a small scale. In addition, a few major game companies have signed on. I don't think the Phantom had that.

That may be the case, but responses from journalists have been pretty "meh", which doesn't bode very well. From the few articles I've read, there have been reports of input lag and artifacting from the compression technology, and this is in a highly controlled, "safe" environment. The experience is only going to get worse as the technology is tested in the real world.

Honestly, for a service claiming to bring Crysis to desktops with integrated graphics, it's really just replacing one requirement with another. In this case, a high-end PC with high-end internet. The minimum requirement for any game running at 480p is 1Mbit/s, 5Mbit/s for 720p. This is on top of requiring subscription fees to access the service and a fast, steady internet connection, which not everyone can afford. Most ISPs in the states also have hefty bandwidth caps, something I'm certain streaming video would eat up in no time.

This isn't even going over how you don't actually own the game, nor can you modify it as you see fit. That's one of the strengths of PC gaming. If I wanted to just pop a game in and play, I would do so on my Wii. And I still own a physical copy of the game that will work on any Wii. Instead, you're paying for a service that isn't guaranteed to be there tomorrow. In the end, I would rather pay $1000 for a PC that can run Crysis than pay to play a laggy videostream of the same.

Something about this seems rather fishy, and just because they have a working product and spent years on R&D doesn't guarantee it will work or is anything more than a waste of time.

Share this post


Link to post
fraggle said:

"I think there is a world market for maybe five computers." -- Thomas Watson, chairman of IBM, 1943.

Note: he wasn't talking about micro-computers like the ones that are omnipresent now. For the machines he was talking about, his estimate made sense. Transistors and later microprocessors really redefined what a computer was, how much room it took, how expensive it was, and what it could be used for.

Share this post


Link to post
Maes said:

That's the self-imposed limit that everybody mysteriously associates with this service: it doesn't have to (and it better not) be operated over general-purpose internet connections, which have no minimum performance or availability standards. This is a service more suitable to be run over fixed-speed, point-to-point dedicated data lines (which however would NOT be usable for general purpose internet, at least not on the same channel/carrier as the gaming service).

Under that aspect, this would work much like a cable TV on demand (but it would require more infrastructure compared to cable TV, which is not point-to-point, and would be different from a cable modem, which relies on bandwidth sharing).

OK, proposing dedicated, switched lines is not a very 21st century "net" approach, but it's the only practical way such a service could work, at least on a local scale. E.g. in large arcades.

People "mysteriously associate" the internet with this because it is completely and utterly preposterous that a start-up gaming company would try to create an entirely new national network.

EDIT: The EuroGamer article linked above by Snarboo is excellent.

Share this post


Link to post

Lots more info on this seems to be coming to light. I went to the OnLive site to snoop around, and it looks like they're taking beta testers for the summer, and launching in the winter, so we'll be able to see how big a pile of bullshit this is (or not, maybe!) fairly soon. While obviously I'd love for this to be true, and the tech demos shown so far make it look very good, I'm getting the feeling that this won't really work with like 90% of current internet connections. Maybe this will be more viable in a country that doesn't have shit web infrastructure, like Korea. No doubt it'll get there eventually, though.

Share this post


Link to post
david_a said:

People "mysteriously associate" the internet with this because it is completely and utterly preposterous that a start-up gaming company would try to create an entirely new national network.


Probably the same people that believe that "TV is obsolete" because "everything is done via Internet by now", which is just another example of something that sounds cool and enterprisey on paper, but is actually pwned and made a bitch by reality.

Well, on the other hand most people still don't get why their internets is slow and always will be, so if they are happy to live in a "future" full of packet-mangling and -dropping networks, let'em go on, I'm not gonna stop them.

On a more serious note, yeah, it would be stupid to blindly invest billions to bring the service to every city of the world, so their service, if it is ever materialized, will likely come to life as a local-area service, at most covering a small portion of a large town, whether they use a dedicated network or not.

If they don't plan on laying their own "network" (which would be more of a per-user point-to-point leased line) then the very least they would have to work an agreement with ISPs in order to give a local loop minimum guaranteed bandwidth to paying users. That *would* be possible, since the bandwidth would only be used from and to the company's servers, which would need to work in close synergy with the ISPs.

In other words, a paying customer/user would have to get a deal with his ISP in order to allocate him e.g. a guaranteed 5 mbps downstream but only towards the OnLive server, not for general purpose internet. No using those "fixed" 5 mbps for torrentz, sorry :-P

Share this post


Link to post

... or maybe this project ends the way most people here seem to expect it:

Some clueless business type idiots which have no grasp on technology shovel a large amount of money into it but in 6 months all that money has mysteriously disappeared and everyone stupidly asks what went wrong...

Share this post


Link to post

Glad I'm not the only person who thinks cloud computing is an absurdly stupid idea. :D

Share this post


Link to post

The thread just wouldn't be complete without a link to a site full of Slashdot nerds ripping this snake oil scam to shit.

What next, they'll bring back Pixelon and Zeosync?

In any case, perhaps the most meaningful comment I've read about it is
this one.

Just quoting the beginning:

"Cloud" is the modern term for a mainframe, time-sharing-like model.

One advantage is that your data lives on a server somewhere, meaning someone else is responsible for backup, and you can access it from any "terminal" (typically a web browser, but could also be things like the Steam client).

Another advantage is a potential pricing model for developers -- Amazon EC2 charges per hour of server time used, at a very flat rate. If you only use an hour, you only pay ten cents.

The big advantage of an infrastructure like EC2 is shown in pathological cases, like websites which tend to receive more traffic at certain times of the day. So every night, you can shut down whatever capacity you don't need, and stop paying for it -- and Amazon can then allocate it to someone else who needs it at that time, possibly overnight.

At a different level, you see the same pattern with web applications -- you don't need a computer more powerful than it needs to be to run Firefox. The server can do whatever computing you need that isn't already happening locally -- but most GUI apps spend a lot of time waiting for the user. So when you do a search in Gmail, that takes some server CPU -- but while you're examining the results, that server is off running someone else's search.

Here's the problem: None of these advantages apply to these guys.

Share this post


Link to post
Graf Zahl said:

... or maybe this project ends the way most people here seem to expect it:

Some clueless business type idiots which have no grasp on technology shovel a large amount of money into it but in 6 months all that money has mysteriously disappeared and everyone stupidly asks what went wrong...


Anybody remember about Enron Broadband? I somehow got reminded of that..also a theoretically cool but actually totally half-baked, expensive, and unfeasible idea.

Share this post


Link to post

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×