Jump to content
Search In
  • More options...
Find results that contain...
Find results in...
Sign in to follow this  
Jehar

Client downloads in modern gaming.

Recommended Posts

It seems that the average amount of data in a map\mod these days is quickly surpassing current client bandwidth capabilities. ET:QW (I have no doubt there will be modders jumping on the chance to map for it) is coming out soon, and the megatextures alone wll be quite a bit of data.

Once custom content comes around, a simple server\client dl system simply won't cut it in my mind. I know Quake 3 had a horrible system. Q2 less so.

So, I was thinking about ways for an average player to get data from the server rapidly, short of quitting the client and finding a dl site.

Something popped into my head: What if the client could torrent the data from the server he is attempting to connect to, *as well* as other low-ping servers with the same data? Most servers have the bandwidth to throw around, and for popular maps\mods, this could rapidly increase dl speeds to the client, assuming each server uploads at speeds they would usually.

100k\s from 5 servers in the vicinity seems quite viable, and obviously advantageous to 100k\s from only the server the client is connecting to.

So. That's the idea. Thoughts?

Share this post


Link to post

Sounds like a good idea in theory, but I don't really know how viable it is. What with physics calculations rapidly coming to the forefront in games - well, that's a lot of data that needs to be synchronized between the server and clients, and I could see that pushing network bandwidth usage higher and higher for modern games. But yeah, utilizing a bittorrent-like system (especially on a semi-enclosed network like Steam) could be good.

Share this post


Link to post

Physics calculations are done in real-time, after the client has all the nessecary maps\mods. I'm not talking about server sharing bandwidth during gameplay. :P That would be silly,

Share this post


Link to post

Physics should not push up the bandwidth requirements. The only thing that should have to be sent is the player input, and all computers should be able to calculate the same physics results from that.

Share this post


Link to post
AndrewB said:

Physics should not push up the bandwidth requirements. The only thing that should have to be sent is the player input, and all computers should be able to calculate the same physics results from that.


They do, the source engine's physics are all calculated client-side (when it was first released, it was server-side which was laggy as fuck).

Share this post


Link to post

Sigh ... AndrewB, you know as well as I do that while it's all fine and good to have physics calculated client-side, you have to synchronize them. Sure, that might not be that important in current games, where physics engines are used almost exclusively for rag-doll death animations, but in future games, where complicated physics simulations are used extensively for every object in the game (including players and projectiles), the results of those calculations are much better processed in a single location (in a client-server game, the server, naturally, as, ideally, it's not doing the rendering and audio processing that the clients are) and sending the results to the clients.

Obviously, as physics engines become more complicated and as physics calculations become more and more precise (as well as coming into more widespread use among the game's objects), they will begin to take up a considerable amount of bandwidth within a given PC's data transport systems - not even to consider the amount of network usage such data would use. I mean, come on people - we're starting to see serious talk of a need for hardware physics accelerators. You know, an add-in board (like a 3DFX Voodoo of yore) that processes physics? Yeah.

So, the reason I brought it up is because as physics data that needs to be sent across the network grows, that's going to be less and less bandwidth that's available for use for live media downloads. So, building on that, yes, I think a torrent-based solution could work well - but the real problem I see there is the CPU load it would put on the game server and clients. I don't know if you're aware of this, but simply being in a bittorrent swarm consumes a HUGE amount of CPU power. I don't know if it would really be viable for that sort of situation.

What I really think needs to happen is that developers need to look long and hard at dynamically generated content - just have a look at .kkrieger for example. I know Will Wright is big into the concept, but I wonder idly if anyone else has really put much thought into the topic.

Another problem, AndrewB, with the whole "just send player inputs and let them all calculate the physics results" is that, for correct physics modeling, there will have to be a small element of randomness - the "chaos" factor, as people love to call it. In future games, (say, unrealengine4 era) it's likely that physics simulations will include this tiny bit of randomness, and as such - well, I'm sure you can see how a scheme such as that you propose would fail.

Even if you don't code in an element of randomness, tiny bit-level memory errors or noise in the signal within the machine could cause serious desynchronization down the line.

Share this post


Link to post
AndrewB said:

Physics should not push up the bandwidth requirements. The only thing that should have to be sent is the player input, and all computers should be able to calculate the same physics results from that.

Quoted for truth, though there's still things that move around in the map that need to be synchronized as well.
I don't see how massive physics would be such an issue and would need to be sent through to the other players in order to be the same for everyone playing. As long as the event is the same on your end or their end then it should be the same visa-versa. If the other computer can't keep up with what's happening, then their video frame rate drops, since displaying what's going on is the biggest part of the process. Then if that isn't enough, then their frame rate drops even more until the computer can handle what's going on all at once again.

Share this post


Link to post

Having gone on and designed an underlying architecture for the next games we're working on that will make it network friendly, I feel I should pipe up on this one.

AndrewB is actually on the right track. Recording inputs and sending them across the network is the right way to go if you want to minimize bandwidth (it also happens to double up as a rather handy replay feature). Input is the wrong word for it though. Rather, you'd record decisions. Decisions can be made several ways, be they through human input, AI computations, or network packets. Randomness is not needed in the networking or any other core system at all - a human is random enough, and you can make the AI as random as you like, so in effect your inputs are random but the underlying system is not.

Using such minimal data is dangerous though, but elegant at the same time. The only way you can control your system using such minimal inputs is to give your update loop a fixed timestep. There's been alot of movement towards dynamic timesteps in games over the past few years so that the maximum framerate can be reached, but I've long viewed that as rubbish. Stability in game systems cannot be garaunteed by a variable that changes every frame. It's quite easy to keep the update loop on a fixed timestep, and at the same time let the renderer go crazy pumping out as many frames per second as it can handle - especially these days with the focus shifting from single-threaded applications to multithreaded applications.

Once you have a fixed timestep, you can be assured that the system will produce the same results time and time again given the same inputs. As long as the randomness is purely contained to inputs, and not in calculations, recording inputs is very much a feasable way to do things.

Moving along - physics. Physics systems benefit from a stable timestep. Altering timesteps per frame will throw physics calculations out of whack (especially with accellerative forces), and as such you will get different results running on different machines with different update deltas. Clamping it to a fixed delta instantly introduces stability in to the system, and consequently reproducability. Given that, at this point in time (and for the forseeable future) we have more processing time than bandwidth and low-latency connections, it makes sense to make everything as reproducable as possible and calculate as much as possible on the client-side. If we can ensure one client will give the same results as every other client out there, networking becomes easier.

There is, however, one nasty bit in the networking area. Latency. Clients will receive packets late, or out of order, or just not receive them at all. Further, they will receive them a couple of frames after they needed them. This isn't too much of a problem for low latency - if we still assume we have much more processing power than bandwidth, we can just plug the inputs we receive from the network in to the simulation from a frame or two ago, run it back to where we currently are, and continue as planned. Higher latency requires a trickier solution. If a client misses packets, or gets them late, there's a good chance that they already don't know what's going on. It's at that point that they'll need an opinion from the other clients as to what's happening (in fact, I should stop calling them clients, as they're really peers). Obviously, getting an opinion will just make everything out of sync, so each client will have to agree on what the current gamestate should be, and continue the simulation from that opinion. It's a bit tricky, as opinions are still out-there concepts, but theoretically the opinion system is a sound one. Opinions won't matter so much in a local area system, where network latency is always lower than the update loop's timestep - it's only over the internet that it becomes a problem.


This, however, detracts from the original point of data downloads from the server. Obviously, a peer-to-peer system as above will definitely use something torrent-like to get the data. Linking up servers to torrent data to users has some interesting implications. In terms of the human factor, bandwidth costs money, and server maintainers may not want to pipe large chunks of data to users that will never log on to their server. It'd be much better to torrent between the clients instead of servers in that case. The big one though - if other servers are already aware of one another, you're already half-way there to an MMO setup. Why stop at just a single-server play?

Share this post


Link to post

Please sign in to comment

You will be able to leave a comment after signing in



Sign In Now
Sign in to follow this  
×