🎉 Celebrating 25 Years of GameDev.net! 🎉

Not many can claim 25 years on the Internet! Join us in celebrating this milestone. Learn more about our history, and thank you for being a part of our community!

Client side prediction

Started by
15 comments, last by gabrielefarina 9 years, 1 month ago

In my case, the physics simulation is (and probably will be) very basic but still I think that having a different tick rate will raise problems. Also, as long as I'm aiming to support a low number of players-per-server, I won't probably need to lower the tick rate there, and running physics at 30 and graphics as fast as I can with smoothing already proved to be a good solution.

What about how to deal with dropped or out of order messages delivered to the server btw? Reading again your answers, I think that I like the idea of delaying the execution on the server, as long as it should allow me to keep the server code simple and avoid to have to store the past entities state. As long as I'm planning to use a client-server model also for single player or local multiplayer, maybe I could adapt the delay on the server based on the average latency, or even better set the delay to 0 when playing single player or on a local network, and set it to something more meaningful when running online. Does it make sense?

Advertisement

running physics at 30 and graphics as fast as I can with smoothing already proved to be a good solution


That's what Tribes and Tribes 2 did ten years ago, and it worked fine for them (with large-ish maps and, for the time, large-ish numbers of players.)
While I'm not a big fan of the Torque game engine underlying it (too little separation of concerns, not very robust as a generaly purpose engine,) the network part of it was quite good.


I would also recommend defining a particular time step on the server, and calling that "truth." Clients would then simulate the local entity "ahead of time" and remotely replicated entities "behind time." The amount of ahead/behind would be adjusted based on RTT; it's easiest to derive this simply by the server letting each client know when the input arrives too late, or much too early.
enum Bool { True, False, FileNotFound };

OK, I moved further and I think the things are going better now. I think that some of the jumpiness is due to the fact that I discard messages that arrive out of order, so I don't simulate part of the input on the server. Each of my messages at the moment contains the computed velocity vector and the predicted client position after the velocity is applied; on the server I apply the command and compare the resulting position: if it differs form what the server simulated, I send a "correction" message with all the required information. Due to this, if I skip an input message on the server, the simulation goes out of sync. What I'd like to do at the moment, is to add a buffer of incoming input messages on the server and making sure the messages are reliable (even if unordered): than, once I receive the message I expect, I simulate on the server all the messages in the queue in a single update step, using the difference between the client-tick of the last acknowledged message and the current one to understand how much delta time use in the simulation.

As far as I understood though, it is not a common practice to use this kind of buffer on the server; it seems that most of the games discard old input messages and apply only the latest. Doesn't this create serious issues if (like in my case) the movement is based on velocity or acceleration? Do they use a delta time based on the timestamp of the last acknowledged message when applying the input on the server? Or do they simply ignore this situation an rely on some smoothing on the client to deal with the situation?


I would also recommend defining a particular time step on the server, and calling that "truth." Clients would then simulate the local entity "ahead of time" and remotely replicated entities "behind time." The amount of ahead/behind would be adjusted based on RTT;

Does this mean that I should run the simulation on the client in "server time space"? At the moment, when a client connects, I send it the tick-count of the server to sync the client with it, but I didn't thought about running the simulation ahead of time base on the ping. When you say "ahead of time" here u don't mean to process the input with a bit of delay, but simply setting the timestamp/tick-count on the client at "server_ticks+ping/2", correct ?

Thanks,

Gabriele

Something like that, yes, although you'll also need to use a different time stamp for other players entities on your client machine, which is behind the server time.

Separately, it's often a good idea to RLE encode several previous frames worth of input into each upstream packet, so that if one or two packets are lost, you still have the inputs "from the past."
Although in most normal networking cases, you shouldn't have much loss at all, so if you are seeing lost packets, there may be a bug that causes that.
enum Bool { True, False, FileNotFound };

Something like that, yes, although you'll also need to use a different time stamp for other players entities on your client machine, which is behind the server time.

Yeah, I'm using the approach I read in the Valve articles, and it is working fine it seems.


Separately, it's often a good idea to RLE encode several previous frames worth of input into each upstream packet, so that if one or two packets are lost, you still have the inputs "from the past."
Although in most normal networking cases, you shouldn't have much loss at all, so if you are seeing lost packets, there may be a bug that causes that.

Ok, I will try to send the last X input messages, and remove from the outgoing queue a message when an ACK from the server is received (or maybe when they get too old). When the server gets N input messages, should it process them immediately or should I store them in a queue and execute them one by one as the server ticks? Executing all the inputs at the same time might lend to a non-smooth behaviour on the server, but I guess that it is expected in case the stream of incoming messages from the client is not contiguous?

Hello everyone, sorry for disappearing for a bit.

I've been able to solve my issues it seems, by resending previous inputs. Now I have a smooth client side prediction of the player entity even when simulating packet loss and quite high jitter.

Thanks for the help!

This topic is closed to new replies.

Advertisement