🎉 Celebrating 25 Years of GameDev.net! 🎉

Not many can claim 25 years on the Internet! Join us in celebrating this milestone. Learn more about our history, and thank you for being a part of our community!

"Catching up" when a client's input stream is interrupted.

Started by
0 comments, last by frob 9 years, 4 months ago

Hi guys.

I am making a multiplayer platformer using UDP.

I stream input from my client to the server, and the server queues this input up and plays it back.

If the client input is not acknowledged by the server, the client resends the input on its next update.

This repeats until the input is acknowledged.

All very standard stuff.

My issue: Say the client lags or drops packets for for 500 ms, then resumes as normal. From the server's perspective no inputs arrive for 500 ms, then all 500ms worth of input arrives at once. The server then starts to play back the input.

All the client input is played back faithfully but 500ms later than it should be.

What's more, all the new input arriving from the client with no lag is being put at the back of the input queue, so even tho it arrives quickly it isn't used for at least 500ms, and this continues forever. It never "catches up" after the lag spike.

Now obviously i could set a maximum size for the input queue (somewhere between 1 and 2 update's worth?) and simply discard anything that doesn't fit in the queue. This would stop it looking delayed but then i run the risk of missing crucial input. I might miss a jump command, for example, which would make the game feel buggy.

So then i was thinking maybe we look through the input we're going to discard before discarding it and see if there are any jump commands... and then i thought this was getting a little complicated and maybe i should make a forum post... and here we are.

What would be your guys' advice on how to solve this?

Many thanks in advance for any replies smile.png

Advertisement


All the client input is played back faithfully but 500ms later than it should be.

What would be your guys' advice on how to solve this?

Time stamps are awesome. Every machine can even be different, the sever just needs to figure out what the machine's offset is, then adjust accordingly.

We'll take your scenario, the server does not receive any packets for a while, then receives in rapid succession a bunch of packets with older time stamps. The client said "event happened at 37.614 seconds in to the game", but the server is now at 38.215.

Exactly how you deal with older packets is up to you. It is one of many decisions that end up giving every game a unique feel.

If you have a sliding window where you roll back the simulation, you might decide the event is still within your resimulate window, or you might decide it is too stale. If you are not doing a resimulation-based server you might decide that even though the events are stale they should still be processed, adding them to the queue at the server's current time. Or you might decide they are too old and are no longer relevant. You might have a table of all the events with the time before they expire, some events (e.g. quit game) never expire, other events (e.g. shots fired, jump, crouch) expire after some number of milliseconds specified on their respective row of the data.

Or you might have some other solution appropriate for your game.

This topic is closed to new replies.

Advertisement