🎉 Celebrating 25 Years of GameDev.net! 🎉

Not many can claim 25 years on the Internet! Join us in celebrating this milestone. Learn more about our history, and thank you for being a part of our community!

Client/Server sync - reasoning and specific techniques?

Started by
13 comments, last by hplus0603 4 years ago

this shifting of tick number relative to eachother seems like it forces us to drop a message, or process 2 messages in 1 tick

For network messages, you need to keep reading the network and dequeuing the messages as quickly as they come in. It won't be synchronized to ticks, or frames. But the messages in the packets will have tick numbers they're intended for.

The goal is to reduce input lag, without sending data too late. This means that the relationship between “target tick” and “wallclock time” will shift when you get an update. After all, the client ends up being slaved to wallclock time, with some offset function, to determine “what is the tick I should be running right now?”

If the client changes its clock offset, yes, the “target tick” will jump, and your simulation will process more than one tick within the same interval – but that's no different from having a slow graphics card, say, that runs at 20 fps, and thus needs 3 ticks processed between each frame for a 60 Hz simulation rate. Except it only happens occasionally in this case.

Do you have to hide this? For most players, for most connections, this offset will be adjusted once on connection, and then stay fixed. It may be that the game is perfectly playable without treating it any different from “the CPU got locked out for a bit and I'm now behind.”

enum Bool { True, False, FileNotFound };
Advertisement

hplus0603 said:

this shifting of tick number relative to eachother seems like it forces us to drop a message, or process 2 messages in 1 tick

The goal is to reduce input lag, without sending data too late. This means that the relationship between “target tick” and “wallclock time” will shift when you get an update. After all, the client ends up being slaved to wallclock time, with some offset function, to determine “what is the tick I should be running right now?”

If the client changes its clock offset, yes, the “target tick” will jump, and your simulation will process more than one tick within the same interval – but that's no different from having a slow graphics card, say, that runs at 20 fps, and thus needs 3 ticks processed between each frame for a 60 Hz simulation rate. Except it only happens occasionally in this case.

Thanks a ton, that's where my understanding was failing. My client currently gets the server's tick during the connection handshake, then just increments the tick by 1 every tickrate ms. My offset is applied when I send data out to the server (so it looks like the client is ahead), but it doesn't have any effect on the tick that the client is actually processing. That seems to be where the mistake is.

It sounds like the client needs to actually be adaptive, not having a strong concept of its own tick, but instead using the authoritative responses to see what tick it should be processing. Is that accurate? If so, how does this “target tick” concept work? I assume the server just marches along with tick += 1 every tickrate ms, is that still true?

Found a good thread where you already explained it :D https://www.gamedev.net/forums/topic/696756-command-frames-and-tick-synchronization/5386794/?page=3

So that answers my question, when the server says “you're too far ahead, adjust by -3”, the client needs to stop processing for 3 simulation ticks. When it's too far behind, it needs to process extra ticks.

I now question what the “offset” concept is for, if the actual client tick is what's responding to server adjustments? Is it just a static number that says, “when I get the server's tick during the connection handshake, I'll start at serverTick + offset”?

Edit: Maybe the offset is used when you can't control the base tick? In that case, the client would have a constantly increasing baseTick and an offset. The client would then be processing currentTick = baseTick + offset on any given iteration. When an adjustment comes in, you would set a targetTick = currentTick + adjustment + 1 (since it'll be processed next tick). On the next simulation iteration, you would run multiple simulation iterations (or freeze) until currentTick = targetTick (it's missing some steps, but something like that). This seems to line up with the pseudocode I've been seeing in these threads.

The way that the client knows what tick it wants to simulate should still be based on a local wallclock time. Typically, the math will look something like:

extern double ReadHighPrecisionTimer(); // returns seconds

double tick = (ReadHighPrecisionTimer() - TimeExecutableFirstStarted) * TicksPerSecond + ServerOffset;

If your timer returns something other than seconds, adjust the math as appropriate.

And when the server tells you to adjust the offset (and you haven't already made that adjustment,) you change the ServerOffset variable. This way, you can still figure out which tick to simulate and render when the client runs at 144 Hz, even if your network tick rate is 20 Hz or whatever.

enum Bool { True, False, FileNotFound };

This topic is closed to new replies.

Advertisement