🎉 Celebrating 25 Years of GameDev.net! 🎉

Not many can claim 25 years on the Internet! Join us in celebrating this milestone. Learn more about our history, and thank you for being a part of our community!

Latency confusion

Started by
1 comment, last by Almar Joling 23 years, 4 months ago
While working, I got myself uncertain about something very simple. If I have a program that sends a packet every 100ms, and have a latency of 500ms, does the packet stream of 100ms stay intact? I mean, does the server receive every 100ms a packet?
Advertisement
There are 2 important measures of latency. The average latency which measures the round trip time of a packet to and from the destination, and the standard deviation. The standard deveiation is a meausre of the spread of the individual latency measurements from the mean. Anyways, if have a large standard deviation, your packets will not be recived in a consistent well space stream, rather it will be clumped together and sometimes be long stalls between packets. If you naively process the packets as is, there will be noticeable jerkyness.

Large stdv is much worse than large latency. This is really the cause of most of the jumpy nature of multiplayer games. You can combat it somewhat by buffering the steam, but this introduces latency. It''s a trade off, you''ll have to decide what is best for your game.

Notice that large stdv isn''t exclusively the falut of the network, it could also be introduced by how you process the packets (we discussed this somewhat in another thread about read/write and socket types).

Good Luck

-ddn
Thanks Ddn, this is exactly what I needed!

This topic is closed to new replies.

Advertisement