What’s the difference between lag and jitter

networkingterminology

I've noticed nowadays that games (and people) only use the term 'lag'.
Except, while playing BZFlag, I've noticed there's two notifications for latency (the proper term 'lag'); "Your jitter is too high (### ms), warning #/#" and "Your latency is too high (### ms), warning #/#".

I've noticed that sometimes, only the "lag" notification comes out along with the "jitter" notification nowadays. But I've remembered back then (around 2008), that only the "jitter" notification comes up.

I've googled up the two and their wikipedia pages and Jitter talks about difference between the interval that the packet's time measure starts, alongside frequencies while the other is about the time difference in receiving, processing and then return-sending a single packet.
There again, I could be reading the incorrect definition for that application of the term.

Is there a difference between "Jitter" and "Latency" (if so, what?), or are they the same thing?

Best Answer

Lag is a noticeable delay between the time something is initiated and the time when it happens. For example, pressing an "attack" button and finding that the attack doesn't happen until a second later.

Latency is sometimes used to mean the same thing as lag, but in networking it's generally used interchangeably with ping time: the amount of time it takes for a packet to travel from point A to point B, or to travel there and back again. High packet latency generally leads to lag in a game.

Jitter is a variance in latency over time. If every packet takes exactly the same amount of time to travel from A to B, there is no jitter. If the packet delivery times are inconsistent, we call it jitter.

Jitter can be overcome with buffering, but that adds to overall latency/lag. Overcoming a lot of jitter might require buffers so large that the resulting lag would make a game terribly unresponsive, possibly not worth playing.