Jitter vs. Delay, what is the difference ?

Jitter and Delay, are both important network performance metrics in mobile networks. However, they represent different aspects of network behaviour.

Delay (also known as Latency), refers to the time it takes for a data packet, to travel from the source to the destination.

While Jitter is the "variation in the delay" experienced by consecutive data packets.

Delay is primarily caused by the physical characteristics of the network, and the processing capabilities of the network elements. Better network conditions results in less delay, which results in better user experience.

High Delay values can cause problems in real-time applications such as voice or video calls.

While Jitter is more caused by network congestion, and changes in the routing paths. So that, each packet travels from source to destination using different path, which results in different delay values for each packet. In turn, packets can reach the destination in out-of-order sequence.

So, high Jitter values causes problems specially in case of real-time applications.