Jitter vs. Delay, what is the difference ?
Jitter and Delay, are both important network performance metrics in mobile networks. However, they represent different aspects of network behaviour.
Delay (also known as Latency), refers to the time it takes for a data packet, to travel from the source to the destination.
While Jitter is the "variation in the delay" experienced by consecutive data packets.
Delay is primarily caused by the physical characteristics of the network, and the processing capabilities of the network elements. Better network conditions results in less delay, which results in better user experience.
High Delay values can cause problems in real-time applications such as voice or video calls.
While Jitter is more caused by network congestion, and changes in the routing paths. So that, each packet travels from source to destination using different path, which results in different delay values for each packet. In turn, packets can reach the destination in out-of-order sequence.
So, high Jitter values causes problems specially in case of real-time applications.
-
LTE (4G) Network Architecture
LTE (4G) Network Nodes LTE refers to 'Long Term Evolution', and the architecture of the LTE (4G) network is as
-
QoS ‘Quality of Service’ in GPRS
QoS refers to 'Quality of Service'. After 'PDP context activation' procedure completes, a 'PDP context' shall exist on the MS,
-
SGSN, GGSN – 2G/3G Network Architecture and Interfaces
2G/3G Network Nodes The following graph shows the 2G/3G Network Architecture. It includes the SGSN, GGSN, HLR, CG, PDN, RNC,