Jitter – A Measure of Broadband Quality

Author: 
Coverage Type: 

Most people have heard of latency, which is a measure of the average delay of data packets on a network. There is another important measure of network quality that is rarely talked about. Jitter is the variance in the delays of signals being delivered through a broadband network connection. Jitter occurs when the latency increases or decreases over time. We have a tendency in the industry to oversimplify technical issues; we take a speed test and assume the answer that pops out is our speed. Those same speed tests also measure latency, and even network engineers sometimes get mentally lazy and are satisfied to see an expected latency number on a network test. But in reality, the broadband signal coming into your home is incredibly erratic. From millisecond to millisecond, the amount of data hitting your home network varies widely. Measuring jitter means measuring the degree of network chaos. Fully understanding the causes of jitter in any specific network is a challenge because the causes can be subtle. It’s often hard to pinpoint a jitter problem because it can be here one millisecond and gone the next, but it’s something we should be discussing more. A lot of the complaints people have about their broadband connection are caused by too-high jitter.

[Doug Dawson is President of CCG Consulting.]


Jitter – A Measure of Broadband Quality