Glossary Definition for: Latency

When it comes to networks, latency refers to how long it takes for a remote server to respond to a request.

The latency is often referred to as “ping time” by speed-testing apps and is reported in milliseconds.

Think of it as follows: When you click a link, how long before you see a new page start to load? That time delay is latency.



Get a free PDF copy of our ebook when you join our e-mail newsletter!

The average latency for wired home internet services in around 30ms.

On a good LTE connection, ping times of 75ms to 100ms are common, and this feels plenty fast in use. HSPA+ 4G and 3G networks are slower – averaging around 120ms to 170ms.

Satellite networks, on the other hand, unavoidably have massive latency resulting from the signal’s speed-of-light round-trip to geosynchronous orbit and back. The best satellite systems achieve a latency of 600ms, and over 1000ms (one full second!) is not unusual.

A high-latency connection may be capable of fast raw transfer speeds, but the connection will feel slow for interactive use because of all the delays. Latency is particularly painful for audio and video chat applications (where the half-second delay leads to talking on top of each other), and for typing into remote terminal sessions.

And online action games are a really bad idea without a low- latency connection – otherwise, you will very literally be dead before you know it!

« Back to Glossary Index