Technology Encyclopedia Home >What is network latency?

What is network latency?

Network latency refers to the time it takes for data to travel from one point to another in a network. It is often measured in milliseconds (ms) and represents the delay between a user's action and the system's response. Latency is influenced by factors such as the physical distance between devices, the number of network hops, and the speed of the network infrastructure.

For example, when you load a webpage, your device sends a request to the server hosting the page. The time it takes for the server to process the request and send the data back to your device is part of the latency. High latency can result in slower load times, lag in online gaming, or delays in video calls.

In cloud computing, reducing network latency is crucial for delivering fast and responsive services. For instance, using a Content Delivery Network (CDN) can help minimize latency by caching content closer to end users. Additionally, deploying applications in geographically distributed data centers can reduce the distance data needs to travel. Services like Tencent Cloud's Global Accelerator can optimize network paths and improve latency for global applications.