Wednesday, December 28, 2011

"Latency" Issues Not Under Full Service Provider Control

Network service providers can do many things to optimize bandwidth and latency on their networks. But it also is true that networks cannot optimize most end points on their networks, nor can they control peak load.


That means that latency as a problem can be remedied only partially by steps network service providers can take. 

First person shooter games such as Call of Duty rely on low network latency in order to keep pace with players’ reaction times, and can offer a competitive advantage in multiplayer games.


"Hardcore gamers" often recognize latency as a key criterion when selecting their network provider. That, in turn, poses questions for service providers. How gaming is changing the data center

There are some known ways to reduce latency, such as reducing distance packets travel, for example. Beyond that, one might argue that most latency results from the devices and servers used in sessions, which are, by definition, not under the control of a network operator.

Still, minimizing distance packets must travel, or unnecessary protocol conversions, will help improve latency performance. Peak bandwidth demand, on the other hand, has to be approached differently. Adding new capacity, convincing users to regulate their usage or traffic shaping are potential tools in that regard. But adding capacity does not always automatically improve latency performance.

That implies that the techniques used to improve performance of a network under congestion are different from the tools used to manage latency performance. Caching and other techniques that put server resources "closer" to users are one way networks can be designed to minimize latency issues.


But those decisions have to be made by application providers. 

No comments:

Post a Comment