Muds Wiki
Advertisement

In online gaming, lag is a noticeable delay between the action of players and the reaction of the server. Although lag may be caused by high latency, it may also occur due to insufficient processing power in the client and/or server.

The tolerance for lag depends heavily on the type of game. For instance, a strategy game or a turn-based game with a low pace may have a high threshold or even be mostly unaffected by high delays, whereas a twitch gameplay game such as a first-person shooter with a considerably higher pace may require significantly lower delay to be able to provide satisfying gameplay. But, the specific characteristic of the game matter. For example, fast chess is a turn-based game that is fast action and may not tolerate high lag. And, some twitch games can be designed such that only events that impact the outcome of the game introduce lag, allowing for fast local response most of the time.

Causes[]

File:Gameloop.PNG

A simplified game architecture

While a single-player game maintains the main game state on the local machine, an online game requires it to be maintained on a central server in order to avoid inconsistencies between individual clients. As such, the client has no direct control over the central game state and may only send change requests to the server, and can only update the local game state by receiving updates from the server. This need to communicate causes a delay between the clients and the server, and is the fundamental cause behind lag. While there may be numerous underlying reasons for why a player experiences lag, they can be summarized as insufficient hardware in either the client or the server, or a poor connection between the client and server.

Hardware related issues causes lag due to the fundamental structure of the game architecture. Generally, games consists of a looped sequence of states, or "frames". During each frame, the game accepts user input, performs necessary calculations (AI, graphics etc.). When all processing is finished, the game will update the game state and produce an output, for example in the form of a new image on the screen and/or a packet to be sent to the server. The frequency at which frames are generated is often referred to as the frame rate. As the central game state is located on the server, the updated information must be sent from the client to the server in order to take effect. In addition, the client must receive the necessary information from the server in order to fully update the state. Generating packets to send to the server and processing the received packets can only be done as often as the client is able to update its local state. Although packets could theoretically be generated and sent faster than this, it would only result in sending redundant data if the game state cannot be updated between each packet. A low frame rate would therefore make the game less responsive to updates and may force it to skip outdated data.

Conversely, the same hold true for the server. The frame rate (or tick rate) of the server determines how often it can process data from clients and send updates. This type of problem is difficult to predict and compensate for. Apart from enforcing minimum hardware requirements and attempting to optimize the game for better performance, there are no feasible ways to deal with it.

Perhaps the most common type of lag is caused by network performance problems. Losses, corruption or jitter (an outdated packet is in effect a loss) may all cause problems, but these problems are relatively rare in network with sufficient bandwidth and no or little congestion. Instead, the latency involved in transmitting data between clients and server play a significant role. Latency varies depending on a number of factors, such as the physical distance between the end-systems, as a longer distance means additional transmission length and routing required and therefore higher latency. Routing over the Internet may be extremely indirect, resulting in far more transmission length (and consequential latency) than a direct route, although the cloud gaming service OnLive has developed a solution to this issue by establishing peering relationships with multiple Tier 1 network Internet Service Providers and choosing an optimal route between server and user.[1] In addition, insufficient bandwidth and congestion, even if not severe enough to cause losses, may cause additional delays regardless of distance. As with the hardware issues, packets that arrive slowly or not at all will make both the client and server unable to update the game state in a timely manner.

Online game systems utilizing a wireless network may be subject to significant lag, depending on the architecture of the wireless network and local electromagnetic interference impacting that network. Although radio propagation through air is faster than light through optical fiber, wireless systems are often shared among many users and may suffer from latency incurred due to network congestion, or due to network protocols that introduce latency. And, in the event of electromagnetic interference, transmitted packets may be lost, requiring a retransmission which also incurs latency.

Effects[]

The noticeable effects of lag vary depending on the exact cause, but also on any and all techniques for lag compensation that the game may implement (described below). As all clients experience some amount of delay, implementing these methods to minimize the effect on players is important for smooth gameplay. Lag causes numerous problems for issues such as accurate rendering of the game state and hit detection. In many games, lag is often frowned upon because it disrupts normal gameplay. The severity of lag depends on the type of game and its inherent tolerance for lag. Some games with a slower pace can tolerate significant delays without any need to compensate at all, whereas others with a faster pace are considerably more sensitive and requires extensive use of compensation to be playable (most prevalent is probably the first-person shooter genre). Due to the various problems lag can cause, many players that have a lower speed Internet connection are often not permitted, or discouraged from playing with other players or servers that have a distant server host or have high latency to one another. Extreme cases of lag may result in extensive desynchronization of the game state.

Lag caused by an insufficient update rate between client and server can cause some problems, but these are generally limited to the client itself. Other players may notice jerky movement and similar problems for the affected client, but the real problem lies with the client itself. If the client cannot update the game state at a quick enough pace, the player may be shown outdated renditions of the game, which in turn causes various problems with hit- and collision detection. If the low update rate is caused by a low frame rate (as opposed to a setting on the client, as some games allow), these problems are usually overshadowed by numerous problems related to the client-side processing itself. Both the display and controls will be sluggish and unresponsive. While this may increase the perceived lag, it is important to note that it is of a different kind than network-related delays. In comparison, the same problem on the server may cause significant problems for all clients involved. If the server is unable or unwilling to accept packets from clients fast enough and process these in a timely manner, client actions may never be registered. When the server then sends out updates to the clients, they may experience freezing (unresponsive game) and/or rollbacks, depending on what types of lag compensation, if any, the game uses.

Lag due to network delay is in contrast often less of a problem. Though more common, the actual effects are generally smaller, and it is possible to compensate for these types of delays. Without any form of lag compensation, the clients will notice that the game responds only a short time after an action is performed. This is especially problematic in first-person shooters, where enemies are likely to move as a player attempts to shoot them and the margin for errors is often small.

Solutions and lag compensation[]

There are various methods for reducing or disguising delays, though many of these have their drawbacks and may not be applicable in all cases. If synchronization is not possible by the game itself, the clients themselves may be able to choose to play on servers in geographical proximity to themselves in order to reduce latencies, or the servers may simply opt to drop clients with high latencies in order to avoid having to deal with the resulting problems. However, these are hardly optimal solutions. Instead, games will often be designed with lag compensation in mind.

Many problems can be solved simply by allowing the clients to keep track of their own state and send absolute states to the server or directly to other clients.[2] For example, the client can state exactly at what position it is or who they shot. This solution works and will all but eliminate most problems related to lag. Unfortunately, it also relies on the assumption that the client is honest. There is nothing that prevents a player from modifying the data they send, directly at the client or indirectly via a proxy, in order to ensure they will always hit their targets. In online games, the risk of cheating may make this solution infeasible, and clients will be limited to sending relative states (i.e. which vector it moved or shot in).

Client-side[]

As clients are normally not allowed to define the main game state, but rather receive it from the server, the main task of the client-side compensation is to render the virtual world as accurately as possible. As updates come with a delay and may even be dropped, it is sometimes necessary for the client to predict the flow of the game. Since the state is updated in discrete steps, the client must be able to estimate a movement based on available samples. Two basic methods can be used to accomplish this; extrapolation and interpolation.[2]

Extrapolation is an attempt to estimate a future game state. As soon as a packet from the server is received, the position of an object is updated to the new position. Awaiting the next update, the next position is extrapolated based on the current position and the movement at the time of the update. Essentially, the client will assume that a moving object will continue in the same direction. When a new packet is received, the position may be corrected slightly.

Interpolation works by essentially buffering a game state and rendering the game state to the player with a slight, constant delay. When a packet from the server arrives, instead of updating the position of an object immediately, the client will start to interpolate the position, starting from the last known position. Over an interpolation interval, the object will be rendered moving smoothly between the two positions. Ideally this interval should exactly match the delay between packets, but due to loss and variable delay, this is rarely the case.

Both methods have advantages and drawbacks.

  • Interpolation ensures that objects will move between valid positions only and will produce good results with constant delay and no loss. Should dropped or out-of-order packets overflow the interpolation buffer the client will have to either freeze the object in position until a new packet arrives, or fall back on extrapolation instead. The downside of interpolation is that it causes the world to be rendered with additional latency, increasing the need for some form of lag compensation to be implemented.
  • The problem with extrapolating positions is fairly obvious: it is impossible to accurately predict the future. It will render movement correctly only if the movement is constant, but this will not always be the case. Players may change both speed and direction at random. This may result in a small amount of "warping" as new updates arrive and the estimated positions are corrected, and also cause problems for hit detection as players may be rendered in positions they are not actually in.

Often, in order to allow smooth gameplay, the client is allowed to do soft changes to the game state. While the server may ultimately keep track of ammunition, health, position etc., the client may be allowed to predict the new server-side game state based on the player's actions, such as allowing a player to start moving before the server has responded to the command. These changes will generally be accepted under normal conditions and make delay mostly transparent. Problems will arise only in the case of high delays or losses, when the clients predictions are very noticeably undone by the server. Sometimes, in the case of minor differences, the server may even allow "incorrect" changes to the state based on updates from the client.

Server-side[]

Unlike clients the server knows the exact current game state, and as such prediction is unnecessary. The main purpose of server-side lag compensation is instead to provide accurate effects of client actions. This is important because by the time a player's command has arrived time will have moved on, and the world will no longer be in the state that the player saw when issuing their command. A very explicit example of this is hit detection for weapons fired in first-person shooters, where margins are small and can potentially cause significant problems if not properly handled.

Do nothing[]

One potential "solution" is to simply ignore the problem. For hit detection in first-person shooters this means leading one's target, aiming at the position where it will be by the time the shot reaches the server. With variable latency this may be frustrating even under ideal conditions; with higher latency and random player movement it can make playing virtually impossible. For example, if a remote player passes by a window in a period shorter than the client's latency it will be impossible for the local player to hit them even if they fire immediately.

However, doing nothing does have the advantage of giving players the truest possible picture of what is happening to their input. In games where the player can only exert indirect control, such as RTS games, it is considered acceptable for the local player's troops to be lagged as long as his or her direct inputs (typically cursor position, unit selection, and camera position) are responsive.

Rewind time[]

Another way to address the issue is to store past game states for a certain length of time, then rewind player locations when processing a command.[2] The server uses the latency of the player (including any inherent delay due to interpolation; see above) to rewind time by an appropriate amount in order to determine what the shooting client saw at the time the shot was fired. This will usually result in the server seeing the client firing at the target's old position and thus hitting. In the worst case a player will be so far behind that the server runs out of historic data and they have to start leading their targets.

This is a WYSIWYG solution that allows players to aim directly at what they are seeing. But the price is an aggravation of the effects of latency when a player is under fire: not only does their own latency play a part, but their attacker's too. In many situations this is not noticeable, but players who have just taken cover will notice that they carry on receiving damage/death messages from the server for longer than their own latency can justify. This can lead more often to the (false) impression that they were shot through cover and the (not entirely inaccurate) impression of "laggy hitboxes".[2]

The effects of rewinding can be seen from the perspective of both parties in this video.

One design issue that arises from rewinding is whether to stop rewinding a dead player's lagged commands as soon as they die on the server, or to continue running them until they "catch up" to the time of death. Cutting compensation off immediately prevents victims from posthumously attacking their killers, which meets expectations, but preserves the natural advantage of moving players who round a corner, acquire a target and kill them in less time than a round trip to the stationary victim's client.

Rewinding can be criticised for allowing the high latency of one player to negatively affect the experience of low-latency players. Servers with lag compensation will sometimes reduce the length of player history stored, or enforce ping limits, to reduce this problem.

Make clients extrapolate[]

A third lag solution is to do nothing on the server and to have each client extrapolate (see above) to cover its latency.[3] This produces incorrect results unless remote players maintain a constant velocity, granting an advantage to those who dodge back and forth or simply start/stop moving.

Extended extrapolation also results in remote players becoming visible (though not vulnerable) when they should not be: for example if a remote player sprints up to a corner then stops abruptly at the edge, other clients will render them sprinting onward, into the open, for the duration of their own latency. On the other side of this problem, clients have to give remote players who just started moving an extra burst of speed in order to push them into a theoretically-accurate predicted location.

Design[]

It is possible to reduce the perception of lag through game design. Techniques include playing client-side animations as if the action took place immediately, reducing/removing built-in timers on the host machine, and using camera transitions to hide warping.[4]

Cloud gaming[]

Cloud gaming is a type of online gaming where the entire game is hosted on a game server in a data center, and the user is only running a thin client locally that forwards game controller actions upstream to the game server. The game server then renders the next frame of the game video which is compressed using low-lag video compression and is sent downstream and decompressed by the thin client. For the cloud gaming experience to be acceptable, the round-trip lag of all elements of the cloud gaming system (the thin client, the Internet and/or LAN connection the game server, the game execution on the game server, the video and audio compression and decompression, and the display of the video on a display device) must be low enough that the user perception is that the game is running locally.[5][1] Because of such tight lag requirements, distance considerations of the speed of light through optical fiber come into play, currently limiting the distance between a user and a cloud gaming game server to approximately 1000 miles, according to OnLive, the only company thus far operating a cloud gaming service.[6]

There is also much controversy about the lag associated with cloud gaming. In typical multiplayer games the player's computer renders the game's graphics locally and only information about the player's in-game actions are sent to the server. For example when the player presses a button, the character on-screen instantly performs the corresponding action. However, the consequences of the action such as an enemy being killed are only seen after a short delay due to the time taken for the action to reach the server. This is acceptable since as long as the player's character responds instantly the player still feels in complete control.

With cloud gaming when the player presses a button, nothing appears to happen for a short while. The button press must first be transmitted to the remote server, which takes time. Then the server must start rendering the graphics of the action being performed and stream the video back to the player over the network, which again takes time. Thus the player experiences a noticeable delay between pressing a button and seeing something happen on-screen. Depending on the skill & experience of the player, this can cause disorientation and confusion similar to Delayed Auditory Feedback and hampers navigation & aiming in the game world. When rapidly inputting a long combination move, the on-screen character will not be synchronized with the button presses. This usually causes severe confusion in the player resulting in the failure of the combination move.

The extra input lag can also make it very difficult to play certain single player games. For example, if an enemy takes a swing at the player and the player is expected to block, then by the time the player's screen shows that the enemy has commenced attacking, the enemy would have already struck and killed the player on the server.

See also[]

  • Client-side prediction

References[]

  1. 1.0 1.1 "The Process of Invention: OnLive Video Game Service". The FU Foundation School of Engineering & Applied Science (Columbia University). Retrieved on 2010-01-23.
  2. 2.0 2.1 2.2 2.3 Bernier, Yahn (2001). "Latency Compensating Methods in Client/Server In-game Protocol Design and Optimization". Valve Corporation. Retrieved on 17 September 2011.
  3. Gibson, John (5 December 2010). "Re: Will HoS present the netcode disadvantages of UE3?". Tripwire Interactive. Retrieved on 18 September 2011.
  4. Aldridge, David (2011). "I Shot You First: Networking the Gameplay of HALO: REACH". Game Developers Conference 2011. GDC Vault.
  5. "D8 Video:OnLive demoed on iPad, PC, Mac, Console, iPhone". Wall Street Journal (2010-08-09). Retrieved on 2010-08-19.
  6. "Beta Testing at the Speed of Light". OnLive (2010-01-21). Retrieved on 2010-01-23.
Smallwikipedialogo.png This page uses content from Wikipedia. The original article was at Lag.
The list of authors can be seen in the page history. As with Muds Wiki, the text of Wikipedia is available under the Creative Commons Attribution-Share Alike License 3.0 (Unported) (CC-BY-SA).
Advertisement