Latency
Introduction
The definition of latency depends on the system being observed and the nature of stimulation; from a general point of view, latency is a time delay between the stimulation and the response of the system being observed.
Just as every physical system, the player presents a latency as well, mainly generated by the internal rendering engine; in most cases, this rendering latency is not human-noticeable.
Rendering latency
In order to render the content, the player must first perform a multitude of tasks, like executing the SVG & JavaScript code, decoding the media resources, preparing the output frames etc., each of these tasks taking variable amounts of time. To ensure a smooth output at the frame rate configured under display settings, the player needs to buffer the output frames in advance, which results in a rendering latency.
The player's buffering model is similar to a queue, where the frame buffers are filled up by the rendering engine and emptied as the content is displayed on the screen on the other side. The buffering duration is set by default to an upper limit of 1 second. When the content is complex and the player detects that buffering doesn't keep up with the frame outputting, it will either lower the frame rate or drop some frames.
In most cases, the rendering latency is not noticeable and the default settings should be kept to prevent frames being dropped. However, the rendering latency does have a noticeable effect in case of interactive content and when playing real-time streaming media. In these cases, you might need to reduce the rendering latency for faster response to interactive events or to lower the delay in relation to the streaming source. Note that there are also cases when the rendering latency should not be reduced.
How to reduce the rendering latency

The two settings that control the rendering latency are found on Control Center → "Advanced Applications" → Interactivity page.
Maximum rendering latency
The player's rendering latency can be reduced by changing the value of the "Maximum rendering latency" option from 1000ms to 500ms or 250ms.
Reducing the maximum rendering latency should be done with care, as the available buffer will be smaller. A lower latency means less overhead for the player to render more complex content and more likely to result in dropped frames.
- Testing can be done by gradually reducing the latency while checking the info.log file to make sure that the player is not dropping any frames (factor 1-1, Nb Pic) and the maximum rendering time of any frame is within the allowed overhead (note the maximum time "Max: XXXms" value).
Interactivity boost
To improve interactivity even further, the player can temporarily ignore the value of the "Maximum rendering latency" option and reduce its rendering latency to only 60ms when the "Limit / Reduce latency to 60ms when events are received" option is enabled (which by default it is). In this case, the following happens:
- The first interactive event is processed after the "regular" latency period.
- After an interactive event is detected, the player reduces its rendering latency to 60ms, for the following interactive events to be processed very fast.
- If no interactive event is received within the following 90 seconds, the player reverts the rendering latency back to the "regular" value.
Ultra-low latency
Starting with 4.0.2 firmware release, is it possible to configure the DSOS players to operate using an ultra-low latency mode.
To reduce the player rendering latency to ultra-low values, follow these steps:
- Download and extract the archive on the right, containing configuration files for setting the player rendering latency to 50ms, 100ms, or 200ms.
- Apply the desired configuration file on the player.
When not to reduce the rendering latency
- When using audio in your project, the "Reduce latency to 60ms when events are received" option must be disabled. Moreover, the "Maximum rendering latency" option should be set to at least 500ms.
- When the content is very complex, the "Reduce latency to 60ms when events are received" option should be disabled because temporarily reducing the latency to such a low duration might have a negative impact over the quality of the output. Also, the "Maximum rendering latency" option can be lowered only if there are no performance issues (see explanation above).
- When some interactive events are skipped (i.e., the first interactive event might appear to be "lost") the "Reduce latency to 60ms when events are received" option should be disabled.
Streaming latency
For the special case of streaming, the latency is computed as following:
Total latency = Encoding latency + Network latency + Stream decoding latency + rendering latency + Screen latency
- Encoding latency
- This is the time required by the streaming hardware to prepare and send the stream packets, usually around one second. It cannot be controlled by the player; some encoders might allow the change of the latency value.
- Network latency
- This is the time for the network packets to travel from the streaming source to the player. This cannot be controlled by the player, but is usually negligible when the streaming server is on the same network as the player.
- Stream buffering latency
- This is the time required by the player to buffer the incoming network packets and is needed to avoid image freezing due to network packets not being received. The behavior is different depending on the firmware version, as following:
- Starting with firmware 4.0.0, the buffering delay depends on the source type:
- For RTP sources, the player is buffering 750 ms of video and 350 ms of audio by default - this buffer can be increased (or reduced starting with firmware 4.0.2) using the spx:buffering svg attribute.
- For MPEG2 transport stream sources, the buffering delay is extracted from the embedded DTS and PTS timestamps of each stream, and is thus under control of the streaming source.
- For firmware versions lower than 4.x, the player is buffering at least 10 frames of video (equal to 333ms @ 30fps or 400ms at 25fps). Note that large variations in the bitrate of the streaming media will determine an increase of the buffering up to 20 frames.
- Starting with firmware 4.0.0, the buffering delay depends on the source type:
- This is the time required by the player to buffer the incoming network packets and is needed to avoid image freezing due to network packets not being received. The behavior is different depending on the firmware version, as following:
- Screen latency
- This is induced by the refresh frequency of the screen and shouldn't be more than 30-40 ms. In some cases, image processing options might contribute to the the screen latency - disabling them would help in such cases.

Total latency = 1s + 0s + 0.75s + 1.5s + 0.04s = 3.3s
How to reduce the streaming latency
Using the default values works for most cases, but in some, a low-latency might be required (for instance when security cameras are employed) - for that, follow these steps:
- Check if the encoding latency can be reduced - some devices might have a low-latency option which can be enabled.
- If the streaming server is not on the same network as the player, try to put them within the same network (even better, isolate that from the rest of the network).
- Reduce the maximum rendering latency of the player down to 1000ms, 500ms or 250ms (when audio is not used), taking into account the complexity of the content.
