Jump to content

Network stream input

From SpinetiX Wiki
Note  
Applies to all DSOS players, except HMP300 and DiVA. It requires SYSTEMS Feature Set.

Overview

Network Stream Input allows SpinetiX players to decode and display real-time streaming media received over an IP network. Supported protocols include RTSP, RTP, UDP, SRT, and SDP. This feature is ideal for integrating live feeds, IP cameras, or broadcast streams into your digital signage content.

Streaming media is multimedia that is constantly received by and presented to an end-user while being delivered by a streaming server – which is a hardware/software product that is solely responsible with delivering streaming media. This is in contrast with a traditional web server that delivers all forms of web content, including HTML text, images, etc. See more details on the Web streaming (HTML5).

The streaming media must be compliant with the players' specification, as detailed on Video decoding page, and is subject to various types of latency. See also a collection of encoders and streamers that can be used for media streaming over the network and that should be compatible with the player.

If you are experiencing problems with a stream despite it conforming to the specifications, please see the Troubleshooting section below.

Configuration

Add a streaming layer
Add a streaming layer

To include real-time streaming media inside your content, follow these steps:

  1. Create an Elementi project or open an existing one.
  2. Add a streaming layer using the "Add Streaming Layer" button on the toolbar.
  3. Enter the URI of that streaming media. See steaming formats section below, for more details.
  4. Adjust size and position within your layout.
  5. Test playback on the player.

Tutorials

Streaming formats

The RTSP, RTP, UDP, SRT, and SDP streaming protocols are natively supported by the SpinetiX's players.

These should refer to an MPEG-2 Transport Stream containing video streams (MPEG 1/2/4/H264/H.265) & audio streams (AAC, MP3, LPCM) within the player specifications (see the Video decoding page for more details).

  • RTSP and SDP can also refer to a raw bitstream of MPEG 1/2/4/H264.

RTSP

The Real Time Streaming Protocol (RTSP) is a network control protocol designed to control streaming media servers. The protocol is used for establishing and controlling media sessions between end points. The transmission of streaming data itself is not a task of the RTSP protocol - the streaming being usually done with RTP over UDP.

Syntax:

rtsp://{Streaming_Server}/{Path}
rtsp://{user}:{password}@{Streaming_Server}/{Path}

For example:

  • local streaming server: rtsp://172.21.3.121/high or rtsp://172.21.3.120/axis-media/media.amp
  • external Wowza streaming server: rtsp://wowzaec2demo.streamlock.net/vod/mp4:BigBuckBunny_115k.mp4
Note Notes:
  • The streaming server determines by default whether unicast or multicast is used – to force the player to request one or the other, set the spx:transport attribute value to "multicast" or "unicast" from Layer Properties → Advanced tab.
  • When the stream source is not inside the same network as the player / Elementi, the stream is usually blocked by firewall or router with network address translation (because the server uses by default UDP as the transport for the RTP packets) - in such cases, make sure to activate the "Use TCP transport ... " option when adding the streaming layer, to force the RTP packets to be sent interleaved over the RTSP TCP socket.

RTP

The Real-time Transport Protocol (RTP) defines a standardized packet format for delivering audio and video over IP networks. RTP is designed for end-to-end, real-time, transfer of stream data.

Syntax:

rtp://{Multicast_Address}:{Port}
rtp://@:{Port}
  • The first format includes a multicast RTP source, the second a unicast RTP source.
  • Example: rtp://239.192.1.21:5000

UDP

The User Datagram Protocol (UDP) uses a simple transmission model with a minimum of protocol mechanism. UDP is suitable for purposes where error checking and correction is either not necessary or performed in the application, avoiding the overhead of such processing at the network interface level.

Syntax:

udp://{Multicast_Address}:{Port}
udp://@:{Port}
  • The first format includes a multicast UDP source, the second a unicast UDP source.
  • Example: udp://239.192.1.21:5000

SRT

Secure Reliable Transport (SRT) is a video transport protocol that optimizes streaming performance across unpredictable networks, such as the Internet. It provides reliable and secure streaming with low latency, making it suitable for high-quality video transmission. SRT provides connection, control, and reliable transmission (similar to TCP but using UDP protocol as an underlying transport layer). It supports packet recovery while maintaining low latency (default: 120 ms). SRT supports end-to-end encryption with AES.

Syntax:

srt://{Streaming_Server}:{Port}?{Params}

where:

  • Streaming_Server ⇾ remote IP address or hostname to connect to (mandatory)
  • Port ⇾ remote port to connect to (mandatory)
  • Params ⇾ optional key-value parameters, such as: "mode", "passphrase", "streamid", "latency", etc. For the full list of parameters, refer to the SRT documentation.

Examples:

  • srt://10.10.10.100:5001
  • srt://remote.host.com:8888?passphrase=secretpassword
  • srt://channellink.vitec.com:42001?streamid=spinetix
Note Note:
Any special character in the URI must be properly percent-encoded; for instance, if the passphrase is "my@password", it must be provided as "my%40password".

SDP file

Note  
See the full article related to SDP file for more details.

The Session Description Protocol (SDP) is a format for describing the initialization parameters of streaming media sessions. SDP does not deliver media itself but is used for negotiation between end points of media type, format, and all associated properties.

The following are supported:

  • SDP that refers to an MPEG2TS via RTP or UDP (all valid codecs for use in an MPEG2TS are supported - MPEG 1/2/4/H264);
  • SDP that refers to a raw bitstream of MPEG 1/2/4/H264.

MMS

Applies only for HMP200, HMP130, and HMP100 devices.

Microsoft Media Server (MMS), a Microsoft proprietary network-streaming protocol, serves to transfer unicast data in Windows Media Services. Microsoft deprecated MMS in favor of RTSP in 2003. The MMS protocol is supported by the player as a legacy feature, though its usage is strongly discouraged. To include an MMS source inside your project, set the URI of a media layer to mms://{Path} or to point to an ASX file.

Note Notes:
  • For MMS links, the layer mime type property must be manually set to "video/x-ms-asf".
  • ASX file is a legacy feature for the support of the MMS format - the .ASX file is required when streaming from a Windows Media Server using a "publishing point"; a direct link to a WMS Publishing Point is not supported.

Unsupported

This list is not exhaustive!

Multiscreen streaming

Applies to HMP400/W, iBX410/W, iBX440 with SYSTEMS Feature Set and HMP350.(firmware 4.1.0 or later).

When adding a streaming layer within a multiscreen project, Elementi shows a warning message saying that "Streaming source should be used in multiscreen projects only if displayed on a single screen or a sync variable is set". If that layer is positioned within a single screen, then it's just a normal case of streaming. But if that layer spans over multiple screens, then you need to make the following changes in Elementi:

  1. Click the Layer Properties button to open the "Layer Properties" dialog.
  2. Go to "Advanced" tab.
  3. Click twice on the "Click to add..." field under the "Name" column until the selection box is expanded.
  4. Select "spx:syncVar" attribute from the list.
  5. Click on the right column next to it (under the "Value" column).
  6. Enter a unique name and press the "Enter" key to validate.
  7. Click "OK" button to save the changes.

Next, make sure the stream is an MPEG-2 Transport Stream multicast over RTP or UDP and follow these steps to synchronize the players.

ONVIF compliance

ONVIF is a "global open standard for the interface of physical IP-based security products". Some IP cameras mention ONVIF compliance. The SpinetiX players are not solely IP security products, so there are parts of ONVIF that are not relevant. The "real-time streaming" section of the ONVIF specification is largely compatible with the SpinetiX devices, including support for RTP, RTSP, H264 and AAC.

Troubleshooting

Checklist

When the stream is not showing on the player, please do the following:

  1. Check that the streaming server is properly set up and is sending the network packets.
  2. Check that there are no constraints within your network that could block the network packets. For instance, firewalls or network switches/routers.
  3. Check that the stream can be played by a multimedia player (like VLC) from a PC.
  4. Check that the stream can be played by Elementi.
  5. Check that the stream characteristics are within the player specifications. See Video decoding page for more details.

Send the result of each step together with your enquiry to speed up the diagnostic process.

Known errors

  • Black/green screen on connection to stream – this usually means that the video being streamed is outside the player specification (see Video decoding page for more details).
    • If all codecs are correct and the stream resolution is within range, then this can indicate that the stream quality is too low for the player to decode the video. Check that the network throughput is sufficient.
  • Audio stream not playing – when the streaming media contains only an audio stream, without a video stream, Elementi/player may display an image having a red cross and a warning sign instead of playing the actual stream. To fix this, you must manually set the mime type of the streaming layer including the stream to either audio/mpeg or audio/x-ms-asf, depending on the audio codec used by the audio stream.
  • Packets lost – the playback is not smooth and player.log is full of the errors like
    Continuity loss of video stream in MPEG2TS stream detected
    and/or
    Continuity loss of audio stream in MPEG2TS stream detected
    • This indicates that the player is either:
      • not receiving all the packets of the stream because of network congestion / insufficient throughput → consider lowering the bitrate or using multicast (see the Bandwidth management page for more tips).
      • is receiving too many packets and the player is overloaded → check the usage data in player.log.
  • Image freeze or audio lost – this is the result of a non-optimal multiplexing of the audio and video streams that requires more than 3 MB of buffering of the audio or video stream to keep it in sync with the other. In this case, the player.log has the following error:
    Stream demuxing failed because more than 3145728 bytes of buffering is required
    • Reconnecting to the streaming media (as often as needed) should fix the image freeze or the audio loss. Alternatively, the spx:maxDemuxingBufferSize attribute could be increased.
  • MPEG-2 AAC audio with ADTS framing is not supported – MPEG-2 AAC audio is supported only for normal media, but not for streaming media. The solution is to check if the streaming media contains another audio stream that is supported (like MPEG-4 AAC, MP3, etc.) and select that one instead.

Stream capture

For streaming related problems, you might be asked to capture your stream and send a sample to us for analysis. The capturing can be done directly on the player or with Elementi. The procedure to follow for each is detailed below.

Stream capture on the player

Stream capture
  1. Open Control CenterNetwork → Logging.
  2. Click on the "Capture stream packets" checkbox option to enable the network packets capture.
  3. Re-publish your project, eventually containing only the stream layer(s) to allow for a clean log.
  4. Wait 2–3 minutes for streaming packets to be captured.
  5. Click again on the "Capture stream packets" option to disable the network packets capture.
    • The page is reloaded and there will be a capture file link, whose name starts with "CAP_", followed by a unique ID, above the option.
  6. Click the CAP_ link and download the file(s) found in that location – for instance, you can find two files named packets.dmp and session.sdp.
  7. Upload the file(s) to a sharing website (like OneDrive, Google Drive, Dropbox, etc.) and provide the access link.

Stream capture using Elementi

  1. Open Elementi.
  2. Click Menu > Help and select "Capture Streaming Packets" option.
  3. Open the project containing the streaming source. Leave the stream running or try to connect for around 2 minutes.
  4. Close the project.
  5. Open the Local Application Data folder of Elementi by typing the following command into the Run utility (Windows key + R):
    shell:Local AppData\SpinetiX\Elementi\capture
  6. Upload the stream capture to a file-sharing website (Dropbox, etc.) and provide the access link.

See also