18 KiB
Read a stream
Compatibility matrix
Live streams can be read from the server with the following protocols and codecs:
protocol | variants | video codecs | audio codecs |
---|---|---|---|
SRT | H265, H264, MPEG-4 Video (H263, Xvid), MPEG-1/2 Video | Opus, MPEG-4 Audio (AAC), MPEG-1/2 Audio (MP3), AC-3 | |
WebRTC | WHEP | AV1, VP9, VP8, H265, H264 | Opus, G722, G711 (PCMA, PCMU) |
RTSP | UDP, UDP-Multicast, TCP, RTSPS | AV1, VP9, VP8, H265, H264, MPEG-4 Video (H263, Xvid), MPEG-1/2 Video, M-JPEG and any RTP-compatible codec | Opus, MPEG-4 Audio (AAC), MPEG-1/2 Audio (MP3), AC-3, G726, G722, G711 (PCMA, PCMU), LPCM and any RTP-compatible codec |
RTMP | RTMP, RTMPS, Enhanced RTMP | AV1, VP9, H265, H264 | Opus, MPEG-4 Audio (AAC), MPEG-1/2 Audio (MP3), AC-3, G711 (PCMA, PCMU), LPCM |
HLS | Low-Latency HLS, MP4-based HLS, legacy HLS | AV1, VP9, H265, H264 | Opus, MPEG-4 Audio (AAC) |
We provide instructions for reading with the following software:
Protocols
SRT
SRT is a protocol that allows to publish and read live data stream, providing encryption, integrity and a retransmission mechanism. It is usually used to transfer media streams encoded with MPEG-TS. In order to read a stream from the server with the SRT protocol, use this URL:
srt://localhost:8890?streamid=read:mystream
Replace mystream
with the path name.
If you need to use the standard stream ID syntax instead of the custom one in use by this server, see Standard stream ID syntax.
Some clients that can read with SRT are FFmpeg, GStreamer and VLC.
WebRTC
WebRTC is an API that makes use of a set of protocols and methods to connect two clients together and allow them to exchange live media or data streams. You can read a stream with WebRTC and a web browser by visiting:
http://localhost:8889/mystream
WHEP is a WebRTC extensions that allows to read streams by using a URL, without passing through a web page. This allows to use WebRTC as a general purpose streaming protocol. If you are using a software that supports WHEP, you can read a stream from the server by using this URL:
http://localhost:8889/mystream/whep
Be aware that not all browsers can read any codec, check Supported browsers.
Depending on the network it may be difficult to establish a connection between server and clients, read Solving WebRTC connectivity issues.
Some clients that can read with WebRTC and WHEP are FFmpeg, GStreamer, Unity and web browsers.
RTSP
RTSP is a protocol that allows to publish and read streams. It supports different underlying transport protocols and encryption (see RTSP-specific features). In order to read a stream with the RTSP protocol, use this URL:
rtsp://localhost:8554/mystream
Some clients that can read with RTSP are FFmpeg, GStreamer and VLC.
Latency
The RTSP protocol doesn't introduce any latency by itself. Latency is usually introduced by clients, that put frames in a buffer to compensate network fluctuations. In order to decrease latency, the best way consists in tuning the client. For instance, in VLC, latency can be decreased by decreasing the Network caching parameter, that is available in the Open network stream dialog or alternatively can be set with the command line:
vlc --network-caching=50 rtsp://...
RTMP
RTMP is a protocol that allows to read and publish streams, but is less versatile and less efficient than RTSP and WebRTC (doesn't support UDP, doesn't support most RTSP codecs, doesn't support feedback mechanism). Streams can be read from the server by using the URL:
rtmp://localhost/mystream
Some clients that can read with RTMP are FFmpeg, GStreamer and VLC.
HLS
HLS is a protocol that works by splitting streams into segments, and by serving these segments and a playlist with the HTTP protocol. You can use MediaMTX to generate a HLS stream, that is accessible through a web page:
http://localhost:8888/mystream
and can also be accessed without using the browsers, by software that supports the HLS protocol (for instance VLC or MediaMTX itself) by using this URL:
http://localhost:8888/mystream/index.m3u8
Some clients that can read with HLS are FFmpeg, GStreamer, VLC and web browsers.
LL-HLS
Low-Latency HLS is a recently standardized variant of the protocol that allows to greatly reduce playback latency. It works by splitting segments into parts, that are served before the segment is complete. LL-HLS is enabled by default. If the stream is not shown correctly, try tuning the hlsPartDuration parameter, for instance:
hlsPartDuration: 500ms
Codec support in browsers
The server can produce HLS streams with a variety of video and audio codecs (that are listed at the beginning of the README), but not all browsers can read all codecs due to internal limitations that cannot be overcome by this or any other server.
You can check what codecs your browser can read with HLS by using this tool.
If you want to support most browsers, you can to re-encode the stream by using H264 and AAC codecs, for instance by using FFmpeg:
ffmpeg -i rtsp://original-source \
-c:v libx264 -pix_fmt yuv420p -preset ultrafast -b:v 600k \
-c:a aac -b:a 160k \
-f rtsp rtsp://localhost:8554/mystream
Encryption required by Apple devices
In order to correctly display Low-Latency HLS streams in Safari running on Apple devices (iOS or macOS), a TLS certificate is needed and can be generated with OpenSSL:
openssl genrsa -out server.key 2048
openssl req -new -x509 -sha256 -key server.key -out server.crt -days 3650
Set the hlsEncryption
, hlsServerKey
and hlsServerCert
parameters in the configuration file:
hlsEncryption: yes
hlsServerKey: server.key
hlsServerCert: server.crt
Keep also in mind that not all H264 video streams can be played on Apple Devices due to some intrinsic properties (distance between I-Frames, profile). If the video can't be played correctly, you can either:
-
re-encode it by following instructions in this README
-
disable the Low-latency variant of HLS and go back to the legacy variant:
hlsVariant: mpegts
Latency
in HLS, latency is introduced since a client must wait for the server to generate segments before downloading them. This latency amounts to 500ms-3s when the low-latency HLS variant is enabled (and it is by default), otherwise amounts to 1-15secs.
To decrease the latency, you can:
- try decreasing the hlsPartDuration parameter
- try decreasing the hlsSegmentDuration parameter
- The segment duration is influenced by the interval between the IDR frames of the video track. An IDR frame is a frame that can be decoded independently from the others. The server changes the segment duration in order to include at least one IDR frame into each segment. Therefore, you need to decrease the interval between the IDR frames. This can be done in two ways:
-
if the stream is being hardware-generated (i.e. by a camera), there's usually a setting called Key-Frame Interval in the camera configuration page
-
otherwise, the stream must be re-encoded. It's possible to tune the IDR frame interval by using ffmpeg's -g option:
ffmpeg -i rtsp://original-stream -c:v libx264 -pix_fmt yuv420p -preset ultrafast -b:v 600k -max_muxing_queue_size 1024 -g 30 -f rtsp rtsp://localhost:$RTSP_PORT/compressed
-
Software
FFmpeg
FFmpeg can read a stream from the server in several ways. The recommended one consists in reading with RTSP.
FFmpeg and RTSP
ffmpeg -i rtsp://localhost:8554/mystream -c copy output.mp4
FFmpeg and RTMP
ffmpeg -i rtmp://localhost/mystream -c copy output.mp4
In order to read AV1, VP9, H265, Opus, AC3 tracks and in order to read multiple video or audio tracks, the -rtmp_enhanced_codecs
flag must be present:
ffmpeg -rtmp_enhanced_codecs ac-3,av01,avc1,ec-3,fLaC,hvc1,.mp3,mp4a,Opus,vp09 \
-i rtmp://localhost/mystream -c copy output.mp4
FFmpeg and SRT
ffmpeg -i 'srt://localhost:8890?streamid=read:test' -c copy output.mp4
GStreamer
GStreamer can read a stream from the server in several way. The recommended one consists in reading with RTSP.
GStreamer and RTSP
gst-launch-1.0 rtspsrc location=rtsp://127.0.0.1:8554/mystream latency=0 ! decodebin ! autovideosink
GStreamer and WebRTC
GStreamer also supports reading streams with WebRTC/WHEP, although track codecs must be specified in advance through the video-caps
and audio-caps
parameters. Furthermore, if audio is not present, audio-caps
must be set anyway and must point to a PCMU codec. For instance, the command for reading a video-only H264 stream is:
gst-launch-1.0 whepsrc whep-endpoint=http://127.0.0.1:8889/stream/whep use-link-headers=true \
video-caps="application/x-rtp,media=video,encoding-name=H264,payload=127,clock-rate=90000" \
audio-caps="application/x-rtp,media=audio,encoding-name=PCMU,payload=0,clock-rate=8000" \
! rtph264depay ! decodebin ! autovideosink
While the command for reading an audio-only Opus stream is:
gst-launch-1.0 whepsrc whep-endpoint="http://127.0.0.1:8889/stream/whep" use-link-headers=true \
audio-caps="application/x-rtp,media=audio,encoding-name=OPUS,payload=111,clock-rate=48000,encoding-params=(string)2" \
! rtpopusdepay ! decodebin ! autoaudiosink
While the command for reading a H264 and Opus stream is:
gst-launch-1.0 whepsrc whep-endpoint=http://127.0.0.1:8889/stream/whep use-link-headers=true \
video-caps="application/x-rtp,media=video,encoding-name=H264,payload=127,clock-rate=90000" \
audio-caps="application/x-rtp,media=audio,encoding-name=OPUS,payload=111,clock-rate=48000,encoding-params=(string)2" \
! decodebin ! autovideosink
VLC
VLC can read a stream from the server in several way. The recommended one consists in reading with RTSP:
vlc --network-caching=50 rtsp://localhost:8554/mystream
RTSP and Ubuntu compatibility
The VLC shipped with Ubuntu 21.10 doesn't support playing RTSP due to a license issue (see here and here). To fix the issue, remove the default VLC instance and install the snap version:
sudo apt purge -y vlc
snap install vlc
Encrypted RTSP compatibility
At the moment VLC doesn't support reading encrypted RTSP streams. However, you can use a proxy like stunnel or nginx or a local MediaMTX instance to decrypt streams before reading them.
OBS Studio
OBS Studio can read streams from the server by using the RTSP protocol.
Open OBS, click on Add Source, Media source, OK, uncheck Local file, insert in Input:
rtsp://localhost:8554/stream
Then Ok.
Unity
Software written with the Unity Engine can read a stream from the server by using the WebRTC protocol.
Create a new Unity project or open an existing one.
Open Window -> Package Manager, click on the plus sign, Add Package by name... and insert com.unity.webrtc
. Wait for the package to be installed.
In the Project window, under Assets
, create a new C# Script called WebRTCReader.cs
with this content:
using System.Collections;
using UnityEngine;
using Unity.WebRTC;
public class WebRTCReader : MonoBehaviour
{
public string url = "http://localhost:8889/stream/whep";
private RTCPeerConnection pc;
private MediaStream receiveStream;
void Start()
{
UnityEngine.UI.RawImage rawImage = gameObject.GetComponentInChildren<UnityEngine.UI.RawImage>();
AudioSource audioSource = gameObject.GetComponentInChildren<AudioSource>();
pc = new RTCPeerConnection();
receiveStream = new MediaStream();
pc.OnTrack = e =>
{
receiveStream.AddTrack(e.Track);
};
receiveStream.OnAddTrack = e =>
{
if (e.Track is VideoStreamTrack videoTrack)
{
videoTrack.OnVideoReceived += (tex) =>
{
rawImage.texture = tex;
};
}
else if (e.Track is AudioStreamTrack audioTrack)
{
audioSource.SetTrack(audioTrack);
audioSource.loop = true;
audioSource.Play();
}
};
RTCRtpTransceiverInit init = new RTCRtpTransceiverInit();
init.direction = RTCRtpTransceiverDirection.RecvOnly;
pc.AddTransceiver(TrackKind.Audio, init);
pc.AddTransceiver(TrackKind.Video, init);
StartCoroutine(WebRTC.Update());
StartCoroutine(createOffer());
}
private IEnumerator createOffer()
{
var op = pc.CreateOffer();
yield return op;
if (op.IsError) {
Debug.LogError("CreateOffer() failed");
yield break;
}
yield return setLocalDescription(op.Desc);
}
private IEnumerator setLocalDescription(RTCSessionDescription offer)
{
var op = pc.SetLocalDescription(ref offer);
yield return op;
if (op.IsError) {
Debug.LogError("SetLocalDescription() failed");
yield break;
}
yield return postOffer(offer);
}
private IEnumerator postOffer(RTCSessionDescription offer)
{
var content = new System.Net.Http.StringContent(offer.sdp);
content.Headers.ContentType = new System.Net.Http.Headers.MediaTypeHeaderValue("application/sdp");
var client = new System.Net.Http.HttpClient();
var task = System.Threading.Tasks.Task.Run(async () => {
var res = await client.PostAsync(new System.UriBuilder(url).Uri, content);
res.EnsureSuccessStatusCode();
return await res.Content.ReadAsStringAsync();
});
yield return new WaitUntil(() => task.IsCompleted);
if (task.Exception != null) {
Debug.LogError(task.Exception);
yield break;
}
yield return setRemoteDescription(task.Result);
}
private IEnumerator setRemoteDescription(string answer)
{
RTCSessionDescription desc = new RTCSessionDescription();
desc.type = RTCSdpType.Answer;
desc.sdp = answer;
var op = pc.SetRemoteDescription(ref desc);
yield return op;
if (op.IsError) {
Debug.LogError("SetRemoteDescription() failed");
yield break;
}
yield break;
}
void OnDestroy()
{
pc?.Close();
pc?.Dispose();
receiveStream?.Dispose();
}
}
Edit the url
variable according to your needs.
In the Hierarchy window, find or create a scene. Inside the scene, add a Canvas. Inside the Canvas, add a Raw Image and an Audio Source. Then add the WebRTCReader.cs
script as component of the canvas, by dragging it inside the Inspector window. then Press the Play button at the top of the page.
Web browsers
Web browsers can read a stream from the server in several ways.
Web browsers and WebRTC
You can read a stream by using the WebRTC protocol by visiting the web page:
http://localhost:8889/mystream
This web page can be embedded into another web page by using an iframe:
<iframe src="http://mediamtx-ip:8889/mystream" scrolling="no"></iframe>
For more advanced setups, you can create and serve a custom web page by starting from the source code of the WebRTC read page. In particular, there's a ready-to-use, standalone JavaScript class for reading streams with WebRTC, available in reader.js.
Web browsers and HLS
Web browsers can also read a stream with the HLS protocol. Latency is higher but there are less problems related to connectivity between server and clients, furthermore the server load can be balanced by using a common HTTP CDN (like Cloudflare or CloudFront), and this allows to handle an unlimited amount of readers. Visit the web page:
http://localhost:8888/mystream
This web page can be embedded into another web page by using an iframe:
<iframe src="http://mediamtx-ip:8888/mystream" scrolling="no"></iframe>
For more advanced setups, you can create and serve a custom web page by starting from the source code of the HLS read page.