mirror of
https://github.com/aler9/rtsp-simple-server
synced 2025-09-27 03:56:15 +08:00
docs: update (#4994)
This commit is contained in:
31
README.md
31
README.md
@@ -27,18 +27,19 @@ _MediaMTX_ is a ready-to-use and zero-dependency real-time media server and medi
|
||||
|
||||
<h3>Features</h3>
|
||||
|
||||
* [Publish](https://mediamtx.org/docs/usage/publish) live streams to the server with SRT, WebRTC, RTSP, RTMP, HLS, MPEG-TS, RTP
|
||||
* [Read](https://mediamtx.org/docs/usage/read) live streams from the server with SRT, WebRTC, RTSP, RTMP, HLS
|
||||
* Streams are automatically converted from a protocol to another
|
||||
* Serve several streams at once in separate paths
|
||||
* [Record](https://mediamtx.org/docs/usage/record) streams to disk in fMP4 or MPEG-TS format
|
||||
* [Playback](https://mediamtx.org/docs/usage/playback) recorded streams
|
||||
* [Authenticate](https://mediamtx.org/docs/usage/authentication) users with internal, HTTP or JWT authentication
|
||||
* [Forward](https://mediamtx.org/docs/usage/forward) streams to other servers
|
||||
* [Proxy](https://mediamtx.org/docs/usage/proxy) requests to other servers
|
||||
* [Control](https://mediamtx.org/docs/usage/control-api) the server through the Control API
|
||||
* Reload the configuration without disconnecting existing clients (hot reloading)
|
||||
* [Monitor](https://mediamtx.org/docs/usage/metrics) the server through Prometheus-compatible metrics
|
||||
* [Run hooks](https://mediamtx.org/docs/usage/hooks) (external commands) when clients connect, disconnect, read or publish streams
|
||||
* Compatible with Linux, Windows and macOS, does not require any dependency or interpreter, it's a single executable
|
||||
* ...and many [others](https://mediamtx.org/docs/kickoff/introduction).
|
||||
- [Publish](https://mediamtx.org/docs/usage/publish) live streams to the server with SRT, WebRTC, RTSP, RTMP, HLS, MPEG-TS, RTP
|
||||
- [Read](https://mediamtx.org/docs/usage/read) live streams from the server with SRT, WebRTC, RTSP, RTMP, HLS
|
||||
- Streams are automatically converted from a protocol to another
|
||||
- Serve several streams at once in separate paths
|
||||
- Reload the configuration without disconnecting existing clients (hot reloading)
|
||||
- [Record](https://mediamtx.org/docs/usage/record) streams to disk in fMP4 or MPEG-TS format
|
||||
- [Playback](https://mediamtx.org/docs/usage/playback) recorded streams
|
||||
- [Authenticate](https://mediamtx.org/docs/usage/authentication) users with internal, HTTP or JWT authentication
|
||||
- [Forward](https://mediamtx.org/docs/usage/forward) streams to other servers
|
||||
- [Proxy](https://mediamtx.org/docs/usage/proxy) requests to other servers
|
||||
- [Control](https://mediamtx.org/docs/usage/control-api) the server through the Control API
|
||||
- [Extract metrics](https://mediamtx.org/docs/usage/metrics) from the server in a Prometheus-compatible format
|
||||
- [Monitor performance](https://mediamtx.org/docs/usage/performance) to investigate CPU and RAM consumption
|
||||
- [Run hooks](https://mediamtx.org/docs/usage/hooks) (external commands) when clients connect, disconnect, read or publish streams
|
||||
- Compatible with Linux, Windows and macOS, does not require any dependency or interpreter, it's a single executable
|
||||
- ...and many [others](https://mediamtx.org/docs/kickoff/introduction).
|
||||
|
@@ -10,14 +10,15 @@ Main features:
|
||||
- [Read](/docs/usage/read) live streams from the server with SRT, WebRTC, RTSP, RTMP, HLS
|
||||
- Streams are automatically converted from a protocol to another
|
||||
- Serve several streams at once in separate paths
|
||||
- Reload the configuration without disconnecting existing clients (hot reloading)
|
||||
- [Record](/docs/usage/record) streams to disk in fMP4 or MPEG-TS format
|
||||
- [Playback](/docs/usage/playback) recorded streams
|
||||
- [Authenticate](/docs/usage/authentication) users with internal, HTTP or JWT authentication
|
||||
- [Forward](/docs/usage/forward) streams to other servers
|
||||
- [Proxy](/docs/usage/proxy) requests to other servers
|
||||
- [Control](/docs/usage/control-api) the server through the Control API
|
||||
- Reload the configuration without disconnecting existing clients (hot reloading)
|
||||
- [Monitor](/docs/usage/metrics) the server through Prometheus-compatible metrics
|
||||
- [Extract metrics](/docs/usage/metrics) from the server in a Prometheus-compatible format
|
||||
- [Monitor performance](/docs/usage/performance) to investigate CPU and RAM consumption
|
||||
- [Run hooks](/docs/usage/hooks) (external commands) when clients connect, disconnect, read or publish streams
|
||||
- Compatible with Linux, Windows and macOS, does not require any dependency or interpreter, it's a single executable
|
||||
|
||||
|
@@ -5,7 +5,7 @@
|
||||
Live streams can be published to the server with the following protocols and codecs:
|
||||
|
||||
| protocol | variants | video codecs | audio codecs |
|
||||
| ----------------------------------------------------- | ------------------------------------------- | --------------------------------------------------------------------------------------------------------- | ---------------------------------------------------------------------------------------------------------------------- |
|
||||
| ----------------------------------------------------- | ------------------------------------------ | --------------------------------------------------------------------------------------------------------- | ---------------------------------------------------------------------------------------------------------------------- |
|
||||
| [SRT clients](#srt-clients) | | H265, H264, MPEG-4 Video (H263, Xvid), MPEG-1/2 Video | Opus, MPEG-4 Audio (AAC), MPEG-1/2 Audio (MP3), AC-3 |
|
||||
| [SRT cameras and servers](#srt-cameras-and-servers) | | H265, H264, MPEG-4 Video (H263, Xvid), MPEG-1/2 Video | Opus, MPEG-4 Audio (AAC), MPEG-1/2 Audio (MP3), AC-3 |
|
||||
| [WebRTC clients](#webrtc-clients) | WHIP | AV1, VP9, VP8, H265, H264 | Opus, G722, G711 (PCMA, PCMU) |
|
||||
@@ -15,8 +15,8 @@ Live streams can be published to the server with the following protocols and cod
|
||||
| [RTMP clients](#rtmp-clients) | RTMP, RTMPS, Enhanced RTMP | AV1, VP9, H265, H264 | Opus, MPEG-4 Audio (AAC), MPEG-1/2 Audio (MP3), AC-3, G711 (PCMA, PCMU), LPCM |
|
||||
| [RTMP cameras and servers](#rtmp-cameras-and-servers) | RTMP, RTMPS, Enhanced RTMP | AV1, VP9, H265, H264 | Opus, MPEG-4 Audio (AAC), MPEG-1/2 Audio (MP3), AC-3, G711 (PCMA, PCMU), LPCM |
|
||||
| [HLS cameras and servers](#hls-cameras-and-servers) | Low-Latency HLS, MP4-based HLS, legacy HLS | AV1, VP9, H265, H264 | Opus, MPEG-4 Audio (AAC) |
|
||||
| [MPEG-TS](#mpeg-ts) | MPEG-TS over UDP, MPEG-TS over Unix sockets | H265, H264, MPEG-4 Video (H263, Xvid), MPEG-1/2 Video | Opus, MPEG-4 Audio (AAC), MPEG-1/2 Audio (MP3), AC-3 |
|
||||
| [RTP](#rtp) | RTP over UDP, RTP over Unix sockets | AV1, VP9, VP8, H265, H264, MPEG-4 Video (H263, Xvid), MPEG-1/2 Video, M-JPEG and any RTP-compatible codec | Opus, MPEG-4 Audio (AAC), MPEG-1/2 Audio (MP3), AC-3, G726, G722, G711 (PCMA, PCMU), LPCM and any RTP-compatible codec |
|
||||
| [MPEG-TS](#mpeg-ts) | MPEG-TS over UDP, MPEG-TS over Unix socket | H265, H264, MPEG-4 Video (H263, Xvid), MPEG-1/2 Video | Opus, MPEG-4 Audio (AAC), MPEG-1/2 Audio (MP3), AC-3 |
|
||||
| [RTP](#rtp) | RTP over UDP, RTP over Unix socket | AV1, VP9, VP8, H265, H264, MPEG-4 Video (H263, Xvid), MPEG-1/2 Video, M-JPEG and any RTP-compatible codec | Opus, MPEG-4 Audio (AAC), MPEG-1/2 Audio (MP3), AC-3, G726, G722, G711 (PCMA, PCMU), LPCM and any RTP-compatible codec |
|
||||
|
||||
We provide instructions for publishing with the following devices:
|
||||
|
||||
@@ -48,7 +48,7 @@ If you need to use the standard stream ID syntax instead of the custom one in us
|
||||
|
||||
If you want to publish a stream by using a client in listening mode (i.e. with `mode=listener` appended to the URL), read the next section.
|
||||
|
||||
Known clients that can publish with SRT are [FFmpeg](#ffmpeg), [GStreamer](#gstreamer), [OBS Studio](#obs-studio).
|
||||
Some clients that can publish with SRT are [FFmpeg](#ffmpeg), [GStreamer](#gstreamer), [OBS Studio](#obs-studio).
|
||||
|
||||
### SRT cameras and servers
|
||||
|
||||
@@ -81,7 +81,7 @@ Be aware that not all browsers can read any codec, check [Supported browsers](we
|
||||
|
||||
Depending on the network it might be difficult to establish a connection between server and clients, read [Solving WebRTC connectivity issues](webrtc-specific-features#solving-webrtc-connectivity-issues).
|
||||
|
||||
Known clients that can publish with WebRTC and WHIP are [FFmpeg](#ffmpeg), [GStreamer](#gstreamer), [OBS Studio](#obs-studio), [Unity](#unity) and [Web browsers](#web-browsers).
|
||||
Some clients that can publish with WebRTC and WHIP are [FFmpeg](#ffmpeg), [GStreamer](#gstreamer), [OBS Studio](#obs-studio), [Unity](#unity) and [Web browsers](#web-browsers).
|
||||
|
||||
### WebRTC servers
|
||||
|
||||
@@ -104,11 +104,11 @@ rtsp://localhost:8554/mystream
|
||||
|
||||
The resulting stream is available in path `/mystream`.
|
||||
|
||||
Known clients that can publish with RTSP are [FFmpeg](#ffmpeg), [GStreamer](#gstreamer), [OBS Studio](#obs-studio).
|
||||
Some clients that can publish with RTSP are [FFmpeg](#ffmpeg), [GStreamer](#gstreamer), [OBS Studio](#obs-studio), [OpenCV](#opencv).
|
||||
|
||||
### RTSP cameras and servers
|
||||
|
||||
Most IP cameras expose their video stream by using a RTSP server that is embedded into the camera itself. In particular, cameras that are compliant with ONVIF profile S or T meet this requirement. You can use _MediaMTX_ to connect to one or several existing RTSP servers and read their video streams:
|
||||
Most IP cameras expose their video stream by using a RTSP server that is embedded into the camera itself. In particular, cameras that are compliant with ONVIF profile S or T meet this requirement. You can use _MediaMTX_ to connect to one or several existing RTSP servers and read their media streams:
|
||||
|
||||
```yml
|
||||
paths:
|
||||
@@ -140,11 +140,11 @@ rtmp://localhost/mystream
|
||||
|
||||
The resulting stream is available in path `/mystream`.
|
||||
|
||||
Known clients that can publish with RTMP are [FFmpeg](#ffmpeg), [GStreamer](#gstreamer), [OBS Studio](#obs-studio).
|
||||
Some clients that can publish with RTMP are [FFmpeg](#ffmpeg), [GStreamer](#gstreamer), [OBS Studio](#obs-studio).
|
||||
|
||||
### RTMP cameras and servers
|
||||
|
||||
You can use _MediaMTX_ to connect to one or several existing RTMP servers and read their video streams:
|
||||
You can use _MediaMTX_ to connect to one or several existing RTMP servers and read their media streams:
|
||||
|
||||
```yml
|
||||
paths:
|
||||
@@ -157,7 +157,7 @@ The resulting stream is available in path `/proxied`.
|
||||
|
||||
### HLS cameras and servers
|
||||
|
||||
HLS is a streaming protocol that works by splitting streams into segments, and by serving these segments and a playlist with the HTTP protocol. You can use _MediaMTX_ to connect to one or several existing HLS servers and read their video streams:
|
||||
HLS is a streaming protocol that works by splitting streams into segments, and by serving these segments and a playlist with the HTTP protocol. You can use _MediaMTX_ to connect to one or several existing HLS servers and read their media streams:
|
||||
|
||||
```yml
|
||||
paths:
|
||||
@@ -182,24 +182,6 @@ paths:
|
||||
|
||||
Where `238.0.0.1` is the IP for listening packets, in this case a multicast IP.
|
||||
|
||||
You can generate a UDP multicast MPEG-TS stream with GStreamer:
|
||||
|
||||
```sh
|
||||
gst-launch-1.0 -v mpegtsmux name=mux alignment=1 ! udpsink host=238.0.0.1 port=1234 \
|
||||
videotestsrc ! video/x-raw,width=1280,height=720,format=I420 ! x264enc speed-preset=ultrafast bitrate=3000 key-int-max=60 ! video/x-h264,profile=high ! mux. \
|
||||
audiotestsrc ! audioconvert ! avenc_aac ! mux.
|
||||
```
|
||||
|
||||
or FFmpeg:
|
||||
|
||||
```sh
|
||||
ffmpeg -re -f lavfi -i testsrc=size=1280x720:rate=30 \
|
||||
-c:v libx264 -pix_fmt yuv420p -preset ultrafast -b:v 600k \
|
||||
-f mpegts udp://238.0.0.1:1234?pkt_size=1316
|
||||
```
|
||||
|
||||
The resulting stream is available in path `/mypath`.
|
||||
|
||||
If the listening IP is a multicast IP, _MediaMTX_ will listen for incoming packets on the default multicast interface, picked by the operating system. It is possible to specify the interface manually by using the `interface` parameter:
|
||||
|
||||
```yml
|
||||
@@ -216,7 +198,7 @@ paths:
|
||||
source: udp+mpegts://0.0.0.0:1234?source=192.168.3.5
|
||||
```
|
||||
|
||||
Known clients that can publish with UDP and MPEG-TS are [FFmpeg](#ffmpeg) and [GStreamer](#gstreamer).
|
||||
Some clients that can publish with UDP and MPEG-TS are [FFmpeg](#ffmpeg) and [GStreamer](#gstreamer).
|
||||
|
||||
Unix sockets are more efficient than UDP packets and can be used as transport by specifying the `unix+mpegts` scheme:
|
||||
|
||||
@@ -226,14 +208,6 @@ paths:
|
||||
source: unix+mpegts:///tmp/socket.sock
|
||||
```
|
||||
|
||||
FFmpeg can generate such streams:
|
||||
|
||||
```sh
|
||||
ffmpeg -re -f lavfi -i testsrc=size=1280x720:rate=30 \
|
||||
-c:v libx264 -pix_fmt yuv420p -preset ultrafast -b:v 600k \
|
||||
-f mpegts unix:/tmp/socket.sock
|
||||
```
|
||||
|
||||
### RTP
|
||||
|
||||
The server supports ingesting RTP streams, shipped in two different ways (UDP packets or Unix sockets).
|
||||
@@ -257,17 +231,7 @@ paths:
|
||||
|
||||
`rtpSDP` must contain a valid SDP, that is a description of the RTP session.
|
||||
|
||||
FFmpeg can generate a RTP over UDP stream:
|
||||
|
||||
```sh
|
||||
ffmpeg -re -f lavfi -i testsrc=size=1280x720:rate=30 \
|
||||
-c:v libx264 -pix_fmt yuv420p -preset ultrafast -b:v 600k \
|
||||
-f rtp udp://238.0.0.1:1234?pkt_size=1316
|
||||
```
|
||||
|
||||
The stream is available on path `/mypath`.
|
||||
|
||||
Known clients that can publish with UDP and MPEG-TS are [FFmpeg](#ffmpeg) and [GStreamer](#gstreamer).
|
||||
Some clients that can publish with UDP and MPEG-TS are [FFmpeg](#ffmpeg) and [GStreamer](#gstreamer).
|
||||
|
||||
Unix sockets are more efficient than UDP packets and can be used as transport by specifying the `unix+rtp` scheme:
|
||||
|
||||
@@ -286,14 +250,6 @@ paths:
|
||||
a=fmtp:96 profile-level-id=42e01e;packetization-mode=1;sprop-parameter-sets=Z0LAHtkDxWhAAAADAEAAAAwDxYuS,aMuMsg==
|
||||
```
|
||||
|
||||
FFmpeg can generate such streams:
|
||||
|
||||
```sh
|
||||
ffmpeg -re -f lavfi -i testsrc=size=1280x720:rate=30 \
|
||||
-c:v libx264 -pix_fmt yuv420p -preset ultrafast -b:v 600k \
|
||||
-f rtp unix:/tmp/socket.sock
|
||||
```
|
||||
|
||||
## Devices
|
||||
|
||||
### Raspberry Pi Cameras
|
||||
@@ -458,23 +414,79 @@ The resulting stream is available in path `/cam`.
|
||||
|
||||
### FFmpeg
|
||||
|
||||
FFmpeg can publish a stream to the server in several ways (SRT client, SRT server, RTSP client, RTMP client, MPEG-TS over UDP, MPEG-TS over Unix sockets, WebRTC with WHIP, RTP over UDP, rtp over Unix sockets). The recommended one consists in publishing as a [RTSP client](#rtsp-clients):
|
||||
FFmpeg can publish a stream to the server in several ways. The recommended one consists in publishing with RTSP.
|
||||
|
||||
```
|
||||
ffmpeg -re -stream_loop -1 -i file.ts -c copy -f rtsp rtsp://localhost:8554/mystream
|
||||
```
|
||||
|
||||
The RTSP protocol supports several underlying transport protocols, each with its own characteristics (see [RTSP-specific features](rtsp-specific-features)). You can set the transport protocol by using the `rtsp_transport` flag, for instance, in order to use TCP:
|
||||
#### FFmpeg and RTSP
|
||||
|
||||
```sh
|
||||
ffmpeg -re -stream_loop -1 -i file.ts -c copy -f rtsp -rtsp_transport tcp rtsp://localhost:8554/mystream
|
||||
ffmpeg -re -stream_loop -1 -i file.ts -c copy -f rtsp rtsp://localhost:8554/mystream
|
||||
```
|
||||
|
||||
The resulting stream is available in path `/mystream`.
|
||||
|
||||
#### FFmpeg and RTMP
|
||||
|
||||
```sh
|
||||
ffmpeg -re -stream_loop -1 -i file.ts -c copy -f flv rtmp://localhost:1935/mystream
|
||||
```
|
||||
|
||||
#### FFmpeg and MPEG-TS over UDP
|
||||
|
||||
In MediaMTX configuration, add a path with `source: udp+mpegts://238.0.0.1:1234`. Then:
|
||||
|
||||
```sh
|
||||
ffmpeg -re -stream_loop -1 -i file.ts -c copy -f mpegts 'udp://127.0.0.1:3356?pkt_size=1316'
|
||||
```
|
||||
|
||||
#### FFmpeg and MPEG-TS over Unix socket
|
||||
|
||||
```sh
|
||||
ffmpeg -re -f lavfi -i testsrc=size=1280x720:rate=30 \
|
||||
-c:v libx264 -pix_fmt yuv420p -preset ultrafast -b:v 600k \
|
||||
-f mpegts unix:/tmp/socket.sock
|
||||
```
|
||||
|
||||
#### FFmpeg and RTP over UDP
|
||||
|
||||
In MediaMTX configuration, add a path with `source: udp+rtp://238.0.0.1:1234` and a valid `rtpSDP` (see [RTP](#rtp)). Then:
|
||||
|
||||
```sh
|
||||
ffmpeg -re -f lavfi -i testsrc=size=1280x720:rate=30 \
|
||||
-c:v libx264 -pix_fmt yuv420p -preset ultrafast -b:v 600k \
|
||||
-f rtp udp://238.0.0.1:1234?pkt_size=1316
|
||||
```
|
||||
|
||||
#### FFmpeg and RTP over Unix socket
|
||||
|
||||
```sh
|
||||
ffmpeg -re -f lavfi -i testsrc=size=1280x720:rate=30 \
|
||||
-c:v libx264 -pix_fmt yuv420p -preset ultrafast -b:v 600k \
|
||||
-f rtp unix:/tmp/socket.sock
|
||||
```
|
||||
|
||||
#### FFmpeg and SRT
|
||||
|
||||
```sh
|
||||
ffmpeg -re -stream_loop -1 -i file.ts -c copy -f mpegts 'srt://localhost:8890?streamid=publish:stream&pkt_size=1316'
|
||||
```
|
||||
|
||||
#### FFmpeg and WebRTC
|
||||
|
||||
```sh
|
||||
ffmpeg -re -f lavfi -i testsrc=size=1280x720:rate=30 \
|
||||
-f lavfi -i "sine=frequency=1000:sample_rate=48000" \
|
||||
-c:v libx264 -pix_fmt yuv420p -preset ultrafast -b:v 600k \
|
||||
-c:a libopus -ar 48000 -ac 2 -b:a 128k \
|
||||
-f whip http://localhost:8889/stream/whip
|
||||
```
|
||||
|
||||
WARNING: in case of FFmpeg 8.0, both a video track and an audio track must be present.
|
||||
|
||||
### GStreamer
|
||||
|
||||
GStreamer can publish a stream to the server in several ways (SRT client, SRT server, RTSP client, RTMP client, MPEG-TS over UDP, WebRTC with WHIP, RTP over UDP). The recommended one consists in publishing as a [RTSP client](#rtsp-clients):
|
||||
FFmpeg can publish a stream to the server in several ways. The recommended one consists in publishing with RTSP.
|
||||
|
||||
#### GStreamer and RTSP
|
||||
|
||||
```sh
|
||||
gst-launch-1.0 rtspclientsink name=s location=rtsp://localhost:8554/mystream \
|
||||
@@ -490,23 +502,27 @@ gst-launch-1.0 filesrc location=file.mp4 ! qtdemux name=d \
|
||||
d.video_0 ! rtspclientsink location=rtsp://localhost:8554/mystream
|
||||
```
|
||||
|
||||
The RTSP protocol supports several underlying transport protocols, each with its own characteristics (see [RTSP-specific features](rtsp-specific-features)). You can set the transport protocol by using the `protocols` flag:
|
||||
|
||||
```sh
|
||||
gst-launch-1.0 filesrc location=file.mp4 ! qtdemux name=d \
|
||||
d.video_0 ! rtspclientsink location=rtsp://localhost:8554/mystream protocols=tcp
|
||||
```
|
||||
|
||||
If encryption is enabled, the `tls-validation-flags` and `profiles` options must be specified too:
|
||||
|
||||
```sh
|
||||
gst-launch-1.0 filesrc location=file.mp4 ! qtdemux name=d \
|
||||
d.video_0 ! rtspclientsink location=rtsp://localhost:8554/mystream tls-validation-flags=0 profiles=GST_RTSP_PROFILE_SAVP
|
||||
```
|
||||
|
||||
The resulting stream is available in path `/mystream`.
|
||||
|
||||
GStreamer can also publish a stream by using the [WebRTC / WHIP protocol](#webrtc-clients). Make sure that GStreamer version is at least 1.22, and that if the codec is H264, the profile is baseline. Use the `whipclientsink` element:
|
||||
#### GStreamer and RTMP
|
||||
|
||||
```sh
|
||||
gst-launch-1.0 -v flvmux name=mux ! rtmpsink location=rtmp://localhost/stream \
|
||||
videotestsrc ! video/x-raw,width=1280,height=720,format=I420 ! x264enc speed-preset=ultrafast bitrate=3000 key-int-max=60 ! video/x-h264,profile=high ! mux. \
|
||||
audiotestsrc ! audioconvert ! avenc_aac ! mux.
|
||||
```
|
||||
|
||||
#### GStreamer and MPEG-TS over UDP
|
||||
|
||||
```sh
|
||||
gst-launch-1.0 -v mpegtsmux name=mux alignment=1 ! udpsink host=238.0.0.1 port=1234 \
|
||||
videotestsrc ! video/x-raw,width=1280,height=720,format=I420 ! x264enc speed-preset=ultrafast bitrate=3000 key-int-max=60 ! video/x-h264,profile=high ! mux. \
|
||||
audiotestsrc ! audioconvert ! avenc_aac ! mux.
|
||||
```
|
||||
|
||||
#### GStreamer and WebRTC
|
||||
|
||||
Make sure that GStreamer version is at least 1.22, and that if the codec is H264, the profile is baseline. Use the `whipclientsink` element:
|
||||
|
||||
```
|
||||
gst-launch-1.0 videotestsrc \
|
||||
@@ -518,7 +534,11 @@ gst-launch-1.0 videotestsrc \
|
||||
|
||||
### OBS Studio
|
||||
|
||||
OBS Studio can publish to the server in several ways (SRT client, RTMP client, WebRTC client). The recommended one consists in publishing as a [RTMP client](#rtmp-clients). In `Settings -> Stream` (or in the Auto-configuration Wizard), use the following parameters:
|
||||
OBS Studio can publish to the server in several ways. The recommended one consists in publishing with RTMP.
|
||||
|
||||
#### OBS Studio and RTMP
|
||||
|
||||
In `Settings -> Stream` (or in the Auto-configuration Wizard), use the following parameters:
|
||||
|
||||
- Service: `Custom...`
|
||||
- Server: `rtmp://localhost/mystream`
|
||||
@@ -539,6 +559,8 @@ If you want to generate a stream that can be read with WebRTC, open `Settings ->
|
||||
|
||||
Then use the button `Start Recording` (instead of `Start Streaming`) to start streaming.
|
||||
|
||||
#### OBS Studio and WebRTC
|
||||
|
||||
Recent versions of OBS Studio can also publish to the server with the [WebRTC / WHIP protocol](#webrtc-clients) Use the following parameters:
|
||||
|
||||
- Service: `WHIP`
|
||||
@@ -550,7 +572,7 @@ The resulting stream is available in path `/mystream`.
|
||||
|
||||
### OpenCV
|
||||
|
||||
Software which uses the OpenCV library can publish to the server through its GStreamer plugin, as a [RTSP client](#rtsp-clients). It must be compiled with GStreamer support, by following this procedure:
|
||||
Software which uses the OpenCV library can publish to the server through its GStreamer plugin, as a [RTSP client](#rtsp-clients). It must be compiled with support for GStreamer, by following this procedure:
|
||||
|
||||
```sh
|
||||
sudo apt install -y libgstreamer1.0-dev libgstreamer-plugins-base1.0-dev gstreamer1.0-plugins-ugly gstreamer1.0-rtsp python3-dev python3-numpy
|
||||
|
@@ -2,7 +2,7 @@
|
||||
|
||||
WebRTC is a protocol that can be used for publishing and reading streams. Features in these page are shared among both tasks. Regarding specific tasks, see [Publish](publish) and [Read](read).
|
||||
|
||||
## Supported browsers
|
||||
## Codec support in browsers
|
||||
|
||||
The server can ingest and broadcast with WebRTC a wide variety of video and audio codecs (that are listed at the beginning of the README), but not all browsers can publish and read all codecs due to internal limitations that cannot be overcome by this or any other server.
|
||||
|
||||
|
@@ -10,7 +10,32 @@ The RTSP protocol supports several underlying transport protocols, that are chos
|
||||
- UDP-multicast: allows to save bandwidth when clients are all in the same LAN, by sending packets once to a fixed multicast IP.
|
||||
- TCP: the most versatile.
|
||||
|
||||
The default transport protocol is UDP. To change the transport protocol, you have to tune the configuration of your client of choice.
|
||||
To change the transport protocol, you have to tune the configuration of the client you are using to publish or read streams. In most clients, the default transport protocol is UDP.
|
||||
|
||||
For instance, FFmpeg allows to change the transport protocol with the `-rtsp_transport` flag:
|
||||
|
||||
```sh
|
||||
ffmpeg -rtsp_transport tcp -i rtsp://localhost:8554/mystream -c copy output.mp4
|
||||
```
|
||||
|
||||
GStreamer allows to change the transport protocol with the `protocols` property of `rtspsrc` and `rtspclientsink`:
|
||||
|
||||
```sh
|
||||
gst-launch-1.0 filesrc location=file.mp4 ! qtdemux name=d \
|
||||
d.video_0 ! rtspclientsink location=rtsp://localhost:8554/mystream protocols=tcp
|
||||
```
|
||||
|
||||
VLC allows to use the TCP transport protocol, use the `--rtsp_tcp` flag:
|
||||
|
||||
```sh
|
||||
vlc --network-caching=50 --rtsp-tcp rtsp://localhost:8554/mystream
|
||||
```
|
||||
|
||||
VLC allows to use the UDP-multicast transport protocol by appending `?vlcmulticast` to the URL:
|
||||
|
||||
```sh
|
||||
vlc --network-caching=50 rtsp://localhost:8554/mystream?vlcmulticast
|
||||
```
|
||||
|
||||
## Encryption
|
||||
|
||||
@@ -35,7 +60,22 @@ Streams can be published and read with the `rtsps` scheme and the `8322` port:
|
||||
rtsps://localhost:8322/mystream
|
||||
```
|
||||
|
||||
## Corrupted frames
|
||||
Some clients require additional flags for encryption to work properly.
|
||||
|
||||
When reading with GStreamer, set set `tls-validation-flags` to `0`:
|
||||
|
||||
```sh
|
||||
gst-launch-1.0 rtspsrc tls-validation-flags=0 location=rtsps://ip:8322/...
|
||||
```
|
||||
|
||||
When publishing with GStreamer, set `tls-validation-flags` to `0` and `profiles` to `GST_RTSP_PROFILE_SAVP`:
|
||||
|
||||
```sh
|
||||
gst-launch-1.0 filesrc location=file.mp4 ! qtdemux name=d \
|
||||
d.video_0 ! rtspclientsink location=rtsp://localhost:8554/mystream tls-validation-flags=0 profiles=GST_RTSP_PROFILE_SAVP
|
||||
```
|
||||
|
||||
## Decreasing corrupted frames
|
||||
|
||||
In some scenarios, when publishing or reading from the server with RTSP, frames can get corrupted. This can be caused by several reasons:
|
||||
|
||||
|
@@ -35,7 +35,7 @@ Replace `mystream` with the path name.
|
||||
|
||||
If you need to use the standard stream ID syntax instead of the custom one in use by this server, see [Standard stream ID syntax](srt-specific-features#standard-stream-id-syntax).
|
||||
|
||||
Known clients that can read with SRT are [FFmpeg](#ffmpeg), [GStreamer](#gstreamer) and [VLC](#vlc).
|
||||
Some clients that can read with SRT are [FFmpeg](#ffmpeg), [GStreamer](#gstreamer) and [VLC](#vlc).
|
||||
|
||||
### WebRTC
|
||||
|
||||
@@ -55,7 +55,7 @@ Be aware that not all browsers can read any codec, check [Supported browsers](we
|
||||
|
||||
Depending on the network it may be difficult to establish a connection between server and clients, read [Solving WebRTC connectivity issues](webrtc-specific-features#solving-webrtc-connectivity-issues).
|
||||
|
||||
Known clients that can read with WebRTC and WHEP are [FFmpeg](#ffmpeg), [GStreamer](#gstreamer), [Unity](#unity) and [web browsers](#web-browsers).
|
||||
Some clients that can read with WebRTC and WHEP are [FFmpeg](#ffmpeg), [GStreamer](#gstreamer), [Unity](#unity) and [web browsers](#web-browsers).
|
||||
|
||||
### RTSP
|
||||
|
||||
@@ -65,7 +65,7 @@ RTSP is a protocol that allows to publish and read streams. It supports differen
|
||||
rtsp://localhost:8554/mystream
|
||||
```
|
||||
|
||||
Known clients that can read with RTSP are [FFmpeg](#ffmpeg), [GStreamer](#gstreamer) and [VLC](#vlc).
|
||||
Some clients that can read with RTSP are [FFmpeg](#ffmpeg), [GStreamer](#gstreamer) and [VLC](#vlc).
|
||||
|
||||
#### Latency
|
||||
|
||||
@@ -83,7 +83,7 @@ RTMP is a protocol that allows to read and publish streams, but is less versatil
|
||||
rtmp://localhost/mystream
|
||||
```
|
||||
|
||||
Known clients that can read with RTMP are [FFmpeg](#ffmpeg), [GStreamer](#gstreamer) and [VLC](#vlc).
|
||||
Some clients that can read with RTMP are [FFmpeg](#ffmpeg), [GStreamer](#gstreamer) and [VLC](#vlc).
|
||||
|
||||
### HLS
|
||||
|
||||
@@ -99,7 +99,7 @@ and can also be accessed without using the browsers, by software that supports t
|
||||
http://localhost:8888/mystream/index.m3u8
|
||||
```
|
||||
|
||||
Known clients that can read with HLS are [FFmpeg](#ffmpeg), [GStreamer](#gstreamer), [VLC](#vlc) and [web browsers](#web-browsers).
|
||||
Some clients that can read with HLS are [FFmpeg](#ffmpeg), [GStreamer](#gstreamer), [VLC](#vlc) and [web browsers](#web-browsers).
|
||||
|
||||
#### LL-HLS
|
||||
|
||||
@@ -109,7 +109,7 @@ Low-Latency HLS is a recently standardized variant of the protocol that allows t
|
||||
hlsPartDuration: 500ms
|
||||
```
|
||||
|
||||
#### Supported browsers
|
||||
#### Codec support in browsers
|
||||
|
||||
The server can produce HLS streams with a variety of video and audio codecs (that are listed at the beginning of the README), but not all browsers can read all codecs due to internal limitations that cannot be overcome by this or any other server.
|
||||
|
||||
@@ -170,19 +170,15 @@ To decrease the latency, you can:
|
||||
|
||||
### FFmpeg
|
||||
|
||||
FFmpeg can read a stream from the server in several ways (RTSP, RTMP, HLS, WebRTC with WHEP, SRT). The recommended one consists in reading with [RTSP](#rtsp):
|
||||
FFmpeg can read a stream from the server in several ways. The recommended one consists in reading with RTSP.
|
||||
|
||||
#### FFmpeg and RTSP
|
||||
|
||||
```sh
|
||||
ffmpeg -i rtsp://localhost:8554/mystream -c copy output.mp4
|
||||
```
|
||||
|
||||
The RTSP protocol supports several underlying transport protocols, each with its own characteristics (see [RTSP-specific features](rtsp-specific-features)). You can set the transport protocol by using the `rtsp_transport` flag:
|
||||
|
||||
```sh
|
||||
ffmpeg -rtsp_transport tcp -i rtsp://localhost:8554/mystream -c copy output.mp4
|
||||
```
|
||||
|
||||
FFmpeg can also read a stream with RTMP:
|
||||
#### FFmpeg and RTMP
|
||||
|
||||
```sh
|
||||
ffmpeg -i rtmp://localhost/mystream -c copy output.mp4
|
||||
@@ -194,25 +190,23 @@ In order to read AV1, VP9, H265, Opus, AC3 tracks and in order to read multiple
|
||||
ffmpeg -rtmp_enhanced_codecs ac-3,av01,avc1,ec-3,fLaC,hvc1,.mp3,mp4a,Opus,vp09 -i rtmp://localhost/mystream -c copy output.mp4
|
||||
```
|
||||
|
||||
#### FFmpeg and SRT
|
||||
|
||||
```sh
|
||||
ffmpeg -i 'srt://localhost:8890?streamid=read:test' -f null -
|
||||
```
|
||||
|
||||
### GStreamer
|
||||
|
||||
GStreamer can read a stream from the server in several ways (RTSP, RTMP, HLS, WebRTC with WHEP, SRT). The recommended one consists in reading with [RTSP](#rtsp):
|
||||
GStreamer can read a stream from the server in several way. The recommended one consists in reading with RTSP.
|
||||
|
||||
#### GStreamer and RTSP
|
||||
|
||||
```sh
|
||||
gst-launch-1.0 rtspsrc location=rtsp://127.0.0.1:8554/mystream latency=0 ! decodebin ! autovideosink
|
||||
```
|
||||
|
||||
The RTSP protocol supports several underlying transport protocols, each with its own characteristics (see [RTSP-specific features](rtsp-specific-features)). You can change the transport protocol by using the `protocols` flag:
|
||||
|
||||
```sh
|
||||
gst-launch-1.0 rtspsrc protocols=tcp location=rtsp://127.0.0.1:8554/mystream latency=0 ! decodebin ! autovideosink
|
||||
```
|
||||
|
||||
If encryption is enabled, set `tls-validation-flags` to `0`:
|
||||
|
||||
```sh
|
||||
gst-launch-1.0 rtspsrc tls-validation-flags=0 location=rtsps://ip:8322/...
|
||||
```
|
||||
#### GStreamer and WebRTC
|
||||
|
||||
GStreamer also supports reading streams with WebRTC/WHEP, although track codecs must be specified in advance through the `video-caps` and `audio-caps` parameters. Furthermore, if audio is not present, `audio-caps` must be set anyway and must point to a PCMU codec. For instance, the command for reading a video-only H264 stream is:
|
||||
|
||||
@@ -242,27 +236,13 @@ audio-caps="application/x-rtp,media=audio,encoding-name=OPUS,payload=111,clock-r
|
||||
|
||||
### VLC
|
||||
|
||||
VLC can read a stream from the server in several ways (RTSP, RTMP, HLS, SRT). The recommended one consists in reading with [RTSP](#rtsp):
|
||||
VLC can read a stream from the server in several way. The recommended one consists in reading with RTSP:
|
||||
|
||||
```sh
|
||||
vlc --network-caching=50 rtsp://localhost:8554/mystream
|
||||
```
|
||||
|
||||
The RTSP protocol supports several underlying transport protocols, each with its own characteristics (see [RTSP-specific features](rtsp-specific-features)).
|
||||
|
||||
In order to use the TCP transport protocol, use the `--rtsp_tcp` flag:
|
||||
|
||||
```sh
|
||||
vlc --network-caching=50 --rtsp-tcp rtsp://localhost:8554/mystream
|
||||
```
|
||||
|
||||
In order to use the UDP-multicast transport protocol, append `?vlcmulticast` to the URL:
|
||||
|
||||
```sh
|
||||
vlc --network-caching=50 rtsp://localhost:8554/mystream?vlcmulticast
|
||||
```
|
||||
|
||||
#### Ubuntu compatibility
|
||||
#### RTSP and Ubuntu compatibility
|
||||
|
||||
The VLC shipped with Ubuntu 21.10 doesn't support playing RTSP due to a license issue (see [here](https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=982299) and [here](https://stackoverflow.com/questions/69766748/cvlc-cannot-play-rtsp-omxplayer-instead-can)). To fix the issue, remove the default VLC instance and install the snap version:
|
||||
|
||||
@@ -271,7 +251,7 @@ sudo apt purge -y vlc
|
||||
snap install vlc
|
||||
```
|
||||
|
||||
#### Encrypted streams compatibility
|
||||
#### Encrypted RTSP compatibility
|
||||
|
||||
At the moment VLC doesn't support reading encrypted RTSP streams. However, you can use a proxy like [stunnel](https://www.stunnel.org) or [nginx](https://nginx.org/) or a local _MediaMTX_ instance to decrypt streams before reading them.
|
||||
|
||||
@@ -421,7 +401,9 @@ In the _Hierarchy_ window, find or create a scene. Inside the scene, add a _Canv
|
||||
|
||||
### Web browsers
|
||||
|
||||
Web browsers can read a stream from the server in several ways (WebRTC or HLS).
|
||||
Web browsers can read a stream from the server in several ways.
|
||||
|
||||
#### Web browsers and WebRTC
|
||||
|
||||
You can read a stream by using the [WebRTC protocol](#webrtc) by visiting the web page:
|
||||
|
||||
@@ -437,6 +419,8 @@ This web page can be embedded into another web page by using an iframe:
|
||||
|
||||
For more advanced setups, you can create and serve a custom web page by starting from the [source code of the WebRTC read page](https://github.com/bluenviron/mediamtx/blob/{version_tag}/internal/servers/webrtc/read_index.html). In particular, there's a ready-to-use, standalone JavaScript class for reading streams with WebRTC, available in [reader.js](https://github.com/bluenviron/mediamtx/blob/{version_tag}/internal/servers/webrtc/reader.js).
|
||||
|
||||
#### Web browsers and HLS
|
||||
|
||||
Web browsers can also read a stream with the [HLS protocol](#hls). Latency is higher but there are less problems related to connectivity between server and clients, furthermore the server load can be balanced by using a common HTTP CDN (like Cloudflare or CloudFront), and this allows to handle an unlimited amount of readers. Visit the web page:
|
||||
|
||||
```
|
||||
|
@@ -8,7 +8,7 @@ MediaMTX can be configured to ask clients for credentials, either in the form of
|
||||
|
||||
Credentials can be validated through one of these methods:
|
||||
|
||||
- Internal database: users are stored in the configuration file
|
||||
- Internal database: credentials are stored in the configuration file
|
||||
- External HTTP server: an external HTTP URL is contacted to perform authentication
|
||||
- External JWT provider: an external identity server provides signed tokens that are then verified by the server
|
||||
|
||||
@@ -252,7 +252,7 @@ Authorization: Basic base64(user:pass)
|
||||
|
||||
When using a web browser, a dialog is first shown to users, asking for credentials, and then the header is automatically inserted into every request.
|
||||
|
||||
If the `Authorization: Basic` header cannot be used (for instance, in software like OBS Studio, which only allows to provide a "Bearer Token"), credentials can be passed through the `Authorization: Bearer` header (i.e. the "Bearer Token" in OBS), where value is the concatenation of username and password, separated by a colon:
|
||||
If the `Authorization: Basic` header cannot be used (for instance, in software like OBS Studio, which only allows to provide a "Bearer Token"), credentials can be passed through the `Authorization: Bearer` header (i.e. the "Bearer Token" in OBS), where the value is the concatenation of username and password, separated by a colon:
|
||||
|
||||
```
|
||||
Authorization: Bearer username:password
|
||||
|
@@ -43,7 +43,7 @@ runOnDisconnect:
|
||||
# Global settings -> Authentication
|
||||
|
||||
# Authentication method. Available values are:
|
||||
# * internal: users are stored in the configuration file
|
||||
# * internal: credentials are stored in the configuration file
|
||||
# * http: an external HTTP URL is contacted to perform authentication
|
||||
# * jwt: an external identity server provides authentication through JWTs
|
||||
authMethod: internal
|
||||
|
Reference in New Issue
Block a user