mirror of
https://github.com/aler9/rtsp-simple-server
synced 2025-09-26 19:51:26 +08:00
docs: update (#5003)
This commit is contained in:
2
.github/workflows/release.yml
vendored
2
.github/workflows/release.yml
vendored
@@ -63,7 +63,7 @@ jobs:
|
||||
+ `## Security\n`
|
||||
+ `\n`
|
||||
+ `Binaries are compiled from source through the [Release workflow](https://github.com/${owner}/${repo}/actions/workflows/release.yml) without human intervention,`
|
||||
+ ` preventing any external interference.`
|
||||
+ ` preventing any external interference.\n`
|
||||
+ `\n`
|
||||
+ 'You can verify that binaries have been produced by the workflow by using [GitHub Attestations](https://docs.github.com/en/actions/security-for-github-actions/using-artifact-attestations/using-artifact-attestations-to-establish-provenance-for-builds):\n'
|
||||
+ `\n`
|
||||
|
@@ -2,19 +2,64 @@
|
||||
|
||||
Some streaming protocols allow to route absolute timestamps, associated with each frame, that are useful for synchronizing several video or data streams together. In particular, _MediaMTX_ supports receiving absolute timestamps with the following protocols and devices:
|
||||
|
||||
- HLS (through the `EXT-X-PROGRAM-DATE-TIME` tag in playlists)
|
||||
- RTSP (through RTCP reports, when `useAbsoluteTimestamp` is `true` in settings)
|
||||
- WebRTC (through RTCP reports, when `useAbsoluteTimestamp` is `true` in settings)
|
||||
- HLS
|
||||
- RTSP (when the `useAbsoluteTimestamp` parameter is `true`)
|
||||
- WebRTC (when the `useAbsoluteTimestamp` parameter is `true`)
|
||||
- Raspberry Pi Camera
|
||||
|
||||
and supports sending absolute timestamps with the following protocols:
|
||||
|
||||
- HLS (through the `EXT-X-PROGRAM-DATE-TIME` tag in playlists)
|
||||
- RTSP (through RTCP reports)
|
||||
- WebRTC (through RTCP reports)
|
||||
- HLS
|
||||
- RTSP
|
||||
- WebRTC
|
||||
|
||||
## Absolute timestamp in HLS
|
||||
|
||||
In the HLS protocol, absolute timestamps are routed by adding a `EXT-X-PROGRAM-DATE-TIME` tag before each segment:
|
||||
|
||||
```
|
||||
#EXTM3U
|
||||
#EXT-X-VERSION:9
|
||||
#EXT-X-MEDIA-SEQUENCE:20
|
||||
#EXT-X-TARGETDURATION:2
|
||||
#EXT-X-PROGRAM-DATE-TIME:2015-02-05T01:02:00Z
|
||||
#EXTINF:2,
|
||||
segment1.mp4
|
||||
#EXT-X-PROGRAM-DATE-TIME:2015-02-05T01:04:00Z
|
||||
#EXTINF:2,
|
||||
segment2.mp4
|
||||
```
|
||||
|
||||
The `EXT-X-PROGRAM-DATE-TIME` value is the absolute timestamp that corresponds to the first frame of the segment. The absolute timestamp of following frames can be obtained by summing `EXT-X-PROGRAM-DATE-TIME` with the relative frame timestamp.
|
||||
|
||||
A library that can read absolute timestamps with HLS is [gohlslib](https://github.com/bluenviron/gohlslib).
|
||||
|
||||
## Absolute timestamp in RTSP and WebRTC
|
||||
|
||||
In RTSP and WebRTC, absolute timestamps are routed through periodic RTCP sender reports:
|
||||
|
||||
```
|
||||
0 1 2 3
|
||||
0 1 2 3 4 5 6 7 8 9 0 1 2 3 4 5 6 7 8 9 0 1 2 3 4 5 6 7 8 9 0 1
|
||||
+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+
|
||||
header |V=2|P| RC | PT=SR=200 | length |
|
||||
+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+
|
||||
| SSRC of sender |
|
||||
+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+
|
||||
sender | NTP timestamp, most significant word |
|
||||
info +-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+
|
||||
| NTP timestamp, least significant word |
|
||||
+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+
|
||||
| RTP timestamp |
|
||||
...
|
||||
```
|
||||
|
||||
The sender report contains a reference absolute timestamp (NTP timestamp) and a reference relative timestamp (RTP timestamp). The absolute timestamp of each frame can be computed by using these values together with the RTP timestamp of the frame (shipped with each frame), through the formula:
|
||||
|
||||
```
|
||||
frame_abs_timestamp = ref_ntp_timestamp + (frame_rtp_timestamp - ref_rtp_timestamp) / clock_rate
|
||||
```
|
||||
|
||||
A library that can read absolute timestamps with RTSP is [gortsplib](https://github.com/bluenviron/gortsplib).
|
||||
|
||||
A browser can read absolute timestamps with WebRTC if it exposes the [estimatedPlayoutTimestamp](https://www.w3.org/TR/webrtc-stats/#dom-rtcinboundrtpstreamstats-estimatedplayouttimestamp) statistic.
|
||||
|
@@ -119,17 +119,39 @@ paths:
|
||||
|
||||
The resulting stream is available in path `/proxied`.
|
||||
|
||||
The server supports any number of source streams (count is just limited by available hardware resources) it's enough to add additional entries to the paths section:
|
||||
It is possible to tune the connection by using some additional parameters:
|
||||
|
||||
```yml
|
||||
paths:
|
||||
proxied1:
|
||||
source: rtsp://url1
|
||||
|
||||
proxied2:
|
||||
source: rtsp://url1
|
||||
proxied:
|
||||
# url of the source stream, in the format rtsp://user:pass@host:port/path
|
||||
source: rtsp://original-url
|
||||
# Transport protocol used to pull the stream. available values are "automatic", "udp", "multicast", "tcp".
|
||||
rtspTransport: automatic
|
||||
# Support sources that don't provide server ports or use random server ports. This is a security issue
|
||||
# and must be used only when interacting with sources that require it.
|
||||
rtspAnyPort: no
|
||||
# Range header to send to the source, in order to start streaming from the specified offset.
|
||||
# available values:
|
||||
# * clock: Absolute time
|
||||
# * npt: Normal Play Time
|
||||
# * smpte: SMPTE timestamps relative to the start of the recording
|
||||
rtspRangeType:
|
||||
# Available values:
|
||||
# * clock: UTC ISO 8601 combined date and time string, e.g. 20230812T120000Z
|
||||
# * npt: duration such as "300ms", "1.5m" or "2h45m", valid time units are "ns", "us" (or "µs"), "ms", "s", "m", "h"
|
||||
# * smpte: duration such as "300ms", "1.5m" or "2h45m", valid time units are "ns", "us" (or "µs"), "ms", "s", "m", "h"
|
||||
rtspRangeStart:
|
||||
# Size of the UDP buffer of the RTSP client.
|
||||
# This can be increased to mitigate packet losses.
|
||||
# It defaults to the default value of the operating system.
|
||||
rtspUDPReadBufferSize: 0
|
||||
```
|
||||
|
||||
All available parameters are listed in the [configuration file](/docs/references/configuration-file).
|
||||
|
||||
Advanced RTSP features are described in [RTSP-specific features](rtsp-specific-features).
|
||||
|
||||
### RTMP clients
|
||||
|
||||
RTMP is a protocol that allows to read and publish streams, but is less versatile and less efficient than RTSP and WebRTC (doesn't support UDP, doesn't support most RTSP codecs, doesn't support feedback mechanism). Streams can be published to the server by using the URL:
|
||||
@@ -278,6 +300,8 @@ If you want to run the standard (non-Docker) version of the server:
|
||||
|
||||
The resulting stream is available in path `/cam`.
|
||||
|
||||
The Raspberry Pi Camera can be controlled through a wide range of parameters, that are listed in the [configuration file](/docs/references/configuration-file).
|
||||
|
||||
If you want to run the server inside Docker, you need to use the `latest-rpi` image and launch the container with some additional flags:
|
||||
|
||||
```sh
|
||||
@@ -292,18 +316,6 @@ bluenviron/mediamtx:latest-rpi
|
||||
|
||||
Be aware that precompiled binaries and Docker images are not compatible with cameras that require a custom `libcamera` (like some ArduCam products), since they come with a bundled `libcamera`. If you want to use a custom one, you can [compile from source](/docs/other/compile#custom-libcamera).
|
||||
|
||||
Camera settings can be changed by using the `rpiCamera*` parameters:
|
||||
|
||||
```yml
|
||||
paths:
|
||||
cam:
|
||||
source: rpiCamera
|
||||
rpiCameraWidth: 1920
|
||||
rpiCameraHeight: 1080
|
||||
```
|
||||
|
||||
All available parameters are listed in the [configuration file](/docs/references/configuration-file).
|
||||
|
||||
#### Adding audio
|
||||
|
||||
In order to add audio from a USB microfone, install GStreamer and alsa-utils:
|
||||
@@ -480,7 +492,7 @@ ffmpeg -re -f lavfi -i testsrc=size=1280x720:rate=30 \
|
||||
-f whip http://localhost:8889/stream/whip
|
||||
```
|
||||
|
||||
WARNING: in case of FFmpeg 8.0, both a video track and an audio track must be present.
|
||||
WARNING: in case of FFmpeg 8.0, a video track and an audio track must both be present.
|
||||
|
||||
### GStreamer
|
||||
|
||||
|
@@ -187,13 +187,14 @@ ffmpeg -i rtmp://localhost/mystream -c copy output.mp4
|
||||
In order to read AV1, VP9, H265, Opus, AC3 tracks and in order to read multiple video or audio tracks, the `-rtmp_enhanced_codecs` flag must be present:
|
||||
|
||||
```sh
|
||||
ffmpeg -rtmp_enhanced_codecs ac-3,av01,avc1,ec-3,fLaC,hvc1,.mp3,mp4a,Opus,vp09 -i rtmp://localhost/mystream -c copy output.mp4
|
||||
ffmpeg -rtmp_enhanced_codecs ac-3,av01,avc1,ec-3,fLaC,hvc1,.mp3,mp4a,Opus,vp09 \
|
||||
-i rtmp://localhost/mystream -c copy output.mp4
|
||||
```
|
||||
|
||||
#### FFmpeg and SRT
|
||||
|
||||
```sh
|
||||
ffmpeg -i 'srt://localhost:8890?streamid=read:test' -f null -
|
||||
ffmpeg -i 'srt://localhost:8890?streamid=read:test' -c copy output.mp4
|
||||
```
|
||||
|
||||
### GStreamer
|
||||
|
@@ -11,7 +11,15 @@ Live streams be recorded and played back with the following file containers and
|
||||
|
||||
## Usage
|
||||
|
||||
To record available streams to disk, set the `record` and the `recordPath` parameter in the configuration file:
|
||||
To record available streams to disk, set the `record` parameter in the configuration file:
|
||||
|
||||
```yml
|
||||
pathDefaults:
|
||||
# Record streams to disk.
|
||||
record: yes
|
||||
```
|
||||
|
||||
It's also possible to specify additional parameters:
|
||||
|
||||
```yml
|
||||
pathDefaults:
|
||||
@@ -22,11 +30,26 @@ pathDefaults:
|
||||
# Available variables are %path (path name), %Y %m %d (year, month, day),
|
||||
# %H %M %S (hours, minutes, seconds), %f (microseconds), %z (time zone), %s (unix epoch).
|
||||
recordPath: ./recordings/%path/%Y-%m-%d_%H-%M-%S-%f
|
||||
# Format of recorded segments.
|
||||
# Available formats are "fmp4" (fragmented MP4) and "mpegts" (MPEG-TS).
|
||||
recordFormat: fmp4
|
||||
# fMP4 segments are concatenation of small MP4 files (parts), each with this duration.
|
||||
# MPEG-TS segments are concatenation of 188-bytes packets, flushed to disk with this period.
|
||||
# When a system failure occurs, the last part gets lost.
|
||||
# Therefore, the part duration is equal to the RPO (recovery point objective).
|
||||
recordPartDuration: 1s
|
||||
# This prevents RAM exhaustion.
|
||||
recordMaxPartSize: 50M
|
||||
# Minimum duration of each segment.
|
||||
recordSegmentDuration: 1h
|
||||
# Delete segments after this timespan.
|
||||
# Set to 0s to disable automatic deletion.
|
||||
recordDeleteAfter: 1d
|
||||
```
|
||||
|
||||
All available recording parameters are listed in the [configuration file](/docs/references/configuration-file).
|
||||
|
||||
Be aware that not all codecs can be saved with all formats, as described in the compatibility matrix at the beginning of the README.
|
||||
## Remote upload
|
||||
|
||||
To upload recordings to a remote location, you can use _MediaMTX_ together with [rclone](https://github.com/rclone/rclone), a command line tool that provides file synchronization capabilities with a huge variety of services (including S3, FTP, SMB, Google Drive):
|
||||
|
||||
@@ -34,7 +57,7 @@ To upload recordings to a remote location, you can use _MediaMTX_ together with
|
||||
|
||||
2. Configure _rclone_:
|
||||
|
||||
```
|
||||
```sh
|
||||
rclone config
|
||||
```
|
||||
|
||||
|
Reference in New Issue
Block a user