rpi: support exposing a secondary stream from the same camera (#4426)

This commit is contained in:
Alessandro Ros
2025-04-14 11:56:08 +02:00
committed by GitHub
parent 9579989eff
commit 8ce49727d6
18 changed files with 619 additions and 282 deletions

View File

@@ -5,4 +5,4 @@
/apidocs/*.html
/internal/core/VERSION
/internal/servers/hls/hls.min.js
/internal/staticsources/rpicamera/mtxrpicam_*
/internal/staticsources/rpicamera/mtxrpicam_*/

2
.gitignore vendored
View File

@@ -4,4 +4,4 @@
/apidocs/*.html
/internal/core/VERSION
/internal/servers/hls/hls.min.js
/internal/staticsources/rpicamera/mtxrpicam_*
/internal/staticsources/rpicamera/mtxrpicam_*/

View File

@@ -88,6 +88,8 @@ _rtsp-simple-server_ has been rebranded as _MediaMTX_. The reason is pretty obvi
* [By device](#by-device)
* [Generic webcam](#generic-webcam)
* [Raspberry Pi Cameras](#raspberry-pi-cameras)
* [Adding audio](#adding-audio)
* [Secondary stream](#secondary-stream)
* [By protocol](#by-protocol)
* [SRT clients](#srt-clients)
* [SRT cameras and servers](#srt-cameras-and-servers)
@@ -274,7 +276,7 @@ The RTSP protocol supports multiple underlying transport protocols, each with it
ffmpeg -re -stream_loop -1 -i file.ts -c copy -f rtsp -rtsp_transport tcp rtsp://localhost:8554/mystream
```
The resulting stream will be available in path `/mystream`.
The resulting stream is available in path `/mystream`.
#### GStreamer
@@ -301,7 +303,7 @@ gst-launch-1.0 filesrc location=file.mp4 ! qtdemux name=d \
d.video_0 ! rtspclientsink protocols=tcp name=s location=rtsp://localhost:8554/mystream
```
The resulting stream will be available in path `/mystream`.
The resulting stream is available in path `/mystream`.
GStreamer can also publish a stream by using the [WebRTC / WHIP protocol](#webrtc). Make sure that GStreamer version is at least 1.22, and that if the codec is H264, the profile is baseline. Use the `whipclientsink` element:
@@ -350,7 +352,7 @@ Latest versions of OBS Studio can publish to the server with the [WebRTC / WHIP
Save the configuration and click `Start streaming`.
The resulting stream will be available in path `/mystream`.
The resulting stream is available in path `/mystream`.
#### OpenCV
@@ -425,7 +427,7 @@ while True:
start = now
```
The resulting stream will be available in path `/mystream`.
The resulting stream is available in path `/mystream`.
#### Unity
@@ -536,7 +538,7 @@ public class WebRTCPublisher : MonoBehaviour
In the _Hierarchy_ window, find or create a scene and a camera, then add the `WebRTCPublisher.cs` script as component of the camera, by dragging it inside the _Inspector_ window. then Press the _Play_ button at the top of the page.
The resulting stream will be available in path `/unity`.
The resulting stream is available in path `/unity`.
#### Web browsers
@@ -546,7 +548,7 @@ Web browsers can publish a stream to the server by using the [WebRTC protocol](#
http://localhost:8889/mystream/publish
```
The resulting stream will be available in path `/mystream`.
The resulting stream is available in path `/mystream`.
This web page can be embedded into another web page by using an iframe:
@@ -584,7 +586,7 @@ Where `USB2.0 HD UVC WebCam` is the name of a webcam, that can be obtained with:
ffmpeg -list_devices true -f dshow -i dummy
```
The resulting stream will be available in path `/cam`.
The resulting stream is available in path `/cam`.
#### Raspberry Pi Cameras
@@ -611,7 +613,7 @@ If you want to run the standard (non-Docker) version of the server:
source: rpiCamera
```
The resulting stream will be available in path `/cam`.
The resulting stream is available in path `/cam`.
If you want to run the server inside Docker, you need to use the `latest-rpi` image and launch the container with some additional flags:
@@ -639,6 +641,8 @@ paths:
All available parameters are listed in the [sample configuration file](/mediamtx.yml).
##### Adding audio
In order to add audio from a USB microfone, install GStreamer and alsa-utils:
```sh
@@ -678,7 +682,42 @@ paths:
runOnInitRestart: yes
```
The resulting stream will be available in path `/cam_with_audio`.
The resulting stream is available in path `/cam_with_audio`.
##### Secondary stream
It is possible to enable a secondary stream from the same camera, with a different resolution, FPS and codec. Configuration is the same of a primary stream, with `rpiCameraSecondary` set to `true` and parameters adjusted accordingly:
```yml
paths:
# primary stream
rpi:
source: rpiCamera
# Width of frames.
rpiCameraWidth: 1920
# Height of frames.
rpiCameraHeight: 1080
# FPS.
rpiCameraFPS: 30
# secondary stream
secondary:
source: rpiCamera
# This is a secondary stream.
rpiCameraSecondary: true
# Width of frames.
rpiCameraWidth: 640
# Height of frames.
rpiCameraHeight: 480
# FPS.
rpiCameraFPS: 10
# Codec. in case of secondary streams, it defaults to M-JPEG.
rpiCameraCodec: auto
# JPEG quality.
rpiCameraJPEGQuality: 60
```
The secondary stream is available in path `/secondary`.
### By protocol
@@ -690,7 +729,7 @@ SRT is a protocol that allows to publish and read live data stream, providing en
srt://localhost:8890?streamid=publish:mystream&pkt_size=1316
```
Replace `mystream` with any name you want. The resulting stream will be available in path `/mystream`.
Replace `mystream` with any name you want. The resulting stream is available in path `/mystream`.
If credentials are enabled, append username and password to `streamid`:
@@ -723,7 +762,7 @@ WebRTC is an API that makes use of a set of protocols and methods to connect two
http://localhost:8889/mystream/publish
```
The resulting stream will be available in path `/mystream`.
The resulting stream is available in path `/mystream`.
WHIP is a WebRTC extensions that allows to publish streams by using a URL, without passing through a web page. This allows to use WebRTC as a general purpose streaming protocol. If you are using a software that supports WHIP (for instance, latest versions of OBS Studio), you can publish a stream to the server by using this URL:
@@ -756,7 +795,7 @@ RTSP is a protocol that allows to publish and read streams. It supports differen
rtsp://localhost:8554/mystream
```
The resulting stream will be available in path `/mystream`.
The resulting stream is available in path `/mystream`.
Known clients that can publish with RTSP are [FFmpeg](#ffmpeg), [GStreamer](#gstreamer), [OBS Studio](#obs-studio).
@@ -771,7 +810,7 @@ paths:
source: rtsp://original-url
```
The resulting stream will be available in path `/proxied`.
The resulting stream is available in path `/proxied`.
The server supports any number of source streams (count is just limited by available hardware resources) it's enough to add additional entries to the paths section:
@@ -792,7 +831,7 @@ RTMP is a protocol that allows to read and publish streams, but is less versatil
rtmp://localhost/mystream
```
The resulting stream will be available in path `/mystream`.
The resulting stream is available in path `/mystream`.
In case authentication is enabled, credentials can be passed to the server by using the `user` and `pass` query parameters:
@@ -813,7 +852,7 @@ paths:
source: rtmp://original-url
```
The resulting stream will be available in path `/proxied`.
The resulting stream is available in path `/proxied`.
#### HLS cameras and servers
@@ -826,7 +865,7 @@ paths:
source: http://original-url/stream/index.m3u8
```
The resulting stream will be available in path `/proxied`.
The resulting stream is available in path `/proxied`.
#### UDP/MPEG-TS
@@ -854,7 +893,7 @@ paths:
source: udp://238.0.0.1:1234
```
The resulting stream will be available in path `/mypath`.
The resulting stream is available in path `/mypath`.
Known clients that can publish with WebRTC and WHIP are [FFmpeg](#ffmpeg) and [GStreamer](#gstreamer).

View File

@@ -364,6 +364,8 @@ components:
# Raspberry Pi Camera source
rpiCameraCamID:
type: integer
rpiCameraSecondary:
type: boolean
rpiCameraWidth:
type: integer
rpiCameraHeight:
@@ -436,6 +438,8 @@ components:
type: string
rpiCameraLevel:
type: string
rpiCameraJPEGQuality:
type: integer
# Hooks
runOnInit:

View File

@@ -78,6 +78,7 @@ func TestConfFromFile(t *testing.T) {
RPICameraBitrate: 5000000,
RPICameraProfile: "main",
RPICameraLevel: "4.1",
RPICameraJPEGQuality: 60,
RunOnDemandStartTimeout: 5 * Duration(time.Second),
RunOnDemandCloseAfter: 10 * Duration(time.Second),
}, pa)

View File

@@ -154,41 +154,48 @@ type Path struct {
SourceRedirect string `json:"sourceRedirect"`
// Raspberry Pi Camera source
RPICameraCamID uint `json:"rpiCameraCamID"`
RPICameraWidth uint `json:"rpiCameraWidth"`
RPICameraHeight uint `json:"rpiCameraHeight"`
RPICameraHFlip bool `json:"rpiCameraHFlip"`
RPICameraVFlip bool `json:"rpiCameraVFlip"`
RPICameraBrightness float64 `json:"rpiCameraBrightness"`
RPICameraContrast float64 `json:"rpiCameraContrast"`
RPICameraSaturation float64 `json:"rpiCameraSaturation"`
RPICameraSharpness float64 `json:"rpiCameraSharpness"`
RPICameraExposure string `json:"rpiCameraExposure"`
RPICameraAWB string `json:"rpiCameraAWB"`
RPICameraAWBGains []float64 `json:"rpiCameraAWBGains"`
RPICameraDenoise string `json:"rpiCameraDenoise"`
RPICameraShutter uint `json:"rpiCameraShutter"`
RPICameraMetering string `json:"rpiCameraMetering"`
RPICameraGain float64 `json:"rpiCameraGain"`
RPICameraEV float64 `json:"rpiCameraEV"`
RPICameraROI string `json:"rpiCameraROI"`
RPICameraHDR bool `json:"rpiCameraHDR"`
RPICameraTuningFile string `json:"rpiCameraTuningFile"`
RPICameraMode string `json:"rpiCameraMode"`
RPICameraFPS float64 `json:"rpiCameraFPS"`
RPICameraAfMode string `json:"rpiCameraAfMode"`
RPICameraAfRange string `json:"rpiCameraAfRange"`
RPICameraAfSpeed string `json:"rpiCameraAfSpeed"`
RPICameraLensPosition float64 `json:"rpiCameraLensPosition"`
RPICameraAfWindow string `json:"rpiCameraAfWindow"`
RPICameraFlickerPeriod uint `json:"rpiCameraFlickerPeriod"`
RPICameraTextOverlayEnable bool `json:"rpiCameraTextOverlayEnable"`
RPICameraTextOverlay string `json:"rpiCameraTextOverlay"`
RPICameraCodec string `json:"rpiCameraCodec"`
RPICameraIDRPeriod uint `json:"rpiCameraIDRPeriod"`
RPICameraBitrate uint `json:"rpiCameraBitrate"`
RPICameraProfile string `json:"rpiCameraProfile"`
RPICameraLevel string `json:"rpiCameraLevel"`
RPICameraCamID uint `json:"rpiCameraCamID"`
RPICameraSecondary bool `json:"rpiCameraSecondary"`
RPICameraWidth uint `json:"rpiCameraWidth"`
RPICameraHeight uint `json:"rpiCameraHeight"`
RPICameraHFlip bool `json:"rpiCameraHFlip"`
RPICameraVFlip bool `json:"rpiCameraVFlip"`
RPICameraBrightness float64 `json:"rpiCameraBrightness"`
RPICameraContrast float64 `json:"rpiCameraContrast"`
RPICameraSaturation float64 `json:"rpiCameraSaturation"`
RPICameraSharpness float64 `json:"rpiCameraSharpness"`
RPICameraExposure string `json:"rpiCameraExposure"`
RPICameraAWB string `json:"rpiCameraAWB"`
RPICameraAWBGains []float64 `json:"rpiCameraAWBGains"`
RPICameraDenoise string `json:"rpiCameraDenoise"`
RPICameraShutter uint `json:"rpiCameraShutter"`
RPICameraMetering string `json:"rpiCameraMetering"`
RPICameraGain float64 `json:"rpiCameraGain"`
RPICameraEV float64 `json:"rpiCameraEV"`
RPICameraROI string `json:"rpiCameraROI"`
RPICameraHDR bool `json:"rpiCameraHDR"`
RPICameraTuningFile string `json:"rpiCameraTuningFile"`
RPICameraMode string `json:"rpiCameraMode"`
RPICameraFPS float64 `json:"rpiCameraFPS"`
RPICameraAfMode string `json:"rpiCameraAfMode"`
RPICameraAfRange string `json:"rpiCameraAfRange"`
RPICameraAfSpeed string `json:"rpiCameraAfSpeed"`
RPICameraLensPosition float64 `json:"rpiCameraLensPosition"`
RPICameraAfWindow string `json:"rpiCameraAfWindow"`
RPICameraFlickerPeriod uint `json:"rpiCameraFlickerPeriod"`
RPICameraTextOverlayEnable bool `json:"rpiCameraTextOverlayEnable"`
RPICameraTextOverlay string `json:"rpiCameraTextOverlay"`
RPICameraCodec string `json:"rpiCameraCodec"`
RPICameraIDRPeriod uint `json:"rpiCameraIDRPeriod"`
RPICameraBitrate uint `json:"rpiCameraBitrate"`
RPICameraProfile string `json:"rpiCameraProfile"`
RPICameraLevel string `json:"rpiCameraLevel"`
RPICameraJPEGQuality uint `json:"rpiCameraJPEGQuality"`
RPICameraPrimaryName string `json:"-"` // filled by Check()
RPICameraSecondaryWidth uint `json:"-"` // filled by Check()
RPICameraSecondaryHeight uint `json:"-"` // filled by Check()
RPICameraSecondaryFPS float64 `json:"-"` // filled by Check()
RPICameraSecondaryJPEGQuality uint `json:"-"` // filled by Check()
// Hooks
RunOnInit string `json:"runOnInit"`
@@ -245,6 +252,7 @@ func (pconf *Path) setDefaults() {
pconf.RPICameraBitrate = 5000000
pconf.RPICameraProfile = "main"
pconf.RPICameraLevel = "4.1"
pconf.RPICameraJPEGQuality = 60
// Hooks
pconf.RunOnDemandStartTimeout = 10 * Duration(time.Second)
@@ -272,6 +280,11 @@ func (pconf Path) Clone() *Path {
}
dest.Regexp = pconf.Regexp
dest.RPICameraPrimaryName = pconf.RPICameraPrimaryName
dest.RPICameraSecondaryWidth = pconf.RPICameraSecondaryWidth
dest.RPICameraSecondaryHeight = pconf.RPICameraSecondaryHeight
dest.RPICameraSecondaryFPS = pconf.RPICameraSecondaryFPS
dest.RPICameraSecondaryJPEGQuality = pconf.RPICameraSecondaryJPEGQuality
return &dest
}
@@ -350,6 +363,7 @@ func (pconf *Path) validate(
l.Log(logger.Warn, "parameter 'sourceProtocol' is deprecated and has been replaced with 'rtspTransport'")
pconf.RTSPTransport = *pconf.SourceProtocol
}
if pconf.SourceAnyPortEnable != nil {
l.Log(logger.Warn, "parameter 'sourceAnyPortEnable' is deprecated and has been replaced with 'rtspAnyPort'")
pconf.RTSPAnyPort = *pconf.SourceAnyPortEnable
@@ -377,6 +391,7 @@ func (pconf *Path) validate(
if err != nil {
return fmt.Errorf("'%s' is not a valid URL", pconf.Source)
}
if u.Scheme != "http" && u.Scheme != "https" {
return fmt.Errorf("'%s' is not a valid URL", pconf.Source)
}
@@ -420,12 +435,13 @@ func (pconf *Path) validate(
}
case pconf.Source == "rpiCamera":
for otherName, otherPath := range conf.Paths {
if otherPath != pconf && otherPath != nil &&
otherPath.Source == "rpiCamera" && otherPath.RPICameraCamID == pconf.RPICameraCamID {
return fmt.Errorf("'rpiCamera' with same camera ID %d is used as source in two paths, '%s' and '%s'",
pconf.RPICameraCamID, name, otherName)
}
if pconf.RPICameraWidth == 0 {
return fmt.Errorf("invalid 'rpiCameraWidth' value")
}
if pconf.RPICameraHeight == 0 {
return fmt.Errorf("invalid 'rpiCameraHeight' value")
}
switch pconf.RPICameraExposure {
@@ -433,43 +449,99 @@ func (pconf *Path) validate(
default:
return fmt.Errorf("invalid 'rpiCameraExposure' value")
}
switch pconf.RPICameraAWB {
case "auto", "incandescent", "tungsten", "fluorescent", "indoor", "daylight", "cloudy", "custom":
default:
return fmt.Errorf("invalid 'rpiCameraAWB' value")
}
if len(pconf.RPICameraAWBGains) != 2 {
return fmt.Errorf("invalid 'rpiCameraAWBGains' value")
}
switch pconf.RPICameraDenoise {
case "off", "cdn_off", "cdn_fast", "cdn_hq":
default:
return fmt.Errorf("invalid 'rpiCameraDenoise' value")
}
switch pconf.RPICameraMetering {
case "centre", "spot", "matrix", "custom":
default:
return fmt.Errorf("invalid 'rpiCameraMetering' value")
}
switch pconf.RPICameraAfMode {
case "auto", "manual", "continuous":
default:
return fmt.Errorf("invalid 'rpiCameraAfMode' value")
}
switch pconf.RPICameraAfRange {
case "normal", "macro", "full":
default:
return fmt.Errorf("invalid 'rpiCameraAfRange' value")
}
switch pconf.RPICameraAfSpeed {
case "normal", "fast":
default:
return fmt.Errorf("invalid 'rpiCameraAfSpeed' value")
}
switch pconf.RPICameraCodec {
case "auto", "hardwareH264", "softwareH264":
default:
return fmt.Errorf("invalid 'rpiCameraCodec' value")
if !pconf.RPICameraSecondary {
switch pconf.RPICameraCodec {
case "auto", "hardwareH264", "softwareH264":
default:
return fmt.Errorf("supported codecs for a primary RPI Camera stream are auto, hardwareH264, softwareH264")
}
for otherName, otherPath := range conf.Paths {
if otherPath != pconf &&
otherPath != nil &&
otherPath.Source == "rpiCamera" &&
otherPath.RPICameraCamID == pconf.RPICameraCamID &&
!otherPath.RPICameraSecondary {
return fmt.Errorf("'rpiCamera' with same camera ID %d is used as source in two paths, '%s' and '%s'",
pconf.RPICameraCamID, name, otherName)
}
}
} else {
switch pconf.RPICameraCodec {
case "auto", "mjpeg":
default:
return fmt.Errorf("supported codecs for a secondary RPI Camera stream are auto, mjpeg")
}
var primaryName string
var primary *Path
for otherPathName, otherPath := range conf.Paths {
if otherPath != pconf &&
otherPath != nil &&
otherPath.Source == "rpiCamera" &&
otherPath.RPICameraCamID == pconf.RPICameraCamID &&
!otherPath.RPICameraSecondary {
primaryName = otherPathName
primary = otherPath
break
}
}
if primary == nil {
return fmt.Errorf("cannot find a primary RPI Camera stream to associate with the secondary stream")
}
if primary.RPICameraSecondaryWidth != 0 {
return fmt.Errorf("a primary RPI Camera stream is associated with multiple secondary streams")
}
pconf.RPICameraPrimaryName = primaryName
primary.RPICameraSecondaryWidth = pconf.RPICameraWidth
primary.RPICameraSecondaryHeight = pconf.RPICameraHeight
primary.RPICameraSecondaryFPS = pconf.RPICameraFPS
primary.RPICameraSecondaryJPEGQuality = pconf.RPICameraJPEGQuality
}
default:

View File

@@ -31,6 +31,7 @@ type pathParent interface {
pathReady(*path)
pathNotReady(*path)
closePath(*path)
AddReader(req defs.PathAddReaderReq) (defs.Path, *stream.Stream, error)
}
type pathOnDemandState int
@@ -174,6 +175,7 @@ func (pa *path) run() {
WriteTimeout: pa.writeTimeout,
WriteQueueSize: pa.writeQueueSize,
Matches: pa.matches,
PathManager: pa.parent,
Parent: pa,
}
pa.source.(*staticsources.Handler).Initialize()

View File

@@ -18,6 +18,7 @@ import (
sssrt "github.com/bluenviron/mediamtx/internal/staticsources/srt"
ssudp "github.com/bluenviron/mediamtx/internal/staticsources/udp"
sswebrtc "github.com/bluenviron/mediamtx/internal/staticsources/webrtc"
"github.com/bluenviron/mediamtx/internal/stream"
)
const (
@@ -42,6 +43,10 @@ func resolveSource(s string, matches []string, query string) string {
return s
}
type handlerPathManager interface {
AddReader(req defs.PathAddReaderReq) (defs.Path, *stream.Stream, error)
}
type handlerParent interface {
logger.Writer
StaticSourceHandlerSetReady(context.Context, defs.PathSourceStaticSetReadyReq)
@@ -56,6 +61,7 @@ type Handler struct {
WriteTimeout conf.Duration
WriteQueueSize int
Matches []string
PathManager handlerPathManager
Parent handlerParent
ctx context.Context
@@ -298,3 +304,8 @@ func (s *Handler) SetNotReady(req defs.PathSourceStaticSetNotReadyReq) {
case <-s.ctx.Done():
}
}
// AddReader is called by a staticSource.
func (s *Handler) AddReader(req defs.PathAddReaderReq) (defs.Path, *stream.Stream, error) {
return s.PathManager.AddReader(req)
}

View File

@@ -3,11 +3,15 @@
package rpicamera
import (
"debug/elf"
"fmt"
"os"
"os/exec"
"path/filepath"
"runtime"
"strconv"
"strings"
"sync"
"syscall"
"time"
"unsafe"
@@ -15,6 +19,18 @@ import (
"github.com/bluenviron/mediacommon/v2/pkg/codecs/h264"
)
const (
libraryToCheckArchitecture = "libc.so.6"
dumpPrefix = "/dev/shm/mediamtx-rpicamera-"
executableName = "mtxrpicam"
)
var (
dumpMutex sync.Mutex
dumpCount = 0
dumpPath = ""
)
func ntpTime() syscall.Timespec {
var t syscall.Timespec
syscall.Syscall(syscall.SYS_CLOCK_GETTIME, 0, uintptr(unsafe.Pointer(&t)), 0)
@@ -33,9 +49,147 @@ func multiplyAndDivide(v, m, d int64) int64 {
return (secs*m + dec*m/d)
}
func getArchitecture(libPath string) (bool, error) {
f, err := os.Open(libPath)
if err != nil {
return false, err
}
defer f.Close()
ef, err := elf.NewFile(f)
if err != nil {
return false, err
}
defer ef.Close()
return (ef.FileHeader.Class == elf.ELFCLASS64), nil
}
func checkArchitecture() error {
byts, err := exec.Command("ldconfig", "-p").Output()
if err != nil {
return fmt.Errorf("ldconfig failed: %w", err)
}
for _, line := range strings.Split(string(byts), "\n") {
f := strings.Split(line, " => ")
if len(f) == 2 && strings.Contains(f[1], libraryToCheckArchitecture) {
is64, err := getArchitecture(f[1])
if err != nil {
return err
}
if runtime.GOARCH == "arm" {
if !is64 {
return nil
}
} else {
if is64 {
return nil
}
}
}
}
if runtime.GOARCH == "arm" {
return fmt.Errorf("the operating system is 64-bit, you need the 64-bit server version")
}
return fmt.Errorf("the operating system is 32-bit, you need the 32-bit server version")
}
func dumpEmbedFSRecursive(src string, dest string) error {
files, err := mtxrpicam.ReadDir(src)
if err != nil {
return err
}
for _, f := range files {
if f.IsDir() {
err = os.Mkdir(filepath.Join(dest, f.Name()), 0o755)
if err != nil {
return err
}
err = dumpEmbedFSRecursive(filepath.Join(src, f.Name()), filepath.Join(dest, f.Name()))
if err != nil {
return err
}
} else {
buf, err := mtxrpicam.ReadFile(filepath.Join(src, f.Name()))
if err != nil {
return err
}
err = os.WriteFile(filepath.Join(dest, f.Name()), buf, 0o644)
if err != nil {
return err
}
}
}
return nil
}
func dumpComponent() error {
dumpMutex.Lock()
defer dumpMutex.Unlock()
if dumpCount > 0 {
dumpCount++
return nil
}
err := checkArchitecture()
if err != nil {
return err
}
dumpPath = dumpPrefix + strconv.FormatInt(time.Now().UnixNano(), 10)
err = os.Mkdir(dumpPath, 0o755)
if err != nil {
return err
}
files, err := mtxrpicam.ReadDir(".")
if err != nil {
os.RemoveAll(dumpPath)
return err
}
err = dumpEmbedFSRecursive(files[0].Name(), dumpPath)
if err != nil {
os.RemoveAll(dumpPath)
return err
}
err = os.Chmod(filepath.Join(dumpPath, executableName), 0o755)
if err != nil {
os.RemoveAll(dumpPath)
return err
}
dumpCount++
return nil
}
func freeComponent() {
dumpMutex.Lock()
defer dumpMutex.Unlock()
dumpCount--
if dumpCount == 0 {
os.RemoveAll(dumpPath)
}
}
type camera struct {
params params
onData func(int64, time.Time, [][]byte)
params params
onData func(int64, time.Time, [][]byte)
onDataSecondary func(int64, time.Time, []byte)
cmd *exec.Cmd
pipeOut *pipe
@@ -181,7 +335,7 @@ outer:
case 'e':
return fmt.Errorf(string(buf[1:]))
case 'b':
case 'd':
dts := int64(buf[8])<<56 | int64(buf[7])<<48 | int64(buf[6])<<40 | int64(buf[5])<<32 |
int64(buf[4])<<24 | int64(buf[3])<<16 | int64(buf[2])<<8 | int64(buf[1])
@@ -204,6 +358,23 @@ outer:
ntp,
nalus)
case 's':
dts := int64(buf[8])<<56 | int64(buf[7])<<48 | int64(buf[6])<<40 | int64(buf[5])<<32 |
int64(buf[4])<<24 | int64(buf[3])<<16 | int64(buf[2])<<8 | int64(buf[1])
unixNTP := ntpTime()
unixMono := monotonicTime()
// subtract from NTP the delay from now to the moment the frame was taken
ntp := time.Unix(int64(unixNTP.Sec), int64(unixNTP.Nsec))
deltaT := time.Duration(unixMono.Nano()-dts*1e3) * time.Nanosecond
ntp = ntp.Add(-deltaT)
c.onDataSecondary(
multiplyAndDivide(dts, 90000, 1e6),
ntp,
buf[9:])
default:
return fmt.Errorf("unexpected data from pipe: '0x%.2x'", buf[0])
}

View File

@@ -8,8 +8,9 @@ import (
)
type camera struct {
params params
onData func(int64, time.Time, [][]byte)
params params
onData func(int64, time.Time, [][]byte)
onDataSecondary func(int64, time.Time, []byte)
}
func (c *camera) initialize() error {

View File

@@ -1,165 +0,0 @@
//go:build (linux && arm) || (linux && arm64)
package rpicamera
import (
"debug/elf"
"fmt"
"os"
"os/exec"
"path/filepath"
"runtime"
"strconv"
"strings"
"sync"
"time"
)
const (
libraryToCheckArchitecture = "libc.so.6"
dumpPrefix = "/dev/shm/mediamtx-rpicamera-"
executableName = "mtxrpicam"
)
var (
dumpMutex sync.Mutex
dumpCount = 0
dumpPath = ""
)
func getArchitecture(libPath string) (bool, error) {
f, err := os.Open(libPath)
if err != nil {
return false, err
}
defer f.Close()
ef, err := elf.NewFile(f)
if err != nil {
return false, err
}
defer ef.Close()
return (ef.FileHeader.Class == elf.ELFCLASS64), nil
}
func checkArchitecture() error {
byts, err := exec.Command("ldconfig", "-p").Output()
if err != nil {
return fmt.Errorf("ldconfig failed: %w", err)
}
for _, line := range strings.Split(string(byts), "\n") {
f := strings.Split(line, " => ")
if len(f) == 2 && strings.Contains(f[1], libraryToCheckArchitecture) {
is64, err := getArchitecture(f[1])
if err != nil {
return err
}
if runtime.GOARCH == "arm" {
if !is64 {
return nil
}
} else {
if is64 {
return nil
}
}
}
}
if runtime.GOARCH == "arm" {
return fmt.Errorf("the operating system is 64-bit, you need the 64-bit server version")
}
return fmt.Errorf("the operating system is 32-bit, you need the 32-bit server version")
}
func dumpEmbedFSRecursive(src string, dest string) error {
files, err := component.ReadDir(src)
if err != nil {
return err
}
for _, f := range files {
if f.IsDir() {
err = os.Mkdir(filepath.Join(dest, f.Name()), 0o755)
if err != nil {
return err
}
err = dumpEmbedFSRecursive(filepath.Join(src, f.Name()), filepath.Join(dest, f.Name()))
if err != nil {
return err
}
} else {
buf, err := component.ReadFile(filepath.Join(src, f.Name()))
if err != nil {
return err
}
err = os.WriteFile(filepath.Join(dest, f.Name()), buf, 0o644)
if err != nil {
return err
}
}
}
return nil
}
func dumpComponent() error {
dumpMutex.Lock()
defer dumpMutex.Unlock()
if dumpCount > 0 {
dumpCount++
return nil
}
err := checkArchitecture()
if err != nil {
return err
}
dumpPath = dumpPrefix + strconv.FormatInt(time.Now().UnixNano(), 10)
err = os.Mkdir(dumpPath, 0o755)
if err != nil {
return err
}
files, err := component.ReadDir(".")
if err != nil {
os.RemoveAll(dumpPath)
return err
}
err = dumpEmbedFSRecursive(files[0].Name(), dumpPath)
if err != nil {
os.RemoveAll(dumpPath)
return err
}
err = os.Chmod(filepath.Join(dumpPath, executableName), 0o755)
if err != nil {
os.RemoveAll(dumpPath)
return err
}
dumpCount++
return nil
}
func freeComponent() {
dumpMutex.Lock()
defer dumpMutex.Unlock()
dumpCount--
if dumpCount == 0 {
os.RemoveAll(dumpPath)
}
}

View File

@@ -7,4 +7,4 @@ import (
)
//go:embed mtxrpicam_32/*
var component embed.FS
var mtxrpicam embed.FS

View File

@@ -7,4 +7,4 @@ import (
)
//go:embed mtxrpicam_64/*
var component embed.FS
var mtxrpicam embed.FS

View File

@@ -1 +1 @@
v2.3.7
v2.4.0

View File

@@ -38,4 +38,8 @@ type params struct {
Bitrate uint32
Profile string
Level string
SecondaryWidth uint32
SecondaryHeight uint32
SecondaryFPS float32
SecondaryQuality uint32
}

View File

@@ -2,10 +2,16 @@
package rpicamera
import (
"context"
"errors"
"fmt"
"time"
"github.com/bluenviron/gortsplib/v4/pkg/description"
"github.com/bluenviron/gortsplib/v4/pkg/format"
"github.com/bluenviron/gortsplib/v4/pkg/format/rtph264"
"github.com/bluenviron/gortsplib/v4/pkg/format/rtpmjpeg"
"github.com/pion/rtp"
"github.com/bluenviron/mediamtx/internal/conf"
"github.com/bluenviron/mediamtx/internal/defs"
@@ -14,6 +20,10 @@ import (
"github.com/bluenviron/mediamtx/internal/unit"
)
const (
pauseBetweenErrors = 1 * time.Second
)
func paramsFromConf(logLevel conf.LogLevel, cnf *conf.Path) params {
return params{
LogLevel: func() string {
@@ -63,13 +73,40 @@ func paramsFromConf(logLevel conf.LogLevel, cnf *conf.Path) params {
Bitrate: uint32(cnf.RPICameraBitrate),
Profile: cnf.RPICameraProfile,
Level: cnf.RPICameraLevel,
SecondaryWidth: uint32(cnf.RPICameraSecondaryWidth),
SecondaryHeight: uint32(cnf.RPICameraSecondaryHeight),
SecondaryFPS: float32(cnf.RPICameraSecondaryFPS),
SecondaryQuality: uint32(cnf.RPICameraSecondaryJPEGQuality),
}
}
type secondaryReader struct {
ctx context.Context
ctxCancel func()
}
// Close implements reader.
func (r *secondaryReader) Close() {
r.ctxCancel()
}
// APIReaderDescribe implements reader.
func (*secondaryReader) APIReaderDescribe() defs.APIPathSourceOrReader {
return defs.APIPathSourceOrReader{
Type: "rpiCameraSecondary",
ID: "",
}
}
type parent interface {
defs.StaticSourceParent
AddReader(req defs.PathAddReaderReq) (defs.Path, *stream.Stream, error)
}
// Source is a Raspberry Pi Camera static source.
type Source struct {
LogLevel conf.LogLevel
Parent defs.StaticSourceParent
Parent parent
}
// Log implements logger.Writer.
@@ -79,6 +116,15 @@ func (s *Source) Log(level logger.Level, format string, args ...interface{}) {
// Run implements StaticSource.
func (s *Source) Run(params defs.StaticSourceRunParams) error {
if !params.Conf.RPICameraSecondary {
return s.runPrimary(params)
}
return s.runSecondary(params)
}
func (s *Source) runPrimary(params defs.StaticSourceRunParams) error {
var medias []*description.Media
medi := &description.Media{
Type: description.MediaTypeVideo,
Formats: []format.Format{&format.H264{
@@ -86,29 +132,88 @@ func (s *Source) Run(params defs.StaticSourceRunParams) error {
PacketizationMode: 1,
}},
}
medias := []*description.Media{medi}
medias = append(medias, medi)
var mediaSecondary *description.Media
if params.Conf.RPICameraSecondaryWidth != 0 {
mediaSecondary = &description.Media{
Type: description.MediaTypeApplication,
Formats: []format.Format{&format.Generic{
PayloadTyp: 96,
RTPMa: "rpicamera_secondary",
ClockRat: 90000,
}},
}
medias = append(medias, mediaSecondary)
}
var stream *stream.Stream
onData := func(pts int64, ntp time.Time, au [][]byte) {
initializeStream := func() {
if stream == nil {
res := s.Parent.SetReady(defs.PathSourceStaticSetReadyReq{
Desc: &description.Session{Medias: medias},
GenerateRTPPackets: true,
GenerateRTPPackets: false,
})
if res.Err != nil {
return
panic("should not happen")
}
stream = res.Stream
}
}
stream.WriteUnit(medi, medi.Formats[0], &unit.H264{
Base: unit.Base{
PTS: pts,
NTP: ntp,
},
AU: au,
})
encH264 := &rtph264.Encoder{
PayloadType: 96,
PayloadMaxSize: 1460,
}
err := encH264.Init()
if err != nil {
return err
}
onData := func(pts int64, ntp time.Time, au [][]byte) {
initializeStream()
pkts, err2 := encH264.Encode(au)
if err2 != nil {
s.Log(logger.Error, err2.Error())
return
}
for _, pkt := range pkts {
pkt.Timestamp = uint32(pts)
stream.WriteRTPPacket(medi, medi.Formats[0], pkt, ntp, pts)
}
}
var onDataSecondary func(pts int64, ntp time.Time, au []byte)
if params.Conf.RPICameraSecondaryWidth != 0 {
encJpeg := &rtpmjpeg.Encoder{
PayloadMaxSize: 1460,
}
err = encJpeg.Init()
if err != nil {
panic(err)
}
onDataSecondary = func(pts int64, ntp time.Time, au []byte) {
initializeStream()
pkts, err2 := encJpeg.Encode(au)
if err2 != nil {
s.Log(logger.Error, err2.Error())
return
}
for _, pkt := range pkts {
pkt.Timestamp = uint32(pts)
pkt.PayloadType = 96
stream.WriteRTPPacket(mediaSecondary, mediaSecondary.Formats[0], pkt, ntp, pts)
}
}
}
defer func() {
@@ -118,10 +223,11 @@ func (s *Source) Run(params defs.StaticSourceRunParams) error {
}()
cam := &camera{
params: paramsFromConf(s.LogLevel, params.Conf),
onData: onData,
params: paramsFromConf(s.LogLevel, params.Conf),
onData: onData,
onDataSecondary: onDataSecondary,
}
err := cam.initialize()
err = cam.initialize()
if err != nil {
return err
}
@@ -146,6 +252,93 @@ func (s *Source) Run(params defs.StaticSourceRunParams) error {
}
}
func (s *Source) runSecondary(params defs.StaticSourceRunParams) error {
r := &secondaryReader{}
r.ctx, r.ctxCancel = context.WithCancel(context.Background())
defer r.ctxCancel()
path, origStream, err := s.waitForPrimary(r, params)
if err != nil {
return err
}
defer path.RemoveReader(defs.PathRemoveReaderReq{Author: r})
media := &description.Media{
Type: description.MediaTypeVideo,
Formats: []format.Format{&format.MJPEG{}},
}
res := s.Parent.SetReady(defs.PathSourceStaticSetReadyReq{
Desc: &description.Session{Medias: []*description.Media{media}},
GenerateRTPPackets: false,
})
if res.Err != nil {
return res.Err
}
origStream.AddReader(
s,
origStream.Desc.Medias[1],
origStream.Desc.Medias[1].Formats[0],
func(u unit.Unit) error {
pkt := u.GetRTPPackets()[0]
newPkt := &rtp.Packet{
Header: pkt.Header,
Payload: pkt.Payload,
}
newPkt.PayloadType = 26
res.Stream.WriteRTPPacket(media, media.Formats[0], newPkt, u.GetNTP(), u.GetPTS())
return nil
})
origStream.StartReader(s)
defer origStream.RemoveReader(s)
select {
case err := <-origStream.ReaderError(s):
return err
case <-r.ctx.Done():
return fmt.Errorf("primary stream closed")
case <-params.Context.Done():
return fmt.Errorf("terminated")
}
}
func (s *Source) waitForPrimary(
r *secondaryReader,
params defs.StaticSourceRunParams,
) (defs.Path, *stream.Stream, error) {
for {
path, origStream, err := s.Parent.AddReader(defs.PathAddReaderReq{
Author: r,
AccessRequest: defs.PathAccessRequest{
Name: params.Conf.RPICameraPrimaryName,
SkipAuth: true,
},
})
if err != nil {
var err2 defs.PathNoStreamAvailableError
if errors.As(err, &err2) {
select {
case <-time.After(pauseBetweenErrors):
case <-params.Context.Done():
return nil, nil, fmt.Errorf("terminated")
}
continue
}
return nil, nil, err
}
return path, origStream, nil
}
}
// APISourceDescribe implements StaticSource.
func (*Source) APISourceDescribe() defs.APIPathSourceOrReader {
return defs.APIPathSourceOrReader{

View File

@@ -517,64 +517,66 @@ pathDefaults:
###############################################
# Default path settings -> Raspberry Pi Camera source (when source is "rpiCamera")
# ID of the camera
# ID of the camera.
rpiCameraCamID: 0
# Width of frames
# Whether this is a secondary stream.
rpiCameraSecondary: false
# Width of frames.
rpiCameraWidth: 1920
# Height of frames
# Height of frames.
rpiCameraHeight: 1080
# Flip horizontally
# Flip horizontally.
rpiCameraHFlip: false
# Flip vertically
# Flip vertically.
rpiCameraVFlip: false
# Brightness [-1, 1]
# Brightness [-1, 1].
rpiCameraBrightness: 0
# Contrast [0, 16]
# Contrast [0, 16].
rpiCameraContrast: 1
# Saturation [0, 16]
# Saturation [0, 16].
rpiCameraSaturation: 1
# Sharpness [0, 16]
# Sharpness [0, 16].
rpiCameraSharpness: 1
# Exposure mode.
# values: normal, short, long, custom
# values: normal, short, long, custom.
rpiCameraExposure: normal
# Auto-white-balance mode.
# values: auto, incandescent, tungsten, fluorescent, indoor, daylight, cloudy, custom
# values: auto, incandescent, tungsten, fluorescent, indoor, daylight, cloudy, custom.
rpiCameraAWB: auto
# Auto-white-balance fixed gains. This can be used in place of rpiCameraAWB.
# format: [red,blue]
# format: [red,blue].
rpiCameraAWBGains: [0, 0]
# Denoise operating mode.
# values: off, cdn_off, cdn_fast, cdn_hq
# values: off, cdn_off, cdn_fast, cdn_hq.
rpiCameraDenoise: "off"
# Fixed shutter speed, in microseconds.
rpiCameraShutter: 0
# Metering mode of the AEC/AGC algorithm.
# values: centre, spot, matrix, custom
# values: centre, spot, matrix, custom.
rpiCameraMetering: centre
# Fixed gain
# Fixed gain.
rpiCameraGain: 0
# EV compensation of the image [-10, 10]
# EV compensation of the image [-10, 10].
rpiCameraEV: 0
# Region of interest, in format x,y,width,height (all normalized between 0 and 1)
# Region of interest, in format x,y,width,height (all normalized between 0 and 1).
rpiCameraROI:
# Whether to enable HDR on Raspberry Camera 3.
rpiCameraHDR: false
# Tuning file
# Tuning file.
rpiCameraTuningFile:
# Sensor mode, in format [width]:[height]:[bit-depth]:[packing]
# bit-depth and packing are optional.
rpiCameraMode:
# frames per second
# frames per second.
rpiCameraFPS: 30
# Autofocus mode
# values: auto, manual, continuous
# Autofocus mode.
# values: auto, manual, continuous.
rpiCameraAfMode: continuous
# Autofocus range
# values: normal, macro, full
# Autofocus range.
# values: normal, macro, full.
rpiCameraAfRange: normal
# Autofocus speed
# values: normal, fast
# Autofocus speed.
# values: normal, fast.
rpiCameraAfSpeed: normal
# Lens position (for manual autofocus only), will be set to focus to a specific distance
# calculated by the following formula: d = 1 / value
@@ -592,16 +594,18 @@ pathDefaults:
# Text that is printed on each frame.
# format is the one of the strftime() function.
rpiCameraTextOverlay: '%Y-%m-%d %H:%M:%S - MediaMTX'
# Codec. Available values: auto, hardwareH264, softwareH264
# Codec. Available values: auto, hardwareH264, softwareH264, mjpeg.
rpiCameraCodec: auto
# Period between IDR frames
# Period between H264 IDR frames.
rpiCameraIDRPeriod: 60
# Bitrate
# H264 Bitrate.
rpiCameraBitrate: 5000000
# H264 profile
# H264 profile.
rpiCameraProfile: main
# H264 level
# H264 level.
rpiCameraLevel: '4.1'
# JPEG quality.
rpiCameraJPEGQuality: 60
###############################################
# Default path settings -> Hooks