Update README for new config pattern

This commit is contained in:
David Halls
2021-10-17 08:43:59 +01:00
parent c5441a760e
commit 00dd002076

View File

@@ -35,6 +35,7 @@ You can also change various options:
** Convert your camera's video to greyscale. ** Convert your camera's video to greyscale.
** Lock the camera to portrait mode (where available, e.g. mobile phones). ** Lock the camera to portrait mode (where available, e.g. mobile phones).
** Zoom the camera to fill the page. ** Zoom the camera to fill the page.
** Switch between HLS and DASH encoding.
** Select a different version of https://github.com/davedoesdev/ffmpeg.js[ffmpeg.js] to perform ** Select a different version of https://github.com/davedoesdev/ffmpeg.js[ffmpeg.js] to perform
the HLS or DASH encoding. the HLS or DASH encoding.
@@ -50,17 +51,94 @@ the `tpix()` function.
The page's functionality is defined in link:site/streamana.js[] and link:site/streamer.js[]. The page's functionality is defined in link:site/streamana.js[] and link:site/streamer.js[].
link:site/streamer.js[] exports a class, `Streamer`, which does the heavy lifting: link:site/streamer.js[] exports a function, `get_default_config_from_url`, and a class,
`Streamer`, which does the heavy lifting.
You should first call `get_default_config_from_url`. It takes a single argument,
the URL of `ffmpeg-worker-hls.js` or `ffmpeg-worker-dash.js` in https://github.com/davedoesdev/ffmpeg.js[ffmpeg.js].
This allows your application (or the end user if required) to supply its own version,
in accordance with LGPL. It can be a relative path (i.e. just `ffmpeg-worker-hls.js` or
`ffmpeg-worker-dash.js`).
`get_default_config_from_url` determines the streaming protocol (`hls` or `dash`) and returns
the default configuration for the protocol:
```js
{
ffmpeg_lib_url, // the URL you passed to `get_default_config_from_url`
protocol, // `hls` or `dash`
video: { // properies of the video you will be supplying
bitrate: 2500 * 1000,
framerate: 30
},
audio: { // properties of the audio you will be supplying
bitrate: 128 * 1000
},
media_recorder: { // default options for MediaRecorder if it ends up being used
video: {
codec: protocol === 'dash' ? 'vp9' : 'H264', // video codec
},
audio: {
codec: 'opus' // audio codec
},
webm: true, // container format
mp4: false // if true, requires ffmpeg-worker-hls.js or ffmpeg-worker-dash.js
// to be configured with MP4 support (which is not the default)
},
webcodecs: { // default options for WebCodecs if it ends up being used
video: {
// video codec and options
...(protocol === 'dash' ? {
codec: 'vp09.00.10.08.01'
} : {
codec: 'avc1.42E01E' /*'avc1.42001E'*/,
avc: { format: 'annexb' }
})
},
audio: {
codec: 'opus' /*'pcm'*/, // audio codec
},
webm_muxer: { // options for webm-muxer.js
video: {
codec: protocol === 'dash' ? 'V_VP9' : 'V_MPEG4/ISO/AVC'
},
audio: {
codec: 'A_OPUS',
bit_depth: 0 // 32 for pcm */
}
}
},
ffmpeg: { // desired ffmpeg output codecs
// Note: If the encoded stream already uses the desired codec then
// it will pass `copy` instead. For example, if your browser encodes
// your video to H.264 already then `copy` will be used instead of
// `libx264`. This means you can use `ffmpeg-worker-hls.js` or
// `ffmpeg-worker-dash.js` that doesn't contain a H.264 encoder.
video: {
codec: protocol === 'dash' ? 'libvpx-vp9' : 'libx264'
},
audio: {
codec: protocol === 'dash' ? 'libopus' : 'aac'
}
}
};
```
You application can modify the returned configuration before creating a `Streamer` object.
Use the `Streamer` class as follows:
* The constructor takes the following arguments: * The constructor takes the following arguments:
** The https://developer.mozilla.org/en-US/docs/Web/API/MediaStream[`MediaStream`] ** The https://developer.mozilla.org/en-US/docs/Web/API/MediaStream[`MediaStream`]
containing your video and audio tracks. Note that link:site/streamana.js[] supplies containing your video and audio tracks. Note that link:site/streamana.js[] supplies
blank video when the camera is hidden and silent audio when the microphone is muted. blank video when the camera is hidden and silent audio when the microphone is muted.
** An https://developer.mozilla.org/en-US/docs/Web/API/AudioContext[AudioContext] instance.
This is used to create a persistent audio generator for triggering updates to
avoid browser timer throttling. If you don't already use one in your application,
you can just `new AudioContext()`.
** The ingestion URL. ** The ingestion URL.
** The URL of `ffmpeg-worker-hls.js` or `ffmpeg-worker-dash.js` in https://github.com/davedoesdev/ffmpeg.js[ffmpeg.js]. ** The configuration returned by calling `get_default_config_from_url` (see above),
This allows your application (or the end user if required) to supply its own version, optionally modified by your application.
in accordance with LGPL.
** The desired video frame rate.
** Whether the video is rotated. ** Whether the video is rotated.
* Call the `async start()` function to start streaming. * Call the `async start()` function to start streaming.
* Call the `end()` function to stop streaming. * Call the `end()` function to stop streaming.