mirror of
https://github.com/blakeblackshear/frigate.git
synced 2025-09-26 11:31:28 +08:00
Nvidia TensorRT detector (#4718)
* Initial WIP dockerfile and scripts to add tensorrt support * Add tensorRT detector * WIP attempt to install TensorRT 8.5 * Updates to detector for cuda python library * TensorRT Cuda library rework WIP Does not run * Fixes from rebase to detector factory * Fix parsing output memory pointer * Handle TensorRT logs with the python logger * Use non-async interface and convert input data to float32. Detection runs without error. * Make TensorRT a separate build from the base Frigate image. * Add script and documentation for generating TRT Models * Add support for TensorRT devcontainer * Add labelmap to trt model script and docs. Cleanup of old scripts. * Update detect to normalize input tensor using model input type * Add config for selecting GPU. Fix Async inference. Update documentation. * Update some CUDA libraries to clean up version warning * Add CI stage to build TensorRT tag * Add note in docs for image tag and model support
This commit is contained in:
@@ -11,7 +11,15 @@ services:
|
||||
shm_size: "256mb"
|
||||
build:
|
||||
context: .
|
||||
# Use target devcontainer-trt for TensorRT dev
|
||||
target: devcontainer
|
||||
deploy:
|
||||
resources:
|
||||
reservations:
|
||||
devices:
|
||||
- driver: nvidia
|
||||
count: 1
|
||||
capabilities: [gpu]
|
||||
devices:
|
||||
- /dev/bus/usb:/dev/bus/usb
|
||||
# - /dev/dri:/dev/dri # for intel hwaccel, needs to be updated for your hardware
|
||||
@@ -21,6 +29,8 @@ services:
|
||||
- /etc/localtime:/etc/localtime:ro
|
||||
- ./config/config.yml:/config/config.yml:ro
|
||||
- ./debug:/media/frigate
|
||||
# Create the trt-models folder using the documented method of generating TRT models
|
||||
# - ./debug/trt-models:/trt-models
|
||||
- /dev/bus/usb:/dev/bus/usb
|
||||
mqtt:
|
||||
container_name: mqtt
|
||||
|
Reference in New Issue
Block a user