Run custom model on Ubuntu Docker Deepstack for GPU

Post Reply
durnovtsev
Posts: 3
Joined: Mon Jan 17, 2022 7:08 am

Run custom model on Ubuntu Docker Deepstack for GPU

Post by durnovtsev »

I want to run a custom model via Ubuntu Docker Deepstack for GPU, what command should I write in the terminal?
For the standard launch of embedded DeepStack models, I use: docker run --gps all -e VISION-DETECTION=True -v localstorage:/datastore -p 80:5000 deepquestai/deepstack:gpu
This works great in Blue iris

I want to run a custom model and embedded models (person, truck, car...) at the same time

What commands do I need to write in the terminal?
Post Reply