Page 1 of 1

Custom Models

Posted: Wed Nov 17, 2021 7:10 am
by twoii
I am testing out BlueIris and is having trouble with custom models. I have trained DeepStack (running on a separate server) with a custom model and already works when trying via API. But when I try to integrate with BlueIris, it does not seem to detect it.

I checked the logs of DeepStack and saw that BlueIris only calls the vision detection API and not the custom model API.

I have added the name of the model under "Custom Models" on the AI Settings. Am I missing anything else?

Re: Custom Models

Posted: Wed Nov 17, 2021 9:12 pm
by YrbkMgr
I checked the logs of DeepStack and saw that BlueIris only calls the vision detection API and not the custom model API.
Where do you find the DeepStack logs?

Re: Custom Models

Posted: Thu Nov 18, 2021 1:20 am
by twoii
I have Deepstack running on Powershell and we can see all API calls from there.

Re: Custom Models

Posted: Thu Nov 18, 2021 1:49 am
by YrbkMgr
Thanks for that.

Re: Custom Models

Posted: Thu Nov 18, 2021 4:29 am
by aesterling
Have you restarted the computer after adding the custom models?

Re: Custom Models

Posted: Fri Nov 19, 2021 1:24 pm
by twoii
Tested again today and it works now. There was some issue with the model and test data I used.

Re: Custom Models

Posted: Tue Jan 18, 2022 5:47 am
by durnovtsev
twoii wrote: Fri Nov 19, 2021 1:24 pm Tested again today and it works now. There was some issue with the model and test data I used.
Hi twoii, how did I solve the problem with launching custom models? I ran my own custom model and created on the forums for example openlogo through windows everything works fine, But when I run it through Docker in BLUE IRIS, when analyzing, it does not find a custom model, only the upper green line appears on the top left.

sudo docker run --gpus all -e VISION-DETECTION=True -v localstorage:/datastore -p 80:5000 deepquestai/deepstack:gpu
With this command, I run a deep stack of standard models (car, man, dog, chair...) Everything works fine in the Library

I run custom models with this command
to launch sudo docker --gps al -v /home/durnovtsev/DeepStack-Models:/model store/detection -page 80:5000 deepquestai/deepstack:GPU

Re: Custom Models

Posted: Tue Jan 18, 2022 5:53 am
by durnovtsev
sudo docker run --gpus all -e VISION-DETECTION=True -v localstorage:/datastore -p 80:5000 deepquestai/deepstack:gpu
With this command, I run a deep stack of standard models (car, man, dog, chair...) Everything works fine in the Library

I run custom models with this command
to launch sudo docker --gps al -v /home/durnovtsev/DeepStack-Models:/model store/detection -page 80:5000 deepquestai/deepstack:GPU