Gotchas are issues that have been discovered via tickets from past customers. These articles help users resolve their own issues based on past learnings.
Gotcha 11: DeepStack is confirming alerts even though it is outside my area of interest.
Issue: Defined an area of interest (Zone B).
Told BI to only trigger based on Zone B.
Received an Alert due to motion outside Zone B.
The problem is DeepStack will observe the entire visible image.
1. Add more intelligence to Object Crossing logic.
By improving the logic, BI object tracking will only trigger if there is movement from A-B or vice versa.
The camera would never trigger if there was motion just in B, for example a piece of paper was blowing in the wind.
Thus BI would never have triggered and DeepStack would never have the opportunity to analyze the frame where the car was driving by on the road.
2. Obscure the road completely by redefining Zone A.
By obscuring the road, it is completely out of the picture. DeepStack will never analyze the street.
Probably leads to less CPU/GPU resources for DeepStack as well.
Gotcha 10: DeepStack temp directory filling up.
For some reason, certain deployments (CPU) result in DeepStack filling up the temp directory over a few days as the server is running.
We still do not know the root cause. Please share with Support if you know the issue.
Below is a work around shared by one user.
Unfortunately what I have isn’t so much a fix as it is a workaround. It’s just using Windows Task Scheduler to call a batch file that runs ‘del C:\DeepStack\redis\temp-*.rdb’ every hour.
Gotcha 9: Server Error 100
See DeepStack: Server Error 100 section in Status Log Errors article for details.
Gotcha 8: DeepStack appears to be running but no images are being processed.
DeepStack is on (or believed to be on). However, no AI alerts appear in clip list. And no errors in the Status->Log!
We are getting reports that users have DeepStack running based on the DeepStack splash page popping up. However, NO Deepstack annotations appear when they should. It's as if DeepStack is not turned on. Clips list looks like below. Only motion annotations
Once DeepStack is activated, you should receive AI annotations. Even with no objects, the "nothing found" AI annotation appears!
If DeepStack were turned on, the clip list would look like below when objects were found.
First confirm the BI server and the DeepStack server are communicating. Global settings -> AI tab.
Also confirm you have DeepStack turned ON for the camera you are debugging! Camera settings -> Trigger tab -> Artificial Intelligence. Remember, if you have multiple Profiles, DeepStack needs to be set correctly (i.e. on or off) for each Profile.
- If you uncheck Object detection from the Advanced section of Camera settings -> Trigger tab -> Motion sensor Configure, you have unbeknownst to yourself, turned off DeepStack because object detection is needed for DeepStack to work.
- If all the trigger sources are inactive in the Alerts tab, you have unbeknownst to yourself, turned off DeepStack.
If you want to turn off all you alerts, best do so by deleting all your entries in the action map or disabling all your entries in the action map.
The default setting is to have all your trigger sources active. At a minimum keep the Motion Zones trigger sources enabled so DeepStack is not affected.
- Uncheck Auto start/stop in Global settings -> AI.
- Restart Windows to make sure no DeepStack processes are still running. DeepStack could be unstable if trying to do so manually with all the additional processes that DeepStack spawns (server.exe, python.exe, redis-server.exe).
- Start DeepStack manually from the command line. "deepstack --VISION-DETECTION True --PORT 82". Port 82 is used by default but if you chose a different port (Global settings -> AI tab), specify the same number in command line.
Gotcha 7: 2021-May-19 Update:
Deepstack is running but sometimes BI responds server not running
The below logs show a successful Deepstack response at 1:09:15. Then at 1:09:46 BI states Deepstack is unreachable, however user confirmed Deepstack is still running.
Code: Select all
5/12/2021 1:09:46 PM Oby1Cam: DeepStack: Not running or unreachable (84) 5/12/2021 1:09:45 PM Oby1Cam: MOTION (79) 5/12/2021 1:09:15 PM Ofy1Cam: DeepStack: Alert cancelled (nothing found) (83) 5/12/2021 1:09:09 PM Ofy1Cam: MOTION (75)
First confirm all Software Security Exemptions (e.g. Windows Defender) have been applied. The Windows Tuning Article provides details.
The unreachable message occurs if DS does not respond on the IP:port that you specify. If this is not due to issues with security software, perhaps DS is having other issues. I do see an occasional DS state where it is not responding.
BI is now detecting Deepstack not running or unreachable states and restarts DeepStack behind the scenes (assuming BI is set to start/stop DeepStack). This should fix the issue without any user interaction needed.
Gotcha 6: BI Status Log Error - Deepstack not running or unreachable
If BI cannot communicate with Deepstack, BI will let you know.
Assuming you have Auto start/stop with BI enabled, simple fix is stop Deepstack if it's running and restart BI.
Gotcha 5: High CPU usage with Deepstack.
AI / Deepstack is very computationally heavy. If your CPU usage jumps and you don't have a GPU the best you can do is throttle the calls to Deepstack.
- Turn on Deepstack only for outdoor cameras. BI motion settings are usually fine for indoor cameras.
- Reduce number of images sampled per alert.
- Tighten your BI motion sensor settings so cameras do not trigger an alert so frequently.
Gotcha 4: Deepstack missing obvious objects.
Shout out to our users for sharing issues and solutions!
For some reason, the BlueIris software is ignoring what the Deepstack server says in terms of confidence and the label. This seems to happen during the second, third or sometimes fourth image that Deepstack analyzes. I have confirmed with a packet capture that despite these additional images being sent back to the BlueIris software with a successful response, BlueIris ignores it and goes to cancel the alert. I confirm these images meet the criteria of the minimum confidence percentage and "to confirm". I would like to also mention that if DeepStack responds back with a prediction that is within the criteria on the first image, it works OK.
Fix: I just needed to change the keyframe interval to be the same value as the FPS so it is equal to 1.00 (as displayed from Blue Iris). The detection is working much better now. See Camera stream optimization article for details.
Alternative fix: If for some reason you cannot adjust the key frame interval on the camera, identify the current key frame interval (Status -> Cameras tab). If the value is 0.25, i.e. 1/4, i.e. 1 key frame every 4s, then set the pre-trigger buffer (Camera settings -> Record tab) greater than or equal to 4s. This will guarantee BI at least one key frame to process.
Gotcha 3: Deepstack always returning nothing found. (Another shout out to the community for making everyone smarter)
This could imply the Deepstack server is returning server errors and BI is just reporting back "nothing found".
BI will eventually report the Deepstack errors in the Status->log to make users aware of the issue easier.
Until done so, one user creatively used Postman to identify the issue. If you are a developer, you are probably familiar with Postman. It's a very popular tool that helps test and debug APIs. If new to Postman, lots of content on the internet how to use it. By using Postman, the user discovered object detection requests to Postman were returning 500 error codes. 500 errors mean the server had a problem processing the request.
FYI, 400 errors mean the request to Deepstack is not correct. For those new to Postman, a 400 error means the request you created in Postman to send to Deepstack is not correct. You need to first make sure you are sending correct requests to Deepstack.
Once the user realized their Deepstack installation was not working correctly, they reinstalled Deepstack. Deepstack does provide error logs, "C:\Users\[Username]\AppData\Local\DeepStack\logs" (%LOCALAPPDATA%\DeepStack\logs). Understanding the logs were challenging so best course of action was to try reinstalling Deepstack.
User also noted uninstalling Deepstack may not flush out all the files. The user went back into the installation directory (C:\DeepStack) and manually deleted any remaining files.
Furthermore, user noted, it's good to go into task manager and make sure any old Deepstack processes are not running. You can go into Task Manager and kill any of the following processes: server.exe, python.exe, redis-server-exe if they exist.
FYI, installing / using AI Tool for debugging may be worthwhile as BI continues to build out the debugging capabilities. In AI Tool, error messages will pop up right away if old Deepstack versions are still running.
Another clue that Deepstack is not running properly is by running Deepstack manually and observing the response time in the console. If Deepstack is taking 1m to respond, there is probably something wrong.
Gotcha 2: Front door camera
Front door view has the road and therefore a lot of unnecessary alerts. All you really want are people coming to the house or property.
So you create a zone in front of your house to reduce the noise from the road.
No alerts! The problem is the camera will trigger when a person's feet crosses Zone B. The image being processed by Deepstack is just the feet! So Deepstack will often return nothing found.
The fix! Make sure to always define a Zone A which is the overall area of interest on the camera. The image processed by Deepstack is the overlap of all the zones, thus the entire person is sent to Deepstack for processing and the appropriate alert is sent.
One important point that I forgot to mention in the Webinar but did come up in the Q&A is how to tell BI to only be concerned with motion in Zone B??? Simple, see below. By setting "Object crosses zones" to B, I am telling BI to only worry about motion in Zone B. Notice with AI, I uncheck "Object travels" and "Object size exceeds" settings. With AI, the motion sensor settings can usually be simplified!
Gotcha 1: Person detection & Facial detection
Keep in mind the current implementation of Deepstack runs in series, not in parallel. So if you have both activated in Deepstack, Deepstack will alert on person detection first and never alert on facial recognition. Most people would probably think Deepstack would first discover the person and then proceed to recognize the face, but this does not seem to be the case.
Person detected in image below even though person detection and facial recognition was activated.
With just facial recognition activated.