Blue Iris integrated directly with the Deepstack open source AI project with the 5.4.0 release. This is an on-prem only AI solution requested by many users.
Deepstack (deepstack.cc) was a private AI company (December 2018) located in Nigeria that recently open sourced their work (March 2019).
If you prefer to listen instead of read, watch the DeepStack AI webinar associated with this article.
The DeepStack Fine tuning article referenced in the Fine tuning section explains the DeepStack analysis feature. Makes trouble-shooting AI alerts so easy now.
The DeepStack Gotchas article are all the lessons learned from past tickets.
The value of AI in surveillance is it improves accuracy to detect objects and reduces false alerts. AI will draw attention to activity that matters by identifying objects that could be missed by humans because of lack of attention or hard to see at night. AI also suppresses events that do not matter such as moving shadows from the wind, changes in sunlight, flies, birds, squirrels, rain etc. AI is becoming a very effective tool for outdoor cameras.
Alerts list Icons:
Green check: AI confirmation, i.e. AI found an object, but no unique icon for object.
Flag: Same Flag function as before.
There is a user setting to automatically flag confirmed AI alerts.
Camera settings -> Trigger tab -> AI button -> Auto-flag confirmed alerts (Default is unselected)
Cancelled symbol: nothing found
Occupied State / Static Object
A static object is an object that is NOT moving. The most common example is not to keep triggering on a parked car.
BI already comes with advanced alerting by overlapping motion with objects before sending an alert. This reduces the majority of false alerts due to static objects such as parked cars.
Static objects will only have a blue annotation.
There still exists some stubborn scenes where Occupied State provides even more value. For example, a parked car in the driveway underneath a tree for shade could be challenging. The shadows from the tree could trigger motion and the motion boxes could very well overlap with the car causing alerts. Occupied state should prevent such alerts based on logic described below.
Occupied state has the timer icon AND uses the blue annotation.
Logic behind Occupied State
Same object type: +/- 10% confidence level
Same location: +/- 5%
Same object color (removed because many cameras go black&white, i.e. IR, at night. v184.108.40.206)
Occupied state list: BI maintains two lists during an alert.
- Current alert list: objects identified in current alert
- Static object list: A list of objects that it perceives to be static, e.g. a parked car, based off of past alerts.
When an alert occurs, each object(s) in the current alert list is first compared against all the objects in the static object list. If an object in the current alert matches any object in the static object list , then the object is considered static, e.g. parked car, and BI will NOT fire an alert based on this object. The static object icon will be displayed if all objects are determined to be static.
Identifying new objects
If an object on the current alert list does NOT match any object in the static object list an alert is fired. The new object will be placed in the static object list. This is how the software distinguishes between parked cars and cars that drove by. This is also how the static object list gets updated.
Removing static objects
Static objects continue to remain in the static object list if the object is matched. With every match, the timestamp on the object is updated.
If an alert arrives and the static object list has objects that were NOT matched by any of the current alert objects AND those static objects are older than 30s, those objects are purged from the static object list. We know AI is not perfect which is why we created the 30s window. A static object detected in one frame may not be detected in another frame due to lighting changes etc. In order to remove these inaccuracies, we assume as the camera alerts, at least one frame will identify the static object correctly within 30s or later.
Static Objects is a setting that can be turned off if necessary (you want more sensitivity). This may be a good idea in high traffic areas like a camera facing a door or a car entry gate.
- Orange: High level of confidence (>67%).
- Yellow: Confidence level <67%, but meets your threshold, e.g. 50%.
- Blue: Static object/Occupied state. See above for details.
- Red: Cancelled alert based on "To cancel" field in Camera settings.
- Green: Objects of no interest.
For example a boat appeared in the scene but boat is not listed as an object of interest.
Alerts or No alerts on specific objects
The Action Map for alerts have been extend so you can trigger or skip alerts on specific objects. You can finally get the alert when the dog jumps on the couch!
Particularly handy if you are using facial recognition and want to be alerted only on unrecognized faces. Use the tag "unknown".
*** Checkpoint: Understand advantages of AI and the BI/Deepstack User Experience ***
Deepstack AI Installation
Depending on your machine, you can choose the many Windows installation options. Below is the link to the Windows installation page. The CPU version is very easy to install. Just run the installer.
Gotcha!: The DeepStack website needs some cleanup. There are pages on the site that point to a legacy installer that does not work! Stick to the URL above and you will be fine. In particular, the incorrect installer name is DeepStack.Installer.exe. My hunch is this version was created around 2019 before the company decided to open source their software. Another clue you have the wrong version is when you bring up the DeepStack splash page, Activation Key information is part of the page.
If you choose the wrong version, you will find BI cannot start DeepStack automatically. It cannot communicate with the server either with below errors in the Status -> Log.
Other useful Deepstack resources;
According to the documentation, the GPU installation also requires CUDA 10.1 and cuDNN. The latest version of CUDA is 11.2. I took a chance with CUDA 11.2 and it seems to be working fine for me. I will share issues if I come across any. For those who are curious, CUDA is an API created by NVidia to make parallel programming on NVidia GPUs easier.
Run DeepStack manually to confirm Installation.
I prefer to work in stages. For me a good first step is to install/run DeepStack manually. To start DeepStack manually, from a CMD prompt run the below command.
Code: Select all
deepstack --VISION-DETECTION True --PORT 82
In addition to the console, you also get confirmation via the DeepStack browser confirmation page. From a browser go to localhost:82 (or port chosen to run the server)
If DeepStack is running correctly, connect BI to DeepStack to confirm BI and DeepStack can work together. To do so, edit Global settings -> AI tab. The below settings tell BI where the DeepStack server is running, i.e. on the same machine on port 82. With this information, BI is able to send AI requests to DeepStack.
Checklist: Confirm whether BI can communicate successfully with the DeepStack server.
- Global settings -> AI tab:
Select "Use DeepStack server on IP/port". (see above)
Specify which port to use for DeepStack. 82 is the default. As long as it is NOT the same port as the BI web server you are fine. I'm assuming you installed DeepStack on your BI machine, thus IP Address = 127.0.0.1. BI does allow the ability to run DeepStack on a separate server.
- DO NOT check Auto start/stop with Blue Iris.
- Jump to Blue Iris AI Camera settings section below and turn on DeepStack for one camera.
- Check for any errors in Status -> log.
When working correctly, the motion event in the log should be followed by a DeepStack response as seen below.
- Check the DeepStack console as well. Are requests / response being processed by DeepStack?
BI Manages DeepStack
Once you confirmed BI/DeepStack communication, you can consider tightly connecting BI to DeepStack by activating "Auto start/stop" via Global settings -> AI tab as seen below.
Checklist: Now you can automate start/stop of DeepStack from BI.
- Check "Auto start/stop with Blue Iris.
Best to install DeepStack in default location (C:\DeepStack). If you choose a different location, please specify accordingly.
- Stop the DeepStack server you started manually. May need to restart machine just to make sure all processes run by DeepStack are shutdown.
- Hit Start now.
- Use Test in browser link to confirm Deepstack is running.
- Similar to above, confirm one camera is processing DeepStack motion events properly.
- If you want to play with Facial recognition, you can activate the feature as well.
Bring up task manager. If you see the below processes running, you know DeepStack is running.
server.exe and redis-server.exe are must processes.
python processes will also be seen but not sure if 2 or 1 or more python processes will be created based on load.
It is feasible to run BI on one machine and Deepstack on a completely separate machine. A separate machine could mean a completely separate hardware server or a VM or a Dockers container.
One user stated he did so successfully by running a Docker container in an Ubuntu 18.04 VM. In his implementation the IP Address for the Docker container was 10.32.1.9.
The docker command was:
Code: Select all
docker run -d \ --name=deepstack \ -p 80:5000 \ -v /opt/deepstack-storage:/datastore:rw \ -e "VISION-DETECTION=True" \ -e "VISION-FACE=True" \ deepquestai/deepstack
Once setup, similar to the non-distributed system (above), check/confirm that you can access the Deepstack confirmation page from a browser on the BI server (Test in browser link). In BI, the AI settings based on the Docker IP Address mentioned above would be:
If you see traffic on the AI machine but no objects in BI, you likely did not enable the vision detection when starting DeepStack.
*** Checkpoint: Deepstack is installed and can communicate with BI server! ***
Blue Iris AI Camera settings
Camera settings -> Trigger tab -> Artificial Intelligence
List of Objects Available for detection
Best to check deepstack.cc for current list of objects.
Code: Select all
person, bicycle, car, motorcycle, airplane, bus, train, truck, boat, traffic light, fire hydrant, stop_sign, parking meter, bench, bird, cat, dog, horse, sheep, cow, elephant, bear, zebra, giraffe, backpack, umbrella, handbag, tie, suitcase, frisbee, skis, snowboard, sports ball, kite, baseball bat, baseball glove, skateboard, surfboard, tennis racket, bottle, wine glass, cup, fork, knife, spoon, bowl, banana, apple, sandwich, orange, broccoli, carrot, hot dog, pizza, donut, cake, chair, couch, potted plant, bed, dining table, toilet, tv, laptop, mouse, remote, keyboard, cell phone, microwave, oven, toaster, sink, refrigerator, book, clock, vase, scissors, teddy bear, hair dryer, toothbrush.
Motion triggers vs Alerts
With Artificial Intelligence there is now a wider gap between Motion Triggers, e.g. the camera triggered because a deer walked across the lawn, and an Alert, i.e. the AI correctly did not recognize the deer as a person, and therefore cancelled the alert.
The AI setting "Hide cancelled alerts on timeline and 'all alerts'" is on by default, i.e. most users after setting up their Motion and AI, do not want to be bothered by the "nothing found" alerts, because that is why they installed AI in the first place.
However, I do see users in the country side that like to record wildlife such as bears, coyotes, deer etc when they approach their property. BI AI will cancel the Alert, however motion triggers are still always recorded. This implementation was chosen just in case the AI missed something, you still had the motion trigger recording. It turns out, this implementation provides added benefit for the users in the country side that still like to capture recordings of wildlife. The "Hide cancelled alerts on timeline and 'all alerts'" can be unselected if you want "nothing found" alerts still listed in the Alerts folder. Furthermore, cancelled alerts are always listed in the cancelled alerts folder regardless of the "Hide cancelled alerts on timeline and 'all alerts'" setting.
*** Checkpoint: DeepStack Configuration complete. BI server sending meaningful alerts based on AI. ***
See DeepStack Fine Tuning Settings article.
The example below illustrates the limitations of AI. With the "Burn label mark-up onto alert images" AI setting and Add to alerts list = Hi-res JPEG files in the Trigger tab, it is easy to get the trigger image as seen below from the Alerts folder. You can right click on alert in clip list -> Open containing folder. Windows explorer should point to the file containing the trigger image.
You can see for yourself in above image what is going on. BI motion sensors identified the moving object, no problem. DeepStack said this is not a car (no annotation).
If DeepStack saw a car (a different alert below), you would see the annotation. The overlap of the DeepStack object and the Motion detection object (turn on Motion Highlight: Rectangles or Highlight / Rectangle) further confirms the static object test, i.e. it's a moving car, not a parked car because of motion and DeepStack object overlap.
Does this mean I am going to resign myself to missed alerts due to headlights? Heck no! I'm going to tweak my Motion settings in order to try capturing motion much further up the street. This will allow me to get more samples of the car for DeepStack to analyze and hopefully alert on cars with headlights on. The power of Motion sensors with AI!
The Gotchas shares past learnings from past tickets to help you avoid the same mistakes. See DeepStack Gotchas article.
Next steps / Submit a ticket
This article is largely about installing and running DeepStack.
It also explains the integration and the UI/annotations you will now see with your alerts.
If you are having issues with installing or running DeepStack, it is best to go to the Gotchas article to see if others have seen and resolved the same issue.
DeepStack is an open source 3rd party AI solution. Issues running it are probably better addressed by experts on the DeepStack forum.
If you have questions about DeepStack alerts and why alerts are being confirmed / cancelled, first review the Fine tuning article.
This article explains the DeepStack analysis feature and how to understand what BI is doing.
If you still wish to open a ticket, provide the following information:
- Describe issue. What installation did you choose? GPU? CPU? Docker?
- Status -> Log errors
- Screenshot of Global settings -> AI tab.
- Any steps you have taken so far to resolve the issue. This will help us gain insight as to the problem.
- Camera settings. Camera settings -> General tab -> Export
- (if relevant / optional) Status -> Camera tab. Need to know the health of the camera streams.