Blue Iris integrated directly with the Deepstack open source AI project with the 5.4.0 release. This is an on-prem only AI solution requested by many users.
Deepstack (deepstack.cc) was a private AI company (December 2018) located in Nigeria that recently open sourced their work (March 2019).
If you prefer to watch the webinar associated with this article, checkout our YouTube channel. Webinar name: Deepstack AI.
The value of AI in surveillance is it improves accuracy to detect objects and reduces false alerts. AI will draw attention to activity that matters by identifying objects that could be missed by humans because of lack of attention or hard to see at night. AI also suppresses events that do not matter such as moving shadows from the wind, changes in sunlight, flies, birds, squirrels, rain etc. AI is becoming a very effective tool for outdoor cameras.
- Green check: AI confirmation, i.e. AI found an object, but no unique icon for object.
- Flag: Used to easily filter clip list based on trigger events with AI confirmation.
- Cancelled symbol: nothing found
- Occupied State
BI already comes with advanced alerting by overlapping motion with objects before sending an alert. This reduces the majority of false alerts due to static objects such as parked cars. However, there still exists some stubborn scenes where Occupied State provides even more value. For example, a parked car in the driveway underneath a tree for shade could be challenging. The shadows from the tree could trigger motion and the motion boxes could very well overlap with the car causing alerts. Occupied state should prevent such alerts based on logic described below.
Logic behind Occupied State
Same object type: +/- 10% confidence level
Same location: +/- 5%
Time limit: 1 hour
Same object color
Gotcha: Sometimes occupied state results in false cancellations. In image above, second car is actually a different car so should have been alerted. Object color was recently added as another heuristic to measure so occupied state is even smarter.
Static Objects is a setting that can be turned off if necessary. Maybe let us know the issue as well so we can make adjustments or improve the software.
- Orange: High level of confidence.
- Yellow: Confidence level <67%
- Blue: Occupied state. See above for details.
- Red: Cancelled alert based on "To cancel" field in Camera settings.
Alerts or No alerts on specific objects
The Action Map for alerts have been extend so you can trigger or skip alerts on specific objects. You can finally get the alert when the dog jumps on the couch!
Particularly handy if you are using facial recognition and want to be alerted only on unrecognized faces. Use the tag "unknown".
*** Checkpoint: Understand advantages of AI and the BI/Deepstack User Experience ***
Deepstack AI Installation
Depending on your machine, you can choose the many Windows installation options. Below is the link to the Windows installation page. The CPU version is very easy to install. Just run the installer.
Gotcha!: The DeepStack website needs some cleanup. There are pages on the site that point to a legacy installer that does not work! Stick to the URL above and you will be fine. In particular, the incorrect installer name is DeepStack.Installer.exe. My hunch is this version was created around 2019 before the company decided to open source their software. Another clue you have the wrong version is when you bring up the DeepStack splash page, Activation Key information is part of the page.
If you choose the wrong version, you will find BI cannot start DeepStack automatically. It cannot communicate with the server either with below errors in the Status -> Log.
Other useful Deepstack resources;
According to the documentation, the GPU installation also requires CUDA 10.1 and cuDNN. The latest version of CUDA is 11.2. I took a chance with CUDA 11.2 and it seems to be working fine for me. I will share issues if I come across any. For those who are curious, CUDA is an API created by NVidia to make parallel programming on NVidia GPUs easier.
Run DeepStack Manually to Confirm Installation
I prefer to work in stages. For me a good first step is to install/run DeepStack manually. To start DeepStack manually, from a CMD prompt run the below command.
Code: Select all
deepstack --VISION-DETECTION True --PORT 82
In addition to the console, you also get confirmation via the DeepStack browser confirmation page. From a browser go to localhost:82 (or port chosen to run the server)
If DeepStack is running correctly, connect BI to DeepStack to confirm BI and DeepStack can work together. To do so, edit Global settings -> AI tab. The below settings tell BI where the DeepStack server is running, i.e. on the same machine on port 82. With this information, BI is able to send AI requests to DeepStack.
Checklist: Confirm whether BI can communicate successfully with the DeepStack server.
- Global settings -> AI tab:
Select "Use DeepStack server on IP/port". (see above)
Specify which port to use for DeepStack. 82 is the default. As long as it is NOT the same port as the BI web server you are fine. I'm assuming you installed DeepStack on your BI machine, thus IP Address = 127.0.0.1. BI does allow the ability to run DeepStack on a separate server.
- DO NOT check Auto start/stop with Blue Iris.
- Jump to Blue Iris AI Camera settings section below and turn on DeepStack for one camera.
- Check for any errors in Status -> log.
When working correctly, the motion event in the log should be followed by a DeepStack response as seen below.
- Check the DeepStack console as well. Are requests / response being processed by DeepStack?
BI Manages DeepStack
Once you confirmed BI/DeepStack communication, you can consider tightly connecting BI to DeepStack by activating "Auto start/stop" via Global settings -> AI tab as seen below.
Checklist: Now you can automate start/stop of DeepStack from BI.
- Check "Auto start/stop with Blue Iris.
Best to install DeepStack in default location (C:\DeepStack). If you choose a different location, please specify accordingly.
- Stop the DeepStack server you started manually. May need to restart machine just to make sure all processes run by DeepStack are shutdown.
- Hit Start now.
- Use Test in browser link to confirm Deepstack is running.
- Similar to above, confirm one camera is processing DeepStack motion events properly.
- If you want to play with Facial recognition, you can activate the feature as well.
It is feasible to run BI on one machine and Deepstack on a completely separate machine. A separate machine could mean a completely separate hardware server or a VM or a Dockers container.
One user stated he did so successfully by running a Docker container in an Ubuntu 18.04 VM. In his implementation the IP Address for the Docker container was 10.32.1.9.
The docker command was:
Code: Select all
docker run -d \ --name=deepstack \ -p 80:5000 \ -v /opt/deepstack-storage:/datastore:rw \ -e "VISION-DETECTION=True" \ -e "VISION-FACE=True" \ deepquestai/deepstack
Once setup, similar to the non-distributed system (above), check/confirm that you can access the Deepstack confirmation page from a browser on the BI server (Test in browser link). In BI, the AI settings based on the Docker IP Address mentioned above would be:
If you see traffic on the AI machine but no objects in BI, you likely did not enable the vision detection when starting DeepStack.
*** Checkpoint: Deepstack is installed and can communicate with BI server! ***
Blue Iris AI Camera settings
Camera settings -> Trigger tab -> Artificial Intelligence
List of Objects Available for detection
Best to check deepstack.cc for current list of objects.
Code: Select all
person, bicycle, car, motorcycle, airplane, bus, train, truck, boat, traffic light, fire hydrant, stop_sign, parking meter, bench, bird, cat, dog, horse, sheep, cow, elephant, bear, zebra, giraffe, backpack, umbrella, handbag, tie, suitcase, frisbee, skis, snowboard, sports ball, kite, baseball bat, baseball glove, skateboard, surfboard, tennis racket, bottle, wine glass, cup, fork, knife, spoon, bowl, banana, apple, sandwich, orange, broccoli, carrot, hot dog, pizza, donut, cake, chair, couch, potted plant, bed, dining table, toilet, tv, laptop, mouse, remote, keyboard, cell phone, microwave, oven, toaster, sink, refrigerator, book, clock, vase, scissors, teddy bear, hair dryer, toothbrush.
Motion triggers vs Alerts
With Artificial Intelligence there is now a wider gap between Motion Triggers, e.g. the camera triggered because a deer walked across the lawn, and an Alert, i.e. the AI correctly did not recognize the deer as a person, and therefore cancelled the alert.
The AI setting "Hide cancelled alerts on timeline and 'all alerts'" is on by default, i.e. most users after setting up their Motion and AI, do not want to be bothered by the "nothing found" alerts, because that is why they installed AI in the first place.
However, I do see users in the country side that like to record wildlife such as bears, coyotes, deer etc when they approach their property. BI AI will cancel the Alert, however motion triggers are still always recorded. This implementation was chosen just in case the AI missed something, you still had the motion trigger recording. It turns out, this implementation provides added benefit for the users in the country side that still like to capture recordings of wildlife. The "Hide cancelled alerts on timeline and 'all alerts'" can be unselected if you want "nothing found" alerts still listed in the Alerts folder. Furthermore, cancelled alerts are always listed in the cancelled alerts folder regardless of the "Hide cancelled alerts on timeline and 'all alerts'" setting.
*** Checkpoint: DeepStack Configuration complete. BI server sending meaningful alerts based on AI. ***
Fine tuning / Trouble-shooting
Preferred DeepStack settings when Fine-tuning
- Save DeepStack analysis details turned on. *.dat files will now be created in the Alerts folder.
You can use the Status -> DeepStack tab to understand why alerts were sent or not sent. Especially helpful when trying to tune missed alerts, i.e. "nothing found".
Details further below.
- Turn off "Hide cancelled alerts on timeline and 'all alerts'. Camera settings -> Trigger tab -> Artificial Intelligence. It's much easier to find missed alerts through the clip list. Take advantage of BI functionality.
- Make sure "Burn label mark-up onto alert images" is on. Camera settings -> Trigger tab -> Artificial Intelligence. Default is on. Not sure why anyone would NOT want to see the AI annotations.
- Other optional settings I find useful at times
- Recording tab: Set recording to "When triggered". Uncheck Combine or cut video each. This way a new BVR is created for each trigger. Because you have a BVR for each trigger event, it's easy to replay a missed alert/false alert and tweak your motion settings and observe if the tweaks improve your alerts. If you cannot figure out an issue, it is also easy to send the short bvr of the motion trigger to support for review!
- Trigger tab: Turn on Motion overlays. Camera settings -> Trigger tab -> Motion Sensor. Highlight: Rectangles or Highlight / Rectangle
With the motion objects and the DeepStack AI annotations, you can visually see the overlap and see why alerts were sent or not sent. Details below.
- Trigger tab: Set Add to alerts list = Hi-res JPEG files.
With the *.dat files now created with "Save DeepStack analysis details" option, I like seeing the files side by side in the Alerts folder. Makes identifying alerts of interest easier when viewing files in Windows Explorer.
Understanding why DeepStack did not alert has just become a lot easier with the "Save DeepStack analysis" feature. Now BI can show you exactly which images(i.e. frames/samples) were processed by DeepStack when a motion trigger fired so you can understand why an alert was or was not sent.
In order to get the feature to work:
- First check "Save DeepStack analysis" in Camera settings -> Trigger tab -> Artificial Intelligence. This setting will start creating *.dat files in the Alerts folder pertaining to the meta data for each motion trigger.
- Open the Status -> DeepStack window.
- Double click any motion trigger in the Clips List and the DeepStack Status window will populate with the DeepStack meta data making it much easier to understand what is going on.
This example highlights an alert that was cancelled by DeepStack. Double-clicking on an alert while the Status -> DeepStack window is open now populates the Status window with the data associated with the DeepStack analysis.
- The image has a wealth of information.
The black areas show the areas not covered by any zones.
The annotation shows what DeepStack identified.
The yellow motion highlight in the top right shows where the motion is coming from, i.e. trees blowing in the wind.
- The box to the left identifies 11 samples which were sent to DeepStack for processing. My BI settings are:
- Break time = 10s
- 10 images to sample
- 1s interval between samples
- Begin analysis with motion-leading edge
10 more images starting at T + 0 (trigger image) incrementing roughly every 1s (based on settings).
The logs also provide data consistent with the DeepStack analysis data.
From the logs, motion was detected at 11:00:51.907. DeepStack cancelled the alert at 11:01:01.504 roughly 10s later.
The example below illustrates the limitations of AI. With the "Burn label mark-up onto alert images" AI setting and Add to alerts list = Hi-res JPEG files in the Trigger tab, it is easy to get the trigger image as seen below from the Alerts folder. You can right click on alert in clip list -> Open containing folder. Windows explorer should point to the file containing the trigger image.
You can see for yourself in above image what is going on. BI motion sensors identified the moving object, no problem. DeepStack said this is not a car (no annotation).
If DeepStack saw a car (a different alert below), you would see the annotation. The overlap of the DeepStack object and the Motion detection object (turn on Motion Highlight: Rectangles or Highlight / Rectangle) further confirms the static object test, i.e. it's a moving car, not a parked car because of motion and DeepStack object overlap.
Does this mean I am going to resign myself to missed alerts due to headlights? Heck no! I'm going to tweak my Motion settings in order to try capturing motion much further up the street. This will allow me to get more samples of the car for DeepStack to analyze and hopefully alert on cars with headlights on. The power of Motion sensors with AI!
Analyze with DeepStack
Not as popular since the DeepStack analysis feature came out, however if you want to see how DeepStack would examine a past recording, you can also do so via the playback window. I think this feature is of value if you are considering activating more objects, e.g. dog, provided by the DeepStack model and want to gauge the model's accuracy. Also useful if you are creating your own custom models and want to gauge accuracy.
- First remember to turn off any overlays which can cause confusion on the playback.
- Turn on Analyze with DeepStack in the playback window.
- Start playing the recording and see the DeepStack annotations on every frame.
A new debugging tool is now part of the arsenal for tech saavy users. Ever wondered which program has a particular file or directory open? Now you can find out with Microsoft Process Explorer. Process Explorer shows you information about which handles and DLLs processes have opened or loaded.
https://docs.microsoft.com/en-us/sysint ... s-explorer
The link above provides instructions on how to download and install (zip file). The application comes with a Help file to understand the program. You can use this tool to see if DeepStack is running. It should be shown as "sever.exe" beneath BlueIris.exe (assuming BI is set to start DeepStack) under services.
The Gotchas shares past learnings from past tickets to help you avoid the same mistakes. See DeepStack Gotchas article.
Next steps / Submit a ticket
- Describe issue.
- Status -> Log errors
- Status -> Camera tab. Need to know the health of the camera streams.
- Any steps you have taken so far to resolve the issue. This will help us gain insight as to the problem.
- Screenshot of Global settings -> AI tab.
- Screenshot of Alerts in Clips List so we understand what BI is sensing currently.
- Camera settings. Camera settings -> General tab -> Export