Advanced - Deepstack + Blue Iris

Articles based on common support questions.
Post Reply
Posts: 28
Joined: Thu Jul 11, 2019 9:52 pm

Advanced - Deepstack + Blue Iris

Post by varghesesa » Mon Apr 12, 2021 12:07 am


Blue Iris integrated directly with the Deepstack open source AI project with the 5.4.0 release. This is an on-prem only AI solution requested by many users.
Deepstack ( was a private AI company (December 2018) located in Nigeria that recently open sourced their work (March 2019).
If you prefer to watch the webinar associated with this article, checkout our YouTube channel. Webinar name: Deepstack AI.


The value of AI in surveillance is it improves accuracy to detect objects and reduces false alerts. AI will draw attention to activity that matters by identifying objects that could be missed by humans because of lack of attention or hard to see at night. AI also suppresses events that do not matter such as moving shadows from the wind, changes in sunlight, flies, birds, squirrels, rain etc. AI is becoming a very effective tool for outdoor cameras.

User experience

ai-alerts_optimized.png (63.88 KiB) Viewed 3962 times

  • Person
  • Car
  • Green check: AI confirmation, i.e. AI found an object, but no unique icon for object.
  • Flag: Used to easily filter clip list based on trigger events with AI confirmation.
  • Cancelled symbol: nothing found
  • Occupied State
    occupied state.PNG
    occupied state.PNG (77.39 KiB) Viewed 787 times
    BI already comes with advanced alerting by overlapping motion with objects before sending an alert. This reduces the majority of false alerts due to static objects such as parked cars. However, there still exists some stubborn scenes where Occupied State provides even more value. For example, a parked car in the driveway underneath a tree for shade could be challenging. The shadows from the tree could trigger motion and the motion boxes could very well overlap with the car causing alerts. Occupied state should prevent such alerts based on logic described below.

    Logic behind Occupied State
    Same object type: +/- 10% confidence level
    Same location: +/- 5%
    Time limit: 1 hour
    Same object color

    Gotcha: Sometimes occupied state results in false cancellations. In image above, second car is actually a different car so should have been alerted. Object color was recently added as another heuristic to measure so occupied state is even smarter.

    Static Objects is a setting that can be turned off if necessary. Maybe let us know the issue as well so we can make adjustments or improve the software.
    static object.png
    static object.png (24.23 KiB) Viewed 337 times

Annotation colors
  • Orange: High level of confidence.
  • Yellow: Confidence level <67%
  • Blue: Occupied state. See above for details.
  • Red: Cancelled alert based on "To cancel" field in Camera settings.

Alerts or No alerts on specific objects

The Action Map for alerts have been extend so you can trigger or skip alerts on specific objects. You can finally get the alert when the dog jumps on the couch! :)
Particularly handy if you are using facial recognition and want to be alerted only on unrecognized faces. Use the tag "unknown".

action map.PNG
action map.PNG (47.65 KiB) Viewed 782 times

*** Checkpoint: Understand advantages of AI and the BI/Deepstack User Experience ***


deepstack architecture.png
deepstack architecture.png (15.64 KiB) Viewed 3959 times

Deepstack AI Installation

Depending on your machine, you can choose the many Windows installation options. Below is the link to the Windows installation page. The CPU version is very easy to install. Just run the installer.

Gotcha!: The DeepStack website needs some cleanup. There are pages on the site that point to a legacy installer that does not work! Stick to the URL above and you will be fine. In particular, the incorrect installer name is DeepStack.Installer.exe. My hunch is this version was created around 2019 before the company decided to open source their software. Another clue you have the wrong version is when you bring up the DeepStack splash page, Activation Key information is part of the page.

deepstack wrong activation page.PNG
deepstack wrong activation page.PNG (99.64 KiB) Viewed 780 times

If you choose the wrong version, you will find BI cannot start DeepStack automatically. It cannot communicate with the server either with below errors in the Status -> Log.

deepstack error.PNG
deepstack error.PNG (22.95 KiB) Viewed 2129 times

Other useful Deepstack resources; I decided to use the GPU option because of my NVidia GPU.

According to the documentation, the GPU installation also requires CUDA 10.1 and cuDNN. The latest version of CUDA is 11.2. I took a chance with CUDA 11.2 and it seems to be working fine for me. I will share issues if I come across any. For those who are curious, CUDA is an API created by NVidia to make parallel programming on NVidia GPUs easier.

Run DeepStack Manually to Confirm Installation

I prefer to work in stages. For me a good first step is to install/run DeepStack manually. To start DeepStack manually, from a CMD prompt run the below command.

Code: Select all

deepstack --VISION-DETECTION True --PORT 82
The above command assumes you want to run DeepStack on port 82. Below is an image of the DeepStack console when the software is running correctly.

deepstack started.PNG
deepstack started.PNG (21.32 KiB) Viewed 772 times

In addition to the console, you also get confirmation via the DeepStack browser confirmation page. From a browser go to localhost:82 (or port chosen to run the server)
deepstack.PNG (196.52 KiB) Viewed 4031 times

If DeepStack is running correctly, connect BI to DeepStack to confirm BI and DeepStack can work together. To do so, edit Global settings -> AI tab. The below settings tell BI where the DeepStack server is running, i.e. on the same machine on port 82. With this information, BI is able to send AI requests to DeepStack.

deepstack connect.PNG
deepstack connect.PNG (33.23 KiB) Viewed 772 times

Checklist: Confirm whether BI can communicate successfully with the DeepStack server.
  • Global settings -> AI tab:
    Select "Use DeepStack server on IP/port". (see above)
    Specify which port to use for DeepStack. 82 is the default. As long as it is NOT the same port as the BI web server you are fine. I'm assuming you installed DeepStack on your BI machine, thus IP Address = BI does allow the ability to run DeepStack on a separate server.
  • DO NOT check Auto start/stop with Blue Iris.
  • Jump to Blue Iris AI Camera settings section below and turn on DeepStack for one camera.
  • Check for any errors in Status -> log.
    When working correctly, the motion event in the log should be followed by a DeepStack response as seen below.
    deepstack status log.PNG
    deepstack status log.PNG (7.86 KiB) Viewed 1859 times
  • Check the DeepStack console as well. Are requests / response being processed by DeepStack?
If connected correctly and you activated AI on one of your cameras (see below), you can see request/response messages in the DeepStack console as seen below. At this point you know DeepStack is installed and running correctly.
deepstack log.PNG
deepstack log.PNG (146.93 KiB) Viewed 1903 times

BI Manages DeepStack
Once you confirmed BI/DeepStack communication, you can consider tightly connecting BI to DeepStack by activating "Auto start/stop" via Global settings -> AI tab as seen below.

AI global settings.png
AI global settings.png (55.35 KiB) Viewed 3851 times

Checklist: Now you can automate start/stop of DeepStack from BI.
  • Check "Auto start/stop with Blue Iris.
    Best to install DeepStack in default location (C:\DeepStack). If you choose a different location, please specify accordingly.
  • Stop the DeepStack server you started manually. May need to restart machine just to make sure all processes run by DeepStack are shutdown.
  • Hit Start now.
  • Use Test in browser link to confirm Deepstack is running.
  • Similar to above, confirm one camera is processing DeepStack motion events properly.
  • If you want to play with Facial recognition, you can activate the feature as well.

Distributed Systems
It is feasible to run BI on one machine and Deepstack on a completely separate machine. A separate machine could mean a completely separate hardware server or a VM or a Dockers container.

One user stated he did so successfully by running a Docker container in an Ubuntu 18.04 VM. In his implementation the IP Address for the Docker container was

The docker command was:

Code: Select all

docker run -d \
        --name=deepstack \
        -p 80:5000 \
        -v /opt/deepstack-storage:/datastore:rw \
        -e "VISION-DETECTION=True" \
        -e "VISION-FACE=True" \
/opt/deepstack-storage was the folder where Docker was allowed to permanently store things.

Once setup, similar to the non-distributed system (above), check/confirm that you can access the Deepstack confirmation page from a browser on the BI server (Test in browser link). In BI, the AI settings based on the Docker IP Address mentioned above would be:
deepstack-ai-settings.png (44.03 KiB) Viewed 2194 times

If you see traffic on the AI machine but no objects in BI, you likely did not enable the vision detection.

*** Checkpoint: Deepstack is installed and can communicate with BI server! ***

Blue Iris AI Camera settings

Camera settings -> Trigger tab -> Artificial Intelligence

ai-camera-settings_optimized.png (86.38 KiB) Viewed 736 times

List of Objects Available for detection
Best to check for current list of objects.

Code: Select all

person,   bicycle,   car,   motorcycle,   airplane,
bus,   train,   truck,   boat,   traffic light,   fire hydrant,   stop_sign,
parking meter,   bench,   bird,   cat,   dog,   horse,   sheep,   cow,   elephant,
bear,   zebra, giraffe,   backpack,   umbrella,   handbag,   tie,   suitcase,
frisbee,   skis,   snowboard, sports ball,   kite,   baseball bat,   baseball glove,
skateboard,   surfboard,   tennis racket, bottle,   wine glass,   cup,   fork,
knife,   spoon,   bowl,   banana,   apple,   sandwich,   orange, broccoli,   carrot,
hot dog,   pizza,   donut,   cake,   chair,   couch,   potted plant,   bed, dining table,
toilet,   tv,   laptop,   mouse,   remote,   keyboard,   cell phone,   microwave,
oven,   toaster,   sink,   refrigerator,   book,   clock,   vase,   scissors,   teddy bear,
hair dryer, toothbrush.

Motion triggers vs Alerts

With Artificial Intelligence there is now a wider gap between Motion Triggers, e.g. the camera triggered because a deer walked across the lawn, and an Alert, i.e. the AI correctly did not recognize the deer as a person, and therefore cancelled the alert.

The AI setting "Hide cancelled alerts on timeline and 'all alerts'" is on by default, i.e. most users after setting up their Motion and AI, do not want to be bothered by the "nothing found" alerts, because that is why they installed the AI in the first place.

However, I do see users in the country side that like to record wildlife such as bears, coyotes, deer etc when they approach their property. BI AI will cancel the Alert, however motion triggers are still always recorded. This implementation was chosen just in case the AI missed something, you still had the motion trigger recording. It turns out, this implementation provides added benefit for the users in the country side that still like to capture recordings of wildlife. The "Hide cancelled alerts on timeline and 'all alerts'" can be unselected if you want "nothing found" alerts still listed in the Alerts folder. Furthermore, cancelled alerts are always listed in the cancelled alerts folder regardless of the "Hide cancelled alerts on timeline and 'all alerts'" setting.

*** Checkpoint: DeepStack Configuration complete. BI server sending meaningful alerts based on AI. ***

Fine tuning / Trouble-shooting

Preferred DeepStack settings when Fine-tuning
deepstack-debug-settings_optimized.png (13.81 KiB) Viewed 734 times
  • Save DeepStack analysis details turned on. *.dat files will now be created in the Alerts folder.
    You can use the Status -> DeepStack tab to understand why alerts were sent or not sent. Especially helpful when trying to tune missed alerts, i.e. "nothing found".
    Details further below.
  • Turn off "Hide cancelled alerts on timeline and 'all alerts'. Camera settings -> Trigger tab -> Artificial Intelligence. It's much easier to find missed alerts through the clip list. Take advantage of BI functionality.
    deepstack debug clipslist.PNG
    deepstack debug clipslist.PNG (99.3 KiB) Viewed 733 times
  • Make sure "Burn label mark-up onto alert images" is on. Camera settings -> Trigger tab -> Artificial Intelligence. Default is on. Not sure why anyone would NOT want to see the AI annotations.
  • Other optional settings I find useful at times
    • Recording tab: Set recording to "When triggered". Uncheck Combine or cut video each. This way a new BVR is created for each trigger. Because you have a BVR for each trigger event, it's easy to replay a missed alert/false alert and tweak your motion settings and observe if the tweaks improve your alerts. If you cannot figure out an issue, it is easy to send the bvr to support for review!
    • Trigger tab: Turn on Motion overlays. Camera settings -> Trigger tab -> Motion Sensor. Highlight: Rectangles or Highlight / Rectangle
      With the motion objects and the DeepStack AI annotations, you can visually see the overlap and see why alerts were sent or not sent. Details below.
    • Trigger tab: Set Add to alerts list = Hi-res JPEG files.
      With the *.dat files now created with "Save DeepStack analysis details" option, I like seeing the files side by side in the Alerts folder. Makes identifying alerts of interest easier when viewing files in Windows Explorer.
      deepstack alerts folder.PNG
      deepstack alerts folder.PNG (113.61 KiB) Viewed 732 times
Tracing DeepStack / Save DeepStack analysis

Understanding why DeepStack did not alert has just become a lot easier with the "Save DeepStack analysis" feature. Now BI can show you exactly which images(i.e. frames/samples) were processed by DeepStack when a motion trigger fired so you can understand why an alert was or was not sent.

In order to get the feature to work:
  • First check "Save DeepStack analysis" in Camera settings -> Trigger tab -> Artificial Intelligence. This setting will start creating *.dat files in the Alerts folder pertaining to the meta data for each motion trigger.
  • Open the Status -> DeepStack window.
  • Double click any motion trigger in the Clips List and DeepStack Status window will populate with the DeepStack meta data making it much easier to understand what is going on.
save-deepstack-analysis_optimized.png (43.2 KiB) Viewed 728 times

This example highlights an alert that was cancelled by DeepStack. Double-clicking on an alert while the Status -> DeepStack window is open now populates the Status window with the data associated with the DeepStack analysis.
  • The image has a wealth of information.
    The black areas show the areas not covered by any zones.
    The annotation shows what DeepStack identified.
    The yellow motion highlight in the top right shows where the motion is coming from, i.e. trees blowing in the wind.
  • The box to the left identifies 11 samples which were sent to DeepStack for processing. My BI settings are:
    • Break time = 10s
    • 10 images to sample
    • 1s interval between samples
    • Begin analysis with motion-leading edge
    T-1017ms is the motion-leading edge sample.
    10 more images starting at T + 0 (trigger image) incrementing roughly every 1s (based on settings).
Now you see exactly what BI does when making alert decisions based on your settings. Tweaking settings based on missed alerts has become much easier.

The logs also provide data consistent with the DeepStack analysis data.

deepstack log verification.PNG
deepstack log verification.PNG (6.33 KiB) Viewed 725 times

From the logs, motion was detected at 11:00:51.907. DeepStack cancelled the alert at 11:01:01.504 roughly 10s later.

AI Limitations
The example below illustrates the limitations of AI. With the "Burn label mark-up onto alert images" AI setting and Add to alerts list = Hi-res JPEG files in the Trigger tab, it is easy to get the trigger image as seen below from the Alerts folder. You can right click on alert in clip list -> Open containing folder. Windows explorer should point to the file containing the trigger image.

deepstack-headlights_optimized.png (163.76 KiB) Viewed 1859 times

You can see for yourself in above image what is going on. BI motion sensors identified the moving object, no problem. DeepStack said this is not a car (no annotation).

deepstack miss.PNG
deepstack miss.PNG (30.74 KiB) Viewed 1859 times

If DeepStack saw a car (a different alert below), you would see the annotation. The overlap of the DeepStack object and the Motion detection object (turn on Motion Highlight: Rectangles or Highlight / Rectangle) further confirms the static object test, i.e. it's a moving car, not a parked car because of motion and DeepStack object overlap.
deepstack alert image confirmed.png
deepstack alert image confirmed.png (168.06 KiB) Viewed 1832 times

Does this mean I am going to resign myself to missed alerts due to headlights? Heck no! I'm going to tweak my Motion settings in order to try capturing motion much further up the street. This will allow me to get more samples of the car for DeepStack to analyze and hopefully alert on cars with headlights on. The power of Motion sensors with AI!

Analyze Image with Deepstack
  • Now you can find out what Deepstack recognizes when processing an alert. Great tool if you are wondering why an alert was missed.
  • Select an alert and start playback.
  • Stop the alert clip where you want Deepstack to analyze the image.
  • Take a snapshot. Saving to the clipboard directory (default) is fine.
  • From the clip list, select the jpg image that you just created. (may need to select clipboard icon on the clip list)
  • Right click -> Analyze image with Deepstack.
  • See the annotation.
    deepstack analyze.PNG
    deepstack analyze.PNG (227.69 KiB) Viewed 3155 times
Microsoft Process Explorer

A new debugging tool is now part of the arsenal for tech saavy users. Ever wondered which program has a particular file or directory open? Now you can find out with Microsoft Process Explorer. Process Explorer shows you information about which handles and DLLs processes have opened or loaded. ... s-explorer

The link above provides instructions on how to download and install (zip file). The application comes with a Help file to understand the program. You can use this tool to see if DeepStack is running. It should be shown as "sever.exe" beneath BlueIris.exe (assuming BI is set to start DeepStack) under services.


Gotcha 8: DeepStack appears to be on but no images are being processed.
DeepStack is on (or believed to be on). However, no AI alerts appear in clip list. And no errors in the Status->Log!
We are getting reports that users have DeepStack running based on the DeepStack splash page popping up. However, NO Deepstack annotations appear when they should. It's as if DeepStack is not turned on. Clips list looks like below. Only motion annotations
no deepstack cliplist.png
no deepstack cliplist.png (43.42 KiB) Viewed 1903 times

Once DeepStack is activated, you should receive AI annotations. Even with no objects, the "nothing found" AI annotation appears!
nothing found.PNG
nothing found.PNG (29.18 KiB) Viewed 1903 times

If DeepStack were turned on, the clip list would look like below when objects were found.
deepstack running.png
deepstack running.png (69.3 KiB) Viewed 1903 times

First confirm the BI server and the DeepStack server are communicating. Global settings -> AI tab.
Also confirm you have DeepStack turned ON for the camera you are debugging! Camera settings -> Trigger tab -> Artificial Intelligence. Remember, if you have multiple Profiles, DeepStack needs to be set correctly (i.e. on or off) for each Profile.

Big gotchas:
  • If you uncheck Object detection from the Advanced section of Camera settings -> Trigger tab -> Motion sensor Configure, you have unbeknownst to yourself, turned off DeepStack because object detection is needed for DeepStack to work.
    deepstack object detection gotcha.PNG
    deepstack object detection gotcha.PNG (51.21 KiB) Viewed 571 times
  • If all the trigger sources are inactive in the Alerts tab, you have unbeknownst to yourself, turned off DeepStack.
    If you want to turn off all you alerts, best do so by deleting all your entries in the action map or disabling all your entries in the action map.
    The default setting is to have all your trigger sources active. At a minimum keep the Motion Zones trigger sources enabled so DeepStack is not affected.
    deepstack trigger sources gotcha.PNG
    deepstack trigger sources gotcha.PNG (28.41 KiB) Viewed 571 times
Assuming your settings are correct, run DeepStack manually to see if requests are being sent to DeepStack from BI.
  • Uncheck Auto start/stop in Global settings -> AI.
  • Restart BI to make sure no DeepStack processes are still running. DeepStack could be unstable if trying to do so manually with all the additional processes that DeepStack spawns (server.exe, python.exe, redis-server.exe).

    deepstack processes.PNG
    deepstack processes.PNG (6.46 KiB) Viewed 1899 times
  • Start DeepStack manually from the command line. "deepstack --VISION-DETECTION True --PORT 82". Port 82 is used by default but if you chose a different port (Global settings -> AI tab), specify the same number in command line.
From the console, every time you trigger the camera or camera triggers on its own, observe whether a new request appears in the command window.
deepstack log.PNG
deepstack log.PNG (146.93 KiB) Viewed 1903 times

Gotcha 7: 2021-May-19 Update:
Deepstack is running but sometimes BI responds server not running

The below logs show a successful Deepstack response at 1:09:15. Then at 1:09:46 BI states Deepstack is unreachable, however user confirmed Deepstack is still running.

Code: Select all

5/12/2021 1:09:46 PM Oby1Cam: DeepStack: Not running or unreachable (84)
5/12/2021 1:09:45 PM Oby1Cam: MOTION (79)
5/12/2021 1:09:15 PM Ofy1Cam: DeepStack: Alert cancelled (nothing found) (83)
5/12/2021 1:09:09 PM Ofy1Cam: MOTION (75)
The fix:
First confirm all Software Security Exemptions (e.g. Windows Defender) have been applied. The Windows Tuning Article provides details.

The unreachable message occurs if DS does not respond on the IP:port that you specify. If this is not due to issues with security software, perhaps DS is having other issues. I do see an occasional DS state where it is not responding.

BI is now detecting Deepstack not running or unreachable states and restarts DeepStack behind the scenes (assuming BI is set to start/stop DeepStack). This should fix the issue without any user interaction needed.

Gotcha 6: BI Status Log Error - Deepstack not running or unreachable
If BI cannot communicate with Deepstack, BI will let you know.
Assuming you have Auto start/stop with BI enabled, simple fix is stop Deepstack if it's running and restart BI.
BI cannot talk to Deepstack.png
BI cannot talk to Deepstack.png (6.37 KiB) Viewed 2931 times

Gotcha 5: High CPU usage with Deepstack.
AI / Deepstack is very computationally heavy. If your CPU usage jumps and you don't have a GPU the best you can do is throttle the calls to Deepstack.
  • Turn on Deepstack only for outdoor cameras. BI motion settings are usually fine for indoor cameras.
  • Reduce number of images sampled per alert.
  • Tighten your BI motion sensor settings so cameras do not trigger an alert so frequently.

Gotcha 4: Deepstack missing obvious objects.

Shout out to our users for sharing issues and solutions!

For some reason, the BlueIris software is ignoring what the Deepstack server says in terms of confidence and the label. This seems to happen during the second, third or sometimes fourth image that Deepstack analyzes. I have confirmed with a packet capture that despite these additional images being sent back to the BlueIris software with a successful response, BlueIris ignores it and goes to cancel the alert. I confirm these images meet the criteria of the minimum confidence percentage and "to confirm". I would like to also mention that if DeepStack responds back with a prediction that is within the criteria on the first image, it works OK.

Fix: I just needed to change the keyframe interval to be the same value as the FPS so it is equal to 1.00 (as displayed from Blue Iris). The detection is working much better now. See Camera stream optimization article for details.

Alternative fix: If for some reason you cannot adjust the key frame interval on the camera, identify the current key frame interval (Status -> Cameras tab). If the value is 0.25, i.e. 1/4, i.e. 1 key frame every 4s, then set the pre-trigger buffer (Camera settings -> Record tab) greater than or equal to 4s. This will guarantee BI at least one key frame to process.

Gotcha 3: Deepstack always returning nothing found. (Another shout out to the community for making everyone smarter)
This could imply the Deepstack server is returning server errors and BI is just reporting back "nothing found".
BI will eventually report the Deepstack errors in the Status->log to make users aware of the issue easier.

Until done so, one user creatively used Postman to identify the issue. If you are a developer, you are probably familiar with Postman. It's a very popular tool that helps test and debug APIs. If new to Postman, lots of content on the internet how to use it. By using Postman, the user discovered object detection requests to Postman were returning 500 error codes. 500 errors mean the server had a problem processing the request.

FYI, 400 errors mean the request to Deepstack is not correct. For those new to Postman, a 400 error means the request you created in Postman to send to Deepstack is not correct. You need to first make sure you are sending correct requests to Deepstack.

Once the user realized their Deepstack installation was not working correctly, they reinstalled Deepstack. Deepstack does provide error logs, "C:\Users\[Username]\AppData\Local\DeepStack\logs" (%LOCALAPPDATA%\DeepStack\logs). Understanding the logs were challenging so best course of action was to try reinstalling Deepstack.

User also noted uninstalling Deepstack may not flush out all the files. The user went back into the installation directory (C:\DeepStack) and manually deleted any remaining files.

Furthermore, user noted, it's good to go into task manager and make sure any old Deepstack processes are not running. You can go into Task Manager and kill any of the following processes: server.exe, python.exe, redis-server-exe if they exist.

FYI, installing / using AI Tool for debugging may be worthwhile as BI continues to build out the debugging capabilities. In AI Tool, error messages will pop up right away if old Deepstack versions are still running.
ai tool error.PNG
ai tool error.PNG (4.89 KiB) Viewed 2558 times

Another clue that Deepstack is not running properly is by running Deepstack manually and observing the response time in the console. If Deepstack is taking 1m to respond, there is probably something wrong.
deepstack response time.PNG
deepstack response time.PNG (28.81 KiB) Viewed 2558 times

Gotcha 2: Front door camera

Front door view has the road and therefore a lot of unnecessary alerts. All you really want are people coming to the house or property.

front door.jpg
front door.jpg (154.09 KiB) Viewed 3925 times

So you create a zone in front of your house to reduce the noise from the road.

b_optimized.png (158.66 KiB) Viewed 3925 times

No alerts! The problem is the camera will trigger when a person's feet crosses Zone B. The image being processed by Deepstack is just the feet! So Deepstack will often return nothing found.

The fix! Make sure to always define a Zone A which is the overall area of interest on the camera. The image processed by Deepstack is the overlap of all the zones, thus the entire person is sent to Deepstack for processing and the appropriate alert is sent.

a_optimized.png (125.22 KiB) Viewed 3925 times

One important point that I forgot to mention in the Webinar but did come up in the Q&A is how to tell BI to only be concerned with motion in Zone B??? Simple, see below. By setting "Object crosses zones" to B, I am telling BI to only worry about motion in Zone B. Notice with AI, I uncheck "Object travels" and "Object size exceeds" settings. With AI, the motion sensor settings can usually be simplified!

Zone B setting.PNG
Zone B setting.PNG (12.79 KiB) Viewed 2796 times

Gotcha 1: Person detection & Facial detection

Keep in mind the current implementation of Deepstack runs in series, not in parallel. So if you have both activated in Deepstack, Deepstack will alert on person detection first and never alert on facial recognition. Most people would probably think Deepstack would first discover the person and then proceed to recognize the face, but this does not seem to be the case.

Person detected in image below even though person detection and facial recognition was activated.
person and facial.jpg
person and facial.jpg (131.09 KiB) Viewed 3924 times
With just facial recognition activated.
facial.jpg (114.07 KiB) Viewed 3924 times

Next steps / Submit a ticket
  • Describe issue.
  • Status -> Log errors
  • Status -> Camera tab. Need to know the health of the camera streams.
  • Any steps you have taken so far to resolve the issue. This will help us gain insight as to the problem.
  • Screenshot of Global settings -> AI tab.
  • Screenshot of Alerts in Clips List so we understand what BI is sensing currently.
  • Camera settings. Camera settings -> General tab -> Export
deepstack settings five samples.PNG
deepstack settings five samples.PNG (7.02 KiB) Viewed 1859 times
Fire on Alert Both.PNG
Fire on Alert Both.PNG (54.42 KiB) Viewed 3882 times
Fire on Alert.PNG
Fire on Alert.PNG (48.21 KiB) Viewed 3931 times
No Fire on Alert.PNG
No Fire on Alert.PNG (45.04 KiB) Viewed 3931 times
BI before AI.PNG
BI before AI.PNG (27.67 KiB) Viewed 3934 times
Post Reply