This article goes under the covers of the Trigger tab. This article is part of the Triggers / Recordings / Alerts series where we go deep into one of the most popular features of Blue Iris (BI), the ability to define motion triggers -> instruct BI to record when cameras trigger -> send corresponding alerts.
If you prefer to listen instead of read, watch the webinar associated with this article, The Trigger Tab webinar.
We also have an overview Triggers and Alerts 101 webinar for those who are new to BI.
Blue Iris Model
[webinar discussion at 5:11]
One way to think about Blue Iris is a software engine (server) that processes inputs/sensors from the outside world (Triggers) and makes sense of them by sending the appropriate Alerts.
It is important to map the model into the BI dialogs and features. The triggers in the mental image come straight from the Trigger sources and zones controls in the Alerts tab.
Furthermore, the Alerts list comes straight from the Action Set seen frequently throughout the application.
The other large component of BI functionality is connecting / managing cameras and recording moments of interest.
This article is focused on the most common scenario, i.e. setting your motion sensors to trigger motion triggers, recording those moments in time and sending alerts to yourself, i.e. Push notifications to the mobile app, SMS messages and/or Email alerts.
Triggers / Recordings / Alerts Model
[webinar discussion at 10:17]
The model below is very important to understand in terms of what is really going on behind the scenes with the BI software / logic. The image also maps the model to the BI user interface. Once the model (how BI thinks) is understood, it becomes very easy to understand what BI is doing based on the visual affects in the user interface. This is one of the most powerful features of BI. The visual affects in the UI reflects on your settings and provides feedback so you can see for yourself if your settings work. It makes tuning your system so much easier.
Trigger Tab Uncovered
[webinar discussion at 18:30]
Like most BI dialogs, the top half consists of settings pertinent to the topic being addressed by the dialog. In this case the Trigger settings are at the top of the dialog.
The bottom half of the dialog pertains to actions you can take based on the trigger settings.
[webinar discussion at 20:25]
Let's start with the bottom half of the dialog, the trigger actions. The Add to alerts list action is the most important setting.
- No: This option is rarely used. If for some reason you do not want a record in the database associated with a trigger, you would choose No.
- Database only: Default setting. Write the trigger details in the database. By doing so the Alerts in the clips list and the timeline gets populated.
- Hi-res JPEG files: First option in list box to start creating hard copies of trigger events. This selection will save the Trigger leading edge image to the Alerts folder.
- Export to MP4 files: Option to save the motion trigger into MP4 format in the Alerts folder.
Small businesses such as 7-Eleven owners use this option frequently. MP4 files are created immediately and saved to the Alerts folder. The Alerts folder is often synced to a cloud storage service like Google drive. This way the MP4 files are saved to the cloud immediately. The 7-Eleven owner is assured if he gets robbed and the robber smashes the BI server, the evidence is still available in the cloud for retrieval.
Motion sensor dialog
[webinar discussion at 25:27]
With the Motion sensors, the most important concept is to understand Sense -> Motion -> Trigger.
- In terms of settings, Contrast corresponds to Sense, i.e. the camera sensitivity associated with determining motion on a camera.
- The Min. object size corresponds to the Motion, i.e. BI will pay attention to an object as soon as it has met the minimum object threshold.
- The MAKE time is how long the minimum object needs to be in motion before BI decides to trigger, i.e. create a motion trigger.
Advanced trigger options
There are additional settings in the Advanced section.
- Object travels
- Object size exceeds
Optimal Motion sensor settings with AI
[webinar discussion at 35:32]
With AI, some of the motion sensor settings can now be loosened or skipped because AI can often do a better job of filtering the event accurately. I have come to not rely on Object travels and Zone crossing as much any more. I still define a Zone A often as the area of interest for the camera. There is no reason to waste resources processing activities occurring across the street from your house.
Case Study: Creating Triggers -> Alerts
I walk through a use case that is appearing more and more based on tickets. Users want to track two events from the same camera.
- Users want a record in their database when a person walks on the sidewalk or a car drives on the street.
- However, the user ONLY wants to be alerted when a car drives into the driveway OR a person enters their property.
- This use case will introduce users to the Wait action command and its usage.
- A deeper understanding of how the motion sensors logic works.
The gotchas section documents learnings from past tickets.
Gotcha 1: Motion triggers stopped
See Gotcha 1 from the Record Tab article: Recording off or skipped.
Motion triggers stopped are often in the context of no recordings. The above gotcha takes Trigger and Record settings into account.
Issue: User has a 2MP+ camera. The camera would not trigger. However, the motion sensor seemed to be sensing fine in the playback window.
Fix: User had anamorphic selected.
Camera settings -> Video tab.
This is a Trigger gotcha because the camera would never trigger. The reason it would not trigger is because the video was scaled down considerably before going to the motion sensor and causing the motion sensor to never trigger (e.g. min object size would never be met).
The user basically reduced his camera to a low-end USB camera with the image drastically scaled down.