How to ensure BI does not use NVIDIA Cuda

Post Reply
alitech
Posts: 5
Joined: Sat May 16, 2020 12:47 pm

How to ensure BI does not use NVIDIA Cuda

Post by alitech »

Hi

I have an issue where I have ensured that every camera is setup to use hardware accelerated decode to DEFAULT. and GPU Any. However my GPU usage is hitting 50%. I want to ensure that BI does not use hardware acceleration at all. Which option to choose to ensure that the GPU remains untouched?

Thank you
WmG
Posts: 44
Joined: Sat Jun 29, 2019 7:10 pm

Re: How to ensure BI does not use NVIDIA Cuda

Post by WmG »

Mine is set the same but I still see GPU usage just to paint the screen. Won't the GPU be used for that with any app, playing a youtube video for instance? I'm no expert but I added a video card just to offload some graphics usage from the CPU and that worked perfectly for me. I see 15% - 50% with the BI UI running, depending on what's going on.

As an aside, and maybe or maybe not related, when I first set up BI on its current PC, and before I added the video card, I had my CPU usage all dialed in and then moved the PC to its permanent location with a larger, higher resolution monitor. My CPU jumped significantly. Plugged it back into the old smaller monitor and all was well again. Just increasing the size of the BI UI window would overload the i7-4790 CPU. I tweaked as needed to get things back under control but it was a learning experience
User avatar
reddawg
Posts: 145
Joined: Sun Jun 30, 2019 11:29 am

Re: How to ensure BI does not use NVIDIA Cuda

Post by reddawg »

alitech wrote: Fri May 22, 2020 12:31 pm
I have an issue where I have ensured that every camera is setup to use hardware accelerated decode to DEFAULT. and GPU Any. However my GPU usage is hitting 50%. I want to ensure that BI does not use hardware acceleration at all. Which option to choose to ensure that the GPU remains untouched?
Go to Blue Iris Settings -> Cameras and set "Hardware Accelerated Decode" to "None". Now any camera that has Video -> Accelerated Decode set to Default will not use hardware acceleration. Also, on a camera by camera basis, you could just change Camera settings -> Video - Accelerated Decode to None. The first method applies to all cameras where the second method applies to each individual camera.
Blue Iris v5.3.9.10 | Win10 x64 version 22H2 | Dahua IPC-HFW2100, Amcrest IP2M-841W, Hikvision MINI PT DS-2CD2F52F-IS, Edimax IC-3030iWn | Intel i5-2500 CPU, 8GB Ram, Samsung 860 EVO 512GB SSD, WD Black 1TB HD.
Post Reply