How to ensure BI does not use NVIDIA Cuda

Post Reply
alitech
Posts: 5
Joined: Sat May 16, 2020 12:47 pm

How to ensure BI does not use NVIDIA Cuda

Post by alitech » Fri May 22, 2020 12:31 pm

Hi

I have an issue where I have ensured that every camera is setup to use hardware accelerated decode to DEFAULT. and GPU Any. However my GPU usage is hitting 50%. I want to ensure that BI does not use hardware acceleration at all. Which option to choose to ensure that the GPU remains untouched?

Thank you

WmG
Posts: 39
Joined: Sat Jun 29, 2019 7:10 pm

Re: How to ensure BI does not use NVIDIA Cuda

Post by WmG » Fri May 22, 2020 1:14 pm

Mine is set the same but I still see GPU usage just to paint the screen. Won't the GPU be used for that with any app, playing a youtube video for instance? I'm no expert but I added a video card just to offload some graphics usage from the CPU and that worked perfectly for me. I see 15% - 50% with the BI UI running, depending on what's going on.

As an aside, and maybe or maybe not related, when I first set up BI on its current PC, and before I added the video card, I had my CPU usage all dialed in and then moved the PC to its permanent location with a larger, higher resolution monitor. My CPU jumped significantly. Plugged it back into the old smaller monitor and all was well again. Just increasing the size of the BI UI window would overload the i7-4790 CPU. I tweaked as needed to get things back under control but it was a learning experience

User avatar
reddawg
Posts: 128
Joined: Sun Jun 30, 2019 11:29 am

Re: How to ensure BI does not use NVIDIA Cuda

Post by reddawg » Fri May 22, 2020 8:32 pm

alitech wrote:
Fri May 22, 2020 12:31 pm

I have an issue where I have ensured that every camera is setup to use hardware accelerated decode to DEFAULT. and GPU Any. However my GPU usage is hitting 50%. I want to ensure that BI does not use hardware acceleration at all. Which option to choose to ensure that the GPU remains untouched?
Go to Blue Iris Settings -> Cameras and set "Hardware Accelerated Decode" to "None". Now any camera that has Video -> Accelerated Decode set to Default will not use hardware acceleration. Also, on a camera by camera basis, you could just change Camera settings -> Video - Accelerated Decode to None. The first method applies to all cameras where the second method applies to each individual camera.

Post Reply