View unanswered posts | View active topics


Reply to topic  [ 21 posts ]  Go to page 1, 2  Next
Author Message
  
 Post subject: Geovision GPU Decoding
PostPosted: Tue Jul 24, 2012 11:18 pm 
Registered User

Joined: May 2004
Posts: 449
Location: Washington, DC

Offline
Anyone know how to turn this on?

I can't find the settings for it.

Does the setting just not show up if your hardware isn't a Sandy Bridge chipset with onboard VGA?

Running 8.5.4.0 on Win7 64-bit with only 4 gigs of mem (specs say 8 recc, but not required).

Also mention Sandy Bridge with built in VGA, kinda fuzzy on if that is recc or required. (really need a Sandy for this? will they be adding older chipsets for this feature?)

Pavel, you got GPU Decoding working, didn't you? Did you have to check a setting?

It would suck the donkey-donkey if it was limited to Sandy's only. :(



GPU Decoding Specifications
In V8.5 or later, support for GPU (Graphics Processing Unit) decoding is added to lower the CPU
loading and to increase the total frame rate supported by a GV-System. GPU decoding only supports
the following software and hardware specifications:

Software Specifications:

Supported Operating Systems:
Windows Vista (32-bit) / 7 (32 / 64-bit) / Server 2008 R2 (64-bit)

Resolution: 1 M / 2 M
Codec: H.264
Stream: Single
Note: To apply GPU decoding, the recommended memory (RAM) requirements is 8 GB or more for 64-bit OS and 3 GB for 32-bit OS.


Hardware Specifications:

Motherboard Sandy Bridge chipset with onboard VGA
Ex: Intel® Q67, H67, H61, Q65, B65, Z68 Express Chipset.
Note: If you want to use an external VGA card, it is required to connect a monitor to the onboard VGA
to activate GPU decoding.


Top
 Profile  
Reply with quote  

  
 Post subject: Re: Geovision GPU Decoding
PostPosted: Tue Jul 31, 2012 8:27 pm 
Registered User

Joined: May 2004
Posts: 449
Location: Washington, DC

Offline
Looks like they added IvyBridge support in 8.5.5.0.

Wondering if they will add older support for some of the X## series of chipsets.


Top
 Profile  
Reply with quote  

  
 Post subject: Re: Geovision GPU Decoding
PostPosted: Tue Jul 31, 2012 8:39 pm 
Registered User

Joined: Oct 2011
Posts: 124

Offline
I don't think they will add support for older chipsets since the new CPU-integrated graphics processors are so much faster, also the new GPUs support OpenCL programing language which I think is what is allowing Geo to tap into the gpu.

The bit in the user guide about connecting a monitor to the onboard vga when using an external card is a bit confusing, but the reason is if you install a external vga and do not connect a monitor to the onboard vga it will shut the onboard off, making it unavailable to the Geo software.(can use "dummy monitor"- connect 75-ohm resistors between onboard vga pins: 1-6, 2-7, and 3-8)

There is no setting to enable(unless i'm totally forgetting...).
Tried a celeron g530 in a h61 system with (5) 1.3M and (3) 2M cams- in 8ch live view with 1920x1080 monitor and cams set to single-stream h264 max resolution- Geo was using maybe 20% cpu.


Top
 Profile  
Reply with quote  

  
 Post subject: Re: Geovision GPU Decoding
PostPosted: Wed Aug 01, 2012 9:07 pm 
Registered User

Joined: May 2004
Posts: 449
Location: Washington, DC

Offline
gb5102 wrote:
The bit in the user guide about connecting a monitor to the onboard vga when using an external card is a bit confusing, but the reason is if you install a external vga and do not connect a monitor to the onboard vga it will shut the onboard off, making it unavailable to the Geo software.(can use "dummy monitor"- connect 75-ohm resistors between onboard vga pins: 1-6, 2-7, and 3-8)


Glad I'm not the only one who was confused by that.

My current GV-NVR server is an older Intel Q6600 quad core based system with an AMD HD 3800 series video card running Win7.

I thought the decent GPU's alone could handle CUDA\OpenCL without running a newer chipset but I really don't recall. I just read that the HD3800 doesn't support OpenCL. I don't expect Geo to support the older chipsets\GPU's but I was curious if they would or not. Guess it's time to upgrade that old server.

I'm still unclear though, does the chipset really matter? Or is it just the GPU?

If only the GPU mattered, you'd think they would maybe support a few older chipsets, in which case I could just throw one of my Nvidia GTX570's into the Q6600 system and be ready to use GPU decoding. But they make it sound like you need the right combo of chipset and GPU, not just a certain GPU. (memory helps too - 16G I think it was - recc, not sure what the minimum is).

My current chipset is the Intel X38. The motherboard was made in the first part of 2008, so its a good 4 years old. So, you think the chances of Geo supporting the X38 is pretty slim? What would the odds be of them supporting it?

I'm also curious if anyone on this forum, with the proper spec'd server (Sandy\Ivy bridge, decent GPU, lots of memory) has done a side by side comparison of GV-NVR with GPU decoding off and on. What kind of CPU savings were experienced?

My current GV-NVR runs at about 40-50% CPU on all 4 cores with Live view of 3 cams on the 4x4 panel display. If I make the panels smaller, the CPU goes down by half.

It goes up substantially more than 50% when a client connects.

I haven't turned on OnDemand yet either. That should help a bit. One thing is certain. These Megapixel IP camera's in Live View can eat your CPU for breakfast, and GPU decoding is going to be a neccessity.

To quote Arnold:



Last edited by LittleScoobyMaster on Wed Aug 01, 2012 10:14 pm, edited 1 time in total.

Top
 Profile  
Reply with quote  

  
 Post subject: Re: Geovision GPU Decoding
PostPosted: Wed Aug 01, 2012 10:13 pm 
Registered User

Joined: Oct 2011
Posts: 124

Offline
Quote:
I'm still unclear though, does the chipset really matter? Or is it just the GPU?


Its the chipset/CPU. Has to be a Intel HD Graphics GPU/IGP(Integrated Graphics Processor- inside LGA1155 CPU).

LGA1155 motherboard must have on-board video output.(some boards do not have video ports- these won't work because the IGP will be disabled)

Also be aware that not all Sandy Bridge CPUs have integrated GPU- example the i5-2550k cpu does not- so it would not be useful for GPU Decoding feature.

Quote:
I'm also curious if anyone on this forum, with the proper spec'd server (Sandy\Ivy bridge, decent GPU, lots of memory) has done a side by side comparison of GV-NVR with GPU decoding off and on. What kind of CPU savings were experienced?


I would be interested to see a side by side as well.
I will be putting an analog dvr together in the next week or so, if I get some time before installing GV800B card I will set it up as NVR and do some testing. I will measure CPU first while using Sandy Bridge IGP(Celeron G530), then swap in a HD6450 and disable the Intel IGP.


Top
 Profile  
Reply with quote  

  
 Post subject: Re: Geovision GPU Decoding
PostPosted: Wed Aug 01, 2012 10:17 pm 
Registered User

Joined: May 2004
Posts: 449
Location: Washington, DC

Offline
gb5102 wrote:
Quote:
I'm still unclear though, does the chipset really matter? Or is it just the GPU?


Its the chipset/CPU. Has to be a Intel HD Graphics GPU/IGP(Integrated Graphics Processor- inside LGA1155 CPU).

LGA1155 motherboard must have on-board video output.(some boards do not have video ports- these won't work because the IGP will be disabled)

Also be aware that not all Sandy Bridge CPUs have integrated GPU- example the i5-2550k cpu does not- so it would not be useful for GPU Decoding feature.

Quote:
I'm also curious if anyone on this forum, with the proper spec'd server (Sandy\Ivy bridge, decent GPU, lots of memory) has done a side by side comparison of GV-NVR with GPU decoding off and on. What kind of CPU savings were experienced?


I would be interested to see a side by side as well.
I will be putting an analog dvr together in the next week or so, if I get some time before installing GV800B card I will set it up as NVR and do some testing. I will measure CPU first while using Sandy Bridge IGP(Celeron G530), then swap in a HD6450 and disable the Intel IGP.


That would be great. Can't wait to see the results.

I read the part about the integrated GPU requirment as well, and that boggles me. Why limit the feature to an onboard integrated GPU when something like the latest Nvidia\AMD card would do so much better?


Top
 Profile  
Reply with quote  

  
 Post subject: Re: Geovision GPU Decoding
PostPosted: Wed Aug 01, 2012 10:50 pm 
Registered User

Joined: Oct 2011
Posts: 124

Offline
I agree- the external cards are definitely much more powerful.

Sandy Bridge must have been the easiest starting point, but I think this feature is going to become a requirement as resolutions climb higher and higher so I would imagine they will add support for other GPU platforms in the future. We'll just have to wait and see what the future brings [-o<


Top
 Profile  
Reply with quote  

  
 Post subject: Re: Geovision GPU Decoding
PostPosted: Wed Aug 08, 2012 5:45 pm 
Registered User

Joined: Oct 2011
Posts: 124

Offline
Had some time to compare with and without gpu decoding- only had 4 cams but it still shows a noticeable difference.

GPU Decoding- sandy bridge celeron IGP: 5-15% CPU (10min. avg: 8.79%)

PCI-E Radeon HD4350 1GB, IGP disabled: 25-35% CPU (10min. avg: 28.48%)

CPU usage measured in full-screen mode with cams set to single-stream, max resolution, H264 codec. Geovision panel size/monitor resolution is 1600x1200, DirectDraw Overlay and De-interlace Render options selected.

System Specs:
GV-NVR v8.5.4
Celeron G530
2x 4GB DDR3-1066 RAM
P8H61-M LE/CSM motherboard
Win7 Pro x64- Aero, UAC disabled

Cams:
BX140DW - 1MP
VD122D - 1.3MP
VD120D - 1.3MP
MFD110 - 1.3MP


Top
 Profile  
Reply with quote  

  
 Post subject: Re: Geovision GPU Decoding
PostPosted: Wed Oct 31, 2012 6:19 am 
Registered User

Joined: Oct 2012
Posts: 5

Offline
gb5102 wrote:
Had some time to compare with and without gpu decoding- only had 4 cams but it still shows a noticeable difference.

GPU Decoding- sandy bridge celeron IGP: 5-15% CPU (10min. avg: 8.79%)

PCI-E Radeon HD4350 1GB, IGP disabled: 25-35% CPU (10min. avg: 28.48%)

CPU usage measured in full-screen mode with cams set to single-stream, max resolution, H264 codec. Geovision panel size/monitor resolution is 1600x1200, DirectDraw Overlay and De-interlace Render options selected.

System Specs:
GV-NVR v8.5.4
Celeron G530
2x 4GB DDR3-1066 RAM
P8H61-M LE/CSM motherboard
Win7 Pro x64- Aero, UAC disabled

Cams:
BX140DW - 1MP
VD122D - 1.3MP
VD120D - 1.3MP
MFD110 - 1.3MP


Thanks for posting this, I have been looking for actual numbers and glad to see someone took the time to post some!


Top
 Profile  
Reply with quote  

  
 Post subject: Re: Geovision GPU Decoding
PostPosted: Tue Jul 14, 2015 11:22 pm 
Registered User

Joined: May 2004
Posts: 449
Location: Washington, DC

Offline
gb5102 wrote:
Had some time to compare with and without gpu decoding- only had 4 cams but it still shows a noticeable difference.

GPU Decoding- sandy bridge celeron IGP: 5-15% CPU (10min. avg: 8.79%)

PCI-E Radeon HD4350 1GB, IGP disabled: 25-35% CPU (10min. avg: 28.48%)

CPU usage measured in full-screen mode with cams set to single-stream, max resolution, H264 codec. Geovision panel size/monitor resolution is 1600x1200, DirectDraw Overlay and De-interlace Render options selected.

System Specs:
GV-NVR v8.5.4
Celeron G530
2x 4GB DDR3-1066 RAM
P8H61-M LE/CSM motherboard
Win7 Pro x64- Aero, UAC disabled

Cams:
BX140DW - 1MP
VD122D - 1.3MP
VD120D - 1.3MP
MFD110 - 1.3MP


How do you enable or disable this feature in GV-NVR?

Does the feature only show up if your machine meets all the requirements of GPU decoding?

Let's say you had a system with 2 video cards, can you tell Geo to use the GPU decoding on one of the cards but not the other?

From what I've been reading it seems to me that GPU decoding can only be used with the built in motherboard GPU's, is that still true?


Top
 Profile  
Reply with quote  

  
 Post subject: Re: Geovision GPU Decoding
PostPosted: Fri Aug 28, 2015 5:10 am 
Registered User

Joined: Aug 2015
Posts: 18

Offline
LittleScoobyMaster wrote:
I read the part about the integrated GPU requirment as well, and that boggles me. Why limit the feature to an onboard integrated GPU when something like the latest Nvidia\AMD card would do so much better?


If my guess is correct, this is due to the limit that draws from Intel's Quick Sync.
Due to this behavior, I would guess the GPU acceleration mainly comes from Intel's Quick Sync (maybe a little help from OpenCL too).
The reason you see this limit is because Intel will only activate QuickSync hardware when integrated graphic is connected to a monitor output, when Intel's integrated graphic does not have monitor output connected then QuickSync hardware will not be activated. This behavior was seen when Intel first introduced QuickSync 3 years ago.

I heard that there are workaround that will activate QuickSync with discrete graphic cards. You may have to google for it.


Top
 Profile  
Reply with quote  

  
 Post subject: Re: Geovision GPU Decoding
PostPosted: Fri Aug 28, 2015 5:16 am 
Registered User

Joined: Aug 2015
Posts: 18

Offline
LittleScoobyMaster wrote:
How do you enable or disable this feature in GV-NVR?

Does the feature only show up if your machine meets all the requirements of GPU decoding?

Let's say you had a system with 2 video cards, can you tell Geo to use the GPU decoding on one of the cards but not the other?

From what I've been reading it seems to me that GPU decoding can only be used with the built in motherboard GPU's, is that still true?

I would also like to see an icon indicating whether the acceleration is enabled or not. Unfortunately that wasn't available in UI.
So far, if my guess is correct, the GPU acceleration only works with Intel's integrated graphic with QuickSync.
You can check whether your Intel CPU have QuickSync here: http://ark.intel.com/search/advanced?QuickSyncVideo=true&MarketSegment=DT


Top
 Profile  
Reply with quote  

  
 Post subject: Re: Geovision GPU Decoding
PostPosted: Thu Nov 26, 2015 12:15 pm 
Registered User

Joined: May 2004
Posts: 449
Location: Washington, DC

Offline
stealth_lee wrote:
LittleScoobyMaster wrote:
How do you enable or disable this feature in GV-NVR?

Does the feature only show up if your machine meets all the requirements of GPU decoding?

Let's say you had a system with 2 video cards, can you tell Geo to use the GPU decoding on one of the cards but not the other?

From what I've been reading it seems to me that GPU decoding can only be used with the built in motherboard GPU's, is that still true?

I would also like to see an icon indicating whether the acceleration is enabled or not. Unfortunately that wasn't available in UI.
So far, if my guess is correct, the GPU acceleration only works with Intel's integrated graphic with QuickSync.
You can check whether your Intel CPU have QuickSync here: http://ark.intel.com/search/advanced?Qu ... Segment=DT


Thanks stealth_lee.

Wow, that is quite a limitation. I mean, If I have a nice Nvidia GPU hanging around, it sounds like I can't use it with Geovision. That's a bummer. I wonder why they don't allow the GPU decoding to work with Nvidia or AMD GPU's?


Top
 Profile  
Reply with quote  

  
 Post subject: Re: Geovision GPU Decoding
PostPosted: Tue Dec 08, 2015 9:04 am 
Registered User

Joined: Aug 2015
Posts: 18

Offline
LittleScoobyMaster wrote:
stealth_lee wrote:
I would also like to see an icon indicating whether the acceleration is enabled or not. Unfortunately that wasn't available in UI.
So far, if my guess is correct, the GPU acceleration only works with Intel's integrated graphic with QuickSync.
You can check whether your Intel CPU have QuickSync here: http://ark.intel.com/search/advanced?Qu ... Segment=DT


Thanks stealth_lee.

Wow, that is quite a limitation. I mean, If I have a nice Nvidia GPU hanging around, it sounds like I can't use it with Geovision. That's a bummer. I wonder why they don't allow the GPU decoding to work with Nvidia or AMD GPU's?


Well, I guess that's how they choose to implement GPU decoding function at first.
I heard through the grapevine that they will be supporting nVidia GPUs in the future.

My guess would be their nVidia GPU decode performance won't match Intel GPU's QuickSync. Since Intel put a lot of effort into QuickSync and nVidia GPU have hardware limitation while doing computation (especially context switching...)


Top
 Profile  
Reply with quote  

  
 Post subject: Re: Geovision GPU Decoding
PostPosted: Mon Jun 27, 2016 9:32 pm 
Registered User

Joined: May 2004
Posts: 449
Location: Washington, DC

Offline
stealth_lee wrote:
Well, I guess that's how they choose to implement GPU decoding function at first.
I heard through the grapevine that they will be supporting nVidia GPUs in the future.


Your grapevine was half-right. :)

Looks like they added Nvidia cards GPU decoding for GV-VMS, but not for GV-NVR.

I hope they add it to GV-NVR because GV-VMS is an expensive upgrade from GV-NVR. It's cost prohibitive. (Third Party IP Cameras)

GV-VMS Version History
Version 15.10.1.0
Released date: 02/22/2016

New in Main System:

Support for Windows 10
Support for GV-ASManager / GV-LPR Plugin V4.3.5.0
Support for GV-Web Report V2.2.6.0
Support for GPU decoding with external NVIDIA graphics cards
Support for dual streaming of GV-Fisheye Camera
Support for H.265 codec
Support for sending alert notifications through SNMP protocol
Support for up to 20,000 client accounts in Authentication Server
New image orientation options for corridor scenes (supported GV-IP Cameras only)
Support for marking video bookmarks in live view
Connection log with Center V2 / Vital Sign Monitor / Failover Plugin registered in System Log
Support for compacting videos to key frame only when merging and exporting


Top
 Profile  
Reply with quote  

Display posts from previous:  Sort by  
Reply to topic  [ 21 posts ]  Go to page 1, 2  Next


Who is online

Users browsing this forum: No registered users and 2 guests


You cannot post new topics in this forum
You cannot reply to topics in this forum
You cannot edit your posts in this forum
You cannot delete your posts in this forum
You cannot post attachments in this forum

Search for:
Jump to:  

It is currently Thu Dec 14, 2017 8:17 am

The contents of this webpage are copyright © 2003-2016 CCTVForum.com. All Rights Reserved.