Jump to content
marizo

100% usage Processor i7 9 Hikvision IP cams

Recommended Posts

AK357,

Absolutely not. One nVidia graphic card can drive one, two,three, and up to four monitors. But decoding 64Ch X HD1080P X 30FPS in real time will be done on a single PC.

If an nVidia card can drive 4 monitors, then each monitor can be set to display 16 Channels (4X4), or whatever display configurations that you like to see. Each video is to be decoded from its own HD 1080P compressed stream, not from its sub-stream. And they are to be scaled down to compose multi channel display on each monitor. You can move around Main display to any monitors of interested, on the fly. For more info, you can mail me. We should respect this should not be flooded with ad. But I think I have to correct any misleads or wrong concepts.

Are u going to show this at ISC West in Apr /

What is the booth number ?

Share this post


Link to post
Share on other sites

We actually had a small booth in ISC-West 2013 and Shenzhen last Oct, showing 128 Ch on two monitors. We did not make any good opportunities. They did not seem to buy us much. So this time we would not be there. But I will be there, myself. You can call me then.

Share this post


Link to post
Share on other sites
AK357,

Absolutely not. One nVidia graphic card can drive one, two,three, and up to four monitors. But decoding 64Ch X HD1080P X 30FPS in real time will be done on a single PC.

If an nVidia card can drive 4 monitors, then each monitor can be set to display 16 Channels (4X4), or whatever display configurations that you like to see. Each video is to be decoded from its own HD 1080P compressed stream, not from its sub-stream. And they are to be scaled down to compose multi channel display on each monitor. You can move around Main display to any monitors of interested, on the fly. For more info, you can mail me. We should respect this should not be flooded with ad. But I think I have to correct any misleads or wrong concepts.

 

Scaled down. That's not what we were talking about here at all. We were talking about just "zooming out" but still at 1080p. You're dumping information to accomplish what you are doing. You might not feel like it but that's the only way to accomplish what you are doing. I'm not downplaying your work, just the way you're selling it.

Share this post


Link to post
Share on other sites

Drocer,

There seems to be misunderstanding.

If you have decoded video data of 64 Channels of 1080P from 64 channels of main-bitstream (compressed), you can not display all of them on a single monitor. You have to scale down those decoded video into a smaller size to fit into 4X4, or 3X3, whatever configurations, or the number of displaying monitor. That's why I used "scale down". I understand that zooming up might be needed if decoded from sub-stream compressed files.

Share this post


Link to post
Share on other sites
Drocer,

There seems to be misunderstanding.

If you have decoded video data of 64 Channels of 1080P from 64 channels of main-bitstream (compressed), you can not display all of them on a single monitor. You have to scale down those decoded video into a smaller size to fit into 4X4, or 3X3, whatever configurations, or the number of displaying monitor. That's why I used "scale down". I understand that zooming up might be needed if decoded from sub-stream compressed files.

 

Ignore the extra monitors. You can get an i7 and one consumer nvidia card to playback 64 1080p completely untouched, original streams? You have to be using something to dump extra information to display that amount of megapixels during that scale down. That's four 8K video streams of information being played back! Now with professional cards or SLI/dual SLI you can push things but not consumer cards.

Share this post


Link to post
Share on other sites
Ignore the extra monitors. You can get an i7 and one consumer nvidia card to playback 64 1080p completely untouched, original streams? You have to be using something to dump extra information to display that amount of megapixels during that scale down. That's four 8K video streams of information being played back! Now with professional cards or SLI/dual SLI you can push things but not consumer cards.

 

Yes, we do and can, I am repeating.

but no idea of the differences what's consumer or professional cards. We use G series from nVidia. Nothing else. With 64 Channels of 1080P bit stream , No problem. Around 100 Channels of 1080P compressed stream could be the bottleneck when reading out from a HDD. But it is not really because there could be multiples of HDDs or Multiples of Gigabit Internet ports or SSDs(solid state device-new & faster hard disc). Currently, we are looking at a graphic card of U$80. With that we do decode almost 32 Ch 1080P, each 30 FPS. What I mean "almost" is that we are now trying to reduce the CPU load to less than 35 %. The reason is that we have to allow system makers or developers to put on their own value added/enhanced GUIs or applications, say object tracking, face recognition,etc. I will let you know when it is on the market. I left a PM, last night, arriving to you, soon.

Share this post


Link to post
Share on other sites
Ignore the extra monitors. You can get an i7 and one consumer nvidia card to playback 64 1080p completely untouched, original streams? You have to be using something to dump extra information to display that amount of megapixels during that scale down. That's four 8K video streams of information being played back! Now with professional cards or SLI/dual SLI you can push things but not consumer cards.

 

Yes, we do and can, I am repeating.

but no idea of the differences what's consumer or professional cards. We use G series from nVidia. Nothing else. With 64 Channels of 1080P bit stream , No problem. Around 100 Channels of 1080P compressed stream could be the bottleneck when reading out from a HDD. But it is not really because there could be multiples of HDDs or Multiples of Gigabit Internet ports or SSDs(solid state device-new & faster hard disc). Currently, we are looking at a graphic card of U$80. With that we do decode almost 32 Ch 1080P, each 30 FPS. What I mean "almost" is that we are now trying to reduce the CPU load to less than 35 %. The reason is that we have to allow system makers or developers to put on their own value added/enhanced GUIs or applications, say object tracking, face recognition,etc. I will let you know when it is on the market. I left a PM, last night, arriving to you, soon.

 

"We are only of 6 engineers who design the chip as well."

 

What chip is that? $80 USD is a consumer level graphics card. I want to believe you, but red flags keep showing up. I wish you the best of luck if true.

 

Cheers.

Share this post


Link to post
Share on other sites

Lets take another approach to this discussion.

 

Usually if you have a quad panel view on your monitor with 4 1080p cameras running full resolution and full frame 30fps

you are talking probably an average (low avg.) of about 5Mbps of bandwidth per camera/stream so a quad view could be pushing about

20Mbps of bandwidth to the clients viewing station. Now take 64 2MP cameras all of which are running 30fps full resolution in their

video panels we are now talking 5Mbps x 64 cameras = 320Mbps of bandwidth to the client and handling it with an $80 video card?

 

umm huh? You must be using a small res stream or bandwidth or something for the client/video card to decompress all that video.

Share this post


Link to post
Share on other sites

Cliff369,

Your question seems directing to me. I already explained in my previous threads. We do make use of processing power of graphic card carries. No extra hardware something. Pure software.

With a U$80 card, our max number of channels of 1080P (2MP IP camera) to be decoded at 30FPS in real time display are now limited to 32 Channels. Each compressed bit stream is about 8 Mbps, which is a main compressed stream, not sub stream. For 32 channles, the total amount bits to be fed from network port can become around 250 Mbits per second. So Giga LAN is must, or from HDD.

I guess some others may be able to do as much as we can. But they are most likely to skip "de-blocking" procedure, which requires a lot of processing power. So a bit worse video quality that you can detect on the viewing monitor. Some branded big companies do not accept this kind of trick.

The number of monitors are depending on the graphic card inserted in a PC. For 64 Channels or more, we have to use a more expansive G series card from nVidia.

Share this post


Link to post
Share on other sites

Hi every body

I have trouble with CCTV system as below:

I have a server (dell T20, Intel Xeon E3-1225 v3, Ram 8GB, Cache 8MB) to manage 20 camera full HD. However, when i display 20 camera (fullHD) on a monitor then CPU performance of server is 100%

So, how i can do reduction CPU performance of CPU?

Note: Display 20 camera at resolution full HD is require compulsory

Thank you very much!

Share this post


Link to post
Share on other sites
Hi every body

I have trouble with CCTV system as below:

I have a server (dell T20, Intel Xeon E3-1225 v3, Ram 8GB, Cache 8MB) to manage 20 camera full HD. However, when i display 20 camera (fullHD) on a monitor then CPU performance of server is 100%

So, how i can do reduction CPU performance of CPU?

Note: Display 20 camera at resolution full HD is require compulsory

Thank you very much!

 

From reading the rest of the thread you need more processing power to do that. Consider upgrading to a 4790K and a separate graphics card with some serious processing power to take away the heavy lifting from the CPU.

Share this post


Link to post
Share on other sites
Why don't you use Xeon E3 with some good video card?

I have been used video card Geforce GTX-750 for system, but CPU performance until is 100%. So what's solution for me?

Share this post


Link to post
Share on other sites
Why don't you use Xeon E3 with some good video card?

I have been used video card Geforce GTX-750 for system, but CPU performance until is 100%. So what's solution for me?

 

You need more CPU power or you need to view your streams at a lower resolution.

 

Why is it essential that you view them at full resolution?

Share this post


Link to post
Share on other sites
Why don't you use Xeon E3 with some good video card?

I have been used video card Geforce GTX-750 for system, but CPU performance until is 100%. So what's solution for me?

 

You need more CPU power or you need to view your streams at a lower resolution.

 

Why is it essential that you view them at full resolution?

My customer require display at full HD. I try explain for him but he is deliberately refuses to understand. I have been recommend for him solution using a PC client for display, but he don't agree

If you have experience about this problems, please recommend for me some solution example: upgrade video card, upgrade CPU...??

Share this post


Link to post
Share on other sites

My customer require display at full HD. I try explain for him but he is deliberately refuses to understand. I have been recommend for him solution using a PC client for display, but he don't agree

If you have experience about this problems, please recommend for me some solution example: upgrade video card, upgrade CPU...??

 

Your client has backed you into a position where you need more cpu/gpu power. You already have a socket 1150 motherboard. Upgrade to a 4790K. That will be a significant performance increase over your E3

 

https://www.cpubenchmark.net/compare.php?cmp[]=2275&cmp[]=1993

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×