Jump to content

Recommended Posts

I have multiple system running Genetec as well as multiple systems running Exacq. My Exacq systems I have to touch about once a year, however my Genetec systems are continually dropping cameras and video feeds. I only have IQinvision and Arecont cameras on these systems.

 

Anybody else out there having major reliability issues with Genetec?

Share this post


Link to post
Share on other sites

I am the integrator. I am running Genetec 5.1 with service pack 8. It just seems that Genetecs software architecture was designed for analog cameras. As an example Genetec has a software limitation of processing 300mbps when running a stand alone archiver, but if you want to run the archiver and directory on the same machine it has a 200mbps limitation. This limitation was causing me to fail on pulling video from the archiver even though my hardware could handle the load. The software would fail the video extraction because it was causing over the limit network processing to occur. So since I am running the directory and archiver on the same server any time my gigabit network would spike over 20% the video extraction would fail. This limitation is all processing not just the bandwidth from the camera to the archiver, but from the archiver to the guard viewing client as well as to the video extraction no matter what machine it was being pulled off of.

 

Another example is the Security Desk client has a 5gb cache limit when viewing stored video directly out of tiles. The software will cache the video so it can be scrolled through faster which makes sense, however since I am running 5 megapixel Arecont cameras (which suck for bandwidth/storage) So when the customer reaches that 5gb limit of cached video it just stops allowing any more video to be viewed and after my rounds with Genetec the only way to clear it as of now is to close the program and re-open it.

 

The system I am referring to only has 24 physical cameras but due to the 180 cameras I have 33 video feeds.

 

I have an Exacq system running 55 Arecont 5 megapixel cameras and have no problems with it. So I was just seeing if anybody else is experiencing these issues with Genetec.

Share this post


Link to post
Share on other sites
Genetec has a software limitation of processing 300mbps when running a stand alone archiver, but if you want to run the archiver and directory on the same machine it has a 200mbps limitation.

 

That doesn't sound unreasonable. Avigilon Limit is 256Mb total traffic which includes camera down streams plus client upstreams.

Share this post


Link to post
Share on other sites

 

That doesn't sound unreasonable. Avigilon Limit is 256Mb total traffic which includes camera down streams plus client upstreams.

 

Avigilon does not limit anything

Avigilon suggest do not exceed 256 mbps

by the way Exacq suggest the same

Share this post


Link to post
Share on other sites

I serviced a Exacq system with 88 5MP Arecont cameras on one server.... to my surprise it worked

 

I also think Exacq's limit on the Z series is 400Mbps.

Edited by Guest

Share this post


Link to post
Share on other sites
Avigilon does not limit anything

Avigilon suggest do not exceed 256 mbps

by the way Exacq suggest the same

 

 

They are much smarter than I am so I do not plan on testing that limit.

Share this post


Link to post
Share on other sites

We support 2 SEC Colleges that are running Genetec Omni Cast.

One campus has 960 cameras and over 50 archivers.

The only problem we ever have is with the junk GE cameras that haven't been replaced yet.

Never any trouble with Axis or Panasonic unless it's a network issue. And with that many cameras spread out over an entire campus those issues do come up.

That campus is in the process of migrating to Genetec Security Center.

 

The other campus is about 300 cameras and has the same issue. GE cameras take a dump and we replace them with Axis or Panasonic.

 

We have a third customer with over 1000 cameras (all AXIS) that just converted to Avigilon. They couldn't be happier.

 

The other 2 customers above would be on Avigilon if Avigilon supported GE but it doesn't. The one above with the 960 cameras was all set to pull the trigger on replacing their existing GE cameras and going with Avigilon. But Genetec gave them a deal they couldn't refuse.

Share this post


Link to post
Share on other sites

Thanks for all the great information gentlemen. So your comments pointed me in this direction and correct me if I'm wrong, but basically older 7200 RPM drives have a 32MB buffer on it. 32MB = 256Mb.

 

All data needs to go through this buffer?

 

So that is why multiple archivers are needed?

 

But... if my Hard Drive has a 64MB buffer on it will the VMS process 512Mb/s?

 

Just trying to understand this mystery limitation...

Share this post


Link to post
Share on other sites
Thanks for all the great information gentlemen. So your comments pointed me in this direction and correct me if I'm wrong, but basically older 7200 RPM drives have a 32MB buffer on it. 32MB = 256Mb.

 

All data needs to go through this buffer?

 

So that is why multiple archivers are needed?

 

But... if my Hard Drive has a 64MB buffer on it will the VMS process 512Mb/s?

 

Just trying to understand this mystery limitation...

We have been talking about network traffic

Share this post


Link to post
Share on other sites

I understand this said to be a network limitation, but I do not fully understand where this limitation actually exists so the closest I could find was a buffering limitation. So I am hoping to find someone that can explain it in a manner I can understand.

 

So if this is a network limitation why is my network only at 20% when I run into issues?

 

I have processing speed. I have RAM availability. I have network availability. Are we saying the software is processing data with a limitation that is smaller then the processor of the machine/network it resides?

 

I'm confused please explain.

Share this post


Link to post
Share on other sites

I understand this said to be a network limitation, but I do not fully understand where this limitation actually exists so the closest I could find was a buffering limitation. So I am hoping to find someone that can explain it in a manner I can understand.

 

So if this is a network limitation why is my network only at 20% when I run into issues?

 

I have processing speed. I have RAM availability. I have network availability. Are we saying the software is processing data with a limitation that is smaller then the processor of the machine/network it resides?

 

I'm confused please explain.

Share this post


Link to post
Share on other sites

Don't confuse hard drive buffers with stream recording capability. The maximum safe bit rate a system can record at is a function of its throughput, which is determined by network throughput, CPU speed, amount of RAM and storage throughput. Each of these components limit the maximum total system throughput, and you still have to allow overhead for ancillary functions like reading files (viewing recordings) and worst-case system degradation due to factors like drive rebuilds and slow drive response times.

Share this post


Link to post
Share on other sites

OK so here are some additional notes from Genetec that make me think this "throughput" limitation is a "Software Processing Limitation"

 

1. A more powerful server thatn the high end specification will not necessarily increase the maximum capacity

2. Maximum capacity of a virtual machine with the exact same specificaitons as the proposed "metal box" is reduced by 20%

3. A dedicated Network Interface Card (NIC) should be assigned per instance of the Archiver Role when using virtualization

4. Virtual machine must run on Windows Server 2008 R2 / 2012 and VMware Ready hardware

 

Since they are digging into virtualization as a solution I can't seem to relate it to the hardware or network limitations. Or they could just be saying that if virtualization exists then follow the above guidelines.

 

I have this question into Genetec Support, I will update the post when I get something that makes sense to me.

 

Cheers!

Share this post


Link to post
Share on other sites
Don't confuse hard drive buffers with stream recording capability. The maximum safe bit rate a system can record at is a function of its throughput, which is determined by network throughput, CPU speed, amount of RAM and storage throughput. Each of these components limit the maximum total system throughput, and you still have to allow overhead for ancillary functions like reading files (viewing recordings) and worst-case system degradation due to factors like drive rebuilds and slow drive response times.

 

Thanks survtech! That really helps, I posted my last message before reading your explanation.

Share this post


Link to post
Share on other sites

Some food for thought here.

 

I have a 50Mb internet connection.

My Residential Avigilon Server has a 100Mb Local Network connection.

 

Using DU meter my total camera recording stream is about 30Mb.

 

When I run speed test to the internet on this machine I was getting 15Mb internet speed.

I moved the machine to a 1Gb port and repeated the speed test. I got the full 50Mb internet speed.

 

This proves that performance gets drastically reduced before you get close to the ceiling of the specs.

Share this post


Link to post
Share on other sites
Some food for thought here.

 

I have a 50Mb internet connection.

My Residential Avigilon Server has a 100Mb Local Network connection.

 

Using DU meter my total camera recording stream is about 30Mb.

 

When I run speed test to the internet on this machine I was getting 15Mb internet speed.

I moved the machine to a 1Gb port and repeated the speed test. I got the full 50Mb internet speed.

 

This proves that performance gets drastically reduced before you get close to the ceiling of the specs.

 

This theory would imply that you had a gigabit modem right?

Share this post


Link to post
Share on other sites

I don't understand how software can be the limiting factor unless it's 32-bit, then I can see it hitting a wall at 3.2MB of RAM. The rest can be mitigated as new rack mount servers can support 24 HDD, multiple disk controllers, multiple CPU sockets, each with 8-10 cores, up to 2TB of RAM, 4 port 10GigE NICs. Do they have an explanation of what the bottleneck is?

 

With resolution growing, and I see 8MP cameras being the norm with 4K TVs and monitors within two to three years, 250Mbps is just an unrealistic bottleneck, maybe per GigE NIC it's reasonable, but not per server.

Share this post


Link to post
Share on other sites
I don't understand how software can be the limiting factor unless it's 32-bit, then I can see it hitting a wall at 3.2MB of RAM. The rest can be mitigated as new rack mount servers can support 24 HDD, multiple disk controllers, multiple CPU sockets, each with 8-10 cores, up to 2TB of RAM, 4 port 10GigE NICs. Do they have an explanation of what the bottleneck is?

 

With resolution growing, and I see 8MP cameras being the norm with 4K TVs and monitors within two to three years, 250Mbps is just an unrealistic bottleneck, maybe per GigE NIC it's reasonable, but not per server.

 

I completley agree, I have been trying to resolve this issue with Genetec and it is starting to feel like a sales gimick that has been put in place to impliment more software and sales.

 

Here is the question I asked genetec...

 

Support,

I was hoping to get an explaination I can understand about the limitation of throughput that Genetec can process. I know the recommended throughput calculations for 5.1 and 5.2 can be up to 300 cameras or 300 Mbps on a stand alone archiver, which then goes down to 100 cameras or 200 Mbps on a machine running both the Directory and Archiver. So my question is... Where does this limitation actually exist? Is this the limitation of information in which the "Software" can actually process?

When this limitation hits, I have processing power on the machine, I have availability on the network with an avg. utilization of 20% used when this limitation is hit (Directory and Archiver). So does that lead back to a limitation of a software being able to process something? Where is the actual choke point?

Thanks!

Ryan

 

Here is the Response....

 

Hello Ryan,

From experience we always recommend 300 Mbps but that number can be surpassed in some cases depending on your setup, hardware, discs and so on... That’s why we have to mention 300 mbps in our documentation.

 

We would advise to contact your Sales Engineer when it comes to calculate the maximum your system will stay stable depending on your setup.

Thank you,

Share this post


Link to post
Share on other sites

Running a system that close to it's limits is asking for problems.

 

One thing I've run into is that when a RAID array is in a rebuild situation, disk throughput drops significantly.

 

Being at a point where doing something like adding one more client (especially mobile with transcoding), or other minor changes (frame rate, etc.) can take your whole system down is not a position you should be in.

Share this post


Link to post
Share on other sites
Running a system that close to it's limits is asking for problems.

 

One thing I've run into is that when a RAID array is in a rebuild situation, disk throughput drops significantly.

 

Being at a point where doing something like adding one more client (especially mobile with transcoding), or other minor changes (frame rate, etc.) can take your whole system down is not a position you should be in.

 

 

What RAID cards are you using or recommend?

Share this post


Link to post
Share on other sites
Running a system that close to it's limits is asking for problems.

 

One thing I've run into is that when a RAID array is in a rebuild situation, disk throughput drops significantly.

 

Being at a point where doing something like adding one more client (especially mobile with transcoding), or other minor changes (frame rate, etc.) can take your whole system down is not a position you should be in.

 

 

What RAID cards are you using or recommend?

Started out with HighPoint a few years back, wouldn't even recommend them to an enemy now (arrays and volumes disappearing, and unrecoverable, without any logging messages at all, etc).

 

Been using Areca 1882ix series cards for a while now, and they have been working well.

 

Heard reasonably good things about Adaptec, too, but I haven't tried them yet.

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×