Jump to content

Recommended Posts

I would like to know how I should tune the voltage according to the distance of my camera.

 

For example, if the camera is located 250m and the camera power consumption is 4.7w, how do I calculate base on 200mA, 300mA or 500mA to tune the voltage required at the field end?

 

Anyone knows? Please help.

Share this post


Link to post
Share on other sites

Every device has an operating voltage range. For the purpose of our discussion, I will assume the device requires 12DC. Some companies will state 12VDC +/- 10% in the equipment specifications as example. The specifications may also state 12VDC nominal, with an operating voltage range of 9 to 15VDC as example. It’s always important to obtain the operating voltage range from OEM. I often deploy battery operated systems, so the range of operating voltages is important. You can also bench the device to obtain the cutoff point if the device using a variable DC power supply. Set the voltage to 12VDC and turn the voltage down until the camera stops working. I also place an ammeter in series with the load to observe current draw during the test. Keep in mind, typically there are power supply regulators in devices and the current draw will increase just before cutoff.

 

That being said, if you are using a constant power source the job is much easier. For your application, I will assume 12DC +/-10%. This gives us a voltage range at the camera of 10.8 to 13.2VDC. I converted meters to feet, which is approximately 820 ft. Our power consumed by the load (camera) is 4.7w. Using Ohms Law, I = P/V, so at our nominal 12VDC operating voltage our camera draws 390mA or .39A. Now it’s a game of wire size vs. power supply output voltage. The wire conductors, which connect the voltage source to the load, are part the circuit and must be factored. Wire conductors are rated by the American Wire Gauge (AWG) and manufactures will provide charts the AWG size and ohms per 1000ft or meters. This specification is for one conductor only. We have two conductors for a complete circuit. This is called loop resistance and we need to account for this during our calculations. Let’s assume your pondering using Siamese cable with stranded 18/2. From the OEM cable chart, this equates to 6.9 Ω per 1000ft. By ratios, 6.9/1000 = x/820. X, the resistance for one conductor of 820 ft. of wire is 5.6Ω. For loop resistance, we double this for a total of 11.3Ω. Our load and wire conductors form a series circuit, so our voltage drop in the cable will be Vr = I x R. Then: .39 x 11.3 = 4.4 volts (ouch ). If we had to use this cable we would need a 16.4VDC power supply (4.4 + 12 = 16.4) to have 12VDC at the load. You can start to see how you need to play with the AWG wire sizes and power supply voltages for a final solution. In the case above, using 16AWG, the voltage drop inthe cable would be: 4.4/1000 = x/820 x2 = 3.6Ω. 3.6Ω x .39A = 1.4V. Then our power supply voltage is: 12V + 1.4V = 13.4V. Many power supplies have output adjustments , so we could trim the voltage as needed. Also stay within 75 to 80% of the maximum rated current or power of the power supply. I hope this helps.

 

Normic

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×