Electrical – LED landscape lighting. Adding one more fixture causing all to strobe

electrical

I have installed outdoor landscape lighting ..my transformer is 300 watt and I have only used 200 led watts so far..I go to add one more light and they all start to do strobe like effect.. ..I remove it and the other ones go back to normal…I am working with 150 feet of 12/2 wire….help please!

Best Answer

You are overloading the power supply. The problem is, low voltage systems are susceptible to voltage drop over long distances -- and your LED lights have an ability to compensate for voltage drop by drawing more current.

The fact that you said "transformer" says you're dealing with a low voltage system. It's actually an electronic power supply, because it's crowbar'ing - a protective circuit is shutting it down and then restarting/reclosing. That is the blinking. Generally those are either 12V or 24V. They are rated in watts (300W) but to understand the problem we need to work with amps. Wth 300W, a 12V supply is 25 amps capacity. A 24V supply is 12.5 amps.

Your 200W of load is 16.7A at 12V and 8.3 amps at 24V if voltage drop down wires was not a factor. So let's look at the numbers, assuming that your lights are on average 100 feet down the wire. (you can work it out individually per lamp, but it gets recursively complicated because these loads draw more current at lower voltages.)

At 12 volts

If it flowed 16.66 amps. your average 100' of 12 AWG wire (round trip) will have 5.8 volts of voltage drop (when using this voltage drop calculator, insert a big number for percentage loss permitted, otherwise it will tell you to use very big wire). At 240V, nobody cares about 5.8 volts. But at 12V it slaughters you - leaving only 6.2 volts for the light. Now remember -- the LEDs will increase their current draw to still hit their 200W design target. 200 watts at 6.2 volts is 32.5 amps. This exceeds the 25A limit of the 300W power supply.

Adding that last lamp on the string, especially at the far end where the voltage drop is the worst, was the straw that broke the camel's back.

At 24 volts

If it flowed 8.33 amps, your average 100' of 12 AWG wire (round trip) will have 2.88 volts drop, reducing voltage to 21.1. Your lamps will increase their current draw in proportion, drawing 200W/21.1V or 9.47A. But at this current flow, we must recalculate, and we find the voltage drop will be 3.3 volts. 200 watts at 20.7 volts is actually 9.66 amps. A few more cycles of recalculation and we approach about 9.8 amps at 20.4 volts. The 300W power supply's capacity is 12.5 amps. This is workable at 24 volts, but you are near the limits of the wire. You are wasting 15% of your power making the wire warm, but it would be silly to run expensive 8 or 6 AWG wire.

How to fix it

You never got back to us about the model of lamp, but many LEDs (especially the kind that increase their current draw in this way) will work on both 12V and 24V. That means the problem could be solved as easily as switching to a 24V power supply.

Another way would be to run 120V down the wire. At that voltage, 200W is only 1.66A, which will transmit easily down 150' of 12 AWG. Then, have local 12/24V supplies at each lamp or group of lamps. 12 or 14 AWG is fine for 12V if it isn't going very far.

P.S. if you think this is tough, try the guy who puts his 900-watt windmill 500' from his house. And wants to make 12V to charge batteries. 900W at 12V is 75 amps. Go punch that into a voltage drop calculator and see what happens. Upshot: they don't make wire thick enough! 12V is really hard to carry a distance. That's why AC won the war of the currents.