What gauge wire/cord do I need for these outdoor lamps

powerwire

I have two 700 watt outdoor halogen lamps, each connected to a splitter, then back to a power source. Given the distances illustrated below, what gauge cord should I use?

enter image description here

Best Answer

To determine what gauge wire you need, you need to know two things: How much current is needed, and what current capacity (ampacity) corresponds to different wire gauges.

Ampacity

You can find ampacity charts online (for example). This chart also shows resistance.

Ampacity chart

There's another factor you must consider, which is voltage drop. Long lengths of wire will have an associated resistance (because copper is not a superconductor), so you will need to take into account what that resistance is (perhaps even using a larger wire to accommodate it if necessary). This resistance means that the load won't receive the full voltage supplied at the other end of the cord; this is also known as the "voltage drop".

Ohm's Law

So, let's first calculate what current will be needed. Ohm's law allows you to determine voltage (volts), current (amperes), resistance (ohms), or power (watts) by knowing any two values.

Ohm's Law

The splitter will connect both lamps in parallel, giving each 120 volts. You know that they are 700 watts, so you can find current by using the formula I = P/E.

I = 700 / 120 = 5.83 A

The ampacity chart shows that you would need at least 14 AWG (for power transmission) for the connections between your splitter and the lamps.

The current from the splitter to the source would of course be double (11.7 A), so you would need at least 10 AWG wire (11 AWG is uncommon, so I picked the next larger size).

The Lamp Runs

In the run of 14 AWG wire, you would have an additional resistance of 2 * 2.525Ω/1000 or 0.51Ω. (Remember the length of wire is actually double; one for line and one for neutral.) You can calculate the voltage drop of the wire by treating it like a circuit where the lamp is one resistor and the wire is another, then use Ohm's law to determine the voltage on both resistors. The lamp's resistance is (R = E^2 / P):

R = 120^2 / 700 = 20.5Ω

The total resistance (lamp plus wire) is:

20.5 + 0.51 = 21Ω

Now the total current with 120V applied is (I = E/R):

I = 120 / 21 = 5.7 A

This is less than the original current (5.83 A) because with more resistance (the long wire), less current can flow. You can also determine the voltage drop and how much power the wire itself is dissipating (E = R * I, P = R * I^2): 2.9 volts dropped, 16.6 watts dissipated. (This isn't a lot (less than 3% of the total voltage) so you could just use 14 AWG for these runs. "Upgrading" to thicker wire would present slightly less resistance, but the benefit would not outweigh the added cost of more expensive wire.)

The Home Run

Doing the math on the "home run" depends on what was selected for the runs to each lamp. We'll assume that you stuck with the 14 AWG for now, so the total current is 5.7 * 2 = 11.4 A. (We can treat the splitter and downstream wires and lamps as a load of 1368 watts, or a resistance of 10.5Ω.)

Total resistance using 10 AWG wire (and remembering to double the run length) for the home run:

10.5 + 0.2 = 10.7Ω

The total current would be:

120 / 10.7 = 11.2 A

Why does the current seem to be lowering each time we calculate things? Because the wire resistance limits how much current can flow, just like a resistor in a circuit. Two 700 watt lamps with superconducting cables would actually pull 5.83 A each, or 11.7 A in total. With the extra length of wires creating resistance, the whole configuration pulls 0.5 A less.

Because 10 AWG is rated for up to 15 A power transmission, it is sufficient.

If you were to use a cheap 16 AWG extension cord for the home run, rated for only 3.7 A, you would run into trouble.