Electrical Power Loss Over Distance – Causes and Solutions

circuit breakerelectricalpowertoolsvoltage

I am working on a project where I install an electric system by my property entrance, than I have to run a wire about at least 700 feet. I do want to deliver 240VAC and I know that voltage drop is a thing.

I've calculated aluminum wire 2AWG at 850' 200 amps over distance like if I want to power up a table saw for example at the end. Then I get voltage drop which leaves off 156 vac at the end.

My main question is, how can I convert 151vac to 120vac?

FYI, I will actually install an outlet at the end of 850' and power things over night. Generator won't be an option.

PS: I made major edits, I am not going to use copper wire, and I will supply 240 vac 200 amps instead of 120.

Best Answer

You've touched on the reason why electric utilities use high voltage for their distribution lines and also why you see transformers either on poles at your location or on the ground in a (usually) green box.

They do this to avoid power loss in the wires which increases with the square of the current. Mathematically, you have:

P = I^2 * R where P is power in Watts, I is current in Amps and R is resistance in Ohms.

Let's take two hypothetical examples that demonstrate this:

  1. You want to transmit 1800W, which is 15A at 120V through a distance of 500' using a wire which has resistance of 1 Ohm. The power lost to heat in the wires in this case will be 15^2 * 1 = 225W. So you put 1800W in at the source but you only get 1575W at the load.

    1. Now say you bump up the voltage to 480V. Your 1800W only requires 3.75A at that voltage. So the power lost in the wires in this case is 3.75^2 * 1 = 14W. You save 211W of power by using the higher voltages.

Usually for consumers this doesn't play out all that often although that is one reason why high power consuming appliances like water heaters, HVAC systems, ovens and ranges almost always use 240V vs. 120V. The current requirements are 50% by using the higher voltage.

Consider that the power utilities are transporting megaWatts of power and often at very high voltages. The long distance towers you see may operate at 750K to 1,000,000 volts.

Now there are some other considerations. Transformers are not lossless, so you lose some power, again due to heat, when you step-up and then step-down the voltage. But the savings overall due to using smaller and cheaper wire and long term operating costs due to lower power losses may make it worthwhile.

As noted above, 2AWG Cu wire is quite expensive. Usually for longer distances Al wire is a much better value although you will usually need to go up a gauge or two in order to get the same performance.

By the way, you mentioned "AMP loss" in the wiring. That's not really a thing. Amps are not lost, for every Amp you put in one end of the wire, you get a corresponding Amp out the other end. You do lose Volts and that's where the power loss comes in.