My question is pretty simple. I've seen several forums suggest that you should have the max power on any outlet be 80% of what the circuit is rated for (so for example, if you have a 15 A 120 V outlet, then you shouldn't use more than 80% x 15 A x 120 V = 1440 watts).
Is this really the case? I ask because in my situation, I have a 240 volt outlet running on a 30 ampere dedicated circuit where I want to run some equipment continuously. I am certain that there will be no other equipment running on that circuit. Given this, should I expect to only run 80% x 30 A x 240 V = 5,760 watts on that circuit, or may I use the full 7,200 watts continuously?
Best Answer
Use the 80% rule for continuous loads (because the breaker will, even if you don't)
While 210.22 would seem to indicate that you have the full 30A available to you:
you have to consider that the breaker may have other ideas, as stated in 210.20(A):
The reason why is because garden-variety breakers made to UL 489 (and their counterparts in fuse-land) will eventually trip (or blow) if you run 100% of their rated current through them for hours on end. While there is such a thing as a 100%-rated breaker, they are typically only found in industrial work.
Furthermore, the branch circuit wiring needs an 80% derate for continuous loads as well, as per 210.19(A)(1):
So, you're limited to 5760W for an on-all-the-time (or at least longer than a few hours at a time) load. Non-continuous loads (say a large well pump motor), though, can pull the full 7200W from the circuit as the duty cycle of the load provides adequate time for things to cool off between runs.