Continue to Site

Eng-Tips is the largest engineering community on the Internet

Intelligent Work Forums for Engineering Professionals

  • Congratulations KootK on being selected by the Eng-Tips community for having the most helpful posts in the forums last week. Way to Go!

Cable Deration vs. 125% 2

Status
Not open for further replies.

nightfox1925

Electrical
Apr 3, 2006
567
Hi electrical folks. I am having a discussion with one of my colleagues regarding the cable sizing and we are on the matter of resolving some technical indifferences.

per NEC, it states that the conductors shall be sized not less than 125% of the full load current. In my way of sizing, I have maintained the 125% compliance. That means I will select a cable ampacity that is having an equivalent ampacity, after I multiply with a derating factor, higher than the 125% x FLA. Example, assuming I have FLA=30A, and my overall derating (temperature + grouping) = 0.50, my cable required ampacity is 30 x 1.25 = 37.5A, If I'm going to choose say #6 (with ampacity =65A) then applying 0.5 derating to it will result to 65 x 0.5 =32.5A which will be less than 37.5A or less than the 125% rule. Hence, a #4AWG (ampacity=85A) is selected which will give a derrated ampacity of 85 x 0.5 = 42.5A

On the other hand my colleague has another way,

Required ampacity = 30/0.5 = 60A, then with this, he may choose #6AWG which is 65A.

However, this will not satisfy the 125% rule since with this size, the cable is (60/65)x100%= 92% loaded or less than the 125% rule.

If I am not in violation of the code, then I would prefer the #6AWG which is costing less. But my question, is the 125% rule be maintained after conductor deration. What is the significance of this 125% rule for my understanding.

Thank you.



 
Replies continue below

Recommended for you

A feeder must be sized for 125% of the continuous load plus 100% of the non-continuous load. This is after any other required de-rating factors have been applied.

The (main) reason for the 125% requirement is that the circuit breaker that is used to protect the feeder is generally capable of carrying only 80% of its rating on a continuous basis. for example, a 100 A breaker can only carry a continuous load of 80 A. So for a continuous load of 100 A, the breaker rating must be 100/0.8 = 125 A = 100 A x 1.25.(In the NEC, "continuous" is defined as load lasting for three hours or more.)

So if you have to de-rate for number of conductors in a conduit, ambient temperature, etc, the final calculated conductor rating must still satisfy the 125% (of continuous load) requirement.

 
Given this rule, if I were to choose #6AWG with an ampacity of 65A, applying derating of 0.5 on it would give me 32.5A.

The breaker rating for a 30A load is 1.25 x 30A = 37.5A, then I will choose a 100AF/40AT, 3P, MCCB. Then the maximum continuous load for this breaker will be 32A which less than 32.5A derated conductor ampacity.

Obviously, the cable size, after applying 0.5 deration, does not comply or maintain the 125% rule. Is the reason why the 125% rule on the breaker adopted to be maintained on the conductor sizing so that the resultant conductor ampacity will always be higher than the breaker trip rating?

If the derated ampacity is 32.5A and for some reason, the load current went up to 34A, is the breaker with an ampere trip rating of 40A but can only carry continuous load of 32A going to trip through its O/L element? Will a 80% rated breaker of 40AT will not protect the 32.5A derated conductor with O/L from 32.5 up to 39A?

I apologize but I am still getting mixed up with the logic of maintaining 125% in addition to the derating factors in cable sizing directly related to the breaker capability.

 
You also need to look at NEC Article 240. You are allowed to use the next larger size breaker (above the conductor ampacity). This applies up to 800 A breakers. So if you calculate a derated ampacity of 32.5 A, you can use a 35 A breaker, but not a 40 A. Article 240 also lists the recognized standard sizes.

The 125% requirement is basically to ensure that the circuit breaker is large enough to carry the load. The conductor has to be protected by the breaker,so they go hand in hand.

In theory, if you have 34 A load that lasts forever, a 40 A breaker could eventually trip.

Also, if you are derating the conductors for ambient temperature, give a thought to the circuit breaker - that equipment has temperature limits as well.





 
That 40A breaker with a 34A continuous load might or might not eventually trip, but the conductors into the breaker would not be able to carry off the heat from the breaker as fast as it is being generated (I[sup]2[/sup]R) and you will eventually do thermal damage to the breaker and/or lugs. That additional heat, if it raised the temperature of the thermal trip element it could have a tendency to reduce the tripping current. Originally all 100% rated breakers had electronic trip units to avoid thermal effects; now there are apparently some 100% rated thermal-magnetic breakers.
 
Thank you very much for the guidance.

I just read from the NEC 2008 section 215.2.(A).(1) states that "....The minimum feeder circuit conductor size, before any application of any adjustment or correction factors, shall have an allowable ampacity not less than the non-continuous load plus 125% of the continuous load". This exactly what dpc states earlier.

Just an observation, I noticed as well that the CEC code, particularly section 8-104.(6) states otherwise. The section states to compare the one calculated with 125% and the one calculated based from de-ration and choose the one that yields the greater conductor size. Any comments?

 
If you have to meet NEC, it doesn't matter much what other Codes may say.
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor