Continue to Site

Eng-Tips is the largest engineering community on the Internet

Intelligent Work Forums for Engineering Professionals

  • Congratulations pierreick on being selected by the Eng-Tips community for having the most helpful posts in the forums last week. Way to Go!

Effect of conduit size on cable derating factor

Status
Not open for further replies.

L30N

Electrical
Aug 7, 2006
2
Hi everyone,

I am pretty new to the field, and I am wondering if anyone can advise me on this. In relation to underground distribution cable, if ducted installation is to be performed, would the conduit size/ duct size being used affects the derating factor of the cable to be installed. If it has, then what is the main cause for it? i.e. heating?
Thanks in advance.
 
Replies continue below

Recommended for you

Your cable rating will increase (because thermal resistivity will reduce) with the increase in duct diameter. The influence are:
(a) Thermal Resistivity through cable air from cable surface to duct inside surface - mostly (but not totally) independant of duct size
(b) Thermal Resistivity through duct wall - decrease as the ratio of the duct's outside to inside diameter decrease (i.e. increase the raduis but keep wall thickness the same) - more area (smaller resistivity) is available for heat flow as the raduis is increased. The thermal resistivity will increase if the wall thickness is increased.
(c) Thermal Resistivity from duct to surounding material (backfill/concrete) - decrease as duct size increase (larger duct area is in contact with surrounding material and therefore easier heat transfer)

You will need to use IEC 287 or finite element analysis packages if you want to calculate the exact influence for your application.
 
It may have an effect due to heat transfer characteristics of each medium. But in the US, you would not get any credit for increasing conduit size if you are talking about one circuit in one conduit - the NEC gives a certain ampacity for a certain configuration. If you are dealing with multiple conduits in close proximity, such as a duct bank, where the concern is mutual heating, the NEC recognizes use of the Neher-McGrath method for calculating cable heating and de-rating factors.

 
" ...Your cable rating will increase (because thermal resistivity will reduce) with the increase in duct diameter. ..."
First of all, we cannot change THERMAL RESITIVITY of materials as this is one of their characteristics. We can change thermal RESISTANCE.

(a) item - It suggests that if you increase conduit/duct size the circuit ampacity increases (sorry if I misunderstood). If we increase distance and air volume inside the duct the thermal RESISTANCE increases as thermal resitivity of air is approx. 33.8 [K*m/W] !!! which is many times bigger than i.e. water 1.65 [K*m/W] or even PVC 6.0 [K*m/W]. This is why a material called bentonite is pumped into the ducts to expell the air.

Using "derating factors" on buried cables is only approximation. I would suggest using IEC60287 standard or Neher-McGrath calculation which includes cyclical loading. In more complicated arrangements the mentioned finite element method is the best practical solution.

 
The thermal resistance between the cable and the duct is very complex. It comprises convection, conduction, and radiation. Simplifications used by Neher-McGrath for typical cable in duct situations result in an equation:

T'4=n'·A/[1+0.1·(B+C·Tm)·D's]

where D's is the equivalent diameter of the cable. The simplified equation is independent of the duct diameter, as stated by KJvR.

The thermal resistance of the duct wall and from the duct wall to surrounding earth is lower for larger ducts as stated by KJvR.
 
Thanks for the invaluable advises everyone. Please correct me if I am wrong, but based on these inputs that you have given to me, it is my understanding that the size of the duct being used in underground cable installation does have an effect on the cable rating due to change in thermal resistance associated with the air or filling substance (if this is used) between the cable and the duct. Larger duct size apparently allows for better heat dissipation and hence lower overall resistance which in turn would improve the cable rating factor.

Thank you as well to jghrist, KJvR and cgrodzinski for providing me with the document and equation used to calculate this. I will try to do further research on this.
 
cgrondzinski, thanks for showing me the difference in resistivity and resistance. Hopefully I will not mixed it up in future. I aggree with dpc that the influence of duct size on the overall ampacity is very small and the duct size is a function of the cable diameter and not a way to increase the ampacity.
 
I took the liberty of checking what would be the difference between ampacities of a transmission type cable in 6” duct and the same cable in 8” duct. All other parameters remained constant. The difference was approx. 1%. I used Cyme software which could give also an error. In most cases, the duct diameter is a function of the cable pulling tension and its sidewall pressure.

L30N,
if you have a project that involves more that one circuit in close proximity to each other I would suggest asking for help an engineer specilizing in u/g T&D circuits. Pleae note that once the circuit is buried we have very little control over it beside de-rating. This, in turn, could be learned in the costly way sometimes. The only way to monitor its performance is to install a system called DTS which would measure the cable temperature along the entire length of the circuit. A sensor for this device is common fiber optic cable which costs a fraction of a power cable.
 
I second what KJvR says. The influence of duct size on ampacity is small. For example, I calculate (using Neher-McGrath) the ampacity of #1/0 AWG Al 15 kV cable as 155.9 A with 4" conduit and 158.9 A with 6" conduit.

I wouldn't use this to increase the cable rating factor.
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor