chuckd83
Electrical
- Oct 2, 2014
- 42
An underground cable ampacity study comes across my desk every month or so. The thermal resistivity of the soil (rho) has a fairly significant impact on the ampacity of the cable and it's always a question of what to use. With no soil test, I use a rho of 90 degC-cm/W unless it is in the desert (then 120) or coastal (then 60) per NEC Annex B.2.
If I have a soil test, the geotechnical report provides thermal dryout curves and now the question is what moisture content to use. The NRCS provides moisture content readings throughout the year for various locations around the U.S. To date, I use the worst case (driest) moisture content and select the rho from the dryout curves based on that value.
I'm not sure this is the best approach since the NRCS data is provided with no cable installed. If a cable is in the earth, it will heat up and dry out the surrounding soil when in use affecting the soil rho. So I'm back to what moisture content to select. It's a matter of what temperature the cable will operate (typically 90 or 105 degC) and how much that will dry out the surrounding soil. Are there any resources to help here?
If I have a soil test, the geotechnical report provides thermal dryout curves and now the question is what moisture content to use. The NRCS provides moisture content readings throughout the year for various locations around the U.S. To date, I use the worst case (driest) moisture content and select the rho from the dryout curves based on that value.
I'm not sure this is the best approach since the NRCS data is provided with no cable installed. If a cable is in the earth, it will heat up and dry out the surrounding soil when in use affecting the soil rho. So I'm back to what moisture content to select. It's a matter of what temperature the cable will operate (typically 90 or 105 degC) and how much that will dry out the surrounding soil. Are there any resources to help here?