chindy
Chemical
- Oct 19, 2022
- 2
Hi,
I'm doing work on a system to determine a control loop response time to protect a recovery skid. The scenario is low temperature tripping the skid, diverting the flow to the main flare. I need to find how quickly the downstream pipe (not designed for low temperature) will cool to its temperature limit of -20 F. The following are the assumptions:
1. The system is initially at equilibrium of 80 F (pipe temperature is the same as the fluid temperature).
2. A sudden rush of cold fluid displaces the warmer fluid, so that the pipe contents are now at temperature T.
3. There is no external heating (air convection or solar).
4. The flow rate of the cold fluid is constant.
5. It's a large pipe, so the fluid temperature is constant.
This setup assumes the temperature change is sudden, so the TT will instantly trigger the alarm. The key is finding the maximum time I need to design for between when that alarm triggers and the downstream system is tripped, which is based on how long it will take for the pipe to cool from 80 F to -20 F. I've been having difficulty determining how long it will take for the pipe to cool, however. I can say that for a representative foot of pipe of "m" mass, m*(heat capacity of pipe)*(change in temp of pipe) = Q, the total Btu's needed to cool that one stretch of pipe. However, for the ethylene side, how do I determine the time needed to absorb that amount of heat Q? I've done a few different things with wildly different answers and don't think any of my methods are actually sensible.
I'm doing work on a system to determine a control loop response time to protect a recovery skid. The scenario is low temperature tripping the skid, diverting the flow to the main flare. I need to find how quickly the downstream pipe (not designed for low temperature) will cool to its temperature limit of -20 F. The following are the assumptions:
1. The system is initially at equilibrium of 80 F (pipe temperature is the same as the fluid temperature).
2. A sudden rush of cold fluid displaces the warmer fluid, so that the pipe contents are now at temperature T.
3. There is no external heating (air convection or solar).
4. The flow rate of the cold fluid is constant.
5. It's a large pipe, so the fluid temperature is constant.
This setup assumes the temperature change is sudden, so the TT will instantly trigger the alarm. The key is finding the maximum time I need to design for between when that alarm triggers and the downstream system is tripped, which is based on how long it will take for the pipe to cool from 80 F to -20 F. I've been having difficulty determining how long it will take for the pipe to cool, however. I can say that for a representative foot of pipe of "m" mass, m*(heat capacity of pipe)*(change in temp of pipe) = Q, the total Btu's needed to cool that one stretch of pipe. However, for the ethylene side, how do I determine the time needed to absorb that amount of heat Q? I've done a few different things with wildly different answers and don't think any of my methods are actually sensible.