Josh2008
Mechanical
- Sep 25, 2008
- 42
I'm new to compressed air systems but I've done some research online to learn as much as I can for them.
I'm designing a system for an underground mine application to power pneumatic machinery and tools on different sub-levels of a mine. I have calculated pressure losses for hoses, fitting, and pipe friction so I know the pressure required at the outlet of the receiver.
The air receiver is where I'm a bit stumped. I understand that the receiver plays the role of providing surge capacity to the compressed air system. It also separate moisture and reduce pulsations.
Knowing the pressure inside the receiver, the volume of the receiver, and the atmospheric pressure, we can determine what capacity of air is inside the receiver.
If for instance I know that the pressure at the outlet of the receiver has to be at least 100psi to achieve 90psi at the work tool underground, does that mean it is safe to size the system for a 100psi compressor? Is there any reason to size it at a higher pressure? IE say there where two compressors on the market @ 100 psi and 110 psi to choose from. Obviously you want to size the compressor to the minimum required pressure rating so that there is more cfm available.
The link above sizes a receiver using 100psig from the compressor and 90psi at the outlet of the receiver. What are the circumstances here?
My assumption is that if one compressor is running a tool and the system surges because somebody requires so many cfm for a short period of time that the pressure drop due to flow demand creates flow from the receiver to compensate and avoid a second compressor turning on (therefore saving you energy).
Am I anywhere near correct? If for instance I'm correct, and I don't know what the exact short term surge is going to be, how do I size the receiver? Using the 'for every cfm = 1 cu.ft. rule of thumb'? How does the article above know to use a 10 psi band? Is this something preset?
Thanks everyone.
I'm designing a system for an underground mine application to power pneumatic machinery and tools on different sub-levels of a mine. I have calculated pressure losses for hoses, fitting, and pipe friction so I know the pressure required at the outlet of the receiver.
The air receiver is where I'm a bit stumped. I understand that the receiver plays the role of providing surge capacity to the compressed air system. It also separate moisture and reduce pulsations.
Knowing the pressure inside the receiver, the volume of the receiver, and the atmospheric pressure, we can determine what capacity of air is inside the receiver.
If for instance I know that the pressure at the outlet of the receiver has to be at least 100psi to achieve 90psi at the work tool underground, does that mean it is safe to size the system for a 100psi compressor? Is there any reason to size it at a higher pressure? IE say there where two compressors on the market @ 100 psi and 110 psi to choose from. Obviously you want to size the compressor to the minimum required pressure rating so that there is more cfm available.
The link above sizes a receiver using 100psig from the compressor and 90psi at the outlet of the receiver. What are the circumstances here?
My assumption is that if one compressor is running a tool and the system surges because somebody requires so many cfm for a short period of time that the pressure drop due to flow demand creates flow from the receiver to compensate and avoid a second compressor turning on (therefore saving you energy).
Am I anywhere near correct? If for instance I'm correct, and I don't know what the exact short term surge is going to be, how do I size the receiver? Using the 'for every cfm = 1 cu.ft. rule of thumb'? How does the article above know to use a 10 psi band? Is this something preset?
Thanks everyone.