MacMcMacmac
Aerospace
- Sep 8, 2010
- 56
Good Day Folks. I am trying to get some buy in for an energy reclamation project at our gas turbine engine test facility. We have a 2500hp centrifugal blower that provides air for the test cell. It also drives an 1800hp turbo expander. By mixing hot and cold streams, we can get the air temperature required for the engine under test.
The cell is evacuated by a compressor/exhauster that creates the necessary altitude conditions.
Due to the very dry air requirements to operate the turbo expander down to -100f without creating ice, we draw all of the air for the blower through two desiccant wheel dehydration units in series.
These do a very good job, except under the absolute highest flow conditions at the highest summer humidity and temperature.
What has bothered me about this system for the 10 years I've been working here, is the fact that a very large amount of air from the blower is constantly being discharged to atmosphere. It seems like a huge waste of very expensive air. The blower is almost always producing excess flow for the demands of the test. I do not know why it isn't being throttled via IGVs to match flow to demand, but I've met the engineering crew who designed the complete system and they are no fools, so I can assume there was a very good reason why it operates the way it does. Still, the blow-off air has been filtered, chilled, dried, filtered again, chilled again and dried again, so it is pretty high quality by the time it goes to process. Add in the cost of compression, and the fact that there are two, 1 Mbtu direct fired heaters constantly regenerating the desiccant wheel, and it soon adds up to a lot of money being blown away to no good effect.
I was wondering how I could quantify the cost of the air being blown to atmosphere. I assume there is a simple pressure/velocity/pipe size calculation to measure the mass of air being discharged from the silencer on the roof. I also have the inlet and discharge air temperature and pressure of the blower. Is there a way to calculate the energy required to raise the temperature of the mass of air? This would give us the compressor energy being squandered, but this does not take into account the amount of energy expended drying the air in the first place. If there is an easier method, I'm all ears.
My proposal is to direct the excess air from the blower back to the inlet of the first dehydration unit to dilute the incoming ambient air and lower the load on the desiccant wheels. There is an initial cost to chilling the hot air back to an acceptable approach temperature for the first chiller coil so we do not trip the refrigeration system, but I think the payback for the installation of a water to air heat exchanger would be fairly quick due to the reduced gas requirements for the two regen heaters. At the very least, it would help us reach some unattainable test points during the hottest and wettest parts of the summer. However, until I have some numbers, this is just a guess.
Any suggestions are greatly appreciated.
The cell is evacuated by a compressor/exhauster that creates the necessary altitude conditions.
Due to the very dry air requirements to operate the turbo expander down to -100f without creating ice, we draw all of the air for the blower through two desiccant wheel dehydration units in series.
These do a very good job, except under the absolute highest flow conditions at the highest summer humidity and temperature.
What has bothered me about this system for the 10 years I've been working here, is the fact that a very large amount of air from the blower is constantly being discharged to atmosphere. It seems like a huge waste of very expensive air. The blower is almost always producing excess flow for the demands of the test. I do not know why it isn't being throttled via IGVs to match flow to demand, but I've met the engineering crew who designed the complete system and they are no fools, so I can assume there was a very good reason why it operates the way it does. Still, the blow-off air has been filtered, chilled, dried, filtered again, chilled again and dried again, so it is pretty high quality by the time it goes to process. Add in the cost of compression, and the fact that there are two, 1 Mbtu direct fired heaters constantly regenerating the desiccant wheel, and it soon adds up to a lot of money being blown away to no good effect.
I was wondering how I could quantify the cost of the air being blown to atmosphere. I assume there is a simple pressure/velocity/pipe size calculation to measure the mass of air being discharged from the silencer on the roof. I also have the inlet and discharge air temperature and pressure of the blower. Is there a way to calculate the energy required to raise the temperature of the mass of air? This would give us the compressor energy being squandered, but this does not take into account the amount of energy expended drying the air in the first place. If there is an easier method, I'm all ears.
My proposal is to direct the excess air from the blower back to the inlet of the first dehydration unit to dilute the incoming ambient air and lower the load on the desiccant wheels. There is an initial cost to chilling the hot air back to an acceptable approach temperature for the first chiller coil so we do not trip the refrigeration system, but I think the payback for the installation of a water to air heat exchanger would be fairly quick due to the reduced gas requirements for the two regen heaters. At the very least, it would help us reach some unattainable test points during the hottest and wettest parts of the summer. However, until I have some numbers, this is just a guess.
Any suggestions are greatly appreciated.