Continue to Site

Eng-Tips is the largest engineering community on the Internet

Intelligent Work Forums for Engineering Professionals

  • Congratulations KootK on being selected by the Eng-Tips community for having the most helpful posts in the forums last week. Way to Go!

Turbo inlet vs discharge temp

Status
Not open for further replies.

Jon421

Automotive
Dec 16, 2003
6
Ive been working with a formula that is supposed to calculate a compressors discharge temperature when given the inlet temp, pressure ratio and volumetric efficiency of the compressor. I found that when the inlet temperature is lowered that the discharge temp is effected by about 120%-130% of the change in the inlet temp, depending on the volumetric efficiency of the compressor.

For example, with a 1.80 pressure ratio and 70% V/E, when the inlet temp is dropped by 40 deg the discharge temp is lowered by 49.583 deg, a change of 123.956%

Has anyone else ever come across this before? Whats causing the drastic change in the discharge temp? Is this an accurate calculation or is there a flaw in the formula?

I found the formula here, its about half way down the page under "How hot is the air coming out of the compressor"

 
Replies continue below

Recommended for you

First, it's compressor efficiency, not volumetric efficiency and second, there is a linear relationship between inlet and outlet temperature.

Do the case where inlet is 300 Kelvins:

(1-1.8^0.283) * 300 / 0.70 = 77.6 Trise

so you have a discharge temp of 377.6 Kelvins.

Compare that with 260 Kelvins at the inlet:

(1-1.8^0.283) * 260 / 0.70 = 67.2 Trise

giving a discharge temp of 327.2 K.

The temperature ratios are the same, i.e.,

300/260 == 377.6/327.2

just as you'd expect from examining the equation.
 
Eric,
Your right, I ment compressor efficiency, not voulmetric efficiency, my mistake.

So am I correct to assume that it is more efficient to reduce the temp of the air going into a compressor than to try to reduce the temp of the air coming out? As in using a refrigeration based intercooler to cool the air coming into the compressor.
 
Your observation about inlet temperatures vs. outlet temperatures is essentially correct.

However -- it doesn't follow that it is "more efficient" to reduce the inlet air than the exit air. The reason is -- it's a lot easier to drop the temperature of 250 deg air by 100 degrees (to 150) then it is to drop the air temperature from 80 degrees to zero. The first case can be done with an inter (after) cooler, the second requires some type of refrigeration device which will require power for its own compressor.

Now if you're talking about drag racing, where you can use chilled water in the cooler, and only need it for 10 or 15 seconds, you have a different scenario.
 
SBBlue,
Thats just what I was going to ask next. How do I calculate the heat transfer between the different temps? I know its impossible to calculate an exact answer, but roughly how much more heat would be removed by an inter/aftercooler filled with ice water (or a refrigerant at -30deg) with 300deg air blowing through it vs. a "precooler" with 80deg ambient air blowing through it going into the compressor?

Im thinking about using my existing a/c refrigeration system and placing an evaporator just before the supercharger. The way the kit is designed makes it very difficult to fabricate any type of inter/aftercooler for it so this would be much easier.
 
Jon

The relationship between inlet (T1) and outlet (T2) temperature is linear:

T2=T1*beta^((k-1)/k)

where beta=p1/p2 is the pressure ratio and k is the politropic coefficient (reasonably constant if the two temperatures are not too different).

For example, if you reduce the inlet temp from 40°C to 20°C with a "pre-cooler" (that is a reduction of 50%) you reduce the outlet temp from 47.32°C to 23.66°C, with the same reduction of 50%. This assuming an ideal (isoentropic, k=1.4) compression, with a pressure ratio of 1.8. For a real compression, given a compressor efficiency of 0.7 you have to calculate the ideal deltaT as shown above and divide it by 0.7. Then you can sum this "real" deltaT to the inlet temp, to obtain the real outlet temp.
As the formula says, it's indifferent in term of temp's gain, to cool inlet or outlet flow.
The fact is that, as SSBlue says, it's simpler to cool the outlet flow than the inlet one.
The heat rate to be removed from the flow is Q=M*cp*deltaT, where M is the mass flow rate, cp specific heat (at constant pressure) and deltaT is the wanted temperature difference. For example, if you want to cool down the flow by 20°C, assuming a constant value for cp and M passing from inlet to outlet flow, the heat rate to be subtracted from the flow is the same.
The problem is that cooling the outlet flow is much simpler: smaller exchangers and/or less restrictions to the maximum coolant temp: given the coolant temp, the mean logarithmic deltaT of the aftercooler is greater than that of a precooler: clearly, the coolant temp has to be smaller than the ambient air! Assuming an identic transmission coefficient, this signifies that the second solution requires greater exchange areas to remove the same heat from the flow. However, the advantage of this solution is a reduced compression work done by compressor.
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor