BronYrAur
Mechanical
- Nov 2, 2005
- 798
In the manufacturing of fuel injectors, I have been told (as an HAVC guy) that a relative humidity of 50% or less is needed to prevent rust. I have found that most people (including those in the HVAC industry who should know better) don't understand relative humidity (%RH). So I am questioning if the number of 50% RH is valid. Is RELATIVE humidity a good indicator of when rust will form, or should an absolute parameter (e.g. dew point) be used?
The trouble I usually run into is that someone will specify that the desired setpoint should be something like 75 degrees F and 50% RH. Then they decide that they want 70 degrees F and 50%. To those folks, 50% is 50%. In reality of course, the moisture in the air is quite different for those 2 scenarios.
So I am posting the question here as opposed to the HVAC forum to try to understand when the metal actually starts oxidizing? Is a relative humidity parameter really the best indicator?
The trouble I usually run into is that someone will specify that the desired setpoint should be something like 75 degrees F and 50% RH. Then they decide that they want 70 degrees F and 50%. To those folks, 50% is 50%. In reality of course, the moisture in the air is quite different for those 2 scenarios.
So I am posting the question here as opposed to the HVAC forum to try to understand when the metal actually starts oxidizing? Is a relative humidity parameter really the best indicator?