Continue to Site

Eng-Tips is the largest engineering community on the Internet

Intelligent Work Forums for Engineering Professionals

  • Congratulations waross on being selected by the Eng-Tips community for having the most helpful posts in the forums last week. Way to Go!

Steam "Wetness" in Benchtop Steriliser- Measure? 1

Status
Not open for further replies.

SteriliserGuru

Electrical
Aug 15, 2006
7
Hi,
Excuse the ignorance, but as my job is mainly electrical & its 20 years since my thermodynamics subject, I need help. The Standards Committee in Australia is proposing that steam in benchtop sterilisers (often made by boiling water in chamber, or external small boiler, or by dry heated block with water injected into a "maze") be a maximum of 3% wet. I think that one would expect about 5% by boiling it in the chamber?... QUESTION:Is there a readilly acessible & practical method for measuring this????(Chamber is generally about 20Litres & has a 1/4 threaded test port BTW). Calibration usually involves measurement of pressure & temp. Temp to +/- 0.5C (at 134-135C) & Pressure to (supposedly 0.5% @ about 2.0-2.3 BAR). I think wetness is going to be impractical to measure , & would prefer them to allocate a pressure range by which it could vary from sat. conditions, as our main problem is insufficient air removal giving high pressures, or radiant heat off the wall giving high temp measurements...
 
Replies continue below

Recommended for you

It's going to be real hard to determine any specific degree of 'wetness', or quality without inferring from enthalpy change.

I think rather you need to turn the problem around and 'ensure that the quality is greater than 97%' by ensuring that the quality is greater than 100%. Keep your system slightly in the superheat region. Or is this not possible with your equipment?

Use a reflective shield on your temperature sensor.
 
Maybe I misunderstood, but I thought that he was talking about water content in air, not steam quality.

I2I
 
I went on the basis of the title: "Steam wetness...", but you made me actually look at the tables. 134degC is saturated at about 2.03bar. Sounds like barely saturated to superheated steam to me.
 
TO QUOTE as4815:
"Sterilisers shall be calibrated to the correct point on the phase boundary line.Deviation from this phase boundary line represent states of superheated or wet steam or air & steam mixtures which may lead to sterilisation process failure...."
AND
"The maximum allowable wetness is 3%, which is equivalent to 97% dry, saturated steam"

I perceive this as being daft, & not a thing we can do about it, but need to be able to make a national standards body see reason... Obviously, we can do things to alter air removal (longer/deeper vac.pulses or purging etc) but the steam quality is not going to be alterable with the process / heating fixed.
 
Thats hard and i'm overextended here. I googled "measuring steam quality"; Bench test kits tend to condense the moisture out of a fixed volume of air. I just looked at a couple of patents that cover measuring steam quality in process, but both seem pretty sophisticated.

My feeling is that you need to control the heating to back away from the sudden increase in temperature you are going to get when you move past the sat curve. How to guarantee this won't fall below x=97% would be a controller specification problem and some benchtesting/trial and error work to get it right. A fine tuning heater might be required.
 
Hmm, again the proble is that the control system & temp set points are fixed. (we actually arent ALLOWED to modify the units as they fall under "medical devices") for an example of what we are talking about try or (two italian manufacturers)
 
Calibration, see first post. The standards are asking for what I believe is impossible.
 
Basically calibation is done with thermocouple & pressure gauge to align measured with indicated values. Units vary in being p or T controlled & setpoint as well from 145.5 to about 136; 2.1 to 2.2 bar
Units work, this is more an argument about at what height they want to set the bar for proving the process.
 
we have come across this many times, there are 2 products to measure steam quality
1) A condensing steam "calirometer"
2) very exspensive high temp humidity sensor

the first one is reletivly cheap but has a very wide margen for operator error.

that said we have found the best solution in our hospital:

bring the highest pressure steam possible to the sterilizer location, ensure that line is properly trapped.
step down using a PRV as close to the point of use as possible
 
If you have both temperature and pressure, then you should be able to calculate the "steam wetness" with proper EOS or use of tables as long as you are slightly superheated. With only temperature and pressure to work with, I think that you need to stay on the superheated side of the curve as you will not be able to determine the steam quality.

I2I
 
Steam has a quality of 1.0 in the superheat region. Quality is the measure of where along the latent region the state is.

You can calibrate to be 'on the curve' only using present apparatus.

For this unit to be able to calibrated - you first need something to adjust. The independant has to be separately adjustable from the controlled variable. ie. if the temp is controlled on setpoint, you need to be able to fine-adjust the pressure to get this calibrated. and vice versa. Clearly adjusting a setpoint will put you anywhere from a sat liquid to a sat gas.

To calibrate the steriliser, Turn it on and let the unit stabilise at the temp setpoint at sat temp of your intended pressure. Trending the temp and pressure. Read tables to see if above or on/below sat curve. Adjust pressure until you are below the curve letting stabilise, then fine-adjust upwards until you see a temp spike. back and forth over this point until you are sure that you are on.

Better to control temp to setpoint and adjust pressure. Faster to calibrate, more accurate.

 
CinciMace: Temperature and pressure are not independent in the two-phase region. i.e. adjusting one, affects both

I2I
 
Oh, and quality is undefined in the superheated and subcooled regions. A quality of 1.0 is the quality of saturated steam only.

I2I
 
Nice use of precision. Keep your knickers on, son.

I think i've been pretty clear about the dependancy issue.
Sure they'll both vary, but you can adjust one and let the other one control to the setpoint. The point is, calibrate to the sat vapour line not to the region within.
 
The point is that you cannot calibrate to the saturation line because when you are at that temperature and pressure, there is no way to tell if you are at vapor line or inside the region; therefore, you will need to calibrate to a temperature and pressure that is barely superheated to ensure that you are within the 3%.

I2I
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor