Typically the design of pressure transducers is such that there is a bonded foil strain gage in a whetstone configuration which is bonded to the non-wetted side of an diaphragm… there also is an amplification/compensation electronics board assembly that is contained within a metal housing which is hermetically sealed via fusion (TIG) weld.
The amplifier module is typically evacuated and sealed (to prevent out gassing). When you evacuate the electronics side you will naturally tend to cause the diaphragm to bow in a direction that causes the output to appear as if there has been pressure applied to the wetted side. There is a linear relationship between output change and evacuation ‘level’.
If there happens to be a leak in the seal (or a poor weld), you would notice a negative zero shift over time (as the diaphragm returns to atmospheric pressure) post the evacuation / sealing process. One method of quickly and cheaply detecting a leak in this case would be to perform a gaseous helium “bomb” which would force molecules inside the transducer to “back pressure’ the diaphragm (it would force a large negative zero shift (based on pressure range of the transducer)).
My question is this.
Assuming that a leak rate of 1x10^7 scc/s is allowed…
As the BOMB pressure (gHe) increases and as the duration of the test increases, is it possible to make a good unit seem bad?
How do you calculate the optimum length of time / pressure that should be used based on a percentage of full scale output & Pressure range?
Thanks for your time
Qu1nn
The amplifier module is typically evacuated and sealed (to prevent out gassing). When you evacuate the electronics side you will naturally tend to cause the diaphragm to bow in a direction that causes the output to appear as if there has been pressure applied to the wetted side. There is a linear relationship between output change and evacuation ‘level’.
If there happens to be a leak in the seal (or a poor weld), you would notice a negative zero shift over time (as the diaphragm returns to atmospheric pressure) post the evacuation / sealing process. One method of quickly and cheaply detecting a leak in this case would be to perform a gaseous helium “bomb” which would force molecules inside the transducer to “back pressure’ the diaphragm (it would force a large negative zero shift (based on pressure range of the transducer)).
My question is this.
Assuming that a leak rate of 1x10^7 scc/s is allowed…
As the BOMB pressure (gHe) increases and as the duration of the test increases, is it possible to make a good unit seem bad?
How do you calculate the optimum length of time / pressure that should be used based on a percentage of full scale output & Pressure range?
Thanks for your time
Qu1nn