packdad
Mechanical
- Mar 7, 2001
- 71
When you test a relief valve to verify its setpoint, do you count instrument accuracy and setpoint tolerance simultaneously? For example, if you (or the OEM) have relief valve with a rated setpoint tolerance of +/- 3%, and you measure inlet pressure with a gauge that is accurate to +/- 0.5%, do you look for the actual popping pressure to be within +/- 3% or +/- 2.5%?
If the code were to specify separate tolerances for the setpoint AND for the test instrument, it would seem to make some sense that you would not "add" the tolerances together during the test. But is this a true assumption? And what if the code does NOT specify a separate test instrument tolerance?
If the code were to specify separate tolerances for the setpoint AND for the test instrument, it would seem to make some sense that you would not "add" the tolerances together during the test. But is this a true assumption? And what if the code does NOT specify a separate test instrument tolerance?