Continue to Site

Eng-Tips is the largest engineering community on the Internet

Intelligent Work Forums for Engineering Professionals

  • Congratulations KootK on being selected by the Eng-Tips community for having the most helpful posts in the forums last week. Way to Go!

Pressure Gauge for B31.1 Hydro Test 1

Status
Not open for further replies.

timmckee

Mechanical
Oct 19, 2005
20
I'm confused over the requirements for pressure gauges that are to be used for hydrotesting pressure piping manufactured to either B31.1 or B31.3.

Neither B31.1 or B31.3 state any requirements for gauge calibration or gauge ranges (at least that I can find, maybe I'm wrong). Nor can I find references to other ASME codes. However Sec V Article 10 Leak Testing, does state a min range of 1.5 times test pressure to a max of 4 times the test pressure. It also states that the gauges must be calibrated annually and provide accuracy equal to the manufacturers stated accuracy.

Besides being a good idea is annual calibration mandatory?

Are the pressure ranges mandatory? I plan on buying a digital gauge which is very accurate over its entire range and would like to limit the number of gauges I buy.

Thanks in advance for you help.
 
Replies continue below

Recommended for you

Calibration is mandatory; see ASME B31.1 Appendix J-1.2.10.

I don't know about mandatory ranges. I know our QA department has rules about range, but we seem to have forgotten what the driver was.
 
I should have been more specific - we are installing Non Boiler External Piping manufactured to B31.1.
 
Those are standard ranges for Hydro gauges per ASME Section VIII
 
Industry standard for gauge range is 1.5 to 4 times hydro pressure. You don't weigh your wife on a truck scale, nor do you weigh yourself on a kitchen scale.
 
You need a gage that can accurately measure the pressure for the test being conducted. The 1.5 to 4x range depends on the type of test (hydro, leak, etc.) being done, with the factor varying to compensate for differences in material strengths between the hydro. test temperature and the service temperature, among other factors.

The code specifies a min. pressure to test. If your gage is only accurate to +/-20 psi at that pressure, a diligent inspector will make you test to the test pressure plus 20 psi to ensure you've met the limit. If you haven't calibrated the gage in over a year, or whatever the recommended interval is, you have no idea whether it is a +/-20 psig gage or what...so calibrate it, or buy a new one certified by the manufacturer to some accuracy.
 
B31.3 sets no rules for this. Haven't read this section of B31.1 but suspect it's no different. Trouble is, any inspector who also does ASME VIII work will try to enforce the ASME VIII rules when witnessing hydrotests on B31.3 piping.

btrueblood got a little confused there: the 1.5x to 4x referred to by the previous poster is the requirement for the range of the gauge relative to the calculated hydrotest pressure per ASME VIII. The idea behind this is that you want your gauge to be reasonably accurate (though an accuracy requirement for this testing is NOT actually stipulated by the code), and hence using a gauge to measure at say 1/10th of its full scale is a bad idea (if it's a bourdon-type gauge, which the code assumes).

Today, with industrial pressure transmitters commonly available with accuracies of 0.075% at 100:1 turndown, I see no reason that you couldn't buy ONE pressure transmitter, calibrate it yearly, and use it for just about every hydrotest you ever do. Unfortunately the code hasn't caught up to that as most shops still have a forest of pressure gauges to provide the necessary ranges, and these are all sent out yearly for calibration- a waste of money and time. Our shop uses one of these, calibrated every three years, as a means to check our forest of gauges yearly or whenever we suspect one of having been over-pressured.

The reason stipulated for not using a gauge beyond the 2/3 mark is so that you know how far OVER the pressure you've gone during a test which goes wrong. This is a weak argument in my opinion, as you'd better have very much tighter controls on your source of pressure than that! I see no reason you shouldn't be permitted to use a gauge to 80% or even 90% of its full scale under controlled test conditions, as its accuracy as a fraction of measured value will be greatest there.

 
Pressure gauge ranges were previously discussed here: thread378-267446

Basically, the 1.5X to 4X limits only applies to analog gauges, and that's because they reach their best accuracy (as a percentage of full scale) within this range. That's a feature of the Bourdon tube.

The code does have different rules for digital gauges, or at least Section III does: "Digital type pressure gages may be used without range restriction provided the combined error due to calibration and readability does not exceed 1% of the test pressure."

Pressure gauge accuracies are normally specified as a percentage of full scale, so I'm not fully comfortable with moltenmetal's shop practice. I would first check to make sure that that quoted accuracy is a percentage of turndown range, not just full scale. I'm concerned that the turndown might just improve the electronic resolution (i.e. more digits) without a proportionate improvement in the sensor's repeatability. But even if it is accurate to 0.075% of range, you would still need separate calibrations for each range you plan to use. Given the added cost of electronic gauges and the batteries, I'm not convinced that this is more cost effective than a bunch of analog gauges. But maybe I'm just retro.
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor