Continue to Site

Eng-Tips is the largest engineering community on the Internet

Intelligent Work Forums for Engineering Professionals

  • Congratulations IDS on being selected by the Eng-Tips community for having the most helpful posts in the forums last week. Way to Go!

Electrical equipment calibration 1

Status
Not open for further replies.

carlosmelim

Electrical
Aug 16, 2002
11
Dear fellows,

In our company we are implementing an ISO 9002 quality assurance plan. We have to calibrate all our electrical measures equipment. That is easy and we have sent them to a calibration laboratory.

Our consultant told us that we have to decide the maximum admissible error for each equipment. If, for example, the laboratory says that our insulation measurer has 1% error we can use as it long as we assume that the maximum admissible error is, let´s say, 3%.

How can we choose this figures? Our consultant doesn´t know it either. Which standards apply?

Our company is an commom electrical installer and we have equipments for insulation measurement, dielectric rigidity, earth resistance, RCD tester, etc.

Can anyone help me to choose the correct standards so that we can choose technically correct figures? Portuguese regulations are extremely old and very vague on this subjects.

What people in Portugal usually do is to define the maximum admissible error of an equipment as being the same
as the one stated in the calibration sheet.
This doesn´t make any sense because you change the rules during the game. If you change the admissible error with each calibration, your equipment will always be in good shape and that is, of course, untrue.

Looking forward to hearing from you,

Carlos Melim

 
Replies continue below

Recommended for you


It is not clear if your consultant is an electrical-metrology consultant or an ISO 9002 consultant.

If your met lab is providing you with traceability/comformance data associated with the work they do for you, then there should not be any need for further documentation from them, unless there is specific cause to suspect their services.

It may be desirable to ask your consultant on what he bases his recommendation. Usually numbers of this type are developed from national or international consensus standards. You may want to ask your consultant for a formal/consensus/published definition of “admissible error.” At times terminology becomes ‘buzzwords’ without legally defined meaning.
 
Busbar,

Our consultant is not an electrical engineer. But I understand his recommendation.
If you are using an equipment for, let´s say, measure the voltage on a substation maybe you won´t need an instrument with a excellent accuracy.
But if you are using it to measure small voltage deviations on a laboratory bench then excellent accuracy may be mandatory.
That´s how the maximum admissible error concept appears. Each company must decide, based on their actual work, what is the error they can accept on an instrument.
This is if course theoretical but we must address this issue.
If I knew which standards apply for insulation measurement in transformers, electrical motors, panels and cables, maybe I can decide what is the error acceptable in our meter.
The same applies for the rest of the equipment. I was wondering if someone has faced this issue and could direct us to standards ( IEC, CENELEC, etc ) that are usually followed in the industry.

Nowadays we use our equipment on a very amateur way. We don´t use standard, just our experience of many years.

So, more important than the error issue, I think we should start with a certain number of standards that will give us a more professional and safe background.

I am looking forward to hearing from you,

Best regards

Carlos Melim


 

Sorry, but I'm not very familiar with implementation of ISO 9000-series standards, but it is possible that your met lab may be with respect to instrument calibration. It may be time well spent having a discussion with met-lab personnel about what they provide, and their understanding of practical field-use limitations.

{I would be a little concerned that your consultant may be learning previously unfamiliar procedures “on your dime.”}

I am not certain about this, but that you have or will define limits and adhere to your limits is may be most important, and not so much that you have precise measurement capability. It may be that you merely have to document your measurement capabilities and limitations, keep records specific to that matter and not need to be extremely accurate in your day-to-day work.
 
Busbar

Thanks for your cooperation. I followed your suggestion and talked to the lab guys. I think they will help us on what we need.
I´ll keep you informed
 
Seems they would calibrate (make any adjustments necessary) the equipment and provide you with a test report showing the errors. The percent error would determine the proper application based on any relevant standards. Generally, they tune the equipment to the highest accuracy obtainable. It is what it is. If it is not within tolerance, then a new piece of equipment should be purchased to meet the standard or moved to an application that requires less accuracy. I am only used to getting lab equipment calibrated and we pay a little extra for a report. I have never had to specify a standard to calibrate too. Maybe this is common in the power industry.
 

Prove it for yourself, but let me comment that modern AC instruments seem to have surprising agreement in readings during informal bench testing—on the order of three nines [99.9%] for most all intents routinely achievable.

Their producers assign very conservative accuracy numbers. Given, for example, a portable relay test set, portable power analyzer and a true-RMS handheld mutimeter with ‘clothespin’ CT—they seem to be very close, or conspiratorially inaccurate by an oddly similar amount.

 
There are a number of issues here that shouldn't be confused.

Firstly, most of the discussion here is referring to instrument accuracy, which describes the accuracy of the instrument in the controlled lab conditions, with respect to a known "standard". This is easily taken care of by the calibration lab, by comparing to their known standard, and if the errors are outside the specifications of the unit, they can tweak it so the unit measures within the specs.

The second issue is the measurement uncertainty, which relates to the confidence of your measurement value to be accurate. This is a calculation which combines all possible sources of error in the measuremnt "system" (eg for measuring high voltage, you would have a voltage divider, and voltmeter). This accounts for variations in the reading due to variations in things like (in the above example) temperature, frequency, drift, linearity, together with the known limitations of the system (eg accuracy from the calibration, +-half the smallest display reading [digital instrument], parallax error [analog instrument]) and also taking into account the uncertainty of the calibration laboratory measurements (because they don't have a perfect standard!)

Estimation of errors and calculation of the uncertainty (and the confidence level) can be finger in the air figures, or can be found from experience or experimentation. It is not a simple exercise, but in the long run, can give you the confidence that you require to do your work. The overall concept is to give you a confidence that what you are reading on the dial is a true indication of the measurand.

I don't think the 9000 accreditation requires the estimation of uncertainty work, however if your are working to ISO/IEC 17025, this is spelt out quite clearly. Your cal lab should be able to help in questions about uncertainties, but if you really want to get into it (read: love stats), do a course run by a good metrology company.
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor