Eng-Tips is the largest engineering community on the Internet

Intelligent Work Forums for Engineering Professionals

  • Congratulations waross on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

Old New Guy

Status
Not open for further replies.

edpaq

Mechanical
Apr 29, 2008
16
0
0
US
Hi All,

I'm new to this forum and have been out of QC for about 15 years. And now I find myself back in with allot of catchng up to do.

My first of many questions to come is this. In ISO 10012-2003 7.4.1.f (designated maximum perrmissible error) I,m not sure if I understand this. Are they talking about the accuracy stated by the instrument manufacturer? or am I to determine this number using other factors? If the other factors come into play. What do I need to consider and is there some standard rule/procedure that I need to follow?

Thanks, from the Old/New guy

Eddie
 
Replies continue below

Recommended for you

I am unfamiliar with the standard though it looks to be something along the lines of gage R&R in addition to traceability uncertainties. Would it be possible to expand (without violating copyright), on the section you have questions about? It has been a couple of days and I have noticed no responses. More info would be helpful.

Regards,
 
Thanks for the response. You are correct in your assumption, this about gage calibration requirements and opps, I should have cited 7.1.4.f instead of 7.4.1.f which states that Calibration records shall contain certain requirements such as the "the uncertianties involved in calibrating equipment". Further in this spec it is required to have an estimation and analisis for these uncertianties, (documented of course). And there must be a statement of uncertainty determinations.

I have seen procedures where basically you give a micrometer to somone and have them measure the same attribute several times to come up with a variance of an individual. Then to do the same with several others and see what the range is in the group. If I understand this correctly then part of the uncertainty is that there is a margin or error due to different individual techniques. Then you have to add in environmental factors, Instrument limitations, or anything that might affect the measuring process.

Hope this explains my delema a little better. And by the way. I am a one man show at least for now so I can't even start by using my own example above.

All help is greatly appreciated

Eddie
 
edpaq,

Since you are a one man show, I would suggest using an SPC chart to show your gage/technique stability. Environmental factors can either be captured by a temp/humidity meter or perhaps listed as an assumption within the calibration documentation. Naturally, data would be preferred as you can use it to determine/estimate the environmental drift of both the measuring device and the master sample. The master sample you use also has an uncertainty based upon it's traceability to a national standards body. Where I work, we use a 10% uncertainty (listing both maximum and RMS) for each "generation" your master is away from the national standard.

As for the ISO portion of it, I would think that as long as you capture/document what you plan on tracking as significant relating to a measurement device, an auditor would find that acceptable.

Hope this helps. Again I am unfamiliar with this particular standard but my feel is that you are already working on the correct things to consider with your post yesterday.

Regards,
 
Thanks for the help. I think I have a general understanding of it now. Last time I worked in QC, ISO did not even exist or at least was not well known. Quite a bit has changed since then.

I only have basic measuring tools, nothing more than OD Micrometers, Calipers, Depth Micrometers, and ID Micrometers. Can anyone reference a source where I might get some public data to use? I would assume that it's been done about a million times.

Thanks,

Eddie
 
edpaq,
ISO-10012 has in essence been replaced with ISO-17025. However, your question can be best understood with a good read of ISO Guide-25. "Guide for Measurement Uncertainty" (or affectionaltely known as "The GUM).
Basically you need to account for all significant sources of error. You need to identify sources of error such as repeatability, ambient temperature, humidity, pressure, gravity, magnetic influences, person doing the calibration, etc. This often requires multiple tests by multiple technicians at random times. This test data is combined (according to uncertainty distributions (rectangular, student-t, normal,etc.) then added to the uncertainties of the equipment, standards, cables, etc.
After that is analyzed, you come up with a "Total Uncertainty" of your measurement.
That drives your "Maximum Permissable Error" which usually is a 95% confidence factor or 2-sigma.
 
Status
Not open for further replies.
Back
Top