Continue to Site

Eng-Tips is the largest engineering community on the Internet

Intelligent Work Forums for Engineering Professionals

  • Congratulations cowski on being selected by the Eng-Tips community for having the most helpful posts in the forums last week. Way to Go!

Rounding dimensions for part inspection 9

Status
Not open for further replies.

TommD

Mechanical
Jul 25, 2007
4
Looking for feedback on rounding during part inspection. I've read some older posts directed to the drafting standards ( which is what this forum is for) but I cannot find much re inspection & rounding. Is a +/- .005" a hard limit with no rounding ( +/- .005 = +/- .005000) or does the next digit get rounded? (+/- .0054 = +/- .005") I hope I've been able to express this clearly.
 
Replies continue below

Recommended for you

I was hoping it would be obvious that my example of 13.0000000001 was exaggerated. I was simply trying to explain an academic point, and stated that we "don't really have any business measuring that many digits out."

John-Paul Belanger
Certified Sr. GD&T Professional
Geometric Learning Systems
 
3DDave,

Your post pretty much sums up where I was going.

J-P,

We have to draw the line in the sand somewhere, and the line is the limit specified on the drawing. If we start accepting parts that are known to be over the limit by a tiny amount (like the 13.000000000001 part), then we've taken the first step onto a slippery slope. How tiny an amount over the limit is acceptable? If I were a designer, I wouldn't want to be worried about how much extra tolerance the inspection people were going to add to my spec.

Evan Janeshewski

Axymetrix Quality Engineering Inc.
 
Evan -- I completely agree. But bear with me while I push my point once again. [smile]

You wrote that it's problematic if we "start accepting parts that are known to be over the limit by a tiny amount." And I agree.
But what is your standard for "a tiny amount"? Therein lies the gray area! It's something which Y14.5 cannot address -- nor should it, since it's clearly not a gaging standard. That's all that I've been trying to say.

The best guidance we have in that regard is the old 10% rule: measure to one extra decimal place. (This was already mentioned above by steveapathy, so my apologies for stringing things out.)


John-Paul Belanger
Certified Sr. GD&T Professional
Geometric Learning Systems
 
Guys, let me throw in my ¢2.

It looks to me that you are trying to measure imperfect part using imperfect instrument while assuming living in the perfect world.

In the real world things are more complicated and simpler in the same time.

Let say we have to measure the part according to print that says DIA 1.000±.005
We have Caliper that can display measurement of .001, but its actual measurement uncertainty is .002.
We also have Micrometer that can also display measurement of .001, but its actual measurement uncertainty is .0005.

To be certain that our part is good, we SUBTRACT our measurement uncertainty from both sides of tolerance zone.
So, while the part print still saying DIA 1.000±.005, our QC SHEET (or whatever you call that document) will be saying:
“DIA 1.000±.003 measure with Caliper” or
“DIA 1.0000±.0045 measure with Micrometer”

So, nothing has to be rounded, you just read what your instrument says:
Caliper: 1.003-good, 1.004-bad.
Micrometer: 1.004-good, 1.005-bad.

So, this is the real world approach – numbers on your print are absolute, you measure with the best tool available to you, and compensate for measuring uncertainty.
Now, interesting question: will using more accurate tool (Micrometer vs. Caliper) result in acceptance of more parts? Maybe, maybe not. If your machine is capable of producing parts to DIA 1.000±.003 without effort, you may actually save money by not buying Micrometer. (Naturally this is purely thought experiment; any shop worth its while has several measuring instruments :))
 
Mark Foster (Appilied Geometrics on linkedin )
"""""""""""""""""""""""""""""""""""""""""""""""""""""""""
All measurements made for any reason using any technique at all (whether CMM, open set-up, tape measure, ruler, calipers, holding your thumb up in front of your line of sight, etc.) is only an estimate of the actual. Some estimating techniques are better than others. It's just that some techniques have more uncertainty built in than others. In the same way, the algorithm that we choose to use for evaluating the data that we have collected (through whatever means) also contributes to our measurement uncertainty. So the choice of measurement technique, as well as our choice of algorithm, of course will have an effect on how well we are able to estimate the actual values.

Having said all that, one needs to understand the *theory* that is the standardized definition of what the actual value of a given specification is in order to make good decisions on the best measurement techniques and algorithms to select. One would also need to factor in the manufacturing method(s) and its inherent likelihood to produce certain types of errors (or lack thereof) to make sound decisions to best measure a given feature/characteristic.

We often take "short-cuts" (e.g. using a few data points and averaging them) for expediency sake, but we need to have our eyes wide open when we take such short-cuts so that we understand the potential ramifications on our results and subsequent decisions.

The only "conditions" that I would place on the derivation of the datum plane are those imposed by the Y14.5 standard, which is what I thought we were trying to prove or disprove. The standard states that you are allowed to optimize the plane, or "best fit" to use a common (but not standardized) term. So to use the plane that you derived to "disprove" that the two surfaces are parallel within 0.4 means to me that you simply have not yet found the correct plane. In other words, as Bill said, you have a set-up error.

Your point about the fact that we do not always take the opportunity to adjust the measurement system is a perfectly valid one. We often do take the inspection results from the CMM (or other methods) as "gospel" without knowing the details of how those results were derived. And that has been my point all along with not only this thread, but many others -- i.e. that we must understand the *theory* first, and then come up with *practical* ways of approximating that theory.

Your example is one (of many that could be used) that demonstrates a high degree of measurement uncertainty. If we understand the theory, then we would know that those results from that setup were suspect at best, and would hopefully seek a measurement system that had a higher degree of confidence (less uncertainty).
"""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""
 
CH,
Out of curiosity, why are you saying that in your example of DIA 1.000±.005 caliper measurement = 1.004 means the part is bad?

Assuming that everything else, besides inspection equipment (caliper), has no impact on accuracy of measurement results, I would think that the only thing we can say for certain based on your numbers is that for measurements <.997-1.003> the part will be undoubtedly good.

However measurement results = .993 or .994 or .995 or .996 or 1.004 or 1.005 or 1.006 or 1.007 "barely" mean the part may be good or bad, but do not automatically mean it is bad.
 
The goal of inspection is not to decide if a part is OK or NG. The goal of inspection is to GUARANTEE THAT ONLY GOOD PARTS ARE ACCEPTED! As CheckerHater says, an inspection that doesn't take into account the accuracy/repeatablity/uncertainty of the measuring instrument is not worth the dingleberry hanging from a stray dog's flea infested hind end.

There is a difference between a bad part and a rejected part. The results you mention may not signify bad parts. As you said, they signify unknown parts. They don't automatically mean the part is bad, but they most certainly do mean that the part must be rejected. If you are inspecting with that instrument, you must reject unknown parts if the goal is to guarantee only good parts are accepted. This is the criteria. If you have a marginal process and you want to reject fewer unknown parts, get a more accurate measuring instrument that will allow you to be certain about parts that are closer to the line. If the upper spec limit is 13 and you measure 13.0, 13.00, 13.000, or 13.00000000, you still have to reject the part because you have no idea if the next digit is going to be a 1. However, if you take 5 seconds and measure 13.00 with calipers, you must reject it because you don't know it's good. Maybe good, maybe bad. But, if you really don't want to scrap the whole thing and you can take 3 hours to set up and measure 12.999999710 with some ridiculously accurate piece of measurement equipment, you can now accept the part because you know it's good. However, if you end up measuring 13.000000000 you STILL HAVE TO REJECT.

-handleman, CSWP (The new, easy test)
 
I'm reminded that the assigned cause to several 737 crashes was parts that were just on the edge of drawing requirements.

A chart I like to draw has a 2 X 2 grid. On one edge is Acceptable/Not Acceptable; on the neighboring edge Useable/Not Useable.

I think it is up to engineering to ensure that the criteria for acceptance selects as many of the Useable parts as possible while separating out the Not Useable ones as Not Acceptable. Those parts that are Useable and Not Acceptable are waste and those that are Not Useable and are Acceptable are a nightmare. Since no part lives alone, the borders can seem fuzzy. If the pin is bigger than spec is it OK 'cause the mating hole in this assy is on the high side? That sort of thing.

I think it is up to manufacturing to make parts that are in the Acceptable range (preferably away from the borders) and up to QA/QC to tell the difference.

Of course usual statements about cooperating, throwing things over walls, and all of that.

There has to be a confidence in the measurements, but there is also uncertainty in any measuring. Were it possible, the variability in measurement would be included in tolerance analysis when the values were assigned. Holding the tolerances as absolute should push the discussion into a tradeoff between manufacturing precision and QC measurement errors.
 
I agree that the goal of inspection is to guarantee that only good parts are accepted, however it should not be the role of inspection to influence functional acceptance criteria by selecting one measurement equipment/method and not the other. Quite the contrary - it is the functional requirements that should influence proper selection of inspection equipment/method. In other words, it is the design engineering role to define absolute minimum required to qualify the part good or bad.

So in my inspection plan containing statements like:
“DIA 1.000±.003 measure with Caliper” or
“DIA 1.0000±.0045 measure with Micrometer”
isn't a way to go. The inspection plan, again in my opinion, should rather state that from design engineering point of view it will be sufficient if dimensional requirements are checked with accuracy of minimum (N) digits after decimal separator, with maximum acceptable uncertainty of measurement = (u).

And then it is up to inspection to choose the adequate instruments and methods to meet these demands.
 
pmarc said:
it is the design engineering role to define absolute minimum required to qualify the part good or bad.
Design Engineering specifies the functional requirements by releasing the drawing.
The rest is none of their business. The Manufacturing will pick the tools accurate enough to make good parts and cheap enough to turn the profit.
pmarc said:
it should not be the role of inspection to influence functional acceptance criteria by selecting one measurement equipment/method and not the other.
It is their job to pick the tools to do the work right. Out of curiosity - Who else will select measurement equipment?
And to set things straight, I did not suggest that the inspection plan should say exactly this:
“DIA 1.000±.003 measure with Caliper” or
“DIA 1.0000±.0045 measure with Micrometer”
Company A may decide to go with Micrometer, and company B did not invest into” micrometers” yet. Nevertheless both may be capable of producing good acceptable parts.
In fact “Caliper” and “Micrometer” are just used to illustrate the concept – the ambiguity of any kind is quantifiable and should be considered when proving conformance.
 
CH,
I agree, it is the inspection job to pick the tools to do the work right. But what does the "right" actually mean? And will it always mean the same to you, to me, and to other people?

If I am design engineer and I am producing drawing with feature DIA 1.000±.005, it should be my job to define what the "right" really is. For example, it should be my duty to specify that the feature needs to be measured with accuracy of minimum N=4 decimal digits and with measurement uncertainty not greater than u=.0005. This is the absolute minimum I require. And it is up to inspection to choose proper tools to meet these requirements, so to do the job "right". I am not telling them: "Use micrometer" or "Use caliper" or "Use CMM" or "Use measure tape". I am telling: "Use whatever tool you like as long as you are able to satisfy my demands".

If they follow my instructions, there will not be a situation where (for N and u as above) 1.004 is rejected or accepted. This could easily happen if company A had micrometers and company B hadn't. But in my case 1.004 will always be OK, because the interval of unknown parts on this side of tolerance limits will always be clearly defined, <1.0046;1.0055>, and because company B will never be allowed to measure this feature due to lack of appropriate instrument.

And the same logic applies to manufacturing. By specifying tolerance limits on a drawing, (in most cases) I am not dictating any particular manufacturing method. I do not care how the feature will be produced. It must be produced to meet my requirements - this is all I need. If they are not able to make it, I am searching for another company that will be able to satisfy me. But for sure I will not change my functional needs just because someone uses manufacturing process that can't produce what I need.

Do you see my point?
 
Well, pmarc, it’s hard to get to your point when you say it’s not inspection’s job to select measuring equipment, and it’s designer’s job to specify measurement uncertainty.
It’s really sad that you don’t see what my point is.
As soon as you specified DIA 1.000 ±.005 your job is done. It will always mean the same to you, to me, and to other people. It is not your job to specify the process. Or maybe I didn’t get the memo?
pmarc said:
company B will never be allowed to measure this feature due to lack of appropriate instrument
As long as company B will ship you parts that are within tolerance, you will accept them – you have a contract. It’s not your job to tell them how to do their job.
Now let go to company C that is using C-gages made to gage-makers tolerance of 10%. The “good” parts falling within those 10 % will be rejected. Are you OK with that?
The point is, neither your drawing, or measurements are perfect. As long as you know how imperfect they are, and you take it into consideration, you are in control.
 
Sorry, not a member.
I'll take your word for it - must be interesting.
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor