Continue to Site

Eng-Tips is the largest engineering community on the Internet

Intelligent Work Forums for Engineering Professionals

  • Congratulations GregLocock on being selected by the Eng-Tips community for having the most helpful posts in the forums last week. Way to Go!

Resolution Precision Accuracy of a ruler

Status
Not open for further replies.

Hiviz

Mechanical
Jan 17, 2011
28
Hi

For testing purposes I have to measure the change of a distance (about 50mm). Easy. I would like to use a steel ruler instead of a dial gauge with rod for it, but the standard I am working to says that the instrument should have a resolution of 0.5mm and a precision of 0.1mm. I can get a ruler with 0.5mm increments, but I am struggeling to tell what the precision of my instrument (the ruler) is. And then there is accuracy, but I guess I have that sorted if I have the ruler calibrated...
I understand what precision from the definition point of view (Precision is how close the measured values are to each other)but am unable to apply this to my ruler problem. How can an instrument have a precision of 0.1mm when the resolution is 0.5mm?

Hope somebody understands my problem...

 
Replies continue below

Recommended for you

If your ruler's markings are every 0.5 mm then you can't read 0.1, 0.2, 0.3, 0.4, 0.6 etc. for example.
 
i've understood that precision was 1/2 the smallest unit marked on the scale (which would make resolution the smallest unit, no?)

does the standard define its terms ?

 
Yes, I understand that. But why the standard states that the instrument need to have a resolution of 0.5mm AND a precision of 0.1mm. I can't see how that is possible with any instrument, let alone the ruler.
But maybe it means they want the measured result with one decimal place???
 
Hi rb

Yes, that how I first understood it, but how can you get a precision of 0.1mm with a 0.5mm resolution? If you take the precision as 1/2 the smallest unit the best I can achive is 0.25mm. But according to the it seems possible to get a precision of 0.1mm with a 0.5mm resolution.
There are no further definitions given in the standard.
 
which is the higher requirement ? ... precision of 0.1mm reflects a resolution of 0.2mm, which exceeds the requirement of 0.5mm (so who'd complain ?)
 
Is this standard public or specific to a company? If it is a company standard then it is just stupid (either the resolution needs to be 0.2 mm or the precision needs to be 0.25 mm). That kind of inconsistency does creep into public standards but it is much less common. Last time I looked at the NIST definitions and standards (nearly 20 years ago now), they were pretty definate that the precision could not be smaller than half the smallest increment of resolution on any analog instrument.

David
 
you also wonder if precision refers to the whole range ... ie does 0.1 mean +- 0.05 ??
 
Precision is to some extent repeatability.

So, if you measure the same 'gold standard' (or maybe this time platinum iridium in a climate controlled environment standard) say 10 times, are all 10 readings within .1 of each other?

You may not be able to verify the 'precision' of your instrument using the instrument itself.

I guess the point in your situation is you don't want a part that is measured initially and 'rounded up' to the next .5mm increment, to be measured again and 'rounded down' to the lower .5mm increment.

(Or something like that, you want to ask the guy sat opposite me, he loves to go on a rant bemoaning peoples lack of understanding of the difference between accuracy and precision in a metrology context)

Posting guidelines faq731-376 (probably not aimed specifically at you)
What is Engineering anyway: faq1088-1484
 
My old Engineering Measurements textbook describes "precision" as an inverse measure of the scatter of the data. I interpret this as the result of a series of identical inputs will yield outputs within +/- 0.1 mm of that input in the present example.

It defines "resolution" as the minimum change in input for which there will be a change in output, in this case, 0.5 mm.

It seems possible to me that the precision can be smaller than the resolution.

This textbook is about instrumental measurements and instrument performance, I am not certain these definitions have the same meaning for something like a steel scale.

Regards,

Mike
 
The current usage of "precision" is equivalent to "repeatability," and would be applicable to your dial gauge, i.e., that repeated measurements of the same quantity all fall within a ±0.1mm spread. Likewise, since you are essentially the "dial" when using a steel rule, you'd need to be repeatable to within ±0.1mm, which you obviously can't be, so you need a rule with higher resolution to support the required precision. If you can't get one, then you need to use something else to meet the requirement.

TTFN

FAQ731-376
Chinese prisoner wins Nobel Peace Prize
 
Thanks for your thoughts!

So either the standard is wrong
or I need an instrument that has a resolution of 0.5mm and a precision of 0.1mm. Still not sure how that is going to work since the resolution does not allow me (or the dial) to read in 0.1mm steps, therefore my results would spread over more than 0.1mm. Anybody can come up with an intruments that fulfills the above requirements (or should I go down the line that there is a typo in the standard...?)
 
Perhaps I'm looking at this too simply, but I would take this to mean your ruler needs to be be scaled in 0.5mm increments and these 0.5mm increments need to be precise to within 0.1mm.
 
Tuckabag, I took it the same way as well. My rule is in 0.5mm increments (the first 100mm anyway) with a stated accuracy of "+/- 0.2mm at 20 degrees C"

Obviously it would be foolish to measure something to 52.3mm but I at least know if something comes out at 52.5mm it is between 52.3 and 52.7mm (except I'm not standing in the metrology lab so it's going to be wrong)

Designer of machine tools - user of modified screws
 
if precision is interpreted as repeatability then it isn't solely dependent on the scale 'cause the measurer has an impact, no?
 
Hence IRstuffs point, it's the 'system'/'method' repeatability, not just the rule.

Posting guidelines faq731-376 (probably not aimed specifically at you)
What is Engineering anyway: faq1088-1484
 
Thanks for all your input!

IRstuff
Yes, I agree, but why would a standard state a resolution of 0.5mm
and a precision of 0.1mm???

Tuckabag/Ninja
If the standard would ask for an accuracy of 0.1mm I would go with you, but accuracy and presicion are 2 different things.

Guess I have to find out from whoever wrote the standard. Wish me luck!

 
Is this an industry standard?

Standards are like specs; even in the umpteenth rev, errors and omissions can still be found, if you look hard enough.

We can also play the game of what do the numbers and their precisions mean in the standard. Does the standard call out ±0.1mm, or ±0.10mm? If the former, then your steel rule might still work out, since your measurement uncertainty is ±0.25mm, and your repeatability ought to be about ±0.125mm, which rounds down to ±0.1mm if only 1 decimal place is used. While this approach is a bit cheesy, it's not an implausible position to take.

TTFN

FAQ731-376
Chinese prisoner wins Nobel Peace Prize
 
OK, sure - the "book" has a very mushy definition that cannot be met in practice.

But have you thought about the "carbon-based-ruler-interface" here? What is scribed on the ruler is almost irrelevent compared to how the reader is actually looking at the "ruler" and determining exactly where he is reading: Any parallax? Is he looking dead-on perpendicular to the surfaces? How is he determining what he is reading: If I stick a ruler down a hole, but the hole is drilled out, then is my ruler tiny enough to go partially "into" the 118 degree tip of the hole?

Is this a thickness where you are laying the ruler across a rounded edge or a an irregular edge? Or a machined square corner? Do you have a sliding gage or "stop" that you move to the surface/edge and then read?
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor