Hiviz
Mechanical
- Jan 17, 2011
- 28
Hi
For testing purposes I have to measure the change of a distance (about 50mm). Easy. I would like to use a steel ruler instead of a dial gauge with rod for it, but the standard I am working to says that the instrument should have a resolution of 0.5mm and a precision of 0.1mm. I can get a ruler with 0.5mm increments, but I am struggeling to tell what the precision of my instrument (the ruler) is. And then there is accuracy, but I guess I have that sorted if I have the ruler calibrated...
I understand what precision from the definition point of view (Precision is how close the measured values are to each other)but am unable to apply this to my ruler problem. How can an instrument have a precision of 0.1mm when the resolution is 0.5mm?
Hope somebody understands my problem...
For testing purposes I have to measure the change of a distance (about 50mm). Easy. I would like to use a steel ruler instead of a dial gauge with rod for it, but the standard I am working to says that the instrument should have a resolution of 0.5mm and a precision of 0.1mm. I can get a ruler with 0.5mm increments, but I am struggeling to tell what the precision of my instrument (the ruler) is. And then there is accuracy, but I guess I have that sorted if I have the ruler calibrated...
I understand what precision from the definition point of view (Precision is how close the measured values are to each other)but am unable to apply this to my ruler problem. How can an instrument have a precision of 0.1mm when the resolution is 0.5mm?
Hope somebody understands my problem...