Continue to Site

Eng-Tips is the largest engineering community on the Internet

Intelligent Work Forums for Engineering Professionals

  • Congratulations waross on being selected by the Eng-Tips community for having the most helpful posts in the forums last week. Way to Go!

How to properly use a digital caliper? 2

Status
Not open for further replies.

BreadboardPerson

Mechanical
Apr 12, 2016
32
Hi, I zero the caliper at the beginning of each use. I noticed that depending on where I use it and even at the same location, depending on how much force I use to close the caliper, I get different values. So I get different readings all the time. What is the proper way to get the measurements? How much force should I use to close the jaws? Even lightly closing the jaws led to different values.
 
Replies continue below

Recommended for you

By how much? It is possible the fixed jaw is not as fixed as you expected or that something else is loose. On micrometers there is usually a clutch mechanism on the adjustment knob to set the correct force, but I've never noticed anything similar on calipers. I have several from Harbor Fright and they are really cheap but don't act like that.
 
Why not to send a note to the manufacture, you might get answer, or a replacement.
 
Have you cleaned the jaws?
Have you cleaned the part?
Are the jaws straight?
Are the jaws undamaged?
Do you have a known reference item that you can measure several times with different ways of holding the caliper?
It is best to use the thumbwheel to close the jaw on the part without forcing anything.
By applying the force at the thumbwheel, carefully, you can apply the same force to the outside jaws as to the inside jaws, thereby getting consistent measurements.

Reference:
Krar, Oswald, Technology of Machine Tools

 
I'd have to say that calipers have developed rather greater resolution than accuracy when they went digital. Micrometers had a built-in torque limiter for that reason, otherwise people would use them as G clamps.

Cheers

Greg Locock


New here? Try reading these, they might help FAQ731-376
 
I still use 'very-near' caliper.

Something is worn or loose if your zero is that variable.

Ted
 
Don't squeeze the part. Gently close it and take a reading.

Also, on top of what other people have said, watch out for Abbe error. Grip the object as close as possible to the bottom of the jaw.
 
How stiff is the object you are measuring? Sometimes in a pinch I use calipers to measure o-rings, I close them until there is just a tiny bit of drag on the part.

----------------------------------------

The Help for this program was created in Windows Help format, which depends on a feature that isn't included in this version of Windows.
 
Are the jaws parallel? Close them and hold them up in front of a light. Make sure no light passes through along the length of the jaws. You could also measure a .001" shim at various points in along the jaw length.

I agree with Ted that the moveable jaw is loose if the readings vary. The jaw can rock depending on where the object is in the jaw and where you are grabbing the movable jaw.

Ideally you would develop a "feel" for closing force by measuring a known round or flat object. If you are measuring a round object then first measure a calibrated round pin of about the same diameter. If you are measuring a flat object then first measure a known, flat gauge block.

Put the part in the jaw at the same location as your calibrated gauge. Apply the load to the jaw at the same point.
 
Get some certified gage blocks and practice. If you know that the block is 2.00000" then see what it feels like to measure that. You should check zero, about 25% of scale and about 75% of scale. So if these are 6" you should have a block in the 1-2" range and one that is 4.5-5.5". I had a digital with a damaged sensor stripe, it only worked out to 3.5".
If you are measure parts that are supposed to be square and flat then make sure that the jaws are fitting flush, no gaps. Microfiber lens cloths are nice for cleaning jaws.
And unless you are in a lab 0.0005" is the limit of what you can measure regardless of what the tool says.
In order to go lower you have to worry about temperature control and a bunch of other stuff.

= = = = = = = = = = = = = = = = = = = =
P.E. Metallurgy, consulting work welcomed
 
If I read it right, zero is consistent, it is just the part measurement that varies.
Does the part "get smaller" when extra force is applied by you?

Sometimes the first contact of the caliper is skewed or tipped which would not be a true diameter etc. IN that case a little extra pressure might "align" the caliper better on the part.

A mechanical engineer I worked with would get vastly larger-than-life readings with his name brand digital caliper. His technique always struck me as "clamp and go."

Measuring tools are supposed to be immune to operator "feel" but it seems most are not
 
BreadboardPerson,

Find some gauge blocks and measure them. Whatever technique gets you the correct measurement of the gauge block is the correct technique.

--
JHG
 
Your problem is most likely lack of a tight fit between the moveable jaw and the slide it is mounted on. There are adjustable gib screws to assure a tight fit without too much friction. Even then, these parts are not infinitely rigid and you will change the reading by pressing too hard with the thumb wheel. This can be eliminated by squeezing the jaws together with your fingers on the outside of the jaws, with the object being measured between your fingers. With this technique the jaw clamping force does not go through any flexible parts of the caliper.
 
The pipe wrench of precision measurement.
 
One trick we used to use was to put a thumbprint on the surface blocks, and the adjust the vernier so it just smudged the thumbprint.

Cheers

Greg Locock


New here? Try reading these, they might help FAQ731-376
 
Measuring tools are supposed to be immune to operator "feel" but it seems most are not

Take an introductory course in either metrology or machine ops, nothing could be further from the truth. This is why hiring experienced inspection staff if critical, as-is keeping precision measuring tools away from most engineers. Everything deflects, is impacted by heat, and accuracy means nothing without repeatability. That said, calipers are only intended to measure within 0.002-0.003.” If the OP isn’t repeatable to that level then they should spend some time practice measuring gages or other parts of known size.
 
There's a formal method called gage R&R which attempts to assign the causes of inaccuracy in measurements. I've regularly seen non-metrology responses with an R&R of 30%, that is 70% of the measurement differences were due to things other than the thing being measured.

Cheers

Greg Locock


New here? Try reading these, they might help FAQ731-376
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor