roberto1brazil
Mechanical
- Apr 3, 2011
- 50
Hello
Please, I would like to have an idea of what can be happening when comparing inspection results between a CMM machine and conventional micrometer. Here at my company, we have a part (casing) which has a hole repaired making use of a bush installed in it. Just to clarify my question, I have attached a sketch displaying the main dimensions according to BP drawing. Due to a special internal requirement, it was asked to the operator to leave the final size of bush after machining to an internal diameter of 59,995 mm. He worked very carefully and reached at the final machining cut the dimension of 59,995 mm inspected with micrometer of three points. Refer to the photo to check the kind of micrometer in use. A dial gage (millesimal tolerance) was used to check the roundness of hole after machining and it displayed no more than 0,001mm at a depth of 8 mm approximately. The temperature of shop floor where the jig bore machine is located is 21°C ± 1°C. But it was necessary to check the internal diameter making use of another inspection process to be sure that the required dimension was achieved. The part was sent to a CMM machine (DEA- Global). The room temperature is also controlled (21°C ± 1°C). The calibration of probe is executed with each change of shift or when changing of probe type. The probe touchs eight points of internal diameter at a depth of 8,0 mm. the final result was an average diameter of 59,998 with a maximum diameter of 60,001 and minimum diameter of 59,996. My question is: What are the reasons to have this difference between the two results (it is minimum but there is). Is it due to the uncertainty of the each inspection equipment? The roundness displayed by CMM (60,001 -59,996) is actual or it is due to uncertainty of machine too? I apologize if I am asking something foolish but answers to those kind of questions are very difficult to be found.
Thanks and regards.
Roberto1Brazil
Please, I would like to have an idea of what can be happening when comparing inspection results between a CMM machine and conventional micrometer. Here at my company, we have a part (casing) which has a hole repaired making use of a bush installed in it. Just to clarify my question, I have attached a sketch displaying the main dimensions according to BP drawing. Due to a special internal requirement, it was asked to the operator to leave the final size of bush after machining to an internal diameter of 59,995 mm. He worked very carefully and reached at the final machining cut the dimension of 59,995 mm inspected with micrometer of three points. Refer to the photo to check the kind of micrometer in use. A dial gage (millesimal tolerance) was used to check the roundness of hole after machining and it displayed no more than 0,001mm at a depth of 8 mm approximately. The temperature of shop floor where the jig bore machine is located is 21°C ± 1°C. But it was necessary to check the internal diameter making use of another inspection process to be sure that the required dimension was achieved. The part was sent to a CMM machine (DEA- Global). The room temperature is also controlled (21°C ± 1°C). The calibration of probe is executed with each change of shift or when changing of probe type. The probe touchs eight points of internal diameter at a depth of 8,0 mm. the final result was an average diameter of 59,998 with a maximum diameter of 60,001 and minimum diameter of 59,996. My question is: What are the reasons to have this difference between the two results (it is minimum but there is). Is it due to the uncertainty of the each inspection equipment? The roundness displayed by CMM (60,001 -59,996) is actual or it is due to uncertainty of machine too? I apologize if I am asking something foolish but answers to those kind of questions are very difficult to be found.
Thanks and regards.
Roberto1Brazil