ljester133
Automotive
- Jan 24, 2002
- 3
Does anyone know of a standard or a best practice for tolerancing a visual gage. The gage is used in conjuction with an individuals sight to determine pass or fail condition. Even assuming perfect eyesight can you actually differentiate between 0.1mm, 0.01mm, 0.001mm, and 0.0001mm.
What can the human eye actually see accurately?
We would like to correctly tolerance these sight gages on their prints, keeping a accuracy ratio of 10:1. We don't want to over tolerance the gage and add additional cost to our process.
There has to be some type of standard or best practice for this type of measurement.
What can the human eye actually see accurately?
We would like to correctly tolerance these sight gages on their prints, keeping a accuracy ratio of 10:1. We don't want to over tolerance the gage and add additional cost to our process.
There has to be some type of standard or best practice for this type of measurement.