Eng-Tips is the largest engineering community on the Internet

Intelligent Work Forums for Engineering Professionals

Instrument Validation 3

Status
Not open for further replies.

josephn16

Chemical
Sep 5, 2002
28
0
0
US
Hi all,

Here's my situation: I have to measure the density of a liquid in my process. The specification is 1525 g/L +/- 2.5 g/L. The instrument that I have been given to measure the density reads from 1500-1600 g/L, in increments of 2 g/L.

I do not feel that the instrument is the right one for this process due to the fact that the graduations are 40% of the total spec range. My biggest problem is that I need some very solid proof of this because both the spec and the instrument have been given to me by another of our plants that "invented" the process, and they do not listen to requests for change without some type of proof. Are there any tests that I can perform to prove my claim one way or another?
 
Replies continue below

Recommended for you

The GENERAL rule of thumb is this:

If your ruler measures down to 0.01, then you are acurate to 0.1.

Having spent years in a quality lab as a metrologist, that is a pretty safe rule to live by.
 
You could start with ASTM E 29-02 Standard Practice for Using Significant Digits in Test Data to Determine Conformance with Specifications. It is available at:


Regards,

Cory

Please see FAQ731-376 for tips on how to make the best use of Eng-Tips Fora.
 
The proper way to do this is called a gage reliability and repeatability study, if you wish to provide statistical proof that you have the wrong tool for the job. (Which you do). Basically it involves testing the same samples several times by different operators. I imagine google will find a description.



Cheers

Greg Locock
 
A GR&R will not tell you if you have the right tool for the job, a GR&R will tell you if you can accurately repeat the same measurement with a tool. For example, I can measure the width of a pencil tip with a meter stick, and repeat the same number every time (Good GR&R), but still be using the wrong piece of equipment to measure the value (a caliper or micrometer would be a better choice).
 
You will probably use the density to describe something else, for example, concentration. Do inverse calibration, insert the required confidence limits and you will see what your instrument can really do.The precision given by the specifications is the best that instrument can do under ideal conditions; in a plant it might be different.
m777182
 
I've found the best way to prove the measurement precision needed is to show how much money is left on the table.

For example, perform a gage R&R study with your current device plus another one that has better resolution. Then compare the two. You should see some examples where the current device rejected formulations that were actually good, or vice versa. You can then convert this into $$$$ the business is losing, which is definitely something that will be listened to.

Good luck!
 
Status
Not open for further replies.
Back
Top