Eng-Tips is the largest engineering community on the Internet

Intelligent Work Forums for Engineering Professionals

  • Congratulations waross on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

4-20mA Loop Powered Sensor/Transmitter Calibration

Status
Not open for further replies.

MangaMech

Mechanical
Dec 3, 2004
13
0
0
AU
Does a calibrated 4-20mA Loop Powered Sensor/Transmitter wired into its circuit 'fresh out of the box' require tweeking to allow it to turn the input pressure/temp/flow value into a reliable 4-20mA signal proportional to the measured variable (as per calibration). This signal being read at either the transducer output or at the control cabinet?
Does each installation require some adjustment, i.e. zero point and span, to account for local conditions. This seems imply recalibrating the to instrument?? if so why have a instrument calibrated at the factory?

Thanks in advance.
 
Replies continue below

Recommended for you

It is calibrated at the factory to verify that it meets its advertised specifications.

You have to calibrate it in YOUR application because not all installations work out to exactly match the nice round numbers in the published ranges. You may have standing head on one leg of the sensor, and the maximum point of interest to you will probably not match the factory calibrated value. Or the factory calibrates it for water and you use it on a chemical with different specific gravity.
 
It depends.

The manufacturer must be told the calibrated range. They usually calibrate as required for no additional charge provided that when buying the instrument, someone specified the calibrated range. If you specify the calibrated range you have a reasonable chance of receiving the transmitter calibrated as required. Buying calibration certificates provides documentation that you got what you want.

With most of the latest smart models (I am a Rosemount 3051S kinda guy) the zero point and span will be highly accurate and should not be adjusted in the field to account for local conditions. However, this may not be true if the range is very low. A draft range such as -0.1 inches water to +0.1 inches water would require verification at the site where a 0-1000 psig range would not. For most models, even range changes can be made with the hand-held communicator (calibrator) without messing up the highly accurate calibration. This will accommodate the situations where no range was specified provided that the range needed is well within the transmitter specification as shipped.

If it is not a highly accurate smart transmitter then all bets are off.


 
Thanks,
Does factors such as the powering of the loop, shielding or wiring resistance affect the need to tweek the transducer for local conditions; or should the local setup be to a standard that removes need for adjustment for these factors.
 
I agree with JLSeagull, provided you tell the vendor what range you want there is no need to re-calibrate. Gone are the days where every transmitter was re-calibrated before installation. No, your wiring or system voltage should have zero effect on a good transmitter. Mind you the time saved in calibration is taken up by the designer figuring out the exact calibration required.
You might need to re-zero something like a flange mount level transmitter, sometimes they shift a little when bolted up or a pressure transmitter on liquid if it's mounted below the tapping point.
Regards
Roy
 
I just ran into a case where we calibrated some 0.1% accurate pressure transmitters here in our lab. When installed they were immediately out of calibration because of small deflections in the diaphragms. This may have been caused by some rough handling or even the threading of the transmitters into the system.
We then went out and re-calibrated them in the field....

Joe Lambert
 
Sounds like calibration should remain unaffected providing:
- local wiring and power of loop are within manufacturers limits of voltage and loop resistance.
- the range of the instrument has been specified and been calibrated to this.
- handling and fitup do not strain sensor body.
These all should be achievable within expected/standard industry norms.
The end result is that today a instrumentation engineer/technician commissioning an installation would expect an installed sensor to provide a 4-20mA signal according to the calibration certificate.

Would you bet the house or just the car on this one??
 
I would be willing to grandstand with this position on a lump sum turnkey project for most transmitters. This could be a sort of 80/20 thing.

Shop calibrate the analyzers and specialty instruments. Check them again as installed during commissioning.

Diaphragm seal (capilary) level transmitters are also problematic. If the level range is too low (under 20-inches) and the ambient temperature variation is wide, the level signal may include a lot of bias from heating of the capilaries, etc. This in addition to any diaphragm dents that might happen during installation.
 
No, I stronly dissagree with shop calibration for several reasons.
I have seen more damage done than faults found.
The manufacturer's calibration equipment is far superior to anything the contractor may have.
The contractor's instrument tech is usually an unknown quantity, you don't know what experienc he/she has.
The contractor should never power up an instrument or control system before the commissioning engineer has checked it out, it's not in his intrest or the clients.

Twenty years ago with pneumatic or analog instruments I would have agreed with you.

The contractor should make sure he has the correct tag, check the model and make sure it's installed correctly. Checking calibration is a commisioning task.

Regards
Roy
 
I generally agree with RoyDMatson. A lot of plant people prefer the term validate over calibrate.

Calibration is a service that requires traceability based upon client, laboratory, NIST requirements, etc. This takes the thread toward QA standards that go well beyond the initial question and in more ways than just semantics. I have no personal interest in starting that thread.



 
I agree also. Unless the site has an elaborate calibration lab, with NIST standard comparison equipment, dead weight air testers, etc., as well as well trained technicians more harm than good can come as a result of "calibration".I've seen it first hand.

I agree with Mr Seagull, Validation is a better approach. This is even more true with fieldbus instruments that have a more directly derived digital signal.

Usually when there is a problem its because the instrument is completely dead - so go, no-go tests are helpful. Power up a indicating transmitter to see if a reasonable value is displayed for example. Check a temp transmitter with icewater, etc.

Flow instruments are the most difficult of all and one usually just has to rely on the manf until validation can be done in the process by comparison with other instrumentation.
 
I guess that what we are all trying to achieve is getting a true measurement from the transmitter?
I agree that manufacturer calibration equipment is at least one order of magnitude more accurate than any local site standard. But the manufacturer is unable to predict installation effects. These vary from, positioning of remote seals to the weight of process fluid in impulse pipework. These may be insignificant on high range devices, but they can totally distort the readings of low range transmitters.
As with most things in life, a degree of common sense is required and a zeroing of the transmitter during commsioning is nessesary to remove installation effects.
With regard to hand held communicators. Beware!, they do not calibrate the transmitter. They just re range the output. This means that the digital signal (accsessed via digital communication) and the 4-20ma signal would give you two different readings until you calibrate the transmitter by injecting a reference pressure and recalibrating. As discussed above, this is not recommended without suitable traceable, calibration test equipment.


I trust this helps,
Mlv
 
Let's just say that prudence, if nothing else, says that you should always verify your instrumentation, regardless of how you expect to calibrate. You can call it commissioning, if you so desire, but unless you've done a large number of such installations, and know the characteristics of these installations intimately, calibrate, or validate, or commission.

TTFN

FAQ731-376
 
I think we all agree then, There is no point in doing a shop "Calibration" before installing in the field as this will not correct the "Installation Effects" which are usually taken care of by re-zeroing in-situ.
Standingback is right re-ranging is not a calibration.
Some transmitters still require calibration e.g. Analyzers, conveyor weigh scales, nuclear density gauges to name a few, most others are just re-ranged or zeroed.
Roy
 
Actually, I would consider lab calibration as being also required, depending on the degree of accuracy required. You need to verify that the instrument is meeting its design specifications, which requires laboratory accuracy level and stimuli. Normally, one would expect this to be done by the supplier, as part of its production testing.

Installation effects then go on top of that. If you have an uncalibrated instrument, it's equally likely that your field "calibrations" may result in scaling or offset corrections that do not properly account for all usage conditions.

TTFN

FAQ731-376
 
Status
Not open for further replies.
Back
Top