The superiority of 4-20 mA signal vs 0-10 V has been debated for decades. The common wisdom is that the current loop is superior.
My experience is the opposite. The voltage signal is easier to use in field installations (my experience is mostly with paper machines and steel mills where distances are in the 10 - 300 m range) and it is less prone to noise pick up. Of the around ten "problem installations" I have dealt with over the decades, there has not been a single 0-10 V installation, but many 0-20 or 4-20 mA installations.
The reason is simply that you are not at all dependent on wire resistance any more. That was the prime reason why mA signals were introduced long time ago. Having mechanical instrumentation, where the receiver was a d'Arsonval coil, called for well calibrated and compensated cabling in order to avoid wire resistance influence on the transmission. RF and transients pick up was not at all a problem at that time. It was simply a question of voltage drop in wires.
Introducing the 0-20 mA (or 0-60 mA, as it were in the very beginning) overcame that problem. The transmitters were designed with a Thevenin equivalent with internal resistance high enough to "laugh" at loop resistances in the 100 - 1000 ohms range and that worked very well. Insulation was not a problem since all receivers were floating (coils are inherently isolated) and could easily be connected in series.
The 0-10 V receivers today have input impedance in the Mohms range. Current in the caables is therefore seldom more than about ten microamperes. That makes voltage drop a non-problem. Even 100 ohms, which is a rather long cable does not introduce an error larger than 1 mV, which is 0.1 % of the signal range. The common installation has a much lower resistance and, hence, lower error.
What about RF pick up, then? No problem. Including a low pass filter in the transmitter output (typically 47 ohms, a 10 nF to GND and another 47 ohm just before the otput terminal takes care of "back feed" from field wiring to last base/emitter in the driver stage and input filtering, which is easy, since the impedance is high and band-width in process signals seldom is more than a few ten Hz.
So, the bad reputation the 0-10 V signal has is not because it is bad, but because its predecessors used current levels that were about 10 000 times higher and, therefore, caused voltage drops in the installation. Today's systems do not have that problem.
The current loop system, on the other hand, actually has problems today. A very recent case is a test stand for large diesel engines. There, the vibration transducers are outputting 4-20 mA signals that are sent over around 20 m long cables to the instrumentation, which is in the same space as the VFDs (around 1000 kW) that regenerate braking power. As soon as the VFDs are started, the vibration level (or, rather, the transducer's output) goes to nil - not exactly what one would expect.
The reason is that HF RFI from the generator cables introduce interference in the vibration transmitter outputs which takes the signal below 4 mA, which the system, of course, evaluates as wire break. This system has been heavily filtered but there are still occasions where "wire breaks" occur. And, the calibration of these transducers is something that must be questioned all the time. The truck manufacturer is now contemplating to move over to the 0-10 V signal system that has been tried for some time in parallel to the 4-20 mA system.
I think it is about time to abandon that "knee-jerk" automatic answer that current loop is less sensitive to HF and transients interference. It simply isn't true. And it never was, actually.
Gunnar Englund
--------------------------------------
100 % recycled posting: Electrons, ideas, finger-tips have been used over and over again...