Eng-Tips is the largest engineering community on the Internet

Intelligent Work Forums for Engineering Professionals

Calibration of Hi Pot Meters

Status
Not open for further replies.

rkvbobby123

Electrical
Nov 18, 2014
26
0
0
US
I was in a lab and the CEO was using the Hi Pot meter that was not calibrated as a variac and using some Multimeter that was calibrated using some resistance combinations to get to some output for the Hi-POT test. Raising it to 1500 Volts on his Hi Pot Tester and using all kinds ogf intermediates to show the Unit under Test is getting the 1500 Volts.

My Question is: it is not only the voltage but current magnitudes that play in the process and is it OK.
 
Replies continue below

Recommended for you

It is quite standard to use resistors and a voltmeter in a high voltage measurement system.

The resistors bring the voltage that is applied to the voltmeter (or multimeter here) down into the range of the voltmeter, then the voltage displayed on the voltmeter is multiplied by the resistive divider ratio to get the actual applied voltage. Many use a 1000:1 resistive (or capacitive for ac applications) divider, so the voltage that is displayed on the meter is a direct reading of kilovolts.

From a calibration point of view, you have a number of options:

1 - you can calibrate the system as a whole. ie calibrate the divider and the voltmeter directly against a traceable standard.
2 - you can calibrate the voltmeter separately, and calibrate the resistors separately as well.

The first is the cleaner option (especially if you need to determine measurement uncertainty figures). The second option allows you to use the multimeter for other purposes as well, as you have calibrated it separately from the system. It really depends on the use.

If I was in that lab, I would expect to see calibrations for the resistors used, as well as the multimeter.

In answer to your specific question, the current does come into play. The best place to measure the voltage is at the terminals of the test object, that way it is most representative of the voltage that is applied to the test object. Some voltage measurements are done by measuring the primary voltage of the test transformer, or off a tertiary winding, and multiplying these by the winding ratio to the output winding. These have the disadvantage that, as you alluded to with current playing a part, if your test object draws a lot of current, then the output voltage of the test transformer will be lower than just the input voltage multiplied by the turns ratio, because of the voltage drop within the transformer. Thus you end up applying a different voltage to the test object to that displayed on the meter.

If you use a divider on the output of the test transformer, the voltage that you measure is correct (within uncertainties) irrespective of the test current. This is certainly the preferred method of voltage measurement in high voltage.


Ausphil
 
Great job. In essence then we are saying is that Resistors cannot be taken for granted for their inbuilt tolerances.
They need to be calibrated. In Accreditations feel invaded or feel attacked when such questions are raised. But I will ensure her understands.Thanks.
 
Status
Not open for further replies.
Back
Top