akrus
Electrical
- Sep 13, 2013
- 7
I have been reading a lot about the distinctions between Dielectric Withstand (Hipot) and Insulation Resistance testing. There is a lot of conflicting information out there about the merits of each, especially if you read a response from a manufacturer of particular test equipment.
I understand that generally hipot is a pass/fail test to determine whether the equipment can withstand the test voltage and IR is a quantitative test to measure the resistance at the test voltage. I can see that often hipot is run at a higher AC voltage and IR is run at a relatively lower DC voltage.
Now for my question...
In my application, we are often requested to run both of these tests at the production level. The requirement is to run hipot at 500 VAC and IR at 500 VDC. The capacitance in the unit under test is minimal because there are no capacitive components and the wire length is short. I am struggling to understand how there is any difference between these tests. In both cases, ~500V is applied (I know AC is RMS) and the test equipment is measuring the leakage current. How would any defect (creepage, clearance, damaged insulation, etc.) be detected in one and not the other? My thought is that IR testing covers everything unless the hipot is run at a much higher test voltage. Am I missing something here? Any discussion is welcome. Thanks!
I understand that generally hipot is a pass/fail test to determine whether the equipment can withstand the test voltage and IR is a quantitative test to measure the resistance at the test voltage. I can see that often hipot is run at a higher AC voltage and IR is run at a relatively lower DC voltage.
Now for my question...
In my application, we are often requested to run both of these tests at the production level. The requirement is to run hipot at 500 VAC and IR at 500 VDC. The capacitance in the unit under test is minimal because there are no capacitive components and the wire length is short. I am struggling to understand how there is any difference between these tests. In both cases, ~500V is applied (I know AC is RMS) and the test equipment is measuring the leakage current. How would any defect (creepage, clearance, damaged insulation, etc.) be detected in one and not the other? My thought is that IR testing covers everything unless the hipot is run at a much higher test voltage. Am I missing something here? Any discussion is welcome. Thanks!