Rakestraw
Electrical
- Jan 2, 2003
- 10
I have been asked to measure the actual input impedance of an analog circuit that is designed to accept micro volts as the input which is directly fed into a high impedance op-amp buffer stage. The typical frequency at the input is about .1 HZ, so I am treating it as DC so I am thinking that I am really just trying to measure the input resistance? The maximum input voltage to this circuit is 2 Vdc. In order to measure the impedance I used a laboratory DC reference as a voltage source with a resistor in series across the input to my circuit, I figured I would measure the voltage drop across the resistor to calculate the current and measure the voltage across the input to my circuit and then use these two numbers (R=V/I) to calculate the input resistance.
The problem is that the input Op-amp LMC6462 has an input impedance that is specified at >10 Tera Ohms and our best volt meter has an input impedance of 10 Giga ohms, so the meter loads the circuit and my measurements are incorrect.
Does anybody have any Idea how I can overcome this problem? Or a better way to measure the input impedance without renting or purchasing additional equipment.
Thanks !
The problem is that the input Op-amp LMC6462 has an input impedance that is specified at >10 Tera Ohms and our best volt meter has an input impedance of 10 Giga ohms, so the meter loads the circuit and my measurements are incorrect.
Does anybody have any Idea how I can overcome this problem? Or a better way to measure the input impedance without renting or purchasing additional equipment.
Thanks !