cmb042
Geotechnical
- Apr 28, 2008
- 39
I have an amplifier circuit I am tryin to get more gain out of. I increse Rf and the signal is attenuated. I don't understand why. I first thought that if Rf gets to be too much bigger than Ri, then the capacitor can't discharge fast enough for the input to register the next pulse. I tried increasing, then decreasing the capacitor C and then Ri but I didn't notice any effect. The negative supply rail of the opamp is grounded, and the noninverting input is supplied with about 1V DC. The input signal compsist of 30-200 negative pulses per second of microsecond duration.
Maybe some sort of phase problem? Not sure where to start on a test for that.
crappy sketch:
---Rf---
| |
| |\ |
---C--Ri----|-\ |
| >---
+1VDC--|+/
|/
Maybe some sort of phase problem? Not sure where to start on a test for that.
crappy sketch:
---Rf---
| |
| |\ |
---C--Ri----|-\ |
| >---
+1VDC--|+/
|/