Hi all! This is my first post here. I noticed some non-ideal behavior while messing with an inverting amplifier, and I'm having trouble pinpointing the cause of it. Here is a brief description of the set up: There is a 30 kHz sine wave connected to one end of a pot, and is also driving the input of an inverting amplifier, with gain of 1. The output of the inverting amplifier is connected to the other side of the pot, and the signal is output through the wiper, to the oscilloscope. By adjusting the pot appropriately I would expect to be able to null the signal to essentially zero, however what actually happens is the signal reduces to some minimum voltage, 40mV pp in my case. Turning the pot further causes the 40mV signal to phase shift by about 180 degrees and then grow larger again, but it never gets smaller than that minimum amplitude.
The only explanation I can think of is that the inverting amplifier has a small phase error, so the inverted signal is actually a few degrees more or less than 180 out of phase, instead of exactly 180. Doing the maths confirms that summing two signals slightly off from 180 degrees phase shift yields the same effect I am observing. Looking at the input and inverted input signal together on the oscilloscope, they look perfectly 180 degrees out of phase, but I can't really measure the exact phase difference with degree accuracy since I'm using a analog scope. I used a TL072 op amp to do the inverting, but just in case this was some weird bandwidth related thing I also tried the fastest op amp I had, which was an AD817. In either case the output was the same. Any ideas what would cause this to happen?
I did notice that in both cases, there was a slight ripple on the inverting input of the amplifier, so it was not holding a perfect virtual ground. I checked the noninverting input and it was firmly at ground, no ripple.