I generated a single 100MHz sine wave, and even though the entire signal was complete within 20ns, I still see a minimum and maximum point plotted 100us apart from each other. I have attached the actual sine wave being generated, as well as what peak detect mode shows me when I am zoomed out to 10kSa/s
Yes, this is a very nice demonstration of Peak Detect mode – let’s examine that for the ones who can’t imagine exactly what’s going on:
That single signal period is 10 ns wide and the original sample rate is 2 GSa/s, i.e. 500 ps sample interval. No problem capturing a 10 ns period.
Now you lower the
effective sample rate to just 10 kSa/s by increasing the time base to 200 ms/div and limiting the record length to 20 kpts at the same time. Now Peak Detect has to come into play:
Of course, the ADC still samples at 2 GSa/s, so we have an accurate representation of the input signal in the original data. Now these data at 500 ps sample interval have to be decimated to a 100 µs sample interval. That means keeping only one out of 200k samples.
Now we work on two decimated sample intervals at the same time, by finding the min. and max. value within a 200 µs interval. It is most likely that we will get one such min/max pair that is [-0.5V,+0.5V]. This is now split again to get the final effective sample rate with 100 µs sample interval by generating two samples, -0.5V followed by the next one +0.5V 100 µs later. For this to work, we need also consider the order of the two extrema, i.e. know which one comes first, the minimum or the maximum.
If the transition between two 200 µs sample intervals happens to be exactly in the middle of the signal period, then we would get [0V,+0.5V] for one 200 µs interval and [-0.5V,0V] for the following one. In this case, we also get the correct decimated samples with 100 µs interval following the same scheme as above: 0V, -0.5V, +0.5V, 0V.