Peak detect is nothing more than keeping the minimum and maximum values from the ADCs. It doesn't even require storing the samples and processing them later. Even with an FPGA this is very doable at 2Gs/s.
Exactly.
For marmad.
I just tested also this with my old digitals what have peak mode and roll. also with entry level today oscilloscope what have peak detect and roll mode. Also peak detect with Siglent SDS2304 (using normal mode because today it do not have peak detect function if it is in roll mode)
Example, Owon SDS7102. 1GSa/s (and dual channel 500MSa/s) (it have single chip dual ADC what have two 500MSa/s ADC what can merge together with chip internal function where same clock is inverted internally (180 degree).
Tested:
Roll mode, peak detect, 5sec/div. sampling speed selected for 10s/s if 1k sampling buffer is selected. Used this setting.
Pulse. 3ns (! note Owon analog risetime and it is 100MHz model) Pulse period ~1s
It do not loose any pulse. There is peaks level "aliasing". I do not know what mode this ADC chip run. But it looks like it use ADC in its native speed. This inaccuracy is because I do not know exactly this signal real shape in ADC input. (but I know what same scope show on the display if use it 2ns/div but it need remember there is sin(x)/x)
So thinking this by using amount of level alising and what signal looks if use scope full speed and also trick to show real sample points in stop mode) it can think it use least 500MSa/s for this peak mode. (1ns or 2ns interval.) (signal level on the display is around 7 div. Aliasing p-p is around 1.5div
and this p-p value do not change if scope is 2 channel or 1 channel mode so it "nearly) proof that it use its ADC with 500MSa/s speed.
I have watched many times as long as my eyes last... I can not find any missing pulse. I can not see even any pulse what peak drops below half of max displayed peak.
Same test with Siglent SDS2034 using 50ms/div (not roll mode) and 7k buffer for as low as 10ksa/s. Peak mode.
As far as I know it do not have missed any single 3ns pulse. Level aliasing is less due to faster ADC and faster analog front end rise and fall times. (pulse itself have around 1ns risetime (in HP specification 1.3ns)
If ADC run much less than full native speed result is totally different.
This test but least it proof what is minimum samplerate for peak detect in these cases due to known pulse width, level and due to fact scope do not loose pulses.
If I can produce reliable pulses down 1ns width pulses with known adjustable periods result can proof more about real samplerate what is going to min max peak detect.