Here is a comparison with the RTB2000 and the 1 MSa/1 MOhm data.
But what do the graphics actually show, what is the practical benefit?
Peter
Thanks, it's great to see them plotted together! I would say the main point to take from these plots is that for 1V scale, 20 MHz bandwidth, HDO is doing better than RTB because of higher ADC resolution.
But I personally don't agree with the set of parameter choices proposed here earlier for recording such data:
a) For 1 M Sample files the sample rate is low (50 MS/sec), so full bandwidth data are meaningless because of aliasing. Even 20 MHz data can have some aliasing depending on how steep the filter is.
b) Open input data to me are not that interesting or useful. It's sensitive to interference and only practically relevant example is when using 10X probe for weak signals, which is a bad combination.
When we did this exercise a few years ago (
https://www.eevblog.com/forum/testgear/oscilloscope-input-noise-comparison/) I suggested always taking data with 50 Ohm terminator on input and comparing 50 Ohm and 1 MOhm scope settings. This shows if the amplifier paths are different and is relevant when recording most relatively-low output impedance signals.
Also I suggested always using maximum scope sampling rate. Perhaps there is a middle ground where the files are ~10-30 Mbyte and still manageable but don't go to low frequency. For investigating low-frequency 1/f noise one could take data at 100 MS/sec with 20 MHz bandwidth filter.