Everything said by our RF-guru G0HZU should be taken seriously. Up to now, he mainly talked about the (close to) +6 dB boost for the distortion products in case of two clearly (and inevitably) corelated distortion sources, but of course this assumes zero phase shift, hence zero runtime – or at least test signals whose wavelengths are long compared to the physical dimensions of the test setup. Once the distance between distortion sources happens to be half the wavelength, we get phase inversion and instead of a boost, the distortion components will cancel out.
EDIT: Doh! Nobody notified me of my mistake. Of course the runtime is almost irrelevant, since the distortion products in question are almost the same frequency as the test signals!
In my test, the frequency was 450 MHz and half its wavelength is about 33 cm. While it is highly unlikely that such massive distances will occur between potential distortion sources within the DSO frontend, it would be perfectly possible that an odd multiple of this distance exists (as electrical length) between the external generator and any non-linear stage within the DSO. As a consequence, there is indeed a chance of cancelling out distortion products in this scenario.
My claim that I cannot guarantee the IMD performance of the test tones to be better than -74 dBc came from the fact, that I’ve measured this with my SA. It’s pure coincidence that the measurement result with the DSO was about the same. Since I don’t happen to own a R&S FSEA30 boat anchor with 110 dB third order dynamic range, I could not know whether I hit the limit of the generator or the analyzer. With two 10 dB pads for the external power combiner, hence a total isolation of 26 dB, any IMD products should be killed reliably. Now I could indeed measure an IMD of at least -85 dBc and can be confident that my source will be vastly better than this, when it already measured as -74 dBc IMD with only 6 dB isolation.
With the 10 dB pads in place, I cannot maintain a 0 dBm signal anymore. Even though I could have used about -3 dBm, I voted for repeating the test with a signal level of -10 dBm. This totally changes the scope settings and its internal nonlinear transfer curves.
Originally, I had 0 dBm signals at 500 mV/div. This means that there is an internal 20 dB attenuator active, and the PGA (Programmable Gain Amplifier) might have to deliver about 6 dB of gain. For an equivalent setting at -10 dBm, I would have to set the channel gain to 160 mV/div. With this, the attenuator is still in place, but the PGA now is at 16 dB gain, which might cause more distortion. In general, I think that the PGA might be the main source of nonlinearity in a wide bandwidth scope frontend, whereas the unity gain input buffer as well as the ADC should be rather benign in this regard.
Look at the attached screenshot for -10 dBm at 160 mV/div. The IMD isn’t quite as good, but it still happens to be another sweet spot:
SDS2504X HD_IMD_160mV_C450MHz_O200kHz_-10dBm
We can also try to replicate our former sweet spot without the internal 20 dB attenuator. For this, we need a -20 dBm test signal at 50 mV/div, see third screenshot:
SDS2504X HD_IMD_50mV_C450MHz_O200kHz_-20dBm
With this setup, we can be confident that the generator signal is way better than that, so we need not speculate where the distortion comes from.