Why would you expect the 3rd order dynamic range to be any different from e.g. the SDS2000X Plus?
Simply because of the 12 bit ADC in some of the SDS6000 variants!
With a higher resolution ADC, you get less granular noise, hence a wider first order dynamic range.
Agree, you should see less granular noise and thus better DR.
In contrast, the third order dynamic range is determined solely by the linearity of the ADC. There is no reason why an 8 bit ADC could not provide the same linearity as a 12 bit variant.
Completely Wrong!! The linearity is dictated by the entire analog chain including attenuators (MOS switches), preamps, VGA, buffers and ADC, not
solely by the ADC as you state.
Of course, if you imagine a constant +/-1 LSB INL specification, then the higher resolution ADC will be more linear. But the ADCs in modern scopes are calibrated – have you ever asked yourself why self-cal takes so long? – and consequently even the 8 bit models have a fairly good linearity. This is also absolutely necessary, otherwise any resolution enhancement measures like ERES or long FFT would not yield any sensible results.
Self calibration is not a "fix" for a poorly performing system, and can do little to correct the non-linearity of the entire channel before the ADC that is usually frequency, waveform and amplitude dependent. With sensitive signals present with much larger signals one must be concerned are all the waveform details real or an artifact of the system nonlinearity??
[/quote]
Finally, the linearity and consequently the third order dynamic range is not determined by the ADC alone, but also the frontend. It should be pretty obvious that a 1 GHz or even 2 GHz frontend might not be able to provide the absolutely best linearity, as there are other challenges as well. Figures around 60 dB aren’t bad at all and vastly exceed the standards for hifi audio equipment 😉
Good you corrected your earlier erroneous statement!!
Don't work with audio so can't say, but things we've developed during my career 60dB isn't nearly good enough. We've done various custom dedicated application real time spectrum analyzers with custom chips, high dynamic range wide-band receivers, and many other systems often pushing 100db or more.
Ever wonder why Real Time Spectrum Analyzers don't use 8 bit ADCs
They use the highest resolution ADCs available that meet the intended bandwidth, to help achieve the necessary dynamic range. No one is going to buy a RTSA based upon an 8 bit ADC
These ADCs are quite expensive and fundamentally why RTSA (and 12 bit scopes) are more expensive. There are some new type ADCs that we were fortunately involved with back in ~2010, that may prove highly beneficial for applications like Electronic Warfare, high DR wide-band receivers, and likely later filtering down to scopes & RTSA. Waveform information is quantized simultaneously in time and amplitude, and offers a unique means of achieving higher DR at higher frequencies, also the anti-aliasing filter is post ADC conversion. The actual input waveform creates the unique properties with this ADC, and the waveform dictates its' own "Nyquist" requirement not fixed as in conventional ADCs. Anyway, well outside this thread on the new SDS6000.
Best,