What we should see in practice then?
For example with by 50ohm shorted BNC channel input at 1mV range (at a specific sample rate and BW).
What AC.RMS (or standard deviation or peak-peak) is to expect with 8bit and 12bit scopes?
Or what 12bit/8bit ratio of the noise should we see?
So far - based on the videos - I do not see the ratio to be really 12bit/8bit..
PS: for example my borrowed 20y old DS1062CA shows 440uVpp noise terminated with 50ohm (1mV range, BWlimit ON, 100ms/div)..
This is something that was explained in 12 bit Siglent postings but somehow gets forgotten when discussed here..
Firstly, there is input referred noise of input amplifiers (analog front end). Analog noise is connected to BW. More BW more noise.
Then there is ADC noise that is also partly analog noise (input buffers) and quantization noise (noise introduced while digitizing). Quantization noise will be related to ENOB (effective number of bits).
You can have 8 Bit scope with lover noise than 12 bit scope, at low ranges, for instance if you pair very low noise front to very high ENOB (for a 8 bit) ADC compared to high noise front end connected to 12 bit ADC with not so stellar ENOB.
ENOB is always less than architectural resolution. Here it is that good 8 bitters will have 7.5 Bit ENOB, but good 12 bitters will have 10-10.5 bits.
And here comes the interesting part. Noise is about lover part of signal. Hence term of noise floor. Basically, noise floor is a measure what smallest signal we can measure reliably at current RANGE of vertical sensitivity.
At large V/div, where analog front end contributes little noise, noise will be dominated by quantization noise of ADC and basically equal to it's ENOB. As you go more sensitive, front end will contribute more to noise, to the point of it being dominant source of noise...
At 1mV/div (or less) there won't be 4X noise floor difference between 8 and 12 Bit that ADC ENOB difference would suggest.
So for measuring very low signals, preamp quality comes first.
What 12 bit gives you is not ultimate low noise at low end (that is function of preamp) but dynamic range (that is how ADC are calling it anyways). If for instance , in FFT, 8 bit scope can measure 1mv signal clearly at 100mV/div, a 12 bit scope can do the same at say 400mV/div.
That means that at the same time, on same screen 8 bit scope can show both 1 mV and 390mV peak, but a 12 bit one will be able to show 1mV and 1,58V signal at the same time. But that is FFT, frequency domain look.
In time domain, waveform display standard scope mode with 12 bitter (compared to good low noise 8 bitter) you will see same trace at high V/div, a nicer thinner trace in some intermediate ranges and increasingly same thickness of traces as you go to sensitive ranges (where preamps noise dominates), provided similar preamps.