David, old Rigols did RMS over highly decimated screen pixel data. That resulted in measurements being made over filtered /distorted data.
That is true of any DSO and any sampling RMS instrument. Decimation does not alter the RMS value and instead increases the uncertainty. This is easy enough to understand by considering what happens to the standard deviation of a set of data points when they are decimated; the standard deviation does not change but the uncertainty does.
Note that since decimation has no effect on the RMS value, aliasing does not either. The noise depends on the bandwidth and not the sample rate. The record length has a minor effect because low frequency noise components longer than the record length are not captured and some measurements take advantage of this to limit low frequency noise.
A DSO where the display record is a histogram like a DPO (digital phosphor oscilloscope) can also make an accurate RMS measurement of the display record. So whatever Rigol is doing to produce a display record, it is not this.
What do you think, what would be proper protocol to verify that RMS works well this time?
When I did this test on my DSOs, I used one of my analog oscilloscopes to measure its own RMS noise using the tangential method and then I measured the same noise using the RMS function of my DSO. They agreed to better than 5%. I was alerted to the problem with Rigol's DSOs when someone posted a self noise measurement which was more than an order of magnitude too high and not consistent with the displayed noise.
In theory any Gaussian noise source can be used with a controlled measurement bandwidth however how do you calibrate the source? I used the tangential measurement method with an analog oscilloscope for lack of anything better and that may be the most available method. In theory a DSO can use this method but only if its display accurately duplicates the response of an analog display. Most of the old Tektronix TDS series DSOs can do it as can an analog sampling oscilloscope but the display processing in newer DSOs prevents it.
My informal test when evaluating DSOs and I do not have a calibrated noise source handy is to just have the DSO measure its own front end noise but that only works because I know approximately what it should be and if it measures an order of magnitude or more high, I know the RMS function is broken.
Something is really messed-up with the front-end of the MSO5000. I am using a 10KHz sinewave generated by the builtin AWG, BNC to crocodile, 10X Rigol probe. In Normal acquisition mode, I cannot get a stable signal, it is triggering in both directions, rising and falling edge. When I switch to average mode with 2 averages, the signal gets attenuated by more than 50%, then Hi-Res acquisition mode brings back the signal to full, but the double triggering disappears. I use the same cable with the Rigol probe on my Keysight EDUX1002G and it works perfectly, as expected.
Can anyone try the same setup and verify?
I do not need to verify this as I know exactly what the problem is. I think Dave even discusses and illustrates it in one of this videos but I know w2aew covered it; see the video below.
DDS/AWG outputs contain significant glitch energy from the quantized DAC steps which means any part of the waveform has high frequency rising and falling edges. Depending on exactly how sensitive the trigger is and how it is implemented, the oscilloscope's trigger may see both edges. This is much less of a problem with analog function generators which have a continuous (non-quantized) output but if their output is noisy enough, it can happen with them also.
For DSOs which implement an analog trigger path, the trigger coupling can sometimes be adjusted to prevent this. For DSOs with a digital trigger path like the Rigol, trigger coupling depends on a digital filter and Rigol has had problems with this in the past. Did they ever get AC trigger coupling working on their high end DSOs? It was broken on the DS1000Z series for a long time.