It's a 8-bit DAC with a very sensitive, very high impedance input and a slight DC offset error. It can only ever be about 5% accurate.
The reading looks entirely reasonable to me when the wave isn't clipped.
@Fungus
I'm sorry but you are looking this completely wrong way...
Fair enough, but the point about 8-bit DAC, etc. still stands.
If you only turn on a single channel or if you only measure on channels 1 and 3 then are the RMS readings reasonable?
If you own a decent true RMS multimeter then make the RMS measurements with that.
Use the scope to look for distortions, clipping, etc., in the output (ie. what it's meant to be used for).
You are repeating yourself... It is tiresome and you are not right.....
First let's see specs from manual:
DC Gain Accuracy: <10 mV: ±4% full scale >10 mV: ±3% full scale
DC Offset Accuracy: ±0.1 div±2 mV±1% offset value
Channel to Channel Isolation: DC to maximum bandwidth: >40 dB
This means following: no 5% is not right , it has to be less than ±4% full of scale on sensitivities of less than 10mV/DIV, and better than ±3% of full scale on more than 10mV/DIV...
And 8 bit ADC converter can be 0.00001% accurate... It will have a resolution of 256 discrete steps form 0 to max value. But every single step can in theory be 1 ppm accurate.. It would have to have fantastic linearity but yeah it can be that accurate... Kelvin Varley divider with 3 decades will have a resolution of one thousand but can be sub PPM accurate on each of it's steps...
And I checked , each channel individually does fairly good RMS... 2.09 RMS applied to input (that is amplitude of scope cal on my Rigol, measured by separate 6.5 digit true rms voltmeter) shows 2.09 to 2.06 on all channels, each by itself... So accuracy of RMS measurement in not a problem and quite within scope specs...
Offset is also quite better that guaranteed specs..
Problem is in the third spec: channel separation.... Channel separation should be on order of 10000x, meaning that 10000 mV (10V) connected to one channel, should not induce (yes induce, trough parasitic capacitors and inductances inside scope) more than 1mV of phantom signal in other channels, and that's from DC to 100MHz...
And looking at the traces it seems that scope does that better then specs too, electrically at least.....
Except in RMS measurements.
In which, channel separation between consecutive channel seems to be cca 7.7dB (cca 2.43 times voltage difference) because of stupid software bug...
You connect 2.09 VRMS on CH1 and on CH2 with nothing connected you measure 854mV. That is 0,854 of a division, 10.6% of error induced by software bug..
So there you are.. No , it's not within scope specs.
And my scope does it from CH1 to CH2, from CH2 to CH3, and CH3 to CH4. CH4 does not interfere with other channels.
So if you need to measure RMS, you can do it with only 2 channels simultaneously, CH1 and CH3, CH1 and CH4 or CH2 and CH4.. All other combos will have errors..
And if you work with switching PSU design and repair, RMS on a scope is used a lot... especially on several MHz and up, and waveforms with high crest ratios and complicated shapes , where no multi-meters will have enough bandwidth.. On those scope with good RMS implementation and 3-5 % percent error is much better than 30% errors you will get with multi-meter..
And I don't have one of those fancy thermal RMS converter based types with 100MHz bandwith that could do the job..
So none of your reasons to downplay this have merit.. I know you just want to make sure all is right and precise and relevant and such..
Thank you for a nice discussion that I hope now resulted in good explanation of what is wrong and why this is important and need to be fixed.. Much more than that HUUGe "pulses/pluses" bug
Attached are photos of screen and RMS measurements, voltage applied to each channel separately.. note the spillover to following channel.. All channels were set equally...