I am not talking about pressing the STOP or the SINGLE button. The screen is left in a limbo state when suddenly there are no more triggering events, and the WAIT icon in the upper right corner is flashing green.
When the trigger condition disappears, while waiting for more trigger events to come, from a screen full of traces you are suddenly left with
- sometimes a lot of traces drawn on the screen
- sometimes a few less traces drawn
- sometimes only one trace on the display, .
If it were to be a circular buffer with the last n traces mixed on the display, you won't see a single trace, you will always see a mix of the last n traces. But this is not the case.
Steps to reproduce:
- generate a 5Vpp noise with a signal generator and put it on CH1
- set the trigger to "Normal", "Edge", "Rising", "500mV"
- set the acquire Mode to "Normal", Mem Depth to "Auto"
- timebase to 5ns/div
At this moment you should see a live image of the noise (note the triggered T'D green icon in the upper left corner of the display), looking like the first attached picture.
Let it run for a few seconds, then turn off the signal from the signal generator, so the oscilloscope will find no more trigger events. Do not touch the oscilloscope.
The "WAIT" icon will now appear in the upper left corner and will start to flash green, indicating the oscilloscope is waiting for the next trigger event. At this point the display is frozen.
What would you prefer to see on the display at this moment?
Some will say they expect the image above, with the many traces overlapped frozen on the display, some will say they prefer to see only one trace, maybe. On an analog oscilloscope you will see a blank screen.
No matter your preference, if you repeatedly turn the generator on and off a few times (with a few seconds left in between), you will see the display is frozen in an inconsistent (random) state, sometimes with many traces mixed, sometimes with less traces, and sometimes with only one trace frozen, like in the last 3 pictures attached.
I'll say this is a bug in the UltraVision ring buffer that is supposed to average many traces into one displayed image.
What I'm afraid is this is not a bug, but Rigol just cut some corners to reduce computing power and memory requirements, so it will never be fixed.
I assume (my speculation only) that instead of a running average that will require a circular buffer to implement, they are simply adding the last trace to the displayed image (no circular buffer) then after a while clean the display and start again adding new traces on it.