One second of no feedback can be pretty long when I'm e.g. trying to adjust something. At 200 ms/div it's two seconds …
Sorry, but I cannot see a valid argument here.
I’ve rarely ever used a DSO to deal with very slow signals, but right now I’ve played a little with a 1 Hz cardiac signal, just to confirm my theory.
You want to adjust something. One would normally use a faster time base for this, so I have to assume that the repetition frequency of the event you want to “adjust” is pretty low – just like my 1 Hz cardiac pulse. Now I’m inclined to say that adjusting a waveform with a slow repetition rate will always be a pain, no matter how the DSO displays it.
If, for instance, I want to adjust the amplitude of the cardiac signal, I only get feedback every second, no matter what. And of course that makes it hard to adjust precisely, but please don’t blame the DSO for that.
I’ve tried both normal acquisition and Roll mode and for my personal taste, 1 second updates in normal mode are still bearable, whereas for 200 ms/div Roll mode might be the better option indeed – as long as you don’t need a stable signal position.
Now what would we gain from Scan mode?
Assuming 100 ms/div time base, we can say:
• Normal mode provides a consistent snapshot of the signal every second.
• Roll mode provides a consistent window into the continuously moving signal, without any blind time.
• Scan mode shows an inconsistent mix of old and new data.
Scan mode certainly cannot speed up the update rate of the signal, and the adjustment process won’t get any easier.
Since display modes have to be implemented in hardware, an additional mode would take additional resources that (if at all available) would be much better spent elsewhere.