The DSP in DSOs is a major fail even from the A list. It's not even good enough to get a passing grade as a DSP 101 homework exercise. The first chapter of any DSP text explains aliasing and why you cannot decimate data by throwing away samples. You *have* to low pass filter the data. That's not hard to do and doesn't require a lot of resources, so I'm agog that DSOs are decimating data by discarding samples. The results of my tests with a 5 ns pulse at 1 s intervals showed that "peak detection" is necessary to offset the absolute bodge of downsampling by decimation.
That illustrates the difference between theory and reality. There are good reasons that digital storage oscilloscopes work the way they do. As pointed out above, if you want a digitizer, than buy a digitizer, or an older DSO from LeCroy who initially based their DSOs on digitizers.
On the practical side, for a majority of time DSOs simply could not do more than the simplest processing during decimation short of a heroic effort and price. Indeed, they could not even store the undecimated acquisition because memory was not long and fast enough. The best which could be done is to discard samples, or the very simplest processing like boxcar averaging (high resolution mode) or peak detection. See below about implementing a filter based on decimation ratio or sweep speed.
Later when real time processing became feasible with custom logic, the DPO (digital phosphor oscilloscope) was invented. It has no need of decimation because it produces a histogram of the input in real time and every sample contributes to the acquisition record at any sweep speed.
The way a DSO *should* work when showing a time and/or frequency display:
ADC *always* samples at full speed. There is one ADC data stream which is fed to 2 DSP pipelines.
Data for time domain display is downsampled to sample rate appropriate to timebase setting using a folded LP filter
Time domain display interpolation is by means of a minimum phase interpolator which combines the passband of the anti-alias filter and the downsampling LP filter on the time domain data stream. The coefficients of that filter are optimized to minimize errors. A similar process is applied to interpolating the frequency domain data.
The problem with what you describe is that it results in a bandwidth which depends on decimated sample rate which depends on sweep speed. That is not the case for analog oscilloscopes where transition time does not depend on sweep speed.
It is also why I keep repeating that DSOs do not implement anti-aliasing filters, at least as commonly understood. If they did, then their bandwidth would vary with the time/div setting. You certainly could implement such now but why? The display will be no different except for two factors:
1. Signal envelopes will be wrong as higher frequencies are attenuated. Of course with aliasing it may not be apparent that a signal even has an envelope like in the example photograph that I posted which shows a fine envelope despite potential aliasing because peak detection was used.
2. The histogram for the signal will be corrupted. All of the characteristics and measurements which depend on the histogram will now vary with sweep speed; won't that be fun! Of course those who are used to Rigol's automatic measurements will not notice a difference because Rigol already does this by making measurements on the display record which has a corrupted histogram from the processing required to create it. This suggests that no uncontrolled processing which alters the histogram should be performed between acquisition and measurement.
To give a concrete example of the above, the DSOs I use can measure RMS and peak-to-peak noise easily and accurately within the limits of their fixed input bandwidth. But if decimation was done as you describe, then noise measurements would vary with sweep speed and be essentially useless. Changing the sweep speed should not alter the input bandwidth.
Another example is transition time would vary with sweep speed. Now you might think from using modern DSOs that this would not be a big issue but how did old DSOs handle it? They could not report the correct transition time if too few samples were taken at an edge and the designers knew this! Instead, they return the questionable measurement and include a warning that the decimated sample rate is insufficient. Many modern DSOs just lie. Those old DSOs also adjusted their returned number of significant digits to account for measurement precision.