Very early DSOs had abysmal memories, so for long settings of time/div, the sample rate was reduced, as there was no place to save the number of samples that would result from sampling, & displaying say, 2 cycles of a 50Hz sine wave, using the maximum sample rate, so the sample rate was reduced.
A sine wave of that order could be easily accommodated, using around 1kSa/s, but complex signals are "another Kettle of fish"
Analog video signals are particularly nasty, with components out to 5MHz, or sometimes more.
Not only are they wideband, but the component frequencies of that bandwidth are also constantly changing.
TV people were usually very enthusiastic to see their first DSO, but all their hopes were dashed by the unusable displays presented, as the instrument could not represent the real waveform.
All the discussion so far seems to be drifting into the problems of higher frequencies, where anti-aliasing filters come into play, but if sampling rates are still reduced at much lower frequencies, any alias components will still be present for those lower frequencies which 'scopes are very commonly required to observe,