1M tests used an open input. If I used an external 50ohm termination then that's effectively the same as the 50 ohm mode, unless there is actually different paths for the 50ohm and 1M input which is usually not the case for scopes.
So that graph you just posted in correct. I thought you wanted an open input?
In a good design, noise is dominated by the impedance converter at the input, but there are several noise sources there. The input FET has its inherent voltage noise, and usually horrible flicker noise because it is a UHF part.
In series with the FET gate as part of the protection circuit is a roughly 470 kilohm series resistance bypassed with like 1000 picofarads, so the thermal noise from this resistor has bandwidth of about 340 Hz which is likely obscured by flicker noise from the FET. The shunt 1 megohm resistance is in parallel with about 15 picofarads, so its noise bandwdith is about 10 KHz and it contributes considerable low frequency noise; that is the increase in noise that you see when the 50 ohm termination is not present.
As far as "low noise", I measured Tektronix 7000 series 200 MHz vertical amplifiers from the 1970s and 1980s with 18 microvolts RMS noise over 100 MHz, which is considerably better than the same noise level over 20 MHz. They perform better than most modern instruments because they use JFET instead of CMOS impedance converters, and have the advantage of being designed for a lower bandwidth; software bandwidth upgrades were for the future. Besides the use of CMOS, an 800 MHz design will be higher noise even with bandwidth limiting because of the required parts selection.
The bandwidth is automatically switched at higher sensitivity volts/div settings because high noise levels lead to a largely meaningless display. (1) This feature is hardly a new thing, and it was common in old oscilloscopes that supported x10 vertical magnification; activating the vertical magnification deliberately engaged the bandwidth limit to control noise. (2) Another example of controlling noise is the Tektronix 7A13 differential comparator. It necessarily has high input noise, like 100 microvolts over 100 MHz, because of its bootstrapped differential input configuration, so to support even a 1 mV/div sensitivity, it has a 5 MHz bandwidth limit; even 20 MHz would have been too high for its noise level.
Some early DSOs had noise levels approaching the quantization noise of their 8-bit digitizer, which is a little weird when you first see it. It looks like the DSO is broken when there is just a straight line with an occasional peak-to-peak "bump" in it.
(1) This is also why you do not find vertical sensitivities greater than about 1 mV/div without bandwidth limiting or some type of noise reduction; the input noise is too high for it to make any sense. In the past they considered even 2mV/div questionable.
(2) Instruments like these did not even *have* a separate bandwidth control. If you wanted to limit the bandwidth, then you activated the x10 vertical magnification.