Another thing I wonder. The RTM3000 and RTA4004 have low noise in 1mV range compared to other scopes.
But perhaps in 1V range the picture change dramatic.
All in 50 Ohm and 1 GHz:
1mV/div 1V/div
RTA4004 0.11mV 31.4mV
HDO6104A 0.145mV 4.9mV
Tektronix MSO44 0.260mV 13.0mV
With an ideal noise-free frontend, the measured noise comes from the ADC (and subsequent signal processing) exclusively and will be strictly proportional to the V/div setting, no matter how the various input gains are implemented (PGA/Attenuator).
At 1V/div you will measure ten times more noise than with 100mV/div for example.
Any real frontend will exhibit some base noise, hence there is a lower limit. In case of the RTA4004 you should get 31.4µV noise at 1mV/div in theory, but the frontend noise takes over and it’s actually 110µV. I’d expect the noise will not change much up to 3.5mV/div and then start rising with the V/div number.
What you see here is essentially the different quality of the individual ADCs:
LeCroy HDO6104A is 12 bit and only 4.9mV noise at 1V/div, which is equivalent to about 2.5 LSB.
Tek MSO44 is also 12 bit, but its 13.0mV noise is more like 6.5 LSB.
R&S RTA4004 is 10 bit and 31.4mV is roughly equivalent to 3.75 LSB, which would be equivalent to 15 LSB for a 12 bit ADC.
Now draw your conclusions. Assuming your tests are correct, the HDO6104A sets the benchmark and the MSO44 misses that by quite a margin, but is still better than the 10 bit RTA4004.
You can also look at it this way: if the HDO6104A was restricted to 10 bits, its noise would be only 0.625 LSB (1.62 LSB for the MSO44).
Just as a comparison, the 8 bit ADC in the Siglent SDS5104X exhibits 18.9mV noise at 1V/div and 1GHz bandwidth (1Mpts, 5GSa/s), which is just 0.6 LSB.
This is also where the ENOB discussion comes from. With a high advertised number of bits for the ADC you certainly get more data, but it does not tell much about how useful the additional data actually are.