The noise is specified as 34 microvolts RMS for the 1 megohm input over 100 MHz. My typical old analog oscilloscopes have a noise of 28 microvolts RMS over 100 MHz for a 200 MHz input.
So I will give them credit for achieving low noise for a 500 MHz or 1 GHz input because I have nothing to compare with, but they did not achieve low noise in comparison to obsolete slower oscilloscopes. This may be because the design to support 500 MHz or 1 GHz will be inherently higher noise at lower frequencies.
I'm preparing to make some measurements of noise for comparison and will publish when done.
But preliminary, compared to SDS3104xHD, at 50Ω HD3 only has lower noise up to 20 mV/div. At 50mV/div is similar, and then up from there Siglent has less noise..
At 1 MΩ situation is not that clean cut either. At 1MΩ KS published only data at 500MHz BW.
Where my SDS3104xHD has 1GHz BW, and if I set digital filter (ERES, same type of thing as what HD3 uses) at 500Mhz, I get much better results.
In fact, with full 1GHz BW I get better noise levels from 50mV/div up, despite having 2x BW.
I also tested shortly with SDS200xHD, a 500MHz scope with 1MΩ. At 500µV/div to 5mV/div I get 75µV RMS (better), at 10mV/div I get 125µ RMS (slightly worse), at 50mV/div I get slightly better, at 100mV/div etc...
So basically, very comparable results...
Meaning, for the 500 MHz version at 1MΩ, it's no better than SDS200xHD, noise vise..
Advantage is only in 50Ω Path.