I believe there are ten 40MHz ADCs clocked at 100MHz, giving 1GS/s, limiting ADC conversions to 500MHz. The limiting factors would be the analog front end (probes, attenuator and amplifier), clock jitter, and trigger noise. The analog front end is likely the largest contributor, though trigger noise would be close, especially for averaging periodic signals (like a sampling 'scope). Why do I mention this? The program may spit out values at 150MHz, but there is likely little gain. In fact, the DSP algorithms (decimation and resampling or cardinal spans) may be compromised due to lack of previous antialiasing. Be careful!
Would love to see someone characterize the [isolated] inputs with a 1GHz+ VNA.