The frequencies are too low to have serious problems with skew (A 1ns delay needs about 30cm of trace length).
Ok, perhaps it is solved or guessed what they perhaps do. (when I have been writing @wraper last message make this my comment too optimistic)
IF they do "Ping-Pong" interleaving with two separate ADC.
Do you think 1ns delay there on the board in practice is 30cm... perhaps I'm also Santa Claus.
Time skew between two pingpong interleaved ADC is - how I say it nicely - still critical.
1ns is long time! It is whole 360 degree in 1GSa/s system.
Take very extremely simplified thinking, so that all can think it just calculating in head without more math. Lets think there 1GSa/s samplerate done with two 500MSa/s PingPong interleaved ADC and system itself is ideal and after one input signal goes ideally to both of these clocked ideally and ADC's linearity and amplitude etc all is ideally matched and they have ideal 180 degree phase shift.
Now, 200MHz scope rough risetime is 2ns. For simplify lets think there is 256 FS but this input signal make linear 200 ADC step change in 2ns. So, 100step in 1ns. In 100ps it change 10 step etc... this time error is directly translated to ADC sample point level error.
Do you still think time skev do not mean. If there is some kind of calibration for this, how it adjust these picoseconds. Programmable delay line? Static delay is simple case. But also clocking separate ADC chips without any meaningful jitter is bit challenging when it need do cheap.