Hi Janne,
Thanks for the reply and description of your work. Repair work can be far more tricky than simply finding what is faulty. Many old test instruments as still quite useful, so much of the repair work becomes restoration, and replacing obsolete output transistors require careful confirmation that a pulse generator's rise and fall times are still ok. Many faults are intermittent and the ability to get some idea of a problem, set a complex trigger and leave it armed for a day or so, maybe using sequence mode, is a great way to identify tricky faults. Even simple repair work is quite demanding when dealing with digital video.
On a more philosophical level, I feel that "digital scope" is an incorrect name. In the rush to sell ever more test equipment, the manufacturers have built high speed digitisers with complex trigger and a display and called them a "digital oscilloscope". I don't think this is a fair description of what they do, they are both much more and less than an analogue scope. Both analogue and digital scopes have their place in electronic work. The classical ham radio application of the trapezoid display, RF on one axis, modulating audio on the other, will always be a problem for the digital instrument. I know also that many makers have struggled to make DSOs behave like analogue scopes to show things like eye diagrams of fast serial data. I don't think the analogue scope can be replaced by digital for some applications. In terms of results for your dollar, the analogue scope is a valuable instrument for many electronic jobs.
I wish maker's would give up on trying to make DSOs look like analogue scopes and just call them something else. Unfortunately, the window of opportunity has passed for that and we are stuck with it.
You said, "Even with the relatively old HP 54645A we have at work (which is far below the "$30,000+" price you mention at ebay, although quite more expensive than Rigol), it is very difficult to get any aliasing (certainly not anything that would confuse the user), although sample rate is just 200 MS/s and specified bandwidth of 100 MHz."
I'm not surprised, the spec says it has 1M of acquisition memory per channel! Long memory ensures far higher digitiser speeds over the middle range of timebase settings. If DSO has short memory, then starting from the fastest time base settings down, you have to throttle back the digitiser to stay within the limits of the memory you have available. If the memory is 1M, that is very long, and will the major factor in why you don't see alias as often.
You also said, "My working theory is that Agilent runs their ADC at full sample rate all the time, and then they just digitally decimate and filter the data to get the suitable "display sample rate". That makes it possible to avoid aliasing and reproduce the "analog look" even with long time/div-settings. So there is obviously a way to suppress the aliasing so that it does not confuse the user. I don't know why other scope manufacturers don't do the same thing."
I'm sorry to be disagreeable, but I don't believe that using this technique contributes anything towards avoiding aliasing. What is different between a single shot sampled at 1GS/s and decimating 1000 to 1, so the effective sample speed is 1MS/s, and a single shot taken at a digitser speed of 1MS/s? Sample gates usually use the same fixed sample period, so that is no different. There is no difference in the contents of the acquisition memory, so no edge against alias exists. I think this is another spurious argument put out by salesmen keen to empty your wallet.
I don't wish to create discord, I just can't see how this can work. If you have a DSP chip that can keep a running record and min/max bin data every 1 nanosecond, then it might be a workable technique. I think this is another problem like the screen alias with low cost scopes decimating acquisition memory to squeeze it onto a 300 horizontal pixel display - you can 't see a glitch in the middle of the memory, so how is that any different to a 300 word acquisition memory? I know you can take that single shot and zoom and scroll for a long time, but in reality, most folk look at the screen, don't see any reason for further investigation, and move on to the next job. It might as well be a 300 word memory... Without actually processing the data and compressing it at that maximum sample rate, it is of little practical use. I have yet to see any DSP processor that can do this. The other difficulty is that makers are unwilling to describe how they do whatever magic is in their wondrous box. If they don't describe precisely how it works, how can you know it is good? I'm not going to trust anything said by someone who is paid on commission!
Even moderate price scopes like the Tek DPO2000 series don't use any display algorithm. I was shocked to learn this recently. For low cost scopes you expect compromise, but for a Tek worth AU$3000, I would expect better. It makes no sense, so there must be some missing information - I hope someone can explain why LeCroy managed this over 15 years ago, and modern Tektronix can't. *shakes head*
My best wishes, Colin
Melbourne, Australia