But to borrow and scale a phrase, a 100 MHz oscilloscope cannot track a 2.5 nanosecond edge but it should be able to measure a delay of 1.0 nanoseconds between two such edges.
Perhaps it 'should', but if the spec says the interchannel delays can be as much as 1-2 ns, using it for such measurements isn't something I'd care to rely on.
I took this specification to mean that when using multiple channels, the inputs are not simultaneously sampled. Do we know what ADC the DS1054Z uses?
And then I will look at the early LeCroy DSOs which were advertised as having digital triggers and find that they had timing resolution significantly higher than their real time sample rate would suggest.
That's true, but so what? I have a 9300-series LeCroy, and two 9400-series. And you are correct about their timing resolution/capabilities. But they all had ETS (well, RIS), so they got that for "free", because they had a clock (or facsimile thereof) that ran 40x-50x faster. I see interpolation capabilities in the ps range. Back about 50 years ago, when I was using LeCroy scopes in the Physics labs at the Uni, picosecond events were extremely important. But the current "affordable" scopes we're talking about were never intended for that purpose.
I used the LeCroy since it was already mentioned as a specific example where timing resolution is not limited by the ADC sample clock and digital triggering is used in place of ETS. Wasn't LeCroy the first to implement digital triggering? Maybe they just advertised it as such first. I remember their advertisements saying how much superior it is to older analog ETS implementations.
They certainly are not making picosecond measurements with 100 MHz bandwidths but what I was trying to say earlier is that at these bandwidths, 100 picosecond resolution if not accuracy is reasonable either by using ETS or reconstruction.
So what was the facsimile of the clock which allowed high resolution delay measurements? RIS as they describe it sure sounds like what I described where transition midpoint timing (*) is derived after reconstruction.
I haven't researched the mechanism they used to implement it, but the 9400 (surprisingly the older generation of the two) has an absolute time-base accuracy spec of +/-10ps, and can do relative interpolation as you're describing, down to 5ps. My 9300 may be better yet, but the manual is still packed away.
You would want low jitter in the timebase to prevent aliasing or at least not increase the distortion introduced by the digitizer. On oscilloscopes which use ETS, the timebase jitter needs to be comparable or better than the ETS resolution no matter what the ADC sample rate and sampling error is and the same condition applies if digital triggering is used.
This leads to seemingly absurd implementations where a 20 MS/s ADC is paired with ETS with 500 picosecond resolution to yield a 2 GS/s equivalent time sample rate. No 20 MS/S ADC is likely to support that however so . . . they include a low jitter sample and hold before the ADC so the ADC clock jitter is irrelevant. Something very similar if not identical is done on these integrated ADCs so their sampling jitter only depends on their sample and hold.
The ADCs being used by Rigol seem to have about 5ps of aperture jitter but the big unknown is their clock source which I would expect to be more like 50ps but it could be worse. 50ps or worse is typical for an FPGA derived clock from a clean source and it would be worse yet if clock multiplication was used which I doubt they did. Photos from some other Rigol DSOs show that the digitizer clock is not derived from the FPGA but we have no idea how good the integrated clock they used is except by measurement.
That then comes out to 100 picoseconds at 5 ns/div. Coincidentally, the delay calibration is *specified* in the user manual to be 100 picoseconds at 5 ns/div.
It's not a coincidence at all. But it would be an easy trap to fall into (as I suspect you are) to then assume this implies something about the timebase capabilities of the hardware. When instead it simply reflects a display-mapping capability.
I would actually expect it to be worse when clock jitter is taken into account but not by a whole lot so 100 picoseconds would be at best achievable after averaging.
That is also insignificantly worse than the oldest 100 MHz ETS DSOs that I know of can do.
Probably true, and reflective of the fact that these are not ETS DSOs. Which is kind of what Marmad and I have been trying to tell you.
And what I have been trying to say is that the difference between an ETS measurement and a triggered measurement made after reconstruction, even linear reconstruction in some cases, is a distinction without a difference as far as timing accuracy except insofar as aliasing has occurred.
If a pure sine source was used as a test signal and no aliasing occurred, then they would produce identical results. But aliasing degrades the trigger accuracy when digital triggering is used and this happens whether a pure sine source is used or not because significant aliasing occurs do to distortion in the DSOs analog signal chain and digitizer.
The test using the DSA815 tracking generator will not reveal the above because the source itself has more distortion than the DSO analog front end and digitizer produce.
Now maybe the DS1104Z cannot do the above with a single shot acquisition, but it sure should be able to because it is not difficult and the hardware is capable of supporting it.
"should"? "it is not difficult"? "the hardware is capable"? That's a lot of assertions for one sentence. Sadly there are many things that are not difficult, yet many DSO manufacturers leave them out. And while there are many technique that could be used to improve the temporal resolution of a scope, that doesn't mean that Rigol incorporated any of them... perhaps due to cost, or difficulty trying to merge them with intensity grading, which they felt was more important/valuable.
I do not disagree but lets say I wanted to replace my good 100 MHz analog oscilloscope which has a trigger jitter in the 100ps range with a DS1104Z. Does it support the same timing resolution? This is not just an idle question; I make this sort of measurement all the time. As far as I can tell, the DS1104Z hardware should support it.
Take the best case scenario with a DS1104Z DSO. The signals are pure sine waves and averaging is used. What is the minimum change in delay that can be measured? What if square waves or fast edges are used instead of sine waves? The Tektronix application note that Dave linked says under these conditions, ETS and triggering after sin(x)/x interpolation produce virtually identical results and they back it up with a bunch of calculated graphs.
Incidentally, this application note also discusses what could be the difference between a Rigol DS1054Z upgraded to 100 MHz and a DS1104Z. For years (decades?) now high end DSOs have been implementing frequency and phase compensation after digitization with the filter coefficients determined by calibration at the time of manufacture and if these calibration coefficients are lost by say a backup battery going dead, the DSO becomes a doorstop unless you can get the manufacturer to do the calibration again. The filter coefficients are even adjusted for different input attenuator settings. As digital integration increases, it becomes less expensive in materials and time to do this compared to adjusting the analog signal path and I have not noticed any analog adjustments in photos of the DS1000Z or DS2000A series analog section.
If the DS1054Z is only calibrated this way for up to 50 MHz operation, then an upgraded DS1054Z should show transient response abnormalities compared to a true DS1104Z.
There is a guy on Ebay who rebuilds 150 MHz Tektronix 2445 oscilloscopes by removing the hardware bandwidth filter, setting a jumper to make the firmware think it is a 2465, and changing the faceplate to that of a 300 MHz 2465 oscilloscope. These faux 2465s do indeed have 300 MHz bandwidth or higher but because the original 2445 lacks the rather ingenious frequency and phase compensation network included in a true 2465, the transient response is severely compromised. Nobody noticed this on these Ebay specials for a long time because they just checked the bandwidth.
The online reference I like to give for various TDC designs is currently down do to hosting issues but the relevant part of the description for a transition midpoint timing TDC is "A resolution of around 10ps or so is possible when using a 16 bit pipeline ADC clocked at 80MHz or more." As I recall, these were popular in particle collision experiments because of their adequate resolution and accuracy and their very high measurement rate.
Yes, it was the nuclear physics lab that I was doing particle collision experiments with the LeCroys, back in the olden days. Not really the 50 years I mentioned, but close enough that my recollections of details are extremely vague. But I've spent a lot of time working with my own (antique) LeCroys, so I know them pretty well. And I do rather like the 4000x4000 vector graphics displays (though the burn-in not so much).
I was not surprised to find out that transition midpoint timing or centroid timing TDCs using state of the art ADCs were developed or at least used in particle experiments. What I find neat now is that cheap DSOs use the same principle. The situation reminds me of years ago when it was predicted that 3 levels of cache memory used in workstations would migrate down to PCs and now they are not far from being in handheld devices as well.
We need to print up some stickers saying, "Nuclear Technology Inside!"
I have only played with LeCroy oscilloscopes briefly and never long enough for even a poor evaluation. The 4000x4000 vector graphics display sounds like something I would expect them to do and last time I checked, they still made 12 bit high bandwidth real time DSOs.
The highest display resolution DSO I have is a 7854 (It is sort of a DSO if you squint hard.) which renders a 1024x1024 display and oddly enough uses 102.4 points per division instead of 100 like a sane DSO would. Since its digitizer is 10 bits, this actually makes sense and it takes advantage of it! I was very surprised a couple years ago but should not have been when first using it that I could literally see signal characteristics even on a non-index graded display which were invisible on other 8 bit DSOs no matter how I used them.