Author Topic: Some Important Points to Remember when Evaluating Waveform Update Rates  (Read 37945 times)

0 Members and 4 Guests are viewing this topic.

Online nctnico

  • Super Contributor
  • ***
  • Posts: 27565
  • Country: nl
    • NCT Developments
Re: Some Important Points to Remember when Evaluating Waveform Update Rates
« Reply #50 on: March 23, 2013, 02:03:19 am »
I agree. The waveform update rate is only of use in the persistent mode. Now everyone raise their hand if they used persistent mode for a real measurement in the last year.

Hmm... either you're not understanding waveform update rates - or I don't understand what you're saying by this. If you use a DSO - and you use it at any horizontal timebase setting < 1ms - then you're 'dealing' with your scopes waveform update rate whether you realize it or not. Maybe instead of speaking in terms of waveform update rates we should speak in terms of blind time. Digital scopes - by their very nature - become more and more blind to the waveform data as the timebase gets smaller. Even the Agilent 3000 X-series, with it's 1M wfrm/s rate, is blind to 98% of the possible waveform at the 2ns timebase setting.
I understand very well. Unless you use the persistent mode you are only interested in what the scope triggers on and in that case it will show a trace from one acquisition (*). Actually you don't want many acquisition results added into one trace; the noise will be excessive. Another limit is the refresh rate of the screen and the maximum rate your eyes can keep up with.

(*) Some Tektronix DSOs with a monochrome screen like the TDS200 and some TDS500 will show the previous acquisition in grey which is a handy feeature.
« Last Edit: March 23, 2013, 02:11:01 am by nctnico »
There are small lies, big lies and then there is what is on the screen of your oscilloscope.
 

Offline marmadTopic starter

  • Super Contributor
  • ***
  • Posts: 2979
  • Country: aq
    • DaysAlive
Re: Some Important Points to Remember when Evaluating Waveform Update Rates
« Reply #51 on: March 23, 2013, 02:09:08 am »
can you explain as to why scope#2 has less probability to see the glitches than scope#1?

Sure. As I mentioned, the calculation involves three variables - although two of them can just thought of as a single one: the active acquisition time - which is the timebase setting * the number of divisions. And the other variable is the frequency - the waveform update rate - the number of times that the DSO captures and displays that window of time (the active acquisition time) in a second. If you double either of those two variables - the window size that you're capturing - or the frequency that you're capturing it at, it stands to reason that you double the amount of time you're capturing in a second - thus doubling the probability of seeing a random event.
« Last Edit: March 23, 2013, 03:36:19 am by marmad »
 

Offline marmadTopic starter

  • Super Contributor
  • ***
  • Posts: 2979
  • Country: aq
    • DaysAlive
Re: Some Important Points to Remember when Evaluating Waveform Update Rates
« Reply #52 on: March 23, 2013, 02:20:55 am »
I understand very well. Unless you use the persistent mode you are only interested in what the scope triggers on and in that case it will show a trace from one acquisition (*). Actually you don't want many acquisitions results added into one trace; the noise will be excessive. Another limit is the refresh rate of the screen and the maximum rate your eyes can keep up with.

As mentioned by someone else responding to your post, fast update scopes like the Agilent X series, Rigol UltraVision, etc. have persistence on virtually all of the time - using a gradient of the display color to indicate intensity - almost exactly like phosphor in a  CRT. These DPO/VPOs are not limited by refresh rates of the screen because they basically 'stack' the collected waveforms in a Z-buffer before display. The result is that you can see tiny flickering glitches, jitter error, etc. that would be invisible on slow waveform update scopes because of the blind time. Having less blind time does not mean you have more noise - it means you see more detail.
 

Offline marmadTopic starter

  • Super Contributor
  • ***
  • Posts: 2979
  • Country: aq
    • DaysAlive
Re: Some Important Points to Remember when Evaluating Waveform Update Rates
« Reply #53 on: March 23, 2013, 02:26:32 am »
3). Again, timebase is a artifact. For a 10 divs screen,  I can place my own mask that has 20 divs and rename all the timebase number by half of the original ones. That way I just turned your scope#2 into scope#1 (assuming every other aspect the same). Right?

Huh? No - a DSO has an active acquisition time - the timebase setting * the number of divisions. The only way to change this time is to change the timebase setting - you can't alter the number of divisions it collects. And changing this timebase setting, on most DSOs, also changes the frequency at which it captures the waveform. We're talking about REAL time here - not an artifact. How can you magically change 1us to 1ns?
« Last Edit: March 23, 2013, 02:29:06 am by marmad »
 

Offline Someone

  • Super Contributor
  • ***
  • Posts: 4739
  • Country: au
    • send complaints here
Re: Some Important Points to Remember when Evaluating Waveform Update Rates
« Reply #54 on: March 23, 2013, 02:35:53 am »
Ok, we have a reference for the 50k/s acquisitions on the Rigol topping out at 20ns per division:

Setting the memory to 14k was needed to achieve close to the banner acquisition rate according to that reviewer, what was the sample rate achieved?
Yes, thats a difference of 23% in the time taken to capture a glitch that isnt correlated to the trigger. Not a 23% difference in the dead time as you keep claiming.
What do you think is the difference in dead time?
It's 23%. The guy doesn't understand basic mathematical extrapolation.
We return to the original post of mine stating the dead time as measure in units of time:
dead_time = 1/(update_rate) - record_length

You provided the quote to the dead time expressed as a percentage of realtime:
%dead_time = 100 x (1 – (update_rate * record_time))

These are consistent with the terminology used in both the Rhode & Schwartz and Agilent documents on the topic.

So lets compare these for the two scopes at your magic number of 20ns
1s/54,000 - 20ns * 10 = 18319 ns
1s/50,000 - 20ns * 14 = 19720 ns
the dead times are within 10%

How about calculating the dead time as a percentage of realtime
100 x (1 - (54,000 * 20ns * 10)) = 98.92%
100 X (1 - (50,000 * 20ns * 14)) = 98.6%
expressed as a percentage of realtime they are .32% different

Quoting differences in test time is completely different. Welcome to engineering, terminology is both terse and precise.
 

Offline Someone

  • Super Contributor
  • ***
  • Posts: 4739
  • Country: au
    • send complaints here
Re: Some Important Points to Remember when Evaluating Waveform Update Rates
« Reply #55 on: March 23, 2013, 02:59:52 am »
What does extra divs on my screen, or even extra sample memory, buy me? It might give me some more horizontal resolution, but this is unrelated to update rate
I think the point is that you're not limited by update rate, you're limited by blind time. Converting from wfm/s to blind time involves knowing how much time is represented by one waveform, which is a combination of s/division and divisions/waveform.

A 'division' is just an arbitrary amount of the screen, anyway.  Any scope with same number of pixels can potentially show the same detail if you can fine adjust the time base.  But number of divisions does come up in the math, even if it's not "buying" you anything.
Well we can take it to silly extremes in terms of displayed data, the optimal point with the Agilent 2000 X seems to be 10us/division which would fill the 100,000 point memory neatly across the screen at the lower 1GS/s non interleaved rate. The dead time while capturing as fast as the scope can measure? apparently 10% of realtime.

I dont have the details on the rigol to make a similar comparison, it would be interesting to know where its true peak of recorded information is since this thread is highlighting that waveforms/second while relevant to many measurements is not the single specification relevant when hunting glitches.
 

Offline Gunb

  • Regular Contributor
  • *
  • Posts: 221
  • Country: de
Re: Some Important Points to Remember when Evaluating Waveform Update Rates
« Reply #56 on: March 23, 2013, 08:39:54 am »
You left out the part where I said to adjust the time base of scope A so it also displays 7µs.  Otherwise, you're saying the same thing I was: it's the s/wfm that's important, not the number of divisions the scope displays it with.

You're right, but when I fine tune the RIGOL I can measure, that wfm/s changes, too. So, do we have same wfm/s on both scopes if we follow you're assumption? If not, the comparison becomes again an inaccurate thing.

So, to become independent from that I've assumed that both scopes are identical with same 500ns/DIV, same wfm/s except the no. of DIV. More a theoretical approach then.

same dead time for both now, then again
repeated for 1 second in total now and assumed 100.000 wfm/s then scope B shows always one 1µs per wfm extra, that is 100.000 wfm/s * 1µs = 0.1s in total more than scope A.

I'm not sure your algebra works.. If the two scopes in your example have the same blind time, then I figure Scope A will have higher wfm/s. If scope B has a wfm/s of 100k, then its dead time is 3µs per waveform.  Scope A will thus have a wfm/s of 111k (1/9µs), ya?

No, sorry, write error, was really late when I came home from work yesterday.
I've meant same wfm/s, not dead time. In this case I come to the following:

scope A: 6µs per wfm, 4µs dead-time
scope B: 7µs per wfm, 3µs dead-time

1/10µs = 100.000 wfm/s

leads on both to physical 100.000wfm/s, but scope B shows more of the signal. A random event, whatever, that falls into this extra 1µs earlier than elsewhere in the signal appears earlier on screen, so I come to the conclusion, that the "weighted" wfm/s on scope B is higher. Correct me if I'm wrong.

In the end this wfm/s discussions are not that important to me, as long as the scope has enough for my needs, I'm happy. My HMO has "only" up to 2500 wfm/s and up to know I could use it for everything even if my RIGOL reveals jitter much better.


Have a nice weekend  :)
« Last Edit: March 23, 2013, 08:44:18 am by Gunb »
 

Offline jpb

  • Super Contributor
  • ***
  • Posts: 1771
  • Country: gb
Re: Some Important Points to Remember when Evaluating Waveform Update Rates
« Reply #57 on: March 23, 2013, 11:27:35 am »
What I would find useful would be, rather than more technical details, a poll or survey of people's practical experience.

The questions would be along the lines:

In the last year my high refresh rate DSO allowed me to spot glitches that otherwise I would have missed on
0? 1-5? > 5? occasions.

In the last year my low refresh rate DSO meant that I missed glitches and ended up wasting time/losing my job/retiring from electronics on
0? 1-5? > 5? occasions.

The problem is trying to value a feature  against other features where you have to make a choice. In my case I had a choice between a new 70MHz Agilent DSX2000 series scope
(4 channels) with high refresh but only 100k of memory and a old stock (very old 2006 vintage but never sold) LeCroy WaveJet 334 with low refresh of 3,600 but with a bandwidth of 350MHz and
500k of memory for over £100 less money.

I went with the WaveJet but I'm curious as to how much of a disadvantage the lower fps is given that if you're looking for a glitch you can trigger off narrow pulses, spikes, missing pulses and so on so the circumstances where 50,000 fps is good enough and 3,600 isn't I suspect is a pretty rare occurrence but I don't have much practical experience and am genuinely interested in knowning others experience. (I know we all tend to defend what ever scope we've decided to buy so it is difficult to get an objective assessment of any feature.)
 

Offline Someone

  • Super Contributor
  • ***
  • Posts: 4739
  • Country: au
    • send complaints here
Re: Some Important Points to Remember when Evaluating Waveform Update Rates
« Reply #58 on: March 23, 2013, 11:42:58 am »
The problem is trying to value a feature  against other features where you have to make a choice. In my case I had a choice between a new 70MHz Agilent DSX2000 series scope
(4 channels) with high refresh but only 100k of memory and a old stock (very old 2006 vintage but never sold) LeCroy WaveJet 334 with low refresh of 3,600 but with a bandwidth of 350MHz and
500k of memory for over £100 less money.

I went with the WaveJet but I'm curious as to how much of a disadvantage the lower fps is given that if you're looking for a glitch you can trigger off narrow pulses, spikes, missing pulses and so on so the circumstances where 50,000 fps is good enough and 3,600 isn't I suspect is a pretty rare occurrence but I don't have much practical experience and am genuinely interested in knowning others experience. (I know we all tend to defend what ever scope we've decided to buy so it is difficult to get an objective assessment of any feature.)
The Lecroy Wavejet models are often considered closer to the next band of scopes ie comparable to the Agilent and Tek 3000 series, you got yourself a huge bargain.

The practical difference between 3,600 and 50,000 captures per second is close to zero until you get that odd occasion with a high repetition signal (you cant get the captures if you arent even triggering that often). It adds some detail when looking at edges of digital logic and other high repetition signals but with clever use of the advanced hardware triggering in these mid range scopes you can get away without it, especially since the wavejet can use all that extra memory with its own take on segmented memory included as standard.
 

Offline AndyC_772

  • Super Contributor
  • ***
  • Posts: 4263
  • Country: gb
  • Professional design engineer
    • Cawte Engineering | Reliable Electronics
Re: Some Important Points to Remember when Evaluating Waveform Update Rates
« Reply #59 on: March 23, 2013, 11:47:24 am »
I'm not aware of having missed things because of a limited waveform update rate, though the higher update rate I get on my Tek with DPO mode switched on does give a much better impression of what a non-repetitive waveform is actually doing. With a low update rate, such a signal (say, a microprocessor's chip select) can appear to be behaving quite randomly, while a faster update rate can reveal behaviour which is actually more predictable and deterministic.

It's the proper intensity graded display that makes all the difference. Without it, the image on the screen tells you what happened the last time the scope did a sweep, and nothing else. With it, the image tells you how often things are happening, and that's much more revealing. I suspect there's some minimum number of waveforms/sec that a scope needs to achieve for this effect to really become useful.

It's worth adding that I have most definitely missed things due to lack of scope bandwidth. Given the choice I'd go for higher bandwidth over higher waveform rate; it's better to have fewer sharp, detailed images of what your signal is doing than it is to have more blurry ones.

Offline jpb

  • Super Contributor
  • ***
  • Posts: 1771
  • Country: gb
Re: Some Important Points to Remember when Evaluating Waveform Update Rates
« Reply #60 on: March 23, 2013, 12:04:46 pm »
It's worth adding that I have most definitely missed things due to lack of scope bandwidth. Given the choice I'd go for higher bandwidth over higher waveform rate; it's better to have fewer sharp, detailed images of what your signal is doing than it is to have more blurry ones.

That was my gut feel, that high bandwidth/fast rise time would be more generally useful than higher fps and a more modern design.

It is interesting though, this engineer :

http://www.embeddedrelated.com/showarticle/117.php

places much more emphasis on having a high resolution mode than having higher bandwidth. In fact he is downgrading from 1GHz to 200MHz because he
found his fellow engineers kept borrowing his 1GHz scope!

The Lecroy Wavejet models are often considered closer to the next band of scopes ie comparable to the Agilent and Tek 3000 series, you got yourself a huge bargain.

Yes, I thought it was a good bargain - though I was a bit disappointed when it arrived to find it was built in September 2006 and was the old pre-A series so doesn't have control via the rear USB.

But as the price I paid was roughly the same as the new list price of the 4 500MHz probes that came with it, I decided not to send it back. :)
 

Offline marmadTopic starter

  • Super Contributor
  • ***
  • Posts: 2979
  • Country: aq
    • DaysAlive
Re: Some Important Points to Remember when Evaluating Waveform Update Rates
« Reply #61 on: March 23, 2013, 12:08:16 pm »
@Someone - sorry, I stand corrected - this proves I should never answer posts after midnight - especially ones involving math.  :P I kept conflating acquisition time with blind time in my head - and I also realized that I had been thinking and referring in terms of blind times per second - while you have been referring to it per cycle. Anyway, with some sleep and coffee, it's obvious that a percentage increase in acquisition time could only cause an identical decrease in blind time if the two were identical to begin with (50/50 - which they hardly ever are), but it does cause an identical decrease in the average time in seconds to find a glitch.

So, given the Agilent/Rigol 20ns example: at that timebase setting:
Rigol acquisition time p/s: 50000 * 20 * 14 = 14ms
Agilent acquisition time p/s = 54000 * 20 * 10 = 10.8ms
A 23% difference which leads to the 23% less average time to capture a glitch (as mentioned in the previous post).

Once again, sorry for my obliviousness - no more late night posting!

BTW, when you posted that video in your previous post did you realize that I was the reviewer - or was it just a meta-joke?  ;)
« Last Edit: March 23, 2013, 01:27:36 pm by marmad »
 

Offline marmadTopic starter

  • Super Contributor
  • ***
  • Posts: 2979
  • Country: aq
    • DaysAlive
Re: Some Important Points to Remember when Evaluating Waveform Update Rates
« Reply #62 on: March 23, 2013, 01:12:19 pm »
It's worth adding that I have most definitely missed things due to lack of scope bandwidth. Given the choice I'd go for higher bandwidth over higher waveform rate; it's better to have fewer sharp, detailed images of what your signal is doing than it is to have more blurry ones.

Agreed. You should be getting the bandwidth you need for your work, preferably with a wfrm/s update rate >1k if you can.

In the last year my high refresh rate DSO allowed me to spot glitches that otherwise I would have missed on
0? 1-5? > 5? occasions.

This would be difficult to ever know since it's about probabilities. You could just be 'lucky' with a lower wfrm/s DSO and manage to capture a glitch in 5 seconds - which might have taken someone 'unlucky' 10 hours to find - leading to the loss of their job and retirement from electronics  :)

Seriously though, it's just something that's nice to have - and helpful in some specific circumstances - but the vast majority of DSO work can be performed just fine without it (as people have manged to do for years).

And as Someone pointed out, above a certain point, it's just gravy. Having used a few of the lower end DSOs, it seems to me that the effect and usefulness decreases as the wfrm/s rate increases. The most marked difference seems to be from  < 100 to 1k, then a bit less so from >1k to 10k, then even less so from >10k to 100k, etc.
 

Offline Wuerstchenhund

  • Super Contributor
  • ***
  • Posts: 3088
  • Country: gb
  • Able to drop by occasionally only
Re: Some Important Points to Remember when Evaluating Waveform Update Rates
« Reply #63 on: March 23, 2013, 01:29:28 pm »
What I would find useful would be, rather than more technical details, a poll or survey of people's practical experience.

I fully agree.

Quote
The questions would be along the lines:

In the last year my high refresh rate DSO allowed me to spot glitches that otherwise I would have missed on
0? 1-5? > 5? occasions.

In the last year my low refresh rate DSO meant that I missed glitches and ended up wasting time/losing my job/retiring from electronics on
0? 1-5? > 5? occasions.

At least in my case I can reply to both questions with '0'.

Quote
The problem is trying to value a feature  against other features where you have to make a choice.

Exactly. And at the end of the day, it's not that high wfm rates would have helped me to solve a problem that I otherwise couldn't. That's why it's very low on my prioritzy list for a scope. Even if I don't see a glitch in real-time on the display, it's still recorded by the scope, and I can look at it and analyze it as I see fit. A high wfm rate would certainly be nice, but at least for me the lack of it is not a deal breaker. A lower bandwidth for example would be in most cases.

But one thing I should mention is that most of my work equipment is mid to upper high end (which comes with adequate wfm rates), so I'm not talking about a 25 year old CRT DSO which, if forced, shows 2 or 3 waveforms a second (in which case the low wfm rate would most certainly be a real problem). But these days even entry level scopes should have reasonable screen update rates.

Quote
In my case I had a choice between a new 70MHz Agilent DSX2000 series scope
(4 channels) with high refresh but only 100k of memory and a old stock (very old 2006 vintage but never sold) LeCroy WaveJet 334 with low refresh of 3,600 but with a bandwidth of 350MHz and 500k of memory for over £100 less money.

I went with the WaveJet but I'm curious as to how much of a disadvantage the lower fps is given that if you're looking for a glitch you can trigger off narrow pulses, spikes, missing pulses and so on so the circumstances where 50,000 fps is good enough and 3,600 isn't I suspect is a pretty rare occurrence but I don't have much practical experience and am genuinely interested in knowning others experience.

Look at it this way: with your WaveJet, the lower wfm rate could mean you would miss (i.e. not see) a small glitch in a signal. However, the glitch will probably be still be in your scope's memory, and (not knowing the capabilities of the WaveJet range!), so you should still be able to look at it. On the other side the very high wfm rate of the DSO-2000X is useless if you miss signal elements due to the more limited bandwith.

Another thing to keep in mind (and which everyone seems to be ignoring) is that the wfm update rate does not necessarily mean that this is the rate that is actually shown on the LED screen. Its merely the limit the display controller and the processing can handle. TFT LEDs have their own limit in their pixel response times (which depending on the panel and lots of other factors is roughly in the regions of 1ms to say 25ms, and scopes rarely use the latest and most expensive panels). The response time also depends on the current and new pixel state (i.e. grey to grey, black to white). What this means is that even if your display controller can produce 1,000,000 wfm/s, the display will still only refresh at 50, 60, 100 or maybe 120Hz, but that's about it. So even if you pay for the high wfm rates it doesn't mean you are actually able to see that tiny rare glitch any better than on a scope with a lower wfm update rate.

« Last Edit: March 23, 2013, 01:34:19 pm by Wuerstchenhund »
 

Offline marmadTopic starter

  • Super Contributor
  • ***
  • Posts: 2979
  • Country: aq
    • DaysAlive
Re: Some Important Points to Remember when Evaluating Waveform Update Rates
« Reply #64 on: March 23, 2013, 01:46:50 pm »
Setting the memory to 14k was needed to achieve close to the banner acquisition rate according to that reviewer, what was the sample rate achieved?

2GSa/s. Sample size is adjustable on the Rigol - sample rate is not. And at the maximum rate of 2GSa/s of the Agilent X2000 or Rigol DS2000 series,  the DSO is capturing 2Sa/ns. So at a 20ns timebase setting, you are seeing 400 samples stretched over 640 pixels on the Agilent display and 560 samples stretched over 700 pixels on the Rigol.
 

Offline marmadTopic starter

  • Super Contributor
  • ***
  • Posts: 2979
  • Country: aq
    • DaysAlive
Re: Some Important Points to Remember when Evaluating Waveform Update Rates
« Reply #65 on: March 23, 2013, 01:53:34 pm »
Even if I don't see a glitch in real-time on the display, it's still recorded by the scope, and I can look at it and analyze it as I see fit.

You've written this before in a previous post, but I don't understand your logic. Blind is blind - the scope does not record data when it's blind. If you don't see a glitch in real time, then the scope didn't capture it. Except, of course, if you mean you are just using it in single-shot mode - or you happened to look away from the scope when a single glitch happened :)
« Last Edit: March 23, 2013, 02:14:03 pm by marmad »
 

Offline marmadTopic starter

  • Super Contributor
  • ***
  • Posts: 2979
  • Country: aq
    • DaysAlive
Re: Some Important Points to Remember when Evaluating Waveform Update Rates
« Reply #66 on: March 23, 2013, 02:02:24 pm »
Another thing to keep in mind (and which everyone seems to be ignoring) is that the wfm update rate does not necessarily mean that this is the rate that is actually shown on the LED screen. Its merely the limit the display controller and the processing can handle. TFT LEDs have their own limit in their pixel response times (which depending on the panel and lots of other factors is roughly in the regions of 1ms to say 25ms, and scopes rarely use the latest and most expensive panels). The response time also depends on the current and new pixel state (i.e. grey to grey, black to white). What this means is that even if your display controller can produce 1,000,000 wfm/s, the display will still only refresh at 50, 60, 100 or maybe 120Hz, but that's about it. So even if you pay for the high wfm rates it doesn't mean you are actually able to see that tiny rare glitch any better than on a scope with a lower wfm update rate.

You are not understanding how DPO/VPO technology works. It's not hindered by refresh rates or response times of the display (at least, not up to a limit - see below) - it uses the equivalent of a Z-buffer to 'stack' waveforms for intensity grading. Each time the display is refreshed, it is refreshed with a composite image of the intensity from many, many waveforms.

Edit: @Wuerstchenhund - OTOH, you're correct in the sense that there must be a theoretical limit to the amount of data which can be presented and perceived by the human eye in a second by DPOs. I don't know how to calculate that limit, but perhaps it is something like levels of intensity * refresh rate of the display? Anybody know? But it's certainly at least a 100x greater than the refresh rate.

Ahhh... so I start to understand your logic from the previous post. You think that the refresh rate of the display has to do with the wfrm/s rate and the blind time. But that's not the case; the things are unrelated - except perhaps in low update rate DSOs. Perhaps we mean different things by 'not seeing'. My definition is that when the user doesn't see something, he's just missed it - but it still exists. When the DSO doesn't see something, it doesn't exist.
« Last Edit: March 23, 2013, 02:57:32 pm by marmad »
 

Offline Wuerstchenhund

  • Super Contributor
  • ***
  • Posts: 3088
  • Country: gb
  • Able to drop by occasionally only
Re: Some Important Points to Remember when Evaluating Waveform Update Rates
« Reply #67 on: March 23, 2013, 03:07:27 pm »
You are not understanding how DPO/VPO technology works.

I'm sure I do quite well understand how DPO works, thanks.

Quote
Ahhh... so I start to understand your logic from the previous post. You think that the refresh rate of the display has to do with the wfrm/s rate and the blind time.

I'm sorry but you still don't understand. I never said the wfm update rate has anything to do with the screen refresh rate, and I thought that should be pretty clear from my posting.

The point I am making is exactly that, wfm rate is NOT the same as screen refresh rate. Therefore the idea that a scope with a very high wfm rate lets you see details a scope with a lower wfm update rate is simply not true in this generality (which is the main argument about why some consider a very high wfm rate to be the next best thing since sliced bread). With a DSO, as long as the glitch is in the sample memory, and as long as the other parameters (analog bandwidth, sample rate) are adequate, I will be able to look at that glitch, period. You won't necessarily be able to look at it in real-time, because especially on a scope with LCD screen you're still limited by the screen update rate. But I will be able to find that glitch, and I will be able to analyze it, no matter if the wfm rate is 1,000,000 or 100.

You really should read more careful before replying.

And I'm sorry to say but it also shows that your thinking is very well stuck in the days of analog scopes. Sure you can use persistence mode (no matter if monochrome or color-graded, which is what Tek sold under the 'DPO' name) to find sporadic glitches, and I agree that for this a very high wfm rate of the display controller may be advantageous, and persistence mode may very well be your only option if you use an entry-level scope e which does not offer much else, but with a decent DSO there are much better tools available to find and analyze signal deviations. The only time I use persistence mode is when I want to produce some colourful pictures showing some signal jitter for trainees, but that's about it.

I do understand that for someone who comes from an analog scope, persistence mode is the obvious thing as that is essentially all that was available with an analog scope, but this is one of the reason why I don't agree with the common recommendation that it's best for EE beginners to start with an analog scope. But that's just my own opinion.
« Last Edit: March 23, 2013, 03:11:19 pm by Wuerstchenhund »
 

Offline marmadTopic starter

  • Super Contributor
  • ***
  • Posts: 2979
  • Country: aq
    • DaysAlive
Re: Some Important Points to Remember when Evaluating Waveform Update Rates
« Reply #68 on: March 23, 2013, 03:29:11 pm »
I'm sure I do quite well understand how DPO works, thanks.

Well, I'm sorry if I made a mistake, but the following sentences of yours made it sound as if you didn't:

Another thing to keep in mind (and which everyone seems to be ignoring) is that the wfm update rate does not necessarily mean that this is the rate that is actually shown on the LED screen. Its merely the limit the display controller and the processing can handle.
...
What this means is that even if your display controller can produce 1,000,000 wfm/s, the display will still only refresh at 50, 60, 100 or maybe 120Hz, but that's about it.

...since the display controller is not connected to the waveform update rate - or it's limits.

Quote
With a DSO, as long as the glitch is in the sample memory...

And that is the crux of the issue, no? Finding glitches - and then getting them into sample memory - or do you just mean glitches that end up randomly in sample memory?
« Last Edit: March 23, 2013, 03:31:39 pm by marmad »
 

Offline Wuerstchenhund

  • Super Contributor
  • ***
  • Posts: 3088
  • Country: gb
  • Able to drop by occasionally only
Re: Some Important Points to Remember when Evaluating Waveform Update Rates
« Reply #69 on: March 23, 2013, 03:46:23 pm »
...since the display controller is not connected to the waveform update rate - or it's limits.

Sure it does. The whole processing architecture affects the wfm rate. You should have a look at some DSO hardware designs.

Quote
Quote
With a DSO, as long as the glitch is in the sample memory...
And that is the crux of the issue, no? Finding glitches - and then getting them into sample memory - or do you just mean glitches that end up randomly in sample memory?

Why do I get the increasingly strong feeling that you don't really understand how a DSO actually works?

I mean this is really basic stuff.
« Last Edit: March 23, 2013, 03:48:42 pm by Wuerstchenhund »
 

Offline marmadTopic starter

  • Super Contributor
  • ***
  • Posts: 2979
  • Country: aq
    • DaysAlive
Re: Some Important Points to Remember when Evaluating Waveform Update Rates
« Reply #70 on: March 23, 2013, 03:58:52 pm »
Sure it does. The whole processing architecture affects the wfm rate. You should have a look at some DSO hardware designs.
The wfrm/s rate is the inverse of the DSO's acquisition cycle. Perhaps this is just a problem with semantics? Certainly you don't mean that the display controller IC affects the acquisition cycle?



...but with a decent DSO there are much better tools available to find and analyze signal deviations.
Could you elaborate on the better tools available for finding (or seeing) unknown glitches? I'm being serious - it's true that I'm reasonably new to DSOs - although I do think I understand how they work  :)
« Last Edit: March 23, 2013, 06:18:29 pm by marmad »
 

Offline onlooker

  • Frequent Contributor
  • **
  • Posts: 395
Re: Some Important Points to Remember when Evaluating Waveform Update Rates
« Reply #71 on: March 23, 2013, 04:14:33 pm »
I think you two are with different pre-assumptions about glitch hunting: one assumed  the glitch can be triggered on; while the other assumed triggering on the glitch can not be reliabally done.
 

Offline marmadTopic starter

  • Super Contributor
  • ***
  • Posts: 2979
  • Country: aq
    • DaysAlive
Re: Some Important Points to Remember when Evaluating Waveform Update Rates
« Reply #72 on: March 23, 2013, 04:30:16 pm »
I think you two are with different pre-assumptions about glitch hunting: one assumed  the glitch can be triggered on; while the other assumed triggering on the glitch can not be reliabally done.

Maybe so. All of the time I'm discussing the wfrm/s rate (and blind times), I'm speaking about it purely as a tool that might help you notice a problem (glitch, jitter, etc) - which can then be captured and analyzed by the DSO. I realize that it is irrelevant in much of the normal usage of the DSO.
 

Offline jpb

  • Super Contributor
  • ***
  • Posts: 1771
  • Country: gb
Re: Some Important Points to Remember when Evaluating Waveform Update Rates
« Reply #73 on: March 23, 2013, 10:47:05 pm »
I've just read an application note on Random Interleaved Sampling and this refers to a component of the DSO called the Time to Digital Converter (TDC).

The TDC is used to place the trigger point very accurately between the actual samples (so that on multiple triggers you don't introduce a jitter equal to the sample rate).

The reason that I'm raising this in this thread is it suddenly struck me that the TDC will place an upper limit on the waveform update rate.

The TDC works by charging up from the start of the trigger to the next sample point, which will be a very short time - the maximum will be the sample to sample time which
in the app note is a maximum of 100 psecs for a 10GS/s scope.

The TDC then discharges over a much longer time which is measured digitally using a clock, 100MHz in the app note. The discharge time in the app note is a maximum of
5 microseconds - so 100 psecs to charge and 5 microseconds to discharge. This allows the time to be set to less than a picosecond (in the app note example).

The thing that struck me was that this very long discharge time  will mean that in the example given the next waveform cannot be displayed for 5 microseconds so
the waveform update rate will be limited to 200,000. OK that is quite a high upper limit but it is only 1/5 of Agilent's 3000X series. Also at slower sample rates the
charging time and thus discharging time would correspondingly increase.

I suppose it is possible to store and work on triggers and sample sets in parallel but this is added complication that most manufacturers wouldn't do.

It might also be that there is a trade off between wfps and timing accuracy.

« Last Edit: March 23, 2013, 11:01:50 pm by jpb »
 

Offline marmadTopic starter

  • Super Contributor
  • ***
  • Posts: 2979
  • Country: aq
    • DaysAlive
Re: Some Important Points to Remember when Evaluating Waveform Update Rates
« Reply #74 on: March 23, 2013, 11:56:14 pm »
The thing that struck me was that this very long discharge time  will mean that in the example given the next waveform cannot be displayed for 5 microseconds so
the waveform update rate will be limited to 200,000. OK that is quite a high upper limit but it is only 1/5 of Agilent's 3000X series. Also at slower sample rates the
charging time and thus discharging time would correspondingly increase.
Interesting. I wonder if the InfiniiVision X series use a digital trigger - I did a quick Googling but couldn't find a definitive answer - but I would guess they probably do. I know the Rigol UltraVision line does. According to this Rohde & Schwarz paper, one of the advantages of their digital trigger is "No Masking of Trigger Event":

"An analog trigger requires some time after a trigger decision to rearm the trigger circuitry before they can trigger again. During this rearm time, the oscilloscopes cannot respond to new trigger events - trigger events occurring during the rearm time are masked.
In contrast the digital trigger system of the R&S RTO oscilloscopes can evaluate individual trigger events with the Time-to-Digital-Converters (TDC) within 400 ps intervals (Figure 12) with a resolution of 250 fs."

A 400ps interval would put the limit of trigger events at 2.5 meg.
« Last Edit: March 24, 2013, 12:01:03 am by marmad »
 


Share me

Digg  Facebook  SlashDot  Delicious  Technorati  Twitter  Google  Yahoo
Smf