Author Topic: Some Important Points to Remember when Evaluating Waveform Update Rates  (Read 37957 times)

0 Members and 4 Guests are viewing this topic.

Offline Wuerstchenhund

  • Super Contributor
  • ***
  • Posts: 3088
  • Country: gb
  • Able to drop by occasionally only
Re: Some Important Points to Remember when Evaluating Waveform Update Rates
« Reply #25 on: March 21, 2013, 09:53:29 pm »
Your statement doesn't make sense. How can any user fail to benefit from a DSO with less blind time? The blind time is an inherent fault of digital sampling oscilloscopes - not an attribute. That's like saying eyeglasses only benefit people who would like to read books.

It does, but I guess our disagreement is just because we come from different angles. If you look at a DSO simply as a modern variant of an analog scope, then you're right (for this specific situation).

However, I see it a bit different (I have used analog scopes quite a lot say 20 years ago, but for more than this time my primary scopes have been digital). An analog scope is essentially a device for displaying a waveform. A DSO is much much more, and displaying a waveform is just one of the many things it can do. For my work, displaying a waveform is important, but as least as important (probably more) are the tools a good modern DSO offers to analyze the waveform. The thing is that, once you have your waveform sampled in memory, you can do all kinds of things with it that are impossible with an analog scope. This is the real strength of a DSO.

Now when you say that the display blind time of DSOs is a disadvantage over an analog scope then you may be right, but when you use a DSO the same way as an analog scope then you're not really using it to its strength. I know of course that analog scopes are dead and even 'analog' users have to buy a DSO these days because that's essentially all there is, and these for these users a very high wfm rate may be important as it makes the DSO more similar to their old analog scopes. But this is not equally true for all DSO users.
« Last Edit: March 21, 2013, 09:55:01 pm by Wuerstchenhund »
 

Offline marmadTopic starter

  • Super Contributor
  • ***
  • Posts: 2979
  • Country: aq
    • DaysAlive
Re: Some Important Points to Remember when Evaluating Waveform Update Rates
« Reply #26 on: March 21, 2013, 10:12:01 pm »
The thing is that, once you have your waveform sampled in memory, you can do all kinds of things with it that are impossible with an analog scope. This is the real strength of a DSO.

Granted - but to get the waveform sampled into memory requires the DSO to first 'see' it, right? If your DSO is blind to a glitch that happens in the middle of a long stream of serial data, it doesn't matter how much searching and analyzing you do of the sampled data - you won't find it because the scope never saw it.

Quote
Now when you say that the display blind time of DSOs is a disadvantage over an analog scope then you may be right, but when you use a DSO the same way as an analog scope then you're not really using it to its strength. I know of course that analog scopes are dead and even 'analog' users have to buy a DSO these days because that's essentially all there is, and these for these users a very high wfm rate may be important as it makes the DSO more similar to their old analog scopes. But this is not equally true for all DSO users.

I would completely agree that the importance of less blind time varies from user to user - but I would only argue that, in any case, it benefits everyone - whether they're aware of it or not  :)
 

Offline marmadTopic starter

  • Super Contributor
  • ***
  • Posts: 2979
  • Country: aq
    • DaysAlive
Re: Some Important Points to Remember when Evaluating Waveform Update Rates
« Reply #27 on: March 22, 2013, 12:55:18 am »
I disagree. Say we have two scopes, one 10 divs wide, the other 20 divs wide. One is set to 20 ns/div and does 50k waveforms, and the other does 50k waveforms at 10 ns/div. The blind time will be identical, but according to you the waveform update rate for the latter is higher? The first scope just compresses the same signal in less pixels, which has nothing to do with blind time.

Whether slower sweep speeds (or more divs) help catching rare events depends on your signal, however. If the signal period is much longer than your sweep speed (because you need to be at the highest sweep speed to see whatever detail you were looking for in the rising edge), the extra horizontal resolution only improves the amount of detail you see per acquisition, it does not increase the number of trigger events per second. This means you're looking at less rising edges than the scope with less horizontal resolution and faster waveform update rates. You can only trade horizontal resolution for update rate if the signal period is about the same as your sweep speed.

Claiming that the scope with more horizontal resolution has a faster 'true waveform update rate' is misleading, both should be stated as independent facts.

@Alm - your post threw me for a loop - and I had to think about what I was originally trying to get at in the opening post again. :)  Of course, the number of divisions displayed does not affect the amount of trigger events that a DSO can respond to in a given period of time - and thus can not cause an increase in the wfrm/s rate. But what I was trying to express in the original post was the following idea:

Take 2 DSOs adjusted to an equivalent timebase setting - with equivalent waveform update rates. One DSO has 10 divisions - so it's active acquisition time is 10 * the timebase. The other DSO has 14 divisions so it's active acquisition time is 14 * the timebase. All else being equal, as far as I can determine, the blind time of the second scope is ~30% less than the first. Am I wrong in this deduction?

Anyway, I was attempting to 'translate' this into a wfrms/s speed (of the first DSO) which would yield the equivalent blind time. But perhaps the way that I expressed this idea was misleading and needs to be re-edited to more correctly reflect the notion of a reduction of blind time - as opposed to an increase in waveforms per second.
« Last Edit: March 22, 2013, 02:20:48 am by marmad »
 

Offline Someone

  • Super Contributor
  • ***
  • Posts: 4739
  • Country: au
    • send complaints here
Re: Some Important Points to Remember when Evaluating Waveform Update Rates
« Reply #28 on: March 22, 2013, 06:08:57 am »
Take 2 DSOs adjusted to an equivalent timebase setting - with equivalent waveform update rates. One DSO has 10 divisions - so it's active acquisition time is 10 * the timebase. The other DSO has 14 divisions so it's active acquisition time is 14 * the timebase. All else being equal, as far as I can determine, the blind time of the second scope is ~30% less than the first. Am I wrong in this deduction?
Yes, you are wrong. The 10/14 ratio applies to the recorded time, not the dead time.

dead_time = 1/(update_rate) - record_length

Which after correcting for your erroneous 23% bonus in the originally presented waveform rates we end up calculating the Rigol has a blind time averaging a respectable 5 times longer than the competing (and more expensive) Agilent model.
 

Offline Gunb

  • Regular Contributor
  • *
  • Posts: 221
  • Country: de
Re: Some Important Points to Remember when Evaluating Waveform Update Rates
« Reply #29 on: March 22, 2013, 08:23:30 am »
After reading that you begin to wonder why we bother so much to make DSOs other as an easy tool every no-brainer can use. (and a huge decrease in price) despite all the disadvantages that most users will never know/experience because all they care about is digital stuff and just for that an analog (sample) scope is rather useless.
Interesting links in this topic.

Exactly, you say it all.

"It's digital, it must be better" - unfortunately often a wrong statement.

 

Offline marmadTopic starter

  • Super Contributor
  • ***
  • Posts: 2979
  • Country: aq
    • DaysAlive
Re: Some Important Points to Remember when Evaluating Waveform Update Rates
« Reply #30 on: March 22, 2013, 09:49:04 am »
Yes, you are wrong. The 10/14 ratio applies to the recorded time, not the dead time.

dead_time = 1/(update_rate) - record_length

Which after correcting for your erroneous 23% bonus in the originally presented waveform rates we end up calculating the Rigol has a blind time averaging a respectable 5 times longer than the competing (and more expensive) Agilent model.

My original assertion that the number of divisions decreases the blind time is correct - but the percentages are wrong. As you mentioned, the percentage is the increase in acquisition time, with a correspondingly smaller (depending on the initial ratio) decrease in blind time.

Rigol's blind time percentage, with a 50,000 wfrm update rate of a 280ns acquisition time (14 divisions * 20ns), is 98.6%
Agilent's blind time percentage, with a 54,000 wfrm update rate of a 200ns acquisition time (10 divisions * 20ns), is 98.9%
Agilent's blind time percentage, if it had a 70,000 wfrm update rate of a 200ns acquisition time, would be 98.6% - exactly what Rigol's is at that timebase.

Of course, as alm pointed out, this does not mean, for example, that the Rigol can be respond to 70k trigger events at that timebase setting - it just means, given an equivalent timebase setting and wfrm rate, that it's less blind than a 10 division scope.

I will re-edit the original post to reflect this data later today.
« Last Edit: March 23, 2013, 12:32:14 pm by marmad »
 

Offline Someone

  • Super Contributor
  • ***
  • Posts: 4739
  • Country: au
    • send complaints here
Re: Some Important Points to Remember when Evaluating Waveform Update Rates
« Reply #31 on: March 22, 2013, 12:44:14 pm »
it just means, given an equivalent timebase setting and wfrm rate, that it's less blind than a 10 division scope.
Less blind as a percentage of time, which is important if the infrequent signal you are looking for is not correlated on the trigger, which interestingly for the Agilent model can be gotten around by switching to the slower timebases and then zooming in after the fact since the capture rate is synthetically limited. As you say the calculation is valid when the comparison is made between two scopes with the same capture rate (and also importantly memory depth and bandwidth if you want to ensure the information captured is equivalent), but you tried in the original post to then reverse the calculation and create an equivalent waveform rate for this which is slight of hand as the waveform rate is tied to the triggering. Fair enough point out that the Rigol can capture a lot more data if those waveform rates at with memory depths that are the same or larger, but it cannot be equated to a different capture rate, in that case come up with a metric which describes the captured data (say displayed (MHz*Samples)/Second) or use the established blind time expressed as a percentage.

I'm not sure if the Rigol maintains the full memory depth at all horizontal settings, another of the big selling features of the Agilent when you're used to Tek (RIP, lol).
 

Offline marmadTopic starter

  • Super Contributor
  • ***
  • Posts: 2979
  • Country: aq
    • DaysAlive
Re: Some Important Points to Remember when Evaluating Waveform Update Rates
« Reply #32 on: March 22, 2013, 01:06:38 pm »
...but you tried in the original post to then reverse the calculation and create an equivalent waveform rate for this which is slight of hand as the waveform rate is tied to the triggering.

True - as alm first pointed out to me. When I wrote the original post I was thinking of blind times - and trying to come up with a quick and easy way to display the difference to people used to seeing waveform update rates advertised - but I can see now how it's misleading. I'll edit it to blind time percentages or something else a bit later.
 

Offline AndyC_772

  • Super Contributor
  • ***
  • Posts: 4263
  • Country: gb
  • Professional design engineer
    • Cawte Engineering | Reliable Electronics
Re: Some Important Points to Remember when Evaluating Waveform Update Rates
« Reply #33 on: March 22, 2013, 01:44:56 pm »
Take 2 DSOs adjusted to an equivalent timebase setting - with equivalent waveform update rates. One DSO has 10 divisions - so it's active acquisition time is 10 * the timebase. The other DSO has 14 divisions so it's active acquisition time is 14 * the timebase. All else being equal, as far as I can determine, the blind time of the second scope is ~30% less than the first. Am I wrong in this deduction?

You're absolutely correct. The difference is that one waveform represents more data actually acquired and displayed on a 14 division scope as opposed to a 10 division. To pretend otherwise is to effectively say we'll ignore any useful waveform information that's captured and displayed in 4 of the divisions on the wider scope.

I'm surprised the maths seems to be so confusing, because it's so easy to derive from first principles that even I can do it...

In 1 second, there is a theoretical maximum of 1 second's worth of information to be captured and shown. Any measurements we make should result in a figure which is a percentage of this amount.

The actual amount of information captured is the number of waveforms/sec times the amount of time each waveform represents. The amount of time each waveform represents equals the time base setting in seconds/division times the number of divisions.

So, for example, my old Tektronix TDS754D manages a very creditable 185,000 waveforms/sec at 25ns/div. Not bad at all for an instrument 15 years old, IMHO. (Switch off DPO mode and it's down to just 64 waveforms/sec!)

Each of those waveforms occupies 10 divisions at 25ns/div, for a total time of 250ns. Therefore, the total active time is 46.25ms per second. This makes the blind time 95.375%.

Offline cyr

  • Frequent Contributor
  • **
  • Posts: 252
  • Country: se
Re: Some Important Points to Remember when Evaluating Waveform Update Rates
« Reply #34 on: March 22, 2013, 02:35:43 pm »
I think a table with an adjusted *timebase* would be closer to the truth, forget about "divs" and think about the total amount of signal captured/displayed.

e.g. 10ns/div on a traditional scope is 100ns/screen, but it is 140ns/screen on the rigol. If the traditional scope was set for 14ns/div (or the rigol at ~7.1ns/div)  they would capture the same information on the screen and then the capture rates would be directly comparable.
 

alm

  • Guest
Re: Some Important Points to Remember when Evaluating Waveform Update Rates
« Reply #35 on: March 22, 2013, 06:37:17 pm »
Say I have a very rare problem where there is a spike on the power rail to my chip drops just as the /CS line of the SPI goes high (lets assume the repetition rate is high enough so the update rate of the scope is the limit). I set the scope to trigger on the rising edge of /CS, and attach the other channel to the Vcc pin. I want to witness as many /CS rising edges as possible so I don't have to wait forever to verify whether the problem is fixed. What does extra divs on my screen, or even extra sample memory, buy me? It might give me some more horizontal resolution, but this is unrelated to update rate. The power supply is not going to drop well after the /CS transition, so the extra information is useless to me.

I would argue this is a fairly common case, the demo of a square wave with a single glitch in it seems artificial to me.
 

Offline AndyC_772

  • Super Contributor
  • ***
  • Posts: 4263
  • Country: gb
  • Professional design engineer
    • Cawte Engineering | Reliable Electronics
Re: Some Important Points to Remember when Evaluating Waveform Update Rates
« Reply #36 on: March 22, 2013, 06:43:22 pm »
In that instance, the figure of merit would just be the number of waveforms/sec, and you'd be free to choose whatever time base setting gives you the highest rate consistent with capturing all of the event you're interested in.

What you describe sounds like a fairly deterministic case, though. If A causes B, then more often than not, A always causes B.

I must admit that I've struggled to think of real world cases where problems happen very infrequently and where I'd know so little about what's happening in the faulty case that I can't set up a trigger on it.

Sometimes I even resort to, say, putting some debug code in an FPGA to detect when the fault has occurred and wiggle a GPIO, which I can then trigger on. It seems like a much more predictable way to ensure I see what's happening in the interesting case.

Offline marmadTopic starter

  • Super Contributor
  • ***
  • Posts: 2979
  • Country: aq
    • DaysAlive
Re: Some Important Points to Remember when Evaluating Waveform Update Rates
« Reply #37 on: March 22, 2013, 07:29:25 pm »
What does extra divs on my screen, or even extra sample memory, buy me? It might give me some more horizontal resolution, but this is unrelated to update rate.

It buys you exactly as formulated above - less blind time. Simple as that. Whether that's of any use to you in whatever ways you use a DSO, I couldn't say. But that's what you get with extra divs.
 

Offline Galaxyrise

  • Frequent Contributor
  • **
  • Posts: 531
  • Country: us
Re: Some Important Points to Remember when Evaluating Waveform Update Rates
« Reply #38 on: March 22, 2013, 09:02:21 pm »
What does extra divs on my screen, or even extra sample memory, buy me? It might give me some more horizontal resolution, but this is unrelated to update rate
I think the point is that you're not limited by update rate, you're limited by blind time. Converting from wfm/s to blind time involves knowing how much time is represented by one waveform, which is a combination of s/division and divisions/waveform.

A 'division' is just an arbitrary amount of the screen, anyway.  Any scope with same number of pixels can potentially show the same detail if you can fine adjust the time base.  But number of divisions does come up in the math, even if it's not "buying" you anything.
I am but an egg
 

Offline Rufus

  • Super Contributor
  • ***
  • Posts: 2095
Re: Some Important Points to Remember when Evaluating Waveform Update Rates
« Reply #39 on: March 22, 2013, 09:38:29 pm »
I agree. The waveform update rate is only of use in the persistent mode. Now everyone raise their hand if they used persistent mode for a real measurement in the last year.

The whole point of Tektronix DPO and other implementations is to emulate the persistence of phosphor so the display gives more insight into the nature of a signal like analog scopes do. Such scopes are always in 'persistent mode' and the higher the update rate the better.
 

Offline Someone

  • Super Contributor
  • ***
  • Posts: 4739
  • Country: au
    • send complaints here
Re: Some Important Points to Remember when Evaluating Waveform Update Rates
« Reply #40 on: March 22, 2013, 10:39:41 pm »
All else being equal, as far as I can determine, the blind time of the second scope is ~30% less than the first. Am I wrong in this deduction?

You're absolutely correct. The difference is that one waveform represents more data actually acquired and displayed on a 14 division scope as opposed to a 10 division. To pretend otherwise is to effectively say we'll ignore any useful waveform information that's captured and displayed in 4 of the divisions on the wider scope.

I'm surprised the maths seems to be so confusing, because it's so easy to derive from first principles that even I can do it...
Argh, the 30% reduction is not in the blind time! Its still wrong. If they have the same capture rate, sample rate, bandwidth, and time per horizontal division, then you could say there is a 40% increase in the captured data for a scope with 14 vs 10 divisions, but this is in no way a 30% reduction of blind time whether expressed as unit measures or a percentage of realtime.
 

Offline Gunb

  • Regular Contributor
  • *
  • Posts: 221
  • Country: de
Re: Some Important Points to Remember when Evaluating Waveform Update Rates
« Reply #41 on: March 22, 2013, 11:11:11 pm »
But number of divisions does come up in the math, even if it's not "buying" you anything.

Of course they do.

scope A: 12DIV * 500ns/DIV = 6µs
scope B: 14DIV * 500ns/DIV = 7µs -> shows 1µs more of the signal per screen

same dead time for both now, then again

scope A: 12DIV * 500ns/DIV = 6µs
scope B: 14DIV * 500ns/DIV = 7µs

repeated for 1 second in total now and assumed 100.000 wfm/s then scope B shows always one 1µs per wfm extra, that is 100.000 wfm/s * 1µs = 0.1s in total more than scope A.

So, each scope for itself has got 100.000 wfm/s , but for comparison the extra DIVs of scope B have to be taken into account -> what marmad has explained well.

It has nothing to do with samples in sample memory and the no. of them , that are moved to diplay buffer to be displayed depending on the limited no. of pixels of the screen. This is something quite different.

I recommend to study my attachement from Agilent above, you're changing a few things.
« Last Edit: March 22, 2013, 11:14:04 pm by Gunb »
 

Offline Galaxyrise

  • Frequent Contributor
  • **
  • Posts: 531
  • Country: us
Re: Some Important Points to Remember when Evaluating Waveform Update Rates
« Reply #42 on: March 22, 2013, 11:23:09 pm »
Of course they do.

scope A: 12DIV * 500ns/DIV = 6µs
scope B: 14DIV * 500ns/DIV = 7µs -> shows 1µs more of the signal per screen
You left out the part where I said to adjust the time base of scope A so it also displays 7µs.  Otherwise, you're saying the same thing I was: it's the s/wfm that's important, not the number of divisions the scope displays it with.

Quote
same dead time for both now, then again
repeated for 1 second in total now and assumed 100.000 wfm/s then scope B shows always one 1µs per wfm extra, that is 100.000 wfm/s * 1µs = 0.1s in total more than scope A.
I'm not sure your algebra works.. If the two scopes in your example have the same blind time, then I figure Scope A will have higher wfm/s. If scope B has a wfm/s of 100k, then its dead time is 3µs per waveform.  Scope A will thus have a wfm/s of 111k (1/9µs), ya?
I am but an egg
 

Offline marmadTopic starter

  • Super Contributor
  • ***
  • Posts: 2979
  • Country: aq
    • DaysAlive
Re: Some Important Points to Remember when Evaluating Waveform Update Rates
« Reply #43 on: March 22, 2013, 11:28:05 pm »
Argh, the 30% reduction is not in the blind time! Its still wrong. If they have the same capture rate, sample rate, bandwidth, and time per horizontal division, then you could say there is a 40% increase in the captured data for a scope with 14 vs 10 divisions, but this is in no way a 30% reduction of blind time whether expressed as unit measures or a percentage of realtime.

Yes, sorry, but it is (well, actually 28.6% - I rounded up in that post :) ). It seems odd that we're disputing this since it's so easy to prove with math.  Whoops - I was wrong - Someone is right. it would only be an equivalent reduction in blind time if the two (acquisition time / blind time) started out equal.

As far as the Agilent/Rigol comparison that I mentioned in the opening post - it is a blind-time acquisition-time difference of 23% (for the 20ns timebase setting - in this case not fully 28% because the Agilent has a 54k wfrm/s rate vs 50k for the Rigol so they are not precisely equal). It's very easy to verify - just do the math from the Rohde & Schwarz or Agilent papers:

To capture a 1 p/s glitch at the 20ns timebase setting (with a probability of 99.9%), the average time for the Agilent is 10 minutes, 40 seconds.
To capture a 1 p/s glitch at the 20ns timebase setting (with a probability of 99.9%), the average time for the Rigol is 8 minutes, 13 seconds.

That's a difference of 23% - and it corresponds to 54k / 70k.

But for these two scopes, this figure is only for the 20ns timebase setting - and only because the waveform update rates are very close to each other. For other timebase settings where the Agilent is way ahead of the Rigol in wfrm/s, the percentage would obviously be in Agilent's favor.
« Last Edit: March 23, 2013, 12:19:30 pm by marmad »
 

Offline Someone

  • Super Contributor
  • ***
  • Posts: 4739
  • Country: au
    • send complaints here
Re: Some Important Points to Remember when Evaluating Waveform Update Rates
« Reply #44 on: March 23, 2013, 12:54:37 am »
To capture a 1 p/s glitch at the 20ns timebase setting (with a probability of 99.9%), the average time for the Agilent is 10 minutes, 40 seconds.
To capture a 1 p/s glitch at the 20ns timebase setting (with a probability of 99.9%), the average time for the Rigol is 8 minutes, 13 seconds.
The width of the glitch is not what determines the probability of capturing it in a certain time, its the rate at which the glitch occurs which you use to determine the time it will take to capture. Also only if the glitch is not correlated to the trigger do you get the benefit of the increased capture.

That's a difference of 23% - and it corresponds to 54k / 70k.  If you don't believe it, the formulas are in the published papers - just use them.
Yes, thats a difference of 23% in the time taken to capture a glitch that isnt correlated to the trigger. Not a 23% difference in the dead time as you keep claiming.

Do we need to get shouty yet? :P
 

Offline Galaxyrise

  • Frequent Contributor
  • **
  • Posts: 531
  • Country: us
Re: Some Important Points to Remember when Evaluating Waveform Update Rates
« Reply #45 on: March 23, 2013, 01:20:23 am »
Yes, thats a difference of 23% in the time taken to capture a glitch that isnt correlated to the trigger. Not a 23% difference in the dead time as you keep claiming.
What do you think is the difference in dead time?
I am but an egg
 

Offline onlooker

  • Frequent Contributor
  • **
  • Posts: 395
Re: Some Important Points to Remember when Evaluating Waveform Update Rates
« Reply #46 on: March 23, 2013, 01:23:50 am »
Quote
Let's say we have two scopes - one has 20 divs and the other has 10 - but in every other respect they are equal.

The problem is that not every other respect can be equal in your example. For example, if both cases have the same number of samples per waveform and  scope#1's horizontal time scale of the screen equals to the time span of all samples, the scope#2 has only half of the samples "showing". Therefore this can be regarded as "unfair" comparison since half of the samples for scope#2 are thrown away and counted towards dead time.

The more "fair" comparison is to have scope#2 double the timebase, so both scopes have the same horizontal time spans on the screen (so to cover the same numbers of samples). This is same as others had argued that timebase is just an artifact except scope makers discretized it in their own particular ways. 

Using this "fair" comparison (with every other respect equal ) , the dead times for both scopes are the same.
« Last Edit: March 23, 2013, 01:36:31 am by onlooker »
 

Offline marmadTopic starter

  • Super Contributor
  • ***
  • Posts: 2979
  • Country: aq
    • DaysAlive
Re: Some Important Points to Remember when Evaluating Waveform Update Rates
« Reply #47 on: March 23, 2013, 01:30:20 am »
The problem is that not every other respect can be equal in your example. For example, if both cases have the same number of samples per waveform and  scope#1's horizontal time scale of the screen equals to the time span of all samples, the scope#2 has only half of the samples "showing". Therefore this can be regarded as "unfair" comparison since half of the samples for scope#2 are thrown away and counted towards dead time.

This is nonsense. I'm talking about a calculation of blind time - it has nothing to do with samples. It is not a 'fair or 'unfair' process - it's math. The calculation of a DSOs blind time requires the timebase setting, the number of divisions, and the speed of the waveform capture - with those three variables you can solve either Agilent's or Rohde & Schwarz's blind-time equation. Read the literature. 
« Last Edit: March 23, 2013, 01:31:55 am by marmad »
 

Offline marmadTopic starter

  • Super Contributor
  • ***
  • Posts: 2979
  • Country: aq
    • DaysAlive
Re: Some Important Points to Remember when Evaluating Waveform Update Rates
« Reply #48 on: March 23, 2013, 01:40:13 am »
Therefore this can be regarded as "unfair" comparison since half of the samples for scope#2 are thrown away and counted towards dead time.

BTW, I don't think you quite understand how DSOs work if you think they accomplish decimation by first capturing samples and then throwing them away.  ;)
 

Offline onlooker

  • Frequent Contributor
  • **
  • Posts: 395
Re: Some Important Points to Remember when Evaluating Waveform Update Rates
« Reply #49 on: March 23, 2013, 01:49:56 am »
Quote
This is nonsense. I'm talking about a calculation of blind time

I guess correct math also needs to be applied correctly to make sense.

1). The correct time scale that need to be the same should be the full screen time span, instead of the timebase.

2). We are talking about hunting glitches, Right?  Assuming scope#2 was on 40ns timebase and everything else being equal (waveform rates, sample rates, total number of samples, softwares that processes the samples and show the glitches...), can you explain as to why scope#2 has less probability to see the glitches than scope#1?

3). Again, timebase is a artifact. For a 10 divs screen,  I can place my own mask that has 20 divs and rename all the timebase number by half of the original ones. That way I just turned your scope#2 into scope#1 (assuming every other aspect the same). Right?
« Last Edit: March 23, 2013, 02:03:07 am by onlooker »
 


Share me

Digg  Facebook  SlashDot  Delicious  Technorati  Twitter  Google  Yahoo
Smf