Author Topic: Some Important Points to Remember when Evaluating Waveform Update Rates  (Read 37942 times)

0 Members and 1 Guest are viewing this topic.

Offline marmadTopic starter

  • Super Contributor
  • ***
  • Posts: 2979
  • Country: aq
    • DaysAlive
These issues came up recently in another thread, but since more and more potential oscilloscope buyers are considering (or concerned about) waveform update rates - and more and more DSOs are appearing with faster and faster specified waveform update rates (thanks to Agilent's introduction of the InfiniVision X series), it seemed that it might be worth starting a thread to mention a couple of things - and to offer an easy-to-find location for questions and possible discussion of the topic.

The two points (from the other thread) which I wanted to reiterate for members new to the discussion:

1) This seems rather obvious but perhaps isn't clear to some: the published maximum waveforms per second update rate is just that - a maximum - in other words, a best case scenario with a single-channel running, lowest sample size setting, no measurements or other firmware subroutines requiring overhead to run, while using one (or a few) particular timebase setting(s). This maximum rate will then drop to something lower by doing ALMOST anything - which could include something as simple as pressing a panel button to bring out a menu.

Edit: I edited the following after discussions further on in the thread started by alm - dispensing with the chart and trying to just condense the pertinent information.

2) When considering blind times or average glitch capture times of DSOs, one should factor in both the waveforms per second update rate and the number of horizontal divisions that the DSO displays - since both affect the acquisition time (rate * size). Given two DSOs with equivalent wfrm/s rates at the same timebase, the DSO that displays more divisions will have a larger acquisition time - and an equivalent decrease in the average time required to capture a repeating fault.
« Last Edit: March 24, 2013, 12:38:29 am by marmad »
 

Offline jpb

  • Super Contributor
  • ***
  • Posts: 1771
  • Country: gb
I worried about this when I was purchasing my oscilloscope, as it seems to be given a lot of prominence compared to things like bandwidth and number of channels which I felt were more important. I ended up buying an LeCroy WaveJet which only has an update rate of 3,600 (from the spec sheet - I've not been able to measure it yet), but it does have 350MHz bandwidth and 4 channels and was at a very reduced price as it was 6 years old (though never used - it had been sitting in stock for a long time).

I'd like to gain a better understanding of the practical implications of having a refresh rate of only 3,600 (or 1,000 according to Agilents app note comparing their scope to the WaveJet but I suspect they didn't set the WaveJet to its optimal settings). To a certain extent it is a quantum thing. Unless your update rate is up at the 1,000,000 mark like the Agilent 3000X series the percentage of blind time is still very large so you're going to miss glitches anyway. (If you buy 20 lottery tickets instead of 1 you're more likely to win but it's still very unlikely that you'll do so.)

In Agilents app note on glitches when testing the WaveJet (and others) they set the memory depth to give maximum sample rate which means that a lot of invisible memory points are being stored.

If I was looking for glitches I'd set the memory depth to 500 to match the screen and maximise the update rate - if you see a glitch you then are aware that glitches exist and what trigger condition to set to recapture it at maximum sample rate. So in the app note at 1 microsec a division the WaveJet rate is dropped to 625 but at 1GS/s the number of samples captured per screen is 10,000 i.e. 20 screens worth. Setting it down to 500 might give something like the stated rate of 3600 - still only a tenth of Agilent's but more respectable and about 3.6% of the actual screen rate. With the blind time being 96.4% and events happening say 10 times a second over a period of 5 seconds the chances of missing the event is only about 13% so the WaveJet owner would have an 87% chance of spotting the glitch. The Agilent owner of course would do better with 35,000 or 35% his chance of missing the event is more or less zero.

By the way, your scaled figures for the Rigol probably should be slightly adjusted. At present you have an update rate of ~3 at 50mS where the screen takes half a second to draw and ~6 at 20mS where the screen takes a fifth of a second to draw! :)
« Last Edit: March 21, 2013, 11:58:35 am by jpb »
 

Offline marmadTopic starter

  • Super Contributor
  • ***
  • Posts: 2979
  • Country: aq
    • DaysAlive
I worried about this when I was purchasing my oscilloscope, as it seems to be given a lot of prominence compared to things like bandwidth and number of channels which I felt were more important.
That's true. And I suspect it probably has to do with Agilent's marketing push when they released the InfiniVision X series. OTOH, it's not a fabricated specification - and it's a feature (perhaps like anti-lock brakes on a car) which you might never even realize would have saved you a lot of trouble had you had it.

If I was looking for glitches I'd set the memory depth to 500 to match the screen and maximise the update rate - if you see a glitch you then are aware that glitches exist and what trigger condition to set to recapture it at maximum sample rate. So in the app note at 1 microsec a division the WaveJet rate is dropped to 625 but at 1GS/s the number of samples captured per screen is 10,000 i.e. 20 screens worth. Setting it down to 500 might give something like the stated rate of 3600 - still only a tenth of Agilent's but more respectable and about 3.6% of the actual screen rate. With the blind time being 96.4% and events happening say 10 times a second over a period of 5 seconds the chances of missing the event is only about 13% so the WaveJet owner would have an 87% chance of spotting the glitch. The Agilent owner of course would do better with 35,000 or 35% his chance of missing the event is more or less zero.
Perhaps the most sensible way to think about it is the method outlined in this Rohde & Schwarz paper: it's really all about probabilities since we're talking about random events. Using your specifications (and R&S equations), the average test time (at the 1us scale) for your DSO to catch a repeating signal fault of 10 p/sec (to a probability of 99.9%) is ~31.98 minutes. Again - this is about probabilities - your DSO might catch the glitch in 1 minute - but to be 99.9% sure that it would see a 10 p/sec glitch, you would have to give it ~32 minutes.

Looking at the chart below (from the above linked paper showing test times needed to catch a 10 p/sec glitch with a probability of 99.9%), my feeling is that, in terms of glitch-hunting, anything below 1k would probably be severely problematic; 1k - 10k would be difficult but feasible; and anything over 10k would be the most optimal.

Acquisition Rate- Test Time
100 wfms /s- 19 hours: 11 min : 08 s
1,000 wfms /s- 1 hour: 55 min : 10 s
10,000 wfms /s- 11 min : 31 s
100,000 wfms /s- 1 min : 09 s
1,000,000 wfms /s- 7 s

Quote
By the way, your scaled figures for the Rigol probably should be slightly adjusted. At present you have an update rate of ~3 at 50mS where the screen takes half a second to draw and ~6 at 20mS where the screen takes a fifth of a second to draw! :)
Ha, ha... yes, nicely spotted. I just ran the math on the old numbers while rounding - forgetting about the actual timebase scale of the screen  ;D  Of course, at those slower speeds all DSOs are more or less equivalent. I'll correct that now.  ;)
« Last Edit: March 21, 2013, 04:23:12 pm by marmad »
 

Offline Gunb

  • Regular Contributor
  • *
  • Posts: 221
  • Country: de
Hi,

a few years ago one professional article appeared in the German magazine "elektronik industrie" and was written by Agilent engineers.
It really revealed the relation between probality to see a seldom outlier and the necessary time depending on the wfm/s.

Later on I've found an Agilent paper, where these details are explained well and which was the base for the article in the magazine.
Attached it to this comment.

I've done a few attempt last week with the DS4012 and a counter connected to the trigger output. Could reach about 105.000 wfm/s
at max.


Rgds
Gunb
 

Offline marmadTopic starter

  • Super Contributor
  • ***
  • Posts: 2979
  • Country: aq
    • DaysAlive
Hi Gunb,

Later on I've found an Agilent paper, where these details are explained well and which was the base for the article in the magazine.
Attached it to this comment.

Thanks - yes, this is the paper most often quoted - and from which I took the Agilent figures listed above.

I've done a few attempt last week with the DS4012 and a counter connected to the trigger output. Could reach about 105.000 wfm/s
at max.

BTW, since your Rigol uses a 14-division screen (like mine) your normalized 10-division maximum speed (to compare it to other scopes listed in the Agilent paper) would actually be 147.000 wfrm/s (105 * 14  / 10).
 

Offline jpb

  • Super Contributor
  • ***
  • Posts: 1771
  • Country: gb
Acquisition Rate- Test Time
100 wfms /s- 19 hours: 11 min : 08 s
1,000 wfms /s- 1 hour: 55 min : 10 s
10,000 wfms /s- 11 min : 31 s
100,000 wfms /s- 1 min : 09 s
1,000,000 wfms /s- 7 s



With those times I'd say only the 1,000,000 case is feasible for manual probing. For all other cases you're going to have to leave the scope connected and go away and do something else, it then comes down to how long you leave it.

I've been wondering whether ETS might be used to increase the probability of catching glitches. For example the WaveJet has a 100GS/s mode which presumably combines 100 trigger points so that each screen is an amalgam of 100 screens and since it doesn't have to display the 100 screens separately it might refresh them a lot quicker.

When I eventually get myself an AWG I'll be able to experiment.

 

Offline marmadTopic starter

  • Super Contributor
  • ***
  • Posts: 2979
  • Country: aq
    • DaysAlive
With those times I'd say only the 1,000,000 case is feasible for manual probing. For all other cases you're going to have to leave the scope connected and go away and do something else, it then comes down to how long you leave it.

Well, a lot hinges upon on whether you're just checking for faults - or actively trying to find one. Certainly the 1M rate is the one to have for people who's livelihood depends on seeing a potential problem right at the outset. Using infinite persistence, segmented capturing, etc. can aid the majority of us who need to locate a glitch using a 1k - 100k rate DSO within a reasonable frame of time.
 

Offline marmadTopic starter

  • Super Contributor
  • ***
  • Posts: 2979
  • Country: aq
    • DaysAlive
I've been wondering whether ETS might be used to increase the probability of catching glitches. For example the WaveJet has a 100GS/s mode which presumably combines 100 trigger points so that each screen is an amalgam of 100 screens and since it doesn't have to display the 100 screens separately it might refresh them a lot quicker.

I don't think ETS would 'see' a glitch since it's not a repetitive signal - hence it couldn't be reconstructed from samples taken over many cycles.
 

Offline Gunb

  • Regular Contributor
  • *
  • Posts: 221
  • Country: de

With those times I'd say only the 1,000,000 case is feasible for manual probing. For all other cases you're going to have to leave the scope connected and go away and do something else, it then comes down to how long you leave it.

I've been wondering whether ETS might be used to increase the probability of catching glitches. For example the WaveJet has a 100GS/s mode which presumably combines 100 trigger points so that each screen is an amalgam of 100 screens and since it doesn't have to display the 100 screens separately it might refresh them a lot quicker.

When I eventually get myself an AWG I'll be able to experiment.

ETS uses several periods of a signal to assemble a new one of them. It is a mathematical achieved higher sample rate, I would say it does not affect the wfm/s. So, the dead-time between to "screenshots" is still there, and if the glitch occurs between two subsequent screens, you won't see them either. You still will have to wait. Or am I wrong?

For slower wfm/s, turn on persistency and have a coffee - the lower the wfm/s there more coffees you will drink - advantage: you can't fall asleep while staring at the screen  :)

Other question: if higher wfm/s is needed for your application at all.

 

Offline robrenz

  • Super Contributor
  • ***
  • Posts: 3035
  • Country: us
  • Real Machinist, Wannabe EE
Most of the instrument specification things we fret over as hobbyists we will never use. We want the latest and greatest gear to feel good or just in case we ever build something instead of permanently building a lab preparing to build something that we have not even defined yet.

But this is a good thread :-+

Online nctnico

  • Super Contributor
  • ***
  • Posts: 27565
  • Country: nl
    • NCT Developments
Re: Some Important Points to Remember when Evaluating Waveform Update Rates
« Reply #10 on: March 21, 2013, 06:15:54 pm »
I agree. The waveform update rate is only of use in the persistent mode. Now everyone raise their hand if they used persistent mode for a real measurement in the last year.
There are small lies, big lies and then there is what is on the screen of your oscilloscope.
 

Offline jpb

  • Super Contributor
  • ***
  • Posts: 1771
  • Country: gb
Re: Some Important Points to Remember when Evaluating Waveform Update Rates
« Reply #11 on: March 21, 2013, 06:34:31 pm »
My understanding of equivalent time sampling is that it takes samples at the normal sample rate (1GS/s say) and repeats this over several trigger events, but each time there is a small and random time offset.

Thus the points are interleaved.

If the glitch is big enough to be visible at the 1GS/s rate it will show up in some of the interleaved points even though it is missing for many of the triggers.

Of course the ETS mode might be much slower anyway so any gain in merging 100 screens in one would be lost in update time.

It is just an extra avenue for exploration.
 

Offline Wuerstchenhund

  • Super Contributor
  • ***
  • Posts: 3088
  • Country: gb
  • Able to drop by occasionally only
Re: Some Important Points to Remember when Evaluating Waveform Update Rates
« Reply #12 on: March 21, 2013, 07:37:13 pm »
Most of the instrument specification things we fret over as hobbyists we will never use. We want the latest and greatest gear to feel good or just in case we ever build something instead of permanently building a lab preparing to build something that we have not even defined yet.

You're probably right. Personally, I think the whole wfm/s thing is way overblown, and at the end of the day seems to become what 'GHz' numbers were in the days of the dreadful Penium4. And in my work I can catch most random small glitches with a flexible set of advanced triggers, or with large acquisition memories and a search function. Higher waveform rates are of course welcome, but at least for me it would not be a critical parameter when shopping for a work scope.

I guess a very high wfm rate may be good for people that are used to analog scopes, and use a DSO like an analog scope.
 

Offline marmadTopic starter

  • Super Contributor
  • ***
  • Posts: 2979
  • Country: aq
    • DaysAlive
Re: Some Important Points to Remember when Evaluating Waveform Update Rates
« Reply #13 on: March 21, 2013, 07:45:50 pm »
I agree. The waveform update rate is only of use in the persistent mode. Now everyone raise their hand if they used persistent mode for a real measurement in the last year.

Hmm... either you're not understanding waveform update rates - or I don't understand what you're saying by this. If you use a DSO - and you use it at any horizontal timebase setting < 1ms - then you're 'dealing' with your scopes waveform update rate whether you realize it or not. Maybe instead of speaking in terms of waveform update rates we should speak in terms of blind time. Digital scopes - by their very nature - become more and more blind to the waveform data as the timebase gets smaller. Even the Agilent 3000 X-series, with it's 1M wfrm/s rate, is blind to 98% of the possible waveform at the 2ns timebase setting.
 

Offline marmadTopic starter

  • Super Contributor
  • ***
  • Posts: 2979
  • Country: aq
    • DaysAlive
Re: Some Important Points to Remember when Evaluating Waveform Update Rates
« Reply #14 on: March 21, 2013, 07:49:14 pm »
I guess a very high wfm rate may be good for people that are used to analog scopes, and use a DSO like an analog scope.

Perhaps the point is that, as wfrm/s rates get higher, it allows people to use the DSO more like an analog scope - if that's how they'd like to use it.
 

Offline marmadTopic starter

  • Super Contributor
  • ***
  • Posts: 2979
  • Country: aq
    • DaysAlive
Re: Some Important Points to Remember when Evaluating Waveform Update Rates
« Reply #15 on: March 21, 2013, 08:28:13 pm »
Most of the instrument specification things we fret over as hobbyists we will never use. We want the latest and greatest gear to feel good or just in case we ever build something instead of permanently building a lab preparing to build something that we have not even defined yet.

So you're assuming that most people in this forum either never build anything themselves or don't do any paid electronics work. Wow - I thought the exact opposite ;D - but I could certainly be wrong.

Well, I do electronics work for money (and build my own things  ;) ) but honestly, most of the paid work I do is in the audio field and could be handled with a relatively low bandwidth analog scope - which I've been doing for years. But since deciding to get a low-cost DSO to add to my lab for my own work, and having tried a few of them over the last 2 years (to varying degrees of frustration), I decided to bite the bullet and spend a bit more for something with a higher update rate (and some other features I wanted).

And now, after having used it for a few months - and having experienced the benefits of that - because it doesn't just mean less blind time, but it also means things like a snappier response at lower timebase settings and intensity grading of the waveform - all I can say is that I would never want to go back to a sub 1k wfrm/s DSO again. And I think most people that have used them would agree - even though we know that the vast majority of times that we use a DSO, we could make do with a slower one.
 

Offline Gunb

  • Regular Contributor
  • *
  • Posts: 221
  • Country: de
Re: Some Important Points to Remember when Evaluating Waveform Update Rates
« Reply #16 on: March 21, 2013, 08:28:55 pm »
I agree. The waveform update rate is only of use in the persistent mode. Now everyone raise their hand if they used persistent mode for a real measurement in the last year.

Yes and no.

Correct, if searching for rare events. If persistence mode is on and the glitch could be captured, the next screen will not erase the previous one and so it's visible for human eye.

No if jitter should be investigated. Higher wfm/s scopes enable easier than scopes with lower wfm/s to judge how strong jitter of a signal is without persistence mode active.  Had that case yesterday again where the faster Rigol revealed more details than the slower HMO. In worst case the HMO took the next screen when the jittering edge ended where it started - so the signal looked very clean. The Rigol's higher wfm/s revealed a jittering edge.
« Last Edit: March 21, 2013, 08:32:41 pm by Gunb »
 

Offline PA4TIM

  • Super Contributor
  • ***
  • Posts: 1164
  • Country: nl
  • instruments are like rabbits, they multiply fast
    • PA4TIMs shelter for orphan measurement stuff
Re: Some Important Points to Remember when Evaluating Waveform Update Rates
« Reply #17 on: March 21, 2013, 08:39:23 pm »
Chapter 5 and 6 are interesting. http://w140.com/Handbook_of_Oscilloscope_Technology.pdf ( i do not know if this is part 2, just picked a link but there is an old and newer version. No commercial blah blah, just how it works. It is a bit dated, but hey, sample scopes are too, the DSO is still a baby sample scope just starting to mature a bit but they still have a long way to go f.i. on the field of signal integerity.

After reading that you begin to wonder why we bother so much to make DSOs other as an easy tool every no-brainer can use. (and a huge decrease in price) despite all the disadvantages that most users will never know/experience because all they care about is digital stuff and just for that an analog (sample) scope is rather useless.

But I will not discuss pros of cons of analog sample scopes, just wanted to share the link ( i have 5 anlog sample scopes and a nice DSO I like very much so I'm not anti-DSO ) and I measure only on analog and RF signals so i'm not the avarage user

Interesting links in this topic.


www.pa4tim.nl my collection measurement gear and experiments Also lots of info about network analyse
www.schneiderelectronicsrepair.nl  repair of test and calibration equipment
https://www.youtube.com/user/pa4tim my youtube channel
 

Offline marmadTopic starter

  • Super Contributor
  • ***
  • Posts: 2979
  • Country: aq
    • DaysAlive
Re: Some Important Points to Remember when Evaluating Waveform Update Rates
« Reply #18 on: March 21, 2013, 08:54:53 pm »
Chapter 5 and 6 are interesting. http://w140.com/Handbook_of_Oscilloscope_Technology.pdf ( i do not know if this is part 2, just picked a link but there is an old and newer version. No commercial blah blah, just how it works. It is a bit dated, but hey, sample scopes are too, the DSO is still a baby sample scope just starting to mature a bit but they still have a long way to go f.i. on the field of signal integerity.
Interesting stuff - thanks for posting this. :) I enjoyed reading the history part, although I had to chuckle a bit when I read the following - a clear sign of how fast technology is moving:

"A modern analog scope like the Tektronix 2467 easily displays a signal 500,000 times per second, even the oldest analog scopes achieved 100,000 times. Except for the Tektronix „DPO’s“ all DSOs capture the signal only some ten to some hundred times per second, several orders of magnitude less!"
 

Offline Wuerstchenhund

  • Super Contributor
  • ***
  • Posts: 3088
  • Country: gb
  • Able to drop by occasionally only
Re: Some Important Points to Remember when Evaluating Waveform Update Rates
« Reply #19 on: March 21, 2013, 09:02:36 pm »
Perhaps the point is that, as wfrm/s rates get higher, it allows people to use the DSO more like an analog scope - if that's how they'd like to use it.

Sure, and if this is what they want, why not? But these users are probably more or less the only type of users who may actually benefit from a very high wfm update rate.

It's probably not of much use for users that use a DSO as what it is: a very versatile signal analyzer.
 

Offline marmadTopic starter

  • Super Contributor
  • ***
  • Posts: 2979
  • Country: aq
    • DaysAlive
Re: Some Important Points to Remember when Evaluating Waveform Update Rates
« Reply #20 on: March 21, 2013, 09:18:09 pm »
Sure, and if this is what they want, why not? But these users are probably more or less the only type of users who may actually benefit from a very high wfm update rate.

Your statement doesn't make sense. How can any user fail to benefit from a DSO with less blind time? The blind time is an inherent fault of digital sampling oscilloscopes - not an attribute. That's like saying eyeglasses only benefit people who would like to read books.
 

Offline robrenz

  • Super Contributor
  • ***
  • Posts: 3035
  • Country: us
  • Real Machinist, Wannabe EE
Re: Some Important Points to Remember when Evaluating Waveform Update Rates
« Reply #21 on: March 21, 2013, 09:20:26 pm »
Most of the instrument specification things we fret over as hobbyists we will never use. We want the latest and greatest gear to feel good or just in case we ever build something instead of permanently building a lab preparing to build something that we have not even defined yet.

So you're assuming that most people in this forum either never build anything themselves or don't do any paid electronics work. Wow - I thought the exact opposite ;D - but I could certainly be wrong.

Well, I do electronics work for money (and build my own things  ;) ) but honestly, most of the paid work I do is in the audio field and could be handled with a relatively low bandwidth analog scope - which I've been doing for years. But since deciding to get a low-cost DSO to add to my lab for my own work, and having tried a few of them over the last 2 years (to varying degrees of frustration), I decided to bite the bullet and spend a bit more for something with a higher update rate (and some other features I wanted).

And now, after having used it for a few months - and having experienced the benefits of that - because it doesn't just mean less blind time, but it also means things like a snappier response at lower timebase settings and intensity grading of the waveform - all I can say is that I would never want to go back to a sub 1k wfrm/s DSO again. And I think most people that have used them would agree - even though we know that the vast majority of times that we use a DSO, we could make do with a slower one.

I like good equipment and I even buy a lot of things I cant justify for any other reason than things are more enjoyable with equipment that works exceptionaly well. Do I need a JBC soldering station? No, but I don't want to solder with anything else now. Do I need a 8846A bench meter? No, but I would not give it up either.  So I am with you on the nice equipment is great to work with.  And thanks to Dave I have bought lots of it.

I also don't mean that this thread or any thread that expounds on the fine details of understanding specifications and instrument behavior and thier impact on measurement quality are not valueable.  They are actualy my favorite type of thread.

My point was that myself included, it is easy (for some of us) to get caught up in focusing on outfitting the lab with the latest and greatest and stressing over which features make one instrument better or more usefull than another just because we saw a interesting video about it. When in reality we have no actual current need or experience with that piece of equipment. We just want to be prepared in case that piece of equipment is ever needed and we want an awesome lab.  I think this forum is great and I am having a lot of fun getting back into electronics but it feeds this obssesive part of our human nature.  I know there are many who do this for a living and these performance issues have a real impact on how easy it is for them to do thier job. But I think there are many who fit my lab builder description and there is nothing wrong with that, we all have to start somewhere. 

Your contribution to this forum is excellent and this thread is just another example.  I didn't mean It was a waste of time to discuss this topic.

alm

  • Guest
Re: Some Important Points to Remember when Evaluating Waveform Update Rates
« Reply #22 on: March 21, 2013, 09:27:38 pm »
2) The second point is a little more subtle - but evident when pointed out: when considering the number of waveforms per second that the DSO can capture and display, one has to factor in the number of horizontal divisions that the DSO displays. The 'true' speed of the DSO is the wfrm/s rate * the number of divisions. For example, a DSO which captures 10 divisions of 10ns @49,000 times per second has exactly the same blind time as a DSO which captures 14 divisions of 10ns @ 35,000 times per second. So, when comparing published rates between scopes you have to factor in the divisional display.

I disagree. Say we have two scopes, one 10 divs wide, the other 20 divs wide. One is set to 20 ns/div and does 50k waveforms, and the other does 50k waveforms at 10 ns/div. The blind time will be identical, but according to you the waveform update rate for the latter is higher? The first scope just compresses the same signal in less pixels, which has nothing to do with blind time.

Whether slower sweep speeds (or more divs) help catching rare events depends on your signal, however. If the signal period is much longer than your sweep speed (because you need to be at the highest sweep speed to see whatever detail you were looking for in the rising edge), the extra horizontal resolution only improves the amount of detail you see per acquisition, it does not increase the number of trigger events per second. This means you're looking at less rising edges than the scope with less horizontal resolution and faster waveform update rates. You can only trade horizontal resolution for update rate if the signal period is about the same as your sweep speed.

Claiming that the scope with more horizontal resolution has a faster 'true waveform update rate' is misleading, both should be stated as independent facts.
 

Offline AndyC_772

  • Super Contributor
  • ***
  • Posts: 4263
  • Country: gb
  • Professional design engineer
    • Cawte Engineering | Reliable Electronics
Re: Some Important Points to Remember when Evaluating Waveform Update Rates
« Reply #23 on: March 21, 2013, 09:48:03 pm »
My point was that myself included, it is easy (for some of us) to get caught up in focusing on outfitting the lab with the latest and greatest and stressing over which features make one instrument better or more usefull than another just because we saw a interesting video about it. When in reality we have no actual current need or experience with that piece of equipment. We just want to be prepared in case that piece of equipment is ever needed and we want an awesome lab.  I think this forum is great and I am having a lot of fun getting back into electronics but it feeds this obssesive part of our human nature.  I know there are many who do this for a living and these performance issues have a real impact on how easy it is for them to do thier job. But I think there are many who fit my lab builder description and there is nothing wrong with that, we all have to start somewhere.

+1

The sad fact is that, in a professional context, it's the engineers who recognise and understand the need for the right, good quality equipment to get the job done - but they're not the ones who get to sign the purchase orders. That job falls to people who won't use the equipment, who don't really understand why it's necessary, and for whom it's just a capital expense that has to be weighed against the revenue which will result.

It's hard to explain to someone with no electronics expertise why I wouldn't want an Atten or a Siglent anywhere near the lab, and why they should at least fork out for TTI if they won't buy me an Agilent or Tektronix. Mid-range kit should at least get the job done, though it will undoubtedly have interesting 'quirks' or 'character' which I'll discover down the line.

But when it comes to my own lab, I control the purse strings - and having equipment that's a pleasure to use is a worthwhile objective in its own right. Moreover, since I work for many different customers on many different kinds of product, I never quite know what the next job will require and so it pays to be prepared. So far, every item I've picked up because I figured it might come in handy, has done.

That's my excuse and I'm sticking to it :)

Offline marmadTopic starter

  • Super Contributor
  • ***
  • Posts: 2979
  • Country: aq
    • DaysAlive
Re: Some Important Points to Remember when Evaluating Waveform Update Rates
« Reply #24 on: March 21, 2013, 09:52:47 pm »
I like good equipment and I even buy a lot of things I cant justify for any other reason than things are more enjoyable with equipment that works exceptionaly well. Do I need a JBC soldering station? No, but I don't want to solder with anything else now. Do I need a 8846A bench meter? No, but I would not give it up either.  So I am with you on the nice equipment is great to work with.  And thanks to Dave I have bought lots of it.

Point taken - and very true. Once you use a nice piece of equipment of ANY sort (even hand tools) you don't want to go back  :)

Quote
My point was that myself included, it is easy (for some of us) to get caught up in focusing on outfitting the lab with the latest and greatest and stressing over which features make one instrument better or more usefull than another just because we saw a interesting video about it. When in reality we have no actual current need or experience with that piece of equipment. We just want to be prepared in case that piece of equipment is ever needed and we want an awesome lab.  I think this forum is great and I am having a lot of fun getting back into electronics but it feeds this obssesive part of our human nature.  I know there are many who do this for a living and these performance issues have a real impact on how easy it is for them to do thier job. But I think there are many who fit my lab builder description and there is nothing wrong with that, we all have to start somewhere.

Of course - and sorry, I wasn't implying there was anything wrong in any way with pursuing electronics as a hobby - or even with just outfitting a lab - because you might just love to own and tinker with lab equipment (I sure do :) ). And I'm fully aware of the compulsion to overspend on gear when you might not actually need it - my weakness in that respect has been more with computer stuff than lab equipment (since I tend to do slightly more programming than electronics) - so I have certainly spent tens of thousands of dollars I didn't necessarily need to over the last few decades.  :P
 


Share me

Digg  Facebook  SlashDot  Delicious  Technorati  Twitter  Google  Yahoo
Smf