Author Topic: How often do we need to have the oscilloscope and function generator calibrated?  (Read 12770 times)

0 Members and 7 Guests are viewing this topic.

Offline eeguyTopic starter

  • Regular Contributor
  • *
  • Posts: 181
  • Country: us
Hello, how often should these instruments be calibrated? Before purchase, can I ask the company to offer free annual calibrations for, say 10 or more years?
 

Offline w2aew

  • Super Contributor
  • ***
  • Posts: 1780
  • Country: us
  • I usTa cuDnt speL enjinere, noW I aR wuN
    • My YouTube Channel
Hello, how often should these instruments be calibrated? Before purchase, can I ask the company to offer free annual calibrations for, say 10 or more years?

Most manufacturers recommend an annual calibration cycle (or at least an annual performance verification cycle).  For hobby use, it is totally up to you.  I've got equipment in my home lab that hasn't been calibrated for more than a decade, and their accuracy is still more than sufficient for my use.
YouTube channel: https://www.youtube.com/w2aew
FAE for Tektronix
Technical Coordinator for the ARRL Northern NJ Section
 
The following users thanked this post: jancumps, Falkra

Offline G0HZU

  • Super Contributor
  • ***
  • Posts: 3168
  • Country: gb
If the scope and generator are used as part of a business then a typical recalibration period would be every year or maybe every two years. This is to demonstrate an acceptable level of quality control.

However, for (the vast majority of) hobby use I don't think it makes much sense to send stuff off for formal calibration. I have never done this with any of my test gear and I've had some items for over 30 years. The annual calibration bill for all of the test gear I have today would be quite scary.

Quote
Before purchase, can I ask the company to offer free annual calibrations for, say 10 or more years?
I think 'big' customers like the company I work for can arrange special deals/schemes for calibration and repair costs but I'm not sure you will get a free 10 year calibration deal on just two items. Good luck with that :)
« Last Edit: September 23, 2016, 09:39:01 pm by G0HZU »
 

Offline CatalinaWOW

  • Super Contributor
  • ***
  • Posts: 5432
  • Country: us
Businesses base their calibration cycles on risk.  What risk?  Well, if you do a calibration, and find the instrument out of specification, then every item you produced since the last calibration is potentially bad.  They would love to reduce calibration (extend calibration intervals), but have limits to how much product can be "bad".  Bad depends on the industry, the warranty, the hazard to life and property from a product not tested properly and so on.

More sophisticated companies watch the drift during calibration intervals and change the interval based on a prediction of how long it will stay in tolerance combined with their sensitivity to risk.  Another technique is to have an informal calibration at much more frequent intervals (daily, at shift changes or operator changes, whatever makes sense).  The informal calibration might mean measuring a precision voltage reference or some other quick and simple test appropriate to the test gear.  Tracking these informal tests is used to guide when a formal calibration is required.

Those doing research may be depending on the last decimal place in the data.  They will base calibration intervals on the accuracy they require for their particular measurement.

Most of us don't ever need a formal, traceable to national standards calibration.
 

Offline nctnico

  • Super Contributor
  • ***
  • Posts: 27886
  • Country: nl
    • NCT Developments
Hello, how often should these instruments be calibrated? Before purchase, can I ask the company to offer free annual calibrations for, say 10 or more years?
I don't think you'll get any equipment dealer crazy enough to offer 10 years of free calibrations. You'll have to pay for it but I can imagine they'll throw in some discount for a subscription for several years. Either way you should always be aware an instrument or cable/probe can break and show funny readings so you'll probably need ways to check your own equipment more frequently. Maybe even before each measurement to check the instrument and cables.

You also have to ask yourself why you want/need your equipment checked and if necessary adjusted. Be aware there is some semantic ambiguity whether 'calibration' includes adjusting but test houses interpret 'calibration' as 'checking'. However the Webster's English dictionary says the word 'calibration' means checking AND adjusting.
There are small lies, big lies and then there is what is on the screen of your oscilloscope.
 

Offline bitseeker

  • Super Contributor
  • ***
  • Posts: 9057
  • Country: us
  • Lots of engineer-tweakable parts inside!
@eeguy: What is your use case? Are you working on things where the absolute value of quantities that you're measuring must match a certified standard?

If so, then you calibrate at the manufacturer recommended interval or an interval that your business has determined is necessary. That interval could be significantly different for each instrument, depending on what it's used for (e.g., a power supply used for production verification would be more critical than one for powering an Arduino you're tinkering with).

If not, then you'd probably only calibrate things when you repair them or find that they're too far off from a reasonable amount of error. Even then, you might only calibrate a minimum number of instruments that you could then refer to in order to adjust your other equipment yourself (e.g., a calibrated DMM could be used to adjust the amplitude of your function generator's output).

Of course, you could opt not to have anything calibrated externally, instead choosing specific instruments in your lab to be your "standards" and adjusting everything relative to them.

It depends on your or your business' needs.

And do take note of nctnico's caveat about what "calibration" means. Ask exactly what will be done before sending your equipment in so you're not surprised later.
TEA is the way. | TEA Time channel
 

Offline Wuerstchenhund

  • Super Contributor
  • ***
  • Posts: 3088
  • Country: gb
  • Able to drop by occasionally only
Be aware there is some semantic ambiguity whether 'calibration' includes adjusting but test houses interpret 'calibration' as 'checking'. However the Webster's English dictionary says the word 'calibration' means checking AND adjusting.

Are you sure?
http://www.merriam-webster.com/dictionary/calibrate

"calibrate, calibrated [...]:  to standardize (as a measuring instrument) by determining the deviation from a standard so as to ascertain the proper correction factors"

Oxford's Dictionary has a similar definition of 'calibrate'
https://en.oxforddictionaries.com/definition/calibrate

"Correlate the readings of (an instrument) with those of a standard in order to check the instrument's accuracy."

This also conforms with my understanding that calibration is only the determination of the amount of deviation from a given standard, not the correction. It's also my experience that calibration does not include adjustment (although some labs may offer all-inclusive packages).
« Last Edit: September 23, 2016, 11:51:20 pm by Wuerstchenhund »
 

Offline nctnico

  • Super Contributor
  • ***
  • Posts: 27886
  • Country: nl
    • NCT Developments
@Wuerstchenhund: you are quoting very selectively from the links you provided because the other descriptions also include adjusting  :box:
There are small lies, big lies and then there is what is on the screen of your oscilloscope.
 

Offline rstofer

  • Super Contributor
  • ***
  • Posts: 9933
  • Country: us
Just a single price point but $185 to calibrate a Lecroy scope:
https://www.custom-cal.com/TypeInfo.aspx?kn=147&srv=Oscilloscope_Calibration_Repair

Repair is extra.



 

Offline skipjackrc4

  • Regular Contributor
  • *
  • Posts: 239
  • Country: us
Typically, I've found that if equipment is found to be out of tolerance during calibration, the cal lab will ask me if I want them to bring it into tolerance or accept it with a "limited calibration".  The limited calibration means that the specifications have been relaxed to accommodate the drift in performance.  Sometimes a limited cal is sufficient, sometimes it is not.
 

Offline julian1

  • Frequent Contributor
  • **
  • Posts: 769
  • Country: au
Aren't most fast scopes using 8-bit DACs? What would calibration involve - adjusting/checking the front-end analog filters?
 

Online Kleinstein

  • Super Contributor
  • ***
  • Posts: 14749
  • Country: de
For a scope calibration is usually checking the scale factors for the timing and amplitude. In addition there may be a check if the upper frequency limit is still Ok and if a rectangle still shows nice and clean slopes to check compensation of internal dividers.

Normally a scope is not really used for accurate measurements. A reference for a 8 Bit, maybe 12 Bit resolution is not that difficult and failure prone. Timing is usually a crystal oscillator - so no major drift is expected.

So unless the scope is used in a critical application calibration is not important. A more crude check can be done yourself - so if the scope and generator agree an frequency and amplitude it is usually good enough. For the frequency there are a few radio signal available to do a precision check if needed.

For the compensation at the scope one usually checks probe compensation regularly anyway. This is also a first check for the scope.
 

Offline Wuerstchenhund

  • Super Contributor
  • ***
  • Posts: 3088
  • Country: gb
  • Able to drop by occasionally only
@Wuerstchenhund: you are quoting very selectively from the links you provided because the other descriptions also include adjusting  :box:

The "adjusting" part in the other definitions refer to different kinds of 'adjustment, i.e. glass thermometers (where 'calibration' means identifying a thermometer's properties and putting the temperature markers on the tube), or the process of taking into account deviations from a known standard as an external process (i.e. taking test results and applying corrections post-experiment). The latter process is also applied to test gear.

Let me give you an example: say you were given a voltmeter in unknown condition. You try it out and it works. Then you connect it to some voltage reference, and you notice that it constantly shows a figure that is 0.5V too high, a figure that is also outside the voltmeter's specifications. But other than that the voltmeter works fine, so you use it to do some measurements later on, which are now correct because you now know that you have to substract 0.5v from the indicated value. The check against the voltage reference was the 'calibration', and the mental substraction of 0.5v from every reading is the 'adjustment' you apply based on the calibration results.

That is why, after calibration, your lab doesn't just give you a pass/fail list, but a list showing the exact deviation of the instrument for each required operating mode. That list is there so you can 'adjust' your measurement results accordingly, like in the example above. Of course, in real life EE's just look at the pass/fail column, and if the instrument passed then they'll just take readings as results, i.e. they ignore the known deviation and the 'adjustment' they could make. Scientists on the other hand tend to use the calibration table to perform 'adjustments' to their measurements. Different worlds I guess.

In general however, 'calibration' doesn't include 'alignment' (i.e. bringing the readings of a test instrument in line with it's specifications), a process that for some instruments is even impossible (i.e. if it's out of spec then it's defective and needs repair). On the other side, some instruments (i.e. some torque wrenchs used in aviation) require regular alignment, so the calibration facility will offer calibration and alignment in a single package because calibration alone would be useless in those cases.
« Last Edit: September 24, 2016, 10:36:54 am by Wuerstchenhund »
 

Offline G0HZU

  • Super Contributor
  • ***
  • Posts: 3168
  • Country: gb
Quote
That is why, after calibration, your lab doesn't just give you a pass/fail list, but a list showing the exact deviation of the instrument for each required operating mode. That list is there so you can 'adjust' your measurement results accordingly, like in the example above. Of course, in real life EE's just look at the pass/fail column, and if the instrument passed then they'll just take readings as results, i.e. they ignore the known deviation and the 'adjustment' they could make. Scientists on the other hand tend to use the calibration table to perform 'adjustments' to their measurements. Different worlds I guess.

I'd like to think that a talented EE would use several of the tools at their disposal in order to minimise measurement uncertainty. i.e. if I had to make a critical measurement of the linearity of the levelling of a signal source at a chosen frequency (eg a lab RF sig gen) then I wouldn't trust the cal data from a sig gen that had been 'calibrated' recently. It just isn't going to give the info you need and the info it does give won't be reliable over time. Plus it has probably been vibrating/bouncing in the back of a transit van for a day or two on the way back from 'calibration' and several engineers may have used/abused it since that date.

« Last Edit: September 24, 2016, 11:44:35 am by G0HZU »
 

Offline nctnico

  • Super Contributor
  • ***
  • Posts: 27886
  • Country: nl
    • NCT Developments
Aren't most fast scopes using 8-bit DACs? What would calibration involve - adjusting/checking the front-end analog filters?
DSOs need a lot of calibration. Nowadays the front-ends are digitally controlled but in many DSOs you'll find trimmers to adjust the frequency compensation for the dividers. Then there is ADC gain, offset and interleaving which (usually) are digitally adjustable. More modern DSOs usually have a way to deal with most of these errors themselves with a self-calibration procedure (which also adjusts any errors  >:D ).
There are small lies, big lies and then there is what is on the screen of your oscilloscope.
 
The following users thanked this post: julian1

Offline joeqsmith

  • Super Contributor
  • ***
  • Posts: 11850
  • Country: us
Hello, how often should these instruments be calibrated? Before purchase, can I ask the company to offer free annual calibrations for, say 10 or more years?

For my home hobby uses, when I get any used equipment I will go through a complete checkout, repair, alignment.  That's normally it.  New equipment (rare in my case) I'll just do some basic checks and call it good.   

If I I'm concerned, I do have some items I keep for a sanity check.  The timebase on most of my equipment (that needs time) runs from a GPS which is way tighter than anything I need.  The GPS runs 24/7.   I have several precision low drift resistors (0.1% to 0.005%) along with some film and ceramic caps that I had checked.   My SOLT cal kit for the VNA is home made that I had verified it against a known cal kit.  Also, for the VNA (because I use it to measure impedance) I have some other parts.   For voltage, I have a very old Fluke reference standard.

So when I hook up my 555 timer I have some confidence in what it's doing!

Offline Wuerstchenhund

  • Super Contributor
  • ***
  • Posts: 3088
  • Country: gb
  • Able to drop by occasionally only
Quote
That is why, after calibration, your lab doesn't just give you a pass/fail list, but a list showing the exact deviation of the instrument for each required operating mode. That list is there so you can 'adjust' your measurement results accordingly, like in the example above. Of course, in real life EE's just look at the pass/fail column, and if the instrument passed then they'll just take readings as results, i.e. they ignore the known deviation and the 'adjustment' they could make. Scientists on the other hand tend to use the calibration table to perform 'adjustments' to their measurements. Different worlds I guess.

I'd like to think that a talented EE would use several of the tools at their disposal in order to minimise measurement uncertainty.

I never said otherwise. All I said is that EEs in general tend to ignore the cal sheet.

Quote
i.e. if I had to make a critical measurement of the linearity of the levelling of a signal source at a chosen frequency (eg a lab RF sig gen) then I wouldn't trust the cal data from a sig gen that had been 'calibrated' recently.

So I'd guess you'd then do your own 'calibration', like measuring the linearity yourself with a known good SA or power meter which offers the required precision.

Quote
It just isn't going to give the info you need and the info it does give won't be reliable over time. Plus it has probably been vibrating/bouncing in the back of a transit van for a day or two on the way back from 'calibration' and several engineers may have used/abused it since that date.

That depends on the circumstances. Many of the labs I work in use in-house calibration, i.e. Keysight or whoever comes onsite to calibrate the equipment, and for cases where it makes sense we calibrate in 3 months or 6 months terms even though the mfgr recommended yearly or bi-yearly calibration, just to demonstrate compliance. Especially for the equipment that is calibrated yearly or more often I'm sure the calibration data can be trusted, but of course I'd always do a spot check when I use an instrument that's not "mine" because it could well be defective (shit happens).
 

Offline G0HZU

  • Super Contributor
  • ***
  • Posts: 3168
  • Country: gb
Quote
I never said otherwise. All I said is that EEs in general tend to ignore the cal sheet.
Er, no... you worded it in away that suggested that all EEs are sloppy at reading calibration data. You implied they all just look at the pass/fail column on the sheet when they read it. See below.

Quote
Of course, in real life EE's just look at the pass/fail column, and if the instrument passed then they'll just take readings as results, i.e. they ignore the known deviation and the 'adjustment' they could make.

I'm afraid it only takes one diligent EE to prove you wrong, but in reality there will be a lot of EEs who do study calibration data. I rarely do it at my place of work but when I do it will typically be for something like a noise source ENR vs frequency or the efficiency correction factors (across frequency) for a power meter head. But other EEs may study calibration data much more than me. I just do RF.

« Last Edit: September 25, 2016, 11:38:08 am by G0HZU »
 

Offline nctnico

  • Super Contributor
  • ***
  • Posts: 27886
  • Country: nl
    • NCT Developments
I don't look at calibration data other than to see if an instrument was within specification. The warranted accuracy is what I'm after because those are the numbers I can rely on. The behaviour during the calibration check is just one sample.
There are small lies, big lies and then there is what is on the screen of your oscilloscope.
 

Offline Wuerstchenhund

  • Super Contributor
  • ***
  • Posts: 3088
  • Country: gb
  • Able to drop by occasionally only
Quote
I never said otherwise. All I said is that EEs in general tend to ignore the cal sheet.
Er, no... you worded it in away that suggested that all EEs are sloppy at reading calibration data.

You should really stop reading stuff in other people's posts that simply isn't there.

There's really nothing 'sloppy' in not reading cal data on every occasion. Far from it. As an EE, if a instrument has passed calibration and the specified accuracy of that instrument is good enough for my measurement why should I read calibration data?

That doesn't change the fact that there's usually a certain difference in the adherence to numbers between EE and scientist.

Quote
You implied they all just look at the pass/fail column on the sheet when they read it. See below.

Yes, because that's my experience. YMMV of course, after all who knows maybe in your labs engineers spend half an hour going through the cal data (that just one post ago you said you don't trust anyways) before taking a simple 100Mhz scope to do a basic measurement ;)

Quote
Quote
Of course, in real life EE's just look at the pass/fail column, and if the instrument passed then they'll just take readings as results, i.e. they ignore the known deviation and the 'adjustment' they could make.

I'm afraid it only takes one diligent EE to prove you wrong, but in reality there will be a lot of EEs who do study calibration data. I rarely do it at my place of work but when I do it will typically be for something like a noise source ENR vs frequency or the efficiency correction factors (across frequency) for a power meter head. But other EEs may study calibration data much more than me. I just do RF.

Yes, you have to look to the calibration data for some simpler noise sources like the HP 346A because they are hardly more than a simple noise diode in a solid housing. Each diode has a specific operating envelope which differs slightly from the next one, so each noise source is calibrated and the output/frequency response printed on the source directly. And since the simple noise source has compensation or controls or indications you have to refer to that table to use it.

The same is true for old power meters which don't have a frequency response compensation, so again you have to look at the sticker with the frequency response curve or table if you want to get some useful data. Modern Power Meters store the calibration data in the Power Meter head which is read out by the Power Meter, so manual compensation is no longer required.

Both are pretty much edge cases which really require an engineer to read and apply some calibration data to get any meaningful measurements. And even there you'd usually rely on the data of the factory-provided table or graph showing the frequency response, and not on the latest calibration certificate that shows by how much the noise source or PM head deviate from that table.

For other pieces of test gear (RF generator, scope, PSU, AWGs, Spectrum/Signal/Network Analyzers, whatever), I guess even you don't bother checking the calibration data you already said you don't trust anyways, as long as the instrument has passed (i.e. is known to be working within spec). And despite not trusting calibration data, my guess is that you're not doing your own calibration of every piece of test equipment before using it in a measurement to acertain it's spec compliance (stuff like cable/adapter calibration on VNAs excluded of course).

And frankly, there's nothing wrong with that.
« Last Edit: September 26, 2016, 10:59:42 am by Wuerstchenhund »
 

Offline T3sl4co1l

  • Super Contributor
  • ***
  • Posts: 22384
  • Country: us
  • Expert, Analog Electronics, PCB Layout, EMC
    • Seven Transistor Labs
Businesses base their calibration cycles on risk.  What risk?  Well, if you do a calibration, and find the instrument out of specification, then every item you produced since the last calibration is potentially bad.  They would love to reduce calibration (extend calibration intervals), but have limits to how much product can be "bad".  Bad depends on the industry, the warranty, the hazard to life and property from a product not tested properly and so on.

More sophisticated companies watch the drift during calibration intervals and change the interval based on a prediction of how long it will stay in tolerance combined with their sensitivity to risk.  Another technique is to have an informal calibration at much more frequent intervals (daily, at shift changes or operator changes, whatever makes sense).  The informal calibration might mean measuring a precision voltage reference or some other quick and simple test appropriate to the test gear.  Tracking these informal tests is used to guide when a formal calibration is required.

This.

All this.

It's all about the Quality between manufacturer and customer: what assurance they offer, and what level they demand.

With all measurements tested regularly, and traceable to a standards body, there can be no question, between different manufacturers and different customers, about what their different measurements mean.

In contrast, there is no standard* for, say, pants size.  I can buy anything from "31" to "34", from different manufacturers, and get something that fits (or grossly does not).

*There might be one.  I just don't know of it.  And obviously, it's either such a loose tolerance, or so rarely used, as to be useless.  Also, this is the wild land of the USA.  Likely there is an ISO standard applicable, but we don't use those.

Ideally, a calibration step should produce no change in the device itself; at its most basic level, it's simply to confirm that the device is still within the range of error acceptable for it.  But real devices drift, and it frequently is necessary to correct systematic errors, which of course is best done at the same time.  So a calibration lab does both services at once.

Tim
Seven Transistor Labs, LLC
Electronic design, from concept to prototype.
Bringing a project to life?  Send me a message!
 

Offline eeguyTopic starter

  • Regular Contributor
  • *
  • Posts: 181
  • Country: us
In terms of the need to re-calibration, how the GDS-2204E, Keysight 2K, 3K-T Series and the inexpensive Rigol oscilloscope compared with each other?
 

Online tautech

  • Super Contributor
  • ***
  • Posts: 29385
  • Country: nz
  • Taupaki Technologies Ltd. Siglent Distributor NZ.
    • Taupaki Technologies Ltd.
In terms of the need to re-calibration, how the GDS-2204E, Keysight 2K, 3K-T Series and the inexpensive Rigol oscilloscope compared with each other?
So if you've read all the replies to your question you should be now aware most equipment accuracy that needs to be good should be cal checked annually.

For the hobbyist even less frequently unless precision is required.
Basic home checks for sanity's sake can be performed with just a referenced frequency and a precision voltage reference. Nothing more is needed for a quick sanity check.  ;)
Much scope use is so basic that the Cal output is often enough to see that all's in order.
Avid Rabid Hobbyist.
Some stuff seen @ Siglent HQ cannot be shared.
 

Offline eeguyTopic starter

  • Regular Contributor
  • *
  • Posts: 181
  • Country: us
In terms of the need to re-calibration, how the GDS-2204E, Keysight 2K, 3K-T Series and the inexpensive Rigol oscilloscope compared with each other?
So if you've read all the replies to your question you should be now aware most equipment accuracy that needs to be good should be cal checked annually.

For the hobbyist even less frequently unless precision is required.
Basic home checks for sanity's sake can be performed with just a referenced frequency and a precision voltage reference. Nothing more is needed for a quick sanity check.  ;)
Much scope use is so basic that the Cal output is often enough to see that all's in order.

Yes. What I was trying to ask is whether cheaper oscilloscopes (e.g. DS1054z and GDS-2204E) have higher chance to require re-calibration due to the use of cheaper components than those Keysight 2000-3000T ones.



 

Offline nctnico

  • Super Contributor
  • ***
  • Posts: 27886
  • Country: nl
    • NCT Developments
In case of the GDS-2204E you can do the adjustment yourself using SPC (signal path compensation) and self calibration. I think some calibration companies also calibrate (check) GW Instek scopes. If the calibration fails (outside specification) the scope is likely broken but fortunately the GDS2000 series is covered by a limited lifetime warranty for the first owner; the warranty ends 5 years after the production of the model stops.
There are small lies, big lies and then there is what is on the screen of your oscilloscope.
 
The following users thanked this post: eeguy


Share me

Digg  Facebook  SlashDot  Delicious  Technorati  Twitter  Google  Yahoo
Smf