Author Topic: Taking care of a reference multimeter  (Read 3155 times)

0 Members and 1 Guest are viewing this topic.

Offline rodppTopic starter

  • Frequent Contributor
  • **
  • Posts: 307
Taking care of a reference multimeter
« on: June 01, 2019, 11:32:18 pm »
Hi!

I just bought an 8.5 multimeter (keithley 2002) to use as a reference to my others multimeters.

The last and unique calibration of that multimeter was in 2012 in the Keithley factory, before shipment to the first owner. I have that report.

The original intended use for this multimeter was to use in a cal lab project, where a voltage measurement were related with another physical parameter. The multimeter was used for initial tests during 2 months, and after that the project was aborted. They stored the multimeter in original box, whit all papers, probe, calibration report, etc. Now they advertised it and I bought from them.

I`m looking for advices to take care of it, and have some questions:

1- Should I leave it powred on 24/7, or only when needed?
2- Probably I`ll calibrate it in a certified calibration lab. Should I ask to adjust it to near factory specifications (with before and after calibration report), or only ask for calibration (without adjustments)?
3- Sugestions and advices are very welcome!

Thanks!

Regards,
Rodrigo.
 

Offline TiN

  • Super Contributor
  • ***
  • Posts: 4543
  • Country: ua
    • xDevs.com
Re: Taking care of a reference multimeter
« Reply #1 on: June 01, 2019, 11:53:54 pm »
1. Depends on what are your needs. I always run my important gear (8.5d meters, calibrators, DC Voltage references) 24/7 so it's predictable and no need hours or days of warm-up time to obtain good measurements. However I'm looking for performance better than annual specs as well.

2. Same as Q1. Depends what you want out of unit. If you need traceable results to SI Volt/Ohm, you need to send meter every year (or more frequent) to the calibration. Alternative is to own suitable calibrator (Datron 4708/4808 or Fluke 5700/5720/5730A + 5725, guarded by reference standards or calibrated 3458A) to calibrate and maintain K2002. You see where this is going :) If you just want relative measurements without need for traceability/legal purpose - you can have somebody's help to calibrate/test meter for you or do calibration once to check it's still good and in spec. Adjusting meter makes drift analysis difficult, but if you have no prior history - there is no much to loose anyways.

3.  Would be nice if you can share calibration report data points, just for our volt-nutty collection. I calibrate my 2002s using DIY software and procedure, similar to how Tek/Keithley does it, but more references can be helpful, as no test protocol is same.
YouTube | Metrology IRC Chat room | Let's share T&M documentation? Upload! No upload limits for firmwares, photos, files.
 
The following users thanked this post: rodpp

Offline rodppTopic starter

  • Frequent Contributor
  • **
  • Posts: 307
Re: Taking care of a reference multimeter
« Reply #2 on: June 02, 2019, 02:20:22 am »
Hi TiN, I already read a lot of content in your posts and website. Thanks for sharing your knowledge!

1- My use will not be frequent, and currently I do not need the max performance. So it seems that leaving it powered off is ok for my use.

2- Scare... I do not have plans to be a Volt Nut, neither the necessary time (\$$$$)! I guess that K2002 is the further I`ll go with that, willing not to be wrong... Probably I will do a calibration to sanity check and compare the report with the old one. If its drift is small after that 7 years since the older report, probably I`ll never calibrate it again. On the other hand, if it drifts a lot maybe I must calibrate it more times.

3- Please see attached the report, from my cell phone camera ( unfortunately I don`t have a scanner here). I only redacted the last three digits of the serial number. If the complete serial number is relevant to you, let me know.

Regards,
Rodrigo.
 
The following users thanked this post: TiN

Online tggzzz

  • Super Contributor
  • ***
  • Posts: 20100
  • Country: gb
  • Numbers, not adjectives
    • Having fun doing more, with less
Re: Taking care of a reference multimeter
« Reply #3 on: June 02, 2019, 02:24:18 am »
2- Scare... I do not have plans to be a Volt Nut,

Neither did I....

Quote
Probably I will do a calibration to sanity check and compare

... and that's the entrance to the rathole. It never takes long to find it.
There are lies, damned lies, statistics - and ADC/DAC specs.
Glider pilot's aphorism: "there is no substitute for span". Retort: "There is a substitute: skill+imagination. But you can buy span".
Having fun doing more, with less
 

Offline rodppTopic starter

  • Frequent Contributor
  • **
  • Posts: 307
Re: Taking care of a reference multimeter
« Reply #4 on: June 02, 2019, 05:16:04 pm »
Who knows... to be honest, I`m thinking in start a 10V stable reference project. Must be nice!
 

Offline rodppTopic starter

  • Frequent Contributor
  • **
  • Posts: 307
Re: Taking care of a reference multimeter
« Reply #5 on: June 06, 2019, 03:14:46 pm »
I just uploaded the noise measurements to the TiN FTP, together with data from other three meters (34461A, 3456A and Solartron 7150+).

Here is what I found:

Keithley 2002
      mean=-9.990000e-09
      rms=1.190785e-07
      sigma=1.187181e-07
Agilent 34461A
      mean=-5.777333e-07
      rms=6.591766e-07
      sigma=3.175512e-07
HP 3456A
      mean=-1.662000e-06
      rms=1.728583e-06
      sigma=4.753756e-07
Solartron 7150+
      mean=1.890000e-06
      rms=1.279453e-05
      sigma=1.266050e-05

In the attached image, the last graphic scale is different because of the poor Solartron performance.
 

Online dietert1

  • Super Contributor
  • ***
  • Posts: 2258
  • Country: br
    • CADT Homepage
Re: Taking care of a reference multimeter
« Reply #6 on: June 07, 2019, 02:23:05 pm »
Interesting numbers, yet another question: From the Keithley calibration certificate shown above i got the impression that your instrument was checked but not adjusted. Is this really true? Is this general practice, also with other makes? Can't really explain why, but i was under the assumption that one needs a Josephson voltage standard to adjust a 8.5 digit DVM.
Why should i get an 8.5 digit DMM if i won't get it adjusted but only checked to be better than +/- 5 ppm or so?

Regards, Dieter
 

Offline TiN

  • Super Contributor
  • ***
  • Posts: 4543
  • Country: ua
    • xDevs.com
Re: Taking care of a reference multimeter
« Reply #7 on: June 07, 2019, 02:32:01 pm »
dietert1, often you do NOT want calibration house to start adjusting your meter, so you could track it's drift over time without interruptions. Once you collect X years of calibration reports, you could then check if drift is stable and linear, and if it is - predict uncertainty of meter readings in future. Essentially this is how calibration house would maintain their standards as well, if they are serious about going for best uncertainty.

You don't need JVS system to calibrate or adjust 8.5 digit meters, usual commercial calibrator such as 57x0/4808 guarded by reference standards, which are in periodic calibration can be used to test and adjust 8.5d meters. Meters are not reference standards, even if some vendors like to put some marketing smoke and mirrors in their promotions.
YouTube | Metrology IRC Chat room | Let's share T&M documentation? Upload! No upload limits for firmwares, photos, files.
 

Online dietert1

  • Super Contributor
  • ***
  • Posts: 2258
  • Country: br
    • CADT Homepage
Re: Taking care of a reference multimeter
« Reply #8 on: June 07, 2019, 03:25:56 pm »
OK, i understand. Our HP 3456A precision DMMs also never went for calibration because i was afraid of somebody detuning it without notice. Nowadays the DMM calibration is digital, isn't it? Isn't it possible to make a Keithley 2002 dump all of its calibration data in some useful manner? Why would one loose track when adjusting the instrument?
Concerning precision, i understood from the numbers above that a Keithley 2002 can transfer about 0,1 uV which would be 10 ppb. But that appears to be noise/resolution within an hour or so. And after one day or one week it becomes a 5 or 6 digit meter and it isn't any better than a Fluke 1 ppm standard. Is this the reason why it doesn't receive better calibration?

Regards, Dieter
 

Offline TiN

  • Super Contributor
  • ***
  • Posts: 4543
  • Country: ua
    • xDevs.com
Re: Taking care of a reference multimeter
« Reply #9 on: June 07, 2019, 07:39:02 pm »
K2002 noise about 2-3 uV at best, similar to 3458A/8508A/1281. Anything below 1 uV noisefloor is pointless because of own meter's reference stability and accuracy.
If you want sub-ppm measurements, there is no way around null-meters or nanovoltmeters and ultra-stable verified references, or quantum standards. You can make a dump of calibration values from K2002, but why make your life difficult tracking what dump and what time point did it correspond, if one can just keep meter never adjusted but to measure errors everytime instead? Calibration is verification of unknown DUT to known reference standards. Calibration is NOT the adjustment.

To be fair, 3456A is in far another league to 2002.
YouTube | Metrology IRC Chat room | Let's share T&M documentation? Upload! No upload limits for firmwares, photos, files.
 

Online dietert1

  • Super Contributor
  • ***
  • Posts: 2258
  • Country: br
    • CADT Homepage
Re: Taking care of a reference multimeter
« Reply #10 on: June 07, 2019, 08:27:51 pm »
Yes, i can read in the calibration certificate above that the Keithley 2002 had a deviation of 0,8 uV = 0,080 ppm with the 10 V standard. That's where most of these DVMs shine. Our 3456As yield some ppm in this range, also much better than specs.
A little suspicious though in the above 2002 case, when using a standard with 59 uV = 5.9 ppm uncertainty. And then i don't quite get the meaning of a test tolerance of 129 uV = 12.9 ppm for the DVM (5 digits!). If the 2002 is really capable of +/- 0.1 ppm uncertainty, one would like to know that and calibrate against a quantum standard.
Otherwise i will rather keep our 3456As as i know from experience that they fulfil a +/- 12.9 ppm spec as well, at least for some years.

Regards, Dieter
 

Offline TiN

  • Super Contributor
  • ***
  • Posts: 4543
  • Country: ua
    • xDevs.com
Re: Taking care of a reference multimeter
« Reply #11 on: June 07, 2019, 09:28:46 pm »
I don't know where you getting this 0.1 ppm figure.  :-//
K2002 specifications are clear on this, for example best spec, 20VDC range - 10 ppm of reading + 0.15 ppm of range for 1 year. And in this case using 5.9 ppm uncertain standard is no much problem, but you'd want 3ppm standard to be sure, such as calibrated 732B or 5720A.

Sure you can calibrate and adjust against JVS system, with transfer uncertainty 0.1 ppm of reading + 0.05 ppm of range, but that calibration will be invalid after 10 minutes or calibration and if meter temperature changes more than 0.5 °C. Some meter may perform better and longer time, but without actually testing that with proper stability source this is just speculation.
YouTube | Metrology IRC Chat room | Let's share T&M documentation? Upload! No upload limits for firmwares, photos, files.
 

Online dietert1

  • Super Contributor
  • ***
  • Posts: 2258
  • Country: br
    • CADT Homepage
Re: Taking care of a reference multimeter
« Reply #12 on: June 07, 2019, 10:16:51 pm »
My numbers are from the Keithley 2002 certificate shown above, especially the 10 V check.
So, if i understand that: A Keithley 2002 with a 12.9 ppm deviation in the 10 V DC test will come back from calibration with "test passed" and no adjustment made. That means it will be a 5 digit DVM and the display error will be something like 1290 counts. What a shame.

Regards, Dieter
 

Offline maginnovision

  • Super Contributor
  • ***
  • Posts: 1966
  • Country: us
Re: Taking care of a reference multimeter
« Reply #13 on: June 07, 2019, 10:22:26 pm »
My numbers are from the Keithley 2002 certificate shown above, especially the 10 V check.
So, if i understand that: A Keithley 2002 with a 12.9 ppm deviation in the 10 V DC test will come back from calibration with "test passed" and no adjustment made. That means it will be a 5 digit DVM and the display error will be something like 1290 counts. What a shame.

Regards, Dieter

On the other hand if you're monitoring something it doesn't matter how far off it is, you still see the change. If you get actual data back you can add or subtract the offset(error) since linearity should still be pretty good. If I know my meter is 10PPM off I can account for that incredibly easily.
 

Online dietert1

  • Super Contributor
  • ***
  • Posts: 2258
  • Country: br
    • CADT Homepage
Re: Taking care of a reference multimeter
« Reply #14 on: June 08, 2019, 05:34:12 am »
Now, if the OP takes this approach and subtracts the offsets documented in his calibŕation report, he will still end up with an uncertainty of +/- 5,9 ppm in the 10V DC measurement. That are up to +/- 590 counts - effectively a 5.5 digit meter.
The documented 0,08 ppm coincidence of the 10 V DC test may indicate that the calibration standard was 100x better than its specs as well. This could somehow explain the nice agreement (which may also be accidental). Or the meter got adjusted.
I think all that voltnuttery results from the discrepancy of precision specs and the resulting bad habits of calibration labs. Many owners of precision DMMs know they can do better. I brought a +/- 2 ppm calibration for our voltage standard from a recent metrology meeting, so our HP 3456As are in much better shape than official specs. And they are very stable due to their age.

Regards, Dieter
 

Offline TiN

  • Super Contributor
  • ***
  • Posts: 4543
  • Country: ua
    • xDevs.com
Re: Taking care of a reference multimeter
« Reply #15 on: June 08, 2019, 07:25:50 am »
Sounds like you expect resolution of the meter and accuracy to be the same. However none of meters can do this, but it does not mean that even uncalibrated 8.5d is totally useless.
Take 3458A/02 or fancy 8588A, same story, they will not provide you even 7.5 digit accuracy. Yes it is useless to measure unknown DUT source value to determine 8.5d value, but it's still very useful to measure unknown DUT source stability or difference between DUTs.

Numbers you see on calibration report are valid only at the time of the calibration. 0.08 ppm from +10V point only means that at time of measurement, in calibration lab the average deviation between source (+/-5.4 ppm) and meter (+/-12.9 ppm) was that value. It does not mean meter is 0.08 ppm accurate on it's 20V range. 19V point already reveals you exactly that, where difference becomes -0.36 ppm. There are multiple error sources inside the meter and overall calibration procedure. Another important contributor is INL errors. This is where metrology 8.5d meters really shine, allowing you to transfer known external standards to unknown DUT voltages with sub-ppm accuracy. Perhaps INL test I've performed with 2002 earlier this year could indicate better than thousand words:



Based on this chart, we can expect my particular K2002-GPIB6 to be able transferring VDC values on 20V range with better than +0.4 / -0.3 ppm error. Two 3458A in tandem can do same transfers with better than +/-0.1 ppm error. You can also do such 0.1 ppm transfers using 1980's technology such as Fluke 720A + nullmeter + DC reference. However 8.5d meters and elaborate KVD setups are not the only ones that allow such transfers, even some selected and characterized 6.5d meters can reach similar performance too. Fellow Volt-nuts here in forum already demoed this. But it does not mean these meters are specified as such or legally calibrated to do so. ;) (Also this is a reason why I don't understand Fluke marketing in regards to 8508A and new 8588A/8558A, which do NOT specify linearity at all  :--).

If we verify that 2002 is stable (by measuring known stable source, such as DC reference with known drift), one can than use such uncalibrated 2002 to test two different DC references and obtain difference between them with better spec than if one just measure DUT directly. Such ability enable volt-nut to build infamous LTZ1000 reference, measure its stability with unknown K2002 (caveat = assuming meter is also stable) and ship that uknown value LTZ1000 reference for calibration, to obtain total 20VDC range error of the reference and the meter, because now you know output value of the reference.  If such reference get better uncertainty than K2002 spec, you can calibrate K2002 for that voltage range as result.

Single point (in time) calibration is not very useful, multiple periodic calibrations are required to figure out DUT stability (real drift specification over calibration interval). Given that calibration for top-end meters expensive and only few labs can do it up to instrument specification, volt-nuts often opt as result not to adjust their meters but to collect error values (12.9 ppm in your example) and make sure that error value is less than unit specification for desired calibration period. Alternative to this is to do your own calibration or calibration adjustments against higher end standard, such as your +/-2 ppm standard. However another culprit of K2001/2002 units - they will not let you adjust only one range easily, so you end up needing high-end calibrator, even if you have JVS system at home :)

Serious calibration labs will not adjust your device, even if it is outside of the spec. Simple example - you have 732A DC standard, that monitored every day over 5 years. So you have 5 years of the data history. You ship it to lab, they measure it, find it outside of the spec. Adjust it (turning the trimpot) to their standard (history of which you do NOT know). Now this lab invalidated 5 years of your effort, and you need to collect new data for many more months, before you can figure out if drift rate remained the same as before adjustment, or changed (e.g. trimpot contact resistance variation/suddle current difference impact/etc.). Then you still need to apply math juggling to align "old" data and "new" data. That is lot of trouble, for no clear benefit.

To be fair, most voltnuts care less about calibration and traceability, because lot of work and money required to maintain this, and it make no sense for hobby level lab at home. Unless you crazy enough to own 4708/4808/5700A/5720A and bank of in-spec standards to calibrate 5720A, which you use in turn to calibrate meters like K2002. Then you can do tests 24 hours, every day, and it "save" money on shipping meter to calibrations outside. I treat 2002 as my secondary check meters, and monitor them once in 6mo or so. If I get readings deviation to my primary meters larger than 24 hour spec, I get worried.  :)

And yes, calibration labs that just test your DUT with uncalibrated source and just print a sticker for front panel do exist. Because I don't feel comfortable playing shipping game for expensive meters, and limited choice of local capable labs, doing in-house calibration is my choice in this story.
« Last Edit: June 08, 2019, 08:53:03 am by TiN »
YouTube | Metrology IRC Chat room | Let's share T&M documentation? Upload! No upload limits for firmwares, photos, files.
 
The following users thanked this post: RandallMcRee, eplpwr, Hermann W


Share me

Digg  Facebook  SlashDot  Delicious  Technorati  Twitter  Google  Yahoo
Smf