Author Topic: Is the Fluke 5440B really an artifact CAL instrument?  (Read 15546 times)

0 Members and 2 Guests are viewing this topic.

Offline VintageNut

  • Frequent Contributor
  • **
  • Posts: 534
  • Country: 00
Re: Is the Fluke 5440B really an artifact CAL instrument?
« Reply #25 on: September 30, 2016, 02:00:41 am »
Hello Dr. Frank,

Firstly, there is no guarantee for Fluke specs. For instance, there are many 742A and 732B are out of spec.

This is a trivia, I think.
You always have a statistical distribution of the specified parameters. At a typical confidence level of 95%, about 5% of the instruments fall out of the limits.
Fluke published a study about 732A and 732B references, showing exactly this, but this included also the drift of the SZA263 in typically positive, and the LTFLU in typically negative direction.
It's also trivial, that calibration instruments may drift out of specification over time, and even earlier than the specified 1 year.
If the design and verification is done correctly, this should also cover the 95% confidence level.


Secondly, rule is a rule, people observe it or not, that is another matter. The head of Division of Electricity and Magnetism at NIM(Mr. Shao) had written and spoken that rule(calibrate at working voltage) many times.

Sorry, I don't know Mr. Shao, but he for sure also cannot overrule the laws of electrical engineering, or physics.
It's not possible to calibrate a 752A or other Hamon type dividers at 'working level', for obvious reasons, e.g. that would destroy the divider resistors.

A different story would be, to use a very different technique of calibrating reference / decade dividers, so that each of the resistors also sees the same voltage burden during calibration, as during usage.

Thirdly, at NIM(National Institute of Metrology, China NIST equivelant), they don't use 752A and 720A as serious dividers, they use 4902S, and better still, they build their own true guarded divider, and all calibrated at working voltage.

Well, I don't know the adjustment technique of the 4902S, as I did not yet find a manual, or a description of the circuitry.
If you have that available, please share.

If it's a Hamon type calibration technique again, it's also not possible to do the adjustment on working voltage.
If it's more a technique like in the 720A, where each of the about 100 resistors may be adjusted at 10V, by comparing it in a bridge configuration, you would only get the mediocre ratio specifications for 10:1, 100:1, like the 720A, for sure.

Please, share the NIM technique, how to achieve Hamon type uncertainty (or better) at  working voltage adjustment.. I really can't figure out, how this works.. 

Fourthly, the input resistance of 720A is 100k(at 1.0 input) which is not very large and thus the leakage probably less significant. However, the divider string of 5440B is 2 Meg(for 1000V range), making it more vulnerable for leakage on the not very short bare PCB tracks and relay contacts that easily get dirty because of the circulation.
Lastly, if something is good, doesn't necessarily mean it must be flawless.


Yes, that's also already known about the 720A; but in practical terms, the 720A is anyhow not suitable for very precise 10:1 or 100:1 ratios.
Leakage currents are taken into account for the 720As specification also, like in the 752A ones.

The 5440B may suffer from leakage currents, but obviously it meets or exceeds its specifications anyhow.
Would be interesting, instead, if you had some experience or research results about older 5440B instruments, how and where to possibly clean the instrument, to mitigate such leakage effects and restore its initial performance.
In the end, the 5440B is not intended as a sub-ppm ratio instrument, like the 752A, or like any other of your superior NIM dividers, so your criterion of 'design flaws' in the 5440B is a bit academic, in this case.

To get to my point, calibrating at non-working voltage is a small flaw, and no guarding on ppm-level divider for 1000V at mega-ohm string is another flaw.

BTW, there is a guarding circuit inside 752A, but not true guarding.

Do you have any reference, or documents, how such ratio dividers standards are built, so that they achieve the Hamon type uncertainty, but were really also adjusted at working level voltages?

That would be really helpful.

Btw.: In the 752A manual, they summarized and evaluated all possible sources of errors, which leads to the theoretically calculated specification, which again is good metrological practice.
I also assume, that all these institutions also make an evaluation of their self-built dividers, at first.

I also know about several ratio divider comparison papers from different National Standards Institutes, where they really do a ring-comparison of 752As vs. 4902S, vs. self-built versions under working voltage conditions, i.e. at 1000V:10V mode.
But that comparison again is a different story than the different adjustment methods, I think.


I really would appreciate your practical guidance about designing more precise ratio divider standards!

Frank

Hello Dr. Frank

I have looked at the 4902 circuit board and read the discussion by Zlymex and looked at the front panel of the Datron 4902.

It looks to me that the divider is set up so that the bottom is calibrated to 10V and then a succession of transfers happens up the ladder using a null meter to null the higher sections to be identical to the golden bottom 10v. When all of the 10V ladder rungs are finished, there is now a golden 100V at the top of all of the 10V sections that are all identical to the bottom golden 10v.

The 100V section adjustments are a repeat of the procedure of 10V sections to arrive at 1000V.

I think that golden 1000V, 100V and 10V would be helpful for this to work but may not be absolutely necessary.

Every section of every decade requires an adjustment potentiometer.

Your version of the 752A could be modified to do exactly this if you place an adjustment pot between every stage of every decade making exact voltages of 10,20,30,40,50,60,70,80,90,100,200,300,400,500,600,700,800,900.

It would be meticulous to adjust but might be worth the effort as an academic exercise.
working instruments :Keithley 260,261,2750,7708, 2000 (calibrated), 2015, 236, 237, 238, 147, 220,  Rigol DG1032  PAR Model 128 Lock-In amplifier, Fluke 332A, Gen Res 4107 KVD, 4107D KVD, Fluke 731B X2 (calibrated), Fluke 5450A (calibrated)
 

Offline e61_philTopic starter

  • Frequent Contributor
  • **
  • Posts: 962
  • Country: de
Re: Is the Fluke 5440B really an artifact CAL instrument?
« Reply #26 on: December 22, 2017, 11:28:04 am »
I managed to bring my private Fluke 5440B to the calibrated Fluke 5440B and the two 3458As. I was very confident about the 10V of my Fluke 5440B, but I can't really verify the other voltages at home.

First measurements showed (after ACAL on every instrument):

10V -> +0,6ppm
100V -> +3,6ppm
1000V -> +8,4ppm

All the readings agree with the second 3458A and both 3458A are within 1ppm of the calibrated 5440B.

I didn't expect something like that. The service manual says: "The principal function of External Calibration is to correct for any shift in the Calibrator's internal voltage reference. A second function is to correct for long-term drift in the internal voltage divider resistors used for the 2.0V and 0.2V divided output ranges. All other time- and temperature-dependent changes in the calibrator are corrected by the Internal Calibration procedure.

An abbreviated External Calibration procedure ("10V Cal") may be used following Internal Calibration to complete the calibration of all ranges except the 0.2V and 2.0V ranges...."

If this is really the case, why are 100V and 1000V not within the 24h specification against the 10V output? And for what reason is it posssible to calibrate 20V, 100V and 1000V?

In the next step I run the external calibration procedure through all the steps (including 100V and 1000V). That brought my 5440B exactly to the 3458A (no surprise). After that I ran the Internal Calibration. I wasn't sure what will happen then. I thought the Internal calibration will bring all ratios back to the state before the external calibration, because 10V were already spot on. But that doesn't happen. After the internal calibration everything stayed like it was after the external calibration.

I also found in the service manual, that the internal ADC of the 5440B is able to switch the input gain. If the input is out of range with a gain of 1000x the gain will switch to 100x and the section is nulled. After that it runs again with 1000x. Therefore, I think this cannot be the reason why the calibrator can't find the right ratios. And even if this will be the case I would expect an error message.


The DIV output was spot on, by the way.
 
The following users thanked this post: Dr. Frank

Online Dr. Frank

  • Super Contributor
  • ***
  • Posts: 2402
  • Country: de
Re: Is the Fluke 5440B really an artifact CAL instrument?
« Reply #27 on: December 22, 2017, 01:34:28 pm »
Hi e61_phil,
At first, well done, hope that you didn't break your back, 2 days before Christmas..

The 3458As are not reliable @ 1kV, as the divider is not compensated for self heating effects. Mine measures about -6.5ppm, verified by a Hamon type divider.

The acal feature of the 5440B works only for small deviations from a foregoing external calibration, so you should have identified that by comparing the internal ratio coefficients,  which will be printed by a certain front panel menu . If I remember correctly, these factors are near zero for my instrument. Actually, these are the gain shift values, all being +0.2 ppm.

Maybe your private unit was way off initially, that would explain the different behaviour after external calibration.

I also sometimes encounter greater deviation than 1ppm at 100V,  compared to my 3458A, but that's related to the ACAL uncertainty of both instruments, and also to sufficient warm up time.
After another acal of both instruments,  this difference might vanish.

Using the Hamon divider, there's mostly agreement within 1ppm for 100V and 1kV, referenced to 10V

Frank
« Last Edit: December 22, 2017, 01:39:07 pm by Dr. Frank »
 

Offline e61_philTopic starter

  • Frequent Contributor
  • **
  • Posts: 962
  • Country: de
Re: Is the Fluke 5440B really an artifact CAL instrument?
« Reply #28 on: December 22, 2017, 10:05:49 pm »
Hi Frank,

you're absolutely right, the 3458A isn't that great for 1kV. One of the two 3458As drifts severeal ppm after applying 1kV. To calibrate my own 5440B I connected the 3458A to the calibrated 5440B with 1kV. Afterwards, I waited some minutes and let the 3458A stabilize, then I transfered the shown value to my 5440B. A proper Hamon Divider is on my list :)

Unfortunately, I build the serial cable after the external calibration. But, I think +8ppm isn't way off. Therefore, I though the 5440B should correct for this.


Perhaps I missed the table in the manuals, but I compared the serial output with the output of the GCAL command. And this is what I found out:

GCAL 0 -> 10V Range Gain
GCAL 1 -> 20V Range Gain
GCAL 2 -> 250V Range Gain
GCAL 3 -> 1000V Range Gain
GCAL 4 -> 2V Range Gain
GCAL 5 -> .2V Range Gain
GCAL 6 -> +10V Offset
GCAL 7 -> +20V Offset
GCAL 8 -> +250V Offset
GCAL 9 -> +1000V Offset
GCAL 10 -> -10V Offset
GCAL 11 -> -20V Offset
GCAL 12 -> -250V Offset
GCAL 13 -> -1000V Offset
GCAL 14 -> 10V Gain Shift (?)
GCAL 15 -> 20V Gain Shift (?)
GCAL 16 -> 250V Gain Shift (?)
GCAL 17 -> 1000V Gain Shift (?)
GCAL 18 -> Resolution Ratio
GCAL 19 -> A/D Gain

I think this is helpful, if you don't want to make a serial connection (or write it down by hand).
I'm not absolutely sure about the sequence of the Gain Shifts, because they all have the same value at the moment.

Now, I should dive deeper into the meaning of those numbers. At the moment I have no idea why a gain is mesasured in Volts and what is meant by "Resolution ratio". And I'm also not sure what is menat bei Gain Shift? Is this the shift against the last external calibration?


Edit: I attached the calibration list. This list was read after the first internal calibration, after external calibration.
« Last Edit: December 22, 2017, 10:08:44 pm by e61_phil »
 

Offline Pipelie

  • Regular Contributor
  • *
  • Posts: 172
  • Country: cn
Re: Is the Fluke 5440B really an artifact CAL instrument?
« Reply #29 on: December 23, 2017, 05:17:02 am »
 This is what i got from the RS-232 COM Port  after I repaired and calibrated my 5440.
« Last Edit: January 02, 2018, 06:18:00 am by Pipelie »
 

Offline e61_philTopic starter

  • Frequent Contributor
  • **
  • Posts: 962
  • Country: de
Re: Is the Fluke 5440B really an artifact CAL instrument?
« Reply #30 on: January 06, 2018, 10:05:53 pm »
After some reading in the service manual (thx to Echo88) I eventualy understand the internal calibration process of the Fluke 5440B.

I know many people here also know how it works, but I wanted to give the topic here an answer. The answer is: NO, the F5440B isn't an artifcat CAL instrument!.

The 5440B internal calibration process is only able to measure a DRIFT in the resistor ratios and correct for this drift. To determine the absolute ratio an external calibration is needed. This will explain why one is able to calibrate all ranges which isn't possible on a 3458A for example (which is a real artficat CAL instrument). Therefore, it is possible to adjust some ranges off by severeal ppm. The 5440B internal calibration process will not notice that and keep this wrong ratio stable.

For the internal calibration, the 5440B will configure the feedback resistors of the used range in a way that the resulting value will be 20k. 20k is also the value of the "front" resistor of the voltage amplifier. The feedback loop will be opened and one of this 20k resistors is fed by the reference voltage and the other one is fed by the negative voltage reference. If both values are exactly the same the voltage of the junction between the resistors should be zero. Because this resistors aren't excatly equal one can measure a voltage. This voltage is measured with an ADC. The measured voltage is then stored on every internal calibration and will be used to calculate the ratio shift of the divider. This calculated shift will be added to the stored ratio from the external calibration. Therefore, it is neccessary to run an internal calibration directly before the external calibration.

This is only the very short story. Before determining the ratios the offsets will be nulled and so on..

I played around with Excel to understand the process. I attached this Excel file. You can change the values in the yellow fields and the "internal calibration process" will calculate the correct drift.
 
The following users thanked this post: Mickle T., alm

Offline e61_philTopic starter

  • Frequent Contributor
  • **
  • Posts: 962
  • Country: de
Re: Is the Fluke 5440B really an artifact CAL instrument?
« Reply #31 on: January 07, 2018, 08:01:06 pm »
I've attached the linearity measurement in the 11V Range against a HP3458A.

My Fluke 5440B is more linear, than my best 34401A.
 
The following users thanked this post: Pipelie

Offline Pipelie

  • Regular Contributor
  • *
  • Posts: 172
  • Country: cn
Re: Is the Fluke 5440B really an artifact CAL instrument?
« Reply #32 on: January 08, 2018, 03:31:53 am »
I've attached the linearity measurement in the 11V Range against a HP3458A.

My Fluke 5440B is more linear, than my best 34401A.

Thank you! very impressed.
are you control the 5440 via GPIB?  if so, could you share the script? I would like to run some test on my 5440.

thanks in advance!
 

Offline e61_philTopic starter

  • Frequent Contributor
  • **
  • Posts: 962
  • Country: de
Re: Is the Fluke 5440B really an artifact CAL instrument?
« Reply #33 on: January 08, 2018, 11:30:36 am »
are you control the 5440 via GPIB?  if so, could you share the script? I would like to run some test on my 5440.

Yes, everything is controled via GPIB (pyvisa). It is more or less the same script as used here: https://www.eevblog.com/forum/metrology/dmm-linearity-comparison/msg1352735/#msg1352735

The script simply takes 4 measurements and if all measurements are within 1µV the measurement is accepted.

The data is stored in an ExcelSheet, but you don't need Excel for that.

It would be very interesting to compare a few 5440B in linearity.
 
The following users thanked this post: Pipelie


Share me

Digg  Facebook  SlashDot  Delicious  Technorati  Twitter  Google  Yahoo
Smf