From calibration protocol (of yesterday):
for +10V the 3458A reading was 3.4 uV too high against calibrator actual value.
for -10V the 3458A reading was 4.1 uV (amount too high)
Calibration uncertainity is 2.5 ppm (+/-25uV) with 24 hrs we get 2.6 ppm (+/-26uV)
so for 10V + 56.8uV we have to correct to 10V + 53.4 uV
for -10V -58.5uV we have to correct to -10V - 54.4uV
the difference to 10V + 41uV from Fluke 7000 is 12.4 or 13.4 uV which is within the +/- 26 uV +/- 2.5 uV uncertainity.
But which device is showing the truth? -> we need more references/DMMs
Edit: what where the results between the Fluke 7000 and the other Flukes (732)?
Edit: And even more evil: I did not find in the calibration protocol if the calibrator has been already set for the new volt
with best regards
Andreas