Hello blackdog,
attached the statistical comparison of the systems:
I think the values are only the offset to the starting value.
(showing the uV deviation over time).
HP34401A:
plotted over time with plotter program shows already discrete voltage jumps.
The standard deviation (8.6 uV) with 200 classes shows that the resolution of the logged data is 1uV.
But all 5uV there is a maximum most probably due to rounding artefacts
from binary internal resolution to the output at the interface.
All in all its no gaussian distribution but 2 overlapping distributions.
Autokorrelation: (donĀ“t know how to interpret but looks strange).
Allan deviation shows 0.8uV stability.
Averaging of multiple values seems not to improve stability.
HP34461A:
On the time diagram the resolution appears much finer and with more stability.
The standard deviation 3.6uV is factor 2 better than above. No discrete values found.
The distribution looks more gaussian. Although 2 distributions available.
Autokorrelation shows a repeated function all 20000 samples.
Perhaps interference (beat frequency) with mains frequency?
Allan deviation also shows 0.8 uV stability.
Getting a little better when averaging 10-20 values.
I cannot finally state what that means. Usually the Allan deviation and standard deviation should be equal for 1 sample.
And also the nearly same value is not clear to me.
But since it is the whole system which is judged including the reference voltage,
the 0.8uV may also be the reference voltage stability which is limiting.
And finally I wanted to know the resolution of the 34461A and got a big surprise when doing standard deviation with 1000 classes:
It seems that the instrument is switching the range or resolution above +10uV difference.
very strange. Does the instrument cheat ? (and secretly switching ranges)
I would have expected better maths in a instrument in this class.
With best regards
Andreas