Hello, and thanks again for your fine work on the Uni-T, I have more comments but its time to say goodnight soon!
There's another thread here on the same topic:
https://www.eevblog.com/forum/index.php?topic=601.msg7082#msg7082I think what you ask is a very important question to anyone without access to a
calibration reference standard. Calibration, and adjustment if need be, is like sharpening your knives, thinning your soldering iron, etc., it keeps your tools in shape. Sure, you can work with a dull knife, up to point, just as you can with a cheap inaccurate DMM.
https://www.eevblog.com/forum/index.php?topic=323.msg7053;topicseen#msg7053I brought up the topic in this post, in relation to discussing what you are now doing,
calibrating or checking your new DMM's accuracy and precision, and later its reliability [ i.e., accuracy and precision over time period, like months to years.] Your high end Fluke probably gives its specifications tied to months or days from turn on, to reflect its reliability [ handheld's usually 'refer' to its reliability by specifying a recommended calibration cycle in 1-2 years].
The only other option to buying these references is making your own, see the link on my post, using a voltage reference chip. The band gap variety are very stable, but to be sure, you still need to reference that chip to a calibrated meter, so the Geller style approach I think is the best for now. It would cost about $10-20 every calibration cycle or so to recalibrate your references, $5 as Geller or the other's charge, and the rest for postage.
Another approach is to reduce your cost is buy 2 4 months apart, Geller's is the most accurate so far. Since they are roughly stable for about 6 months, before the time is out for one reference, you can
calibrate check it against the one that is still within cycle and vice versa. If your high end meters are also
calibrated checked against these standards, you can go theoretically for some time before sending it back for recalibration, as you can cross reference each high end reference and meters, against each other.
Its one reason its not unreasonable to own several DMM of the same class, as you may use them to monitor different parts of a design simultaneously, and also serve to check their calibrations against each other. Of course, owning several DMM of the FLuke 8505 class, if you really have little use for such, is more costly that getting a Geller.
Finally, in my experience quality meters may not drift at all, and you find how a particular device functions only by experience, once you get a feel for it, a true calibration cycle can be prolonged.
Hi,
I'm using the cheap Malone voltage standard for my home lab:
http://www.voltagestandard.com/Home_Page_JO2U.html
It's supposed to be 0.01% accurate.
I know about the more expensive Malone standard:
http://www.voltagestandard.com/New_Products.html
as well as the Geller Labs standard:
http://www.gellerlabs.com/SVR%20Series.htm
Anyone have opinions about these standards? Know of any better
ones that a hobbyist could afford?
I'm always looking for better standards :-)
Scott