I've noticed on this forum that getting a voltage standard is the first step of becoming a voltnut. For example, LT1021-based DMMCheck's 5 V standard is popular. Some people also use this as a benchmark for multimeters to post measurements results of all types of meters. If the screen reads "4.99 V", then the meter is said to be accurate.
But it only tests a single point and a single range, so I don't understand why is it considered an useful test.
1. How can you be sure that the error is consistent for all input voltages? Isn't it possible that the error can be higher or lower for a different voltage even in the same range?
2. It only checks a single range. How can you be sure that other ranges are accurate? As far as I know the range switch is a voltage divider, it's possible to have a drifted or even defective range without being noticed.
Thus I think the single voltage reference test, at best, tells you the analog-to-digital converter is working, its internal reference is accurate, and the divider used in a single range is accurate, not more, for example, it doesn't say anything about linearity.
For a more complete test, I think one also needs something like a Kelvin-Varley divider and a high-voltage standard as well. Or just use a multimeter calibrator. But since those are very exotic and out of reach from most, they are not commonly used, thus a single reference voltage became the most popular test.