Do DMMs and their components experience a constant drift? Yes. The most well-studied references are the calibration lab standards. For example, here is data for the 10V output of a Fluke 732A voltage standard from "PREDICTION OF THE OUTPUT VOLTAGE OF DC VOLTAGE STANDARDS", 2009, by Ilić et al:
And here is the 1.018V output, which is a resistive divider driven from the 10V output:
And here is data for the ESI SR104 standard resistor compiled by zlymex, that I grabbed from
here:
It's even common do use linear regression to try to predict the value of the reference more accurately than assuming the last calibrated value. As described in the paper I cited. This would be a waste of time if the references did not have a constant drift.
A DMM will obviously have a less predictable drift, because there are more components affecting the accuracy, all with their own drift characteristics. Here is data from a HP 34401A calibrated three times in 12 years. I believe I got this data of the PMEL forum. Clearly the DMM is drifting, although based on this limited data it does not seem linear.
If the components in a handheld meter would drift less, the engineers designing these instruments would have done a horrible job
. So clearly the handhelds will drift. Is this drift enough to be noticeable given their resolution? Maybe not.
Also, very high or low ranges will generally drift more, and ACV will drift more than DCV. I think the statements about the meter still being bang-on after decades is usually on something like a 9V battery, not a 10 MOhm resistor, or a 700V 20 kHz AC signal, which based on another report on 34401A meters that I pulled from PMEL was found to be out of tolerance 8 times out of 250 tests (the point with the highest number of OOT events):
Do you have any data on the meters being 'bang on'? Number of meters, range they were tested on, uncertainty of the test? How big is a 'bang'?
Are we talking about agreeing to a least significant digit? The 24h tolerance? 1 year tolerance?