Hello,
I was reading some specs for DMMs and instead of the typical xx% + xxDigits, I came upon xx% + xx% of range.
So if the range is 1000.00V, and I'm measuring 1V, and the accuracy is 0.1% + 0.01% of the range, then I could get a reading of up to 101.001V?!
(1V*1.001%)+(1000(range)*0.01%) == 11.001V
And that's assuming I'm using only the portion of the range after the period. If not, then 1V could become:
(1*1.001)+(100000*0.01) == 1,001.001V!!!
Now I suspect I'm making a mistake somewhere, please point me in the right direction.
Thanks!