Hi all.
According to the datasheet, the Fluke 28* has a short-circuit current of 1mA when measuring resistance at the 500-ohm range. Can someone check whether this value is exactly 1 mA or if it differs?
I have a Fluke 287 with a significant deviation when measuring resistance below 500 ohms (in the 500-ohm range). Calibration of the multimeter does not fix the issue. On the 5K range and higher, this problem does not occur. I suspect the issue may be with the SL10327 chip, specifically in its current source.
What's interesting: the deviation from the actual value decreases at the edges of the range. For example (resistor value, reading, difference):
10Ω - 10.54Ω (+0.54Ω)
20Ω - 21.06Ω (+1.06Ω)
30Ω - 31.56Ω (+ 1.56Ω)
40Ω - 42.03Ω (+ 2.03Ω)
50Ω - 52.49Ω (+ 2.49Ω)
60Ω - 62.91Ω (+ 2.91Ω)
70Ω - 73.32Ω (+ 3.32Ω)
80Ω - 83.70Ω (+ 3.70Ω)
90Ω - 94.06Ω (+4.06Ω)
100Ω - 104.14Ω (+4.1Ω)
200Ω - 205.95Ω (+5.95Ω)
300Ω - 305.43Ω (+5.43Ω)
400Ω - 402.64Ω (+2.64Ω)
500Ω - 497.61Ω (-2.39Ω)
In the 5K range, the same resistors show:
100Ω - 0.1000K
200Ω - 0.2000K
300Ω - 0.3000K
400Ω - 0.4000K
500Ω - 0.5000K