I have noticed several times that the YR1035 milliohmmeter (which uses a 1 kHz testing signal) displays the actual DC resistance of the conductor under test, which is seemingly not affected by its inductance, which I would expect to manifest itself as an increased value of the displayed resistance with the added part calculated as 2πfL.
I have never taken the time to properly verify it until now. This time I measured the center conductor of a 10 m (or rather about 9.8 m actual) long piece of coax cable:
1. L = 18 µH (measured by ST42 tweezers)
2. Rdc = 0.354 Ω (measured by passing a known DC current and measuring the voltage drop across the wire)
The YR1035 shows 0.355 Ω. No apparent effect of the inductive reactance, which should be about 0.1 Ω at 1 kHz, so the expected reading would be about 0.45 Ω (Rdc + 2πfL).
I have checked the signal across the conductor under test with an oscilloscope. There's lots of background EMI noise, but a 1 kHz wave can actually be seen.
So... What am I missing? How can it display the actual DC resistance even when the measured conductor's inductance is high enough that it can't be ignored?
p.s. ST42 also displays ~350 mΩ regardless of the frequency of the test signal (from 100 Hz to 10 kHz). I guess I'm missing understanding of some fundamental effects here!