Emphasis added by me:
If a meter has a sufficiently high impedance, it's "invisible" to any circuit under test, so an external termination of any given value could be connected. I live in an HP/Agilent dominated analog world where dBv is by far the most common comparison point. If I'm reporting amplifier gain or frequency response, it's always a dBv thing.
The meter impedance remains around 10 M
when in dBm mode, it does not use a 600
load: it is, in fact, measuring dBu not dBm.
To report gain, one just uses
dB: it's a pure (power) ratio, no m, u or V. A subtraction in dBx is all that is needed (easier to do in my mind than logs and ratios).
The value in dBV is a also simple subtraction away:
dBV ~= dBu - 2.22I'm sure you don't need the calculation spelled out, that's one of the advantages of using a logarithmic measure!
So to wrap it up, there are for sure improvement points:
- The display should have read dBu, rather than dBm.
I think most people don't care, and it's a bit late now (LCD change). - Rel key should also work for the secondary display when in dBm mode (so it becomes a pure dB mode).
This could be definitely done in firmware1: show the difference in measured values on the main display, and the dB difference on the secondary.
It would be slightly confusing (=expect someone to complain ), as the main display will show a difference and the secondary a ratio2, but still handy.
At the same time, I still don't see any major advantage in one dBx over the others (I would have been just as happy with dBV, really...).
1: The internal DC reference and its programmability have nothing to do with it. The displayed dBm value is just a fixed calculation on the measured AC value.
Storing the dBm reference value along the AC value (or recalculating it from the latter) and doing a subtraction is all that it takes.
2: e.g. Measure (and display) 1.00V / 2.218 dBm, hit Rel, then measure 2.00V: display is now 1.00V / 6 dBm