Hi,
the reason why you should not rely on nude virgins but rather have a good injection transformer is measurement accuracy. It is correct (it was the prime argument of the Texas instruments people in their Bode papers) that the transformer can be normalized out, but:
- the problems occur at the band corners, where the S21 of the transformer deviates from 1 a lot (e.g., 10dB for the B-WIT100 at 1Hz).
Lets say we are at the low frequency range (there its most prominent) for now.
- your injection amplitude is now 10dB down, say, i.e. your measurement noise goes up
- If you try to increase drive level, you will soon run into core saturation problems creating harmonics and all other kinds of dirt effects.
The lower the frequency gets, the more pronounced this problem is. Dont forget that allowable DC-AC current to avoid saturation is in the range of 10mA.
- At the high frequency edge stray inductances and interwinding capacitances could make your measurements problematic, because the parasitic elements are
normally not very well known.
So, transformer imperfections can be calibrated out, but at the cost of less dynamic range.
I think what we got here is about the max that can be expected from a single-transformer design.
When you look at the market, they have special transformers for the low end (Ridley, Picotest, Omicron, ...) with heavy cores. Their high-end behaviour is bad, however. The small cores do it just the other way. It only depends what you need.
Techniques that could work (I never tried, just saw some literature) are stacked transformers, each for its own frequency band, ...)
To make this is extremely tricky, involves extended alignment and compensation and you can be glad to get a factor of 10 in frequency range over normal cores.