Hello,
I currently trying to fix Rohde & Schwarz CMU200 purchased as a non-op unit from eBay. Most of problems was solved, a lot of black dust was cleaned from everywhere, alignments (Vref and reference frequency) was made, but now I come to a thing that I can't understand (at least at once).
Unit have 2 RF generators, and both work fine, I can see right levels and frequencies on another (calibrated) spectrum analyzer. Signals (both from external and internal RF generators), connected to any of CMU inputs correctly measured by wideband power meter in CMU (0.1-0.2 dB difference, comparable to expected losses in test cable). But CMU analyzers (both spectrum analyzer and power vs time analyzer) show lower value. At start difference was about 4 dB, it slowly decreased and now "stabilized" at about 2.3 dB @ 100 MHz after running continuously for a few days. Vary between 2-3 dB across whole band.
All self tests in maint mode shows similar difference (about 2-3 dB), both with internal and external loopback. Any combination of inputs and outputs (if internal RF gen is used) give same result, so it is not a frontend problem.
Unit was reset to factory defaults, to exclude possibility that user attenuation data is loaded. Nothing changed.
Maybe someone can advise where to look next?