-
Hello,
Always making sure everything is set for 50 Ohms, if I measure the output level over a wide range of a simple sine wave from my RF signal generator or my arbitrary waveform generator using my spectrum analyzer, staying well within the BW limits of all to avoid analog rolloff, the SA measures almost exactly what the two generators say they're outputting. It even remains very close right up TO their specified limits. So their outputs seem to be well calibrated (the SA too).
But if I measure their outputs at the low end of either of two oscilloscopes' BW range, both oscilloscopes measure significantly higher. As I increase the frequency the measured level decreases and I hit a point somewhere in the middle of their BW range that agrees. Then as I continue to increase the frequency the measurement continues to gradually decrease to about the expected 3dB down AT the oscilloscope's advertised BW. The advertised BWs of the oscilloscopes are much lower than the RFSG or the SA (ie... the SGs are not being taxed in the oscilloscope measurements).
This appears to be normal behavior for at least these two oscilloscopes (new Siglent SDS3034X and R&S MXO4). I expected the measurements to be accurate at low frequencies and stay that way until up near the oscilloscope's BW limits. Then start to roll off to expected 3dB down AT their specified BW limit.
What am I missing ?
Thanks, Russ -
Can you provide the actual numbers of the frequencies involved at each step? For the oscilloscopes there is no lower BW limit so we don't know whether "low" means 1MHz or 100Hz. The same goes for "low" and "high" and "significantly higher" with regard to amplitude. I would expect the scopes to be fairly flat from DC to about the middle of their range, with "fairly flat" meaning less than 2% deviation. But IDK if the specs actually support that.
-
The display on an SA is in dBm, so is non-linear, that of an Oscilloscope is normally linear.
What looks negligible on the former will be quite obvious on the latter. -
Despite the OP claiming everything is set to 50 Ohm termination, it sounds like the oscilloscopes are not set to 50 Ohm input termination. And there might be probing / cabling issues but without knowing the exact setup, this is impossible to tell.
-
Right, thanks to all for the reply. I deliberately didn't state the actual values because I'm asking about this general behavior, not the values at any specific part of the range. The specific values don't really matter and I don't want to set off diversions about them.
The measurement is at some maximum value all the way at bottom of the range, say 10 Hz or below, falls and hits the actual value the gens are outputting about 1/3 of the way up the range then slowly continue to decrease all the way to the top (and beyond) the O-scope's BW range. As I stated, and as you stated, I would have expected the O-scopes to be pretty accurate from the bottom of their range all the way to the point their responses start to roll down to the expected 3dB down AT it's specified BW. Instead they only accidentally pass through the correct value as you roll the sine wave's frequency up from way below to way above. Not that it matters but the Siglent's BW is 350 mHz and the MXO4's is 1.5 gHz.
I'd suspect the O-scope except both behave the same way. I'd suspect the BW settings on both ends but they're definitely at 50 Ohms. I've checked over and over again because that's an obvious simple minded explaination. It's not the cable (which is 50 Ohms of course) because several do the same thing and the behavior starts way below where cable quality matters much anyway. I'd suspect the quality of the scopes themselves but one is a brand new R&S MXO4 which works perfectly in every other respect. And frankly, the Siglent ain't no slouch either. And again, they both do the same thing. And as I said, I'm seeing this with two different signal gens which measure correctly otherwise.
Weird. Anyway, thanks for the response.
Russ
-
Right, thanks to all for the reply. I deliberately didn't state the actual values because I'm asking about this general behavior, not the values at any specific part of the range. The specific values don't really matter and I don't want to set off diversions about them.
That is invalid reasoning.
What is the numerical value of the fall? 1% 1% 1dB 10dB?
Is the upper frequency limit 350MHz 1.5GHz?
What type of cable are you using, and what is its loss at those frequencies?
Numbers, not adjectives.
What are the measured values at different frequencies? Tabular data is sufficient. -
At the OP: what is the samplerate? Is high-res enabled? What happens when peak detect is enabled? And what memory length setting is used? What is the sweep speed?
What you are describing can also be a samplerate / aliasing related issue. Not just from how the oscilloscope is sampling but also how the data is decimated before display on the screen. Frequency sweep can play nasty tricks where it comes to decimating data before it is being displayed. -
The measurement is at some maximum value all the way at bottom of the range, say 10 Hz or below...
Which measurement? Most scopes have several (if not dozens) of measurements.
As the above posters have mentioned, specifics matter. You'd like a generic answer but there isn't one, it all comes back to the (huge number of) configurations/settings of the scope in the specific situation. -
Always making sure everything is set for 50 Ohms
For lower frequency oscilloscopes, like up to 500 MHz, the 50 ohm input setting just places a 50 ohm termination across the input, leaving the input capacitance of 10 to 20 picofarads in parallel with the 50 ohm input.QuoteThis appears to be normal behavior for at least these two oscilloscopes (new Siglent SDS3034X and R&S MXO4). I expected the measurements to be accurate at low frequencies and stay that way until up near the oscilloscope's BW limits. Then start to roll off to expected 3dB down AT their specified BW limit.
Maintaining low aberrations in an oscilloscope's transient response requires a compromise in passband flatness to achieve linear group delay, or something close to it. At 10% of bandwidth, the response is likely down by 0.5%, and down 2% by 20% of bandwidth.
-
The measurement is at some maximum value all the way at bottom of the range, say 10 Hz or below...
Which measurement? Most scopes have several (if not dozens) of measurements.
As the above posters have mentioned, specifics matter. You'd like a generic answer but there isn't one, it all comes back to the (huge number of) configurations/settings of the scope in the specific situation.
Just so.
Two advantages of analogue scopes: fewer subtly important settings hidden deep in a menu system, and a visceral appreciation of the limitations of all scopes' Y amplification chain. -
For lower frequency oscilloscopes, like up to 500 MHz, the 50 ohm input setting just places a 50 ohm termination across the input, leaving the input capacitance of 10 to 20 picofarads in parallel with the 50 ohm input.
Sort of, but there is almost always a significant series resistance with that 10-20pF, perhaps 1-2k. That is insignificant when the impedance is 1M, but at 50R it makes it less irrelevant. A 20pF capacitor at 500MHz would have an impedance of ~16 ohms, that would be unusable. -
That greatly depends on where the capacitance is coming from. There isn't a 20pf capacitor in the front-end but the capacitance comes from all the connectors, cabling, components, etc. If you take a 50 Ohm microstrip trace or 50 Ohm coax cable and measure it with an LCR meter, you'll find a significant amount of capacitance. But that is in a non-impedance matched condition. In case the impedance is matched, the capacitance doesn't matter.For lower frequency oscilloscopes, like up to 500 MHz, the 50 ohm input setting just places a 50 ohm termination across the input, leaving the input capacitance of 10 to 20 picofarads in parallel with the 50 ohm input.
Sort of, but there is almost always a significant series resistance with that 10-20pF, perhaps 1-2k. That is insignificant when the impedance is 1M, but at 50R it makes it less irrelevant. A 20pF capacitor at 500MHz would have an impedance of ~16 ohms, that would be unusable.
-
That greatly depends on where the capacitance is coming from. There isn't a 20pf capacitor in the front-end but the capacitance comes from all the connectors, cabling, components, etc. If you take a 50 Ohm microstrip trace or 50 Ohm coax cable and measure it with an LCR meter, you'll find a significant amount of capacitance. But that is in a non-impedance matched condition. In case the impedance is matched, the capacitance doesn't matter.
The physical construction between the BNC and high impedance input buffer is not a 50 ohm transmission line, or anything close to it, and the buffer itself with its protection circuits represents a lumped capacitance. The 50 ohm termination must be before the high impedance attenuator to support a useful input voltage range.
Some oscilloscopes use a transmission line relay immediately after the input to direct the signal between the high impedance and 50 ohm input amplifiers. The old Tektronix 485 did this, but I suspect most modern instruments faster than 500 MHz do this.
-
In case the impedance is matched, the capacitance doesn't matter.
That's true too, I hadn't thought of that possibility. In a coax, there's so many pF per meter and that's measurable at LF but is not a factor at HF as you say. But that capacitance is distributed over some length while in the front end of the scope it is all within a few cm. Perhaps that can be entirely a transmission line as well, IDK. My conclusions were based on looking at the input of scopes with a VNA and LCR meter to determine VSWR, capacitance and ESR of that capacitance. My conclusions were that the input does not behave as if there were 20pF directly in parallel with the 50R termination and that a 400MHz scope couldn't possibly work if that capacitance was there. It is possible my series resistance measurement is reflecting some other characteristic. -
My conclusions were that the input does not behave as if there were 20pF directly in parallel with the 50R termination and that a 400MHz scope couldn't possibly work if that capacitance was there.
Why not?
If you are using a *1 passive probe, then the probe lead's capacitance will dominate the scope input capacitance.
If you are using a *10 "high" impedance passive probe, then that consists of RC network at the tip, a lossy transmission line, RLC network where the probe connects to the scope. The RLC network can compensate for the 20pF. FFI see Fig 2-12 from https://w140.com/tekwiki/images/6/62/062-1146-00.pdf
And for reference, here's the input attenuator of the Tek 2465. You can see the 50ohm termination is simply banged across the 1Mohm//15pF "high" impedance input to make it 50ohm//15pF. The HP1740 does the same.
-
That's true too, I hadn't thought of that possibility. In a coax, there's so many pF per meter and that's measurable at LF but is not a factor at HF as you say. But that capacitance is distributed over some length while in the front end of the scope it is all within a few cm. Perhaps that can be entirely a transmission line as well, IDK. My conclusions were based on looking at the input of scopes with a VNA and LCR meter to determine VSWR, capacitance and ESR of that capacitance. My conclusions were that the input does not behave as if there were 20pF directly in parallel with the 50R termination and that a 400MHz scope couldn't possibly work if that capacitance was there. It is possible my series resistance measurement is reflecting some other characteristic.
The old Tektronix Circuit Concepts book about probes mentions that advanced termination schemes lower the input capacitance as frequency rises. The input capacitance specification is mostly significant for probe compensation.
-
A good oscilloscope spec also has the VSWR for 50 Ohm input mode. For example: the Tektronix TDS510A specifies < 1.3:1 in 50 Ohm mode even though this model -as far as my information goes- just switches a 50 Ohm load across the input to have 50 Ohm input termination.In case the impedance is matched, the capacitance doesn't matter.
That's true too, I hadn't thought of that possibility. In a coax, there's so many pF per meter and that's measurable at LF but is not a factor at HF as you say. But that capacitance is distributed over some length while in the front end of the scope it is all within a few cm. Perhaps that can be entirely a transmission line as well, IDK. My conclusions were based on looking at the input of scopes with a VNA and LCR meter to determine VSWR, capacitance and ESR of that capacitance. My conclusions were that the input does not behave as if there were 20pF directly in parallel with the 50R termination and that a 400MHz scope couldn't possibly work if that capacitance was there. It is possible my series resistance measurement is reflecting some other characteristic.
-
A good oscilloscope spec also has the VSWR for 50 Ohm input mode. For example: the Tektronix TDS510A specifies < 1.3:1 in 50 Ohm mode even though this model -as far as my information goes- just switches a 50 Ohm load across the input to have 50 Ohm input termination.
Yes, the Tek 485, which is oft cited as the paragon of perfection in input circuit design, uses transmission line design all the way in and is rated for <1.25 VSWR to 350MHz. But they stopped doing that when they came out with the 2465 series and the spec there was slightly worse, <1.3 to 300MHz (later models 350 and 400MHz). My Siglent SDS2354X+ doesn't list a spec, but I've measured it and it and it seems to be comparable with every other scope we've discussed here, 1.43 @ 500MHz and quite a bit lower at 350MHz and below. -
A good oscilloscope spec also has the VSWR for 50 Ohm input mode. For example: the Tektronix TDS510A specifies < 1.3:1 in 50 Ohm mode even though this model -as far as my information goes- just switches a 50 Ohm load across the input to have 50 Ohm input termination.
Yes, the Tek 485, which is oft cited as the paragon of perfection in input circuit design, uses transmission line design all the way in and is rated for <1.25 VSWR to 350MHz. But they stopped doing that when they came out with the 2465 series and the spec there was slightly worse, <1.3 to 300MHz (later models 350 and 400MHz). My Siglent SDS2354X+ doesn't list a spec, but I've measured it and it and it seems to be comparable with every other scope we've discussed here, 1.43 @ 500MHz and quite a bit lower at 350MHz and below.
In a thread "Need 50 ohm input on an (old) oscilloscope that has only 1M ohm inputs: BNC TEE?", I wroteI've just measured my 2465. The return loss is, to pick a single figure:
- 485 internal attenuator: -35dB (VSWR <1.07)
- 2465 internal attenuator: -18dB (VSWR <1.28, spec 1.3)
- 485 with inline terminator: -8dB
You can see the VSWR is much better than 1.25.