I did some testing with the Lecroy 7300A but didn't got much wiser. First of all it turns out my Mini-circuits directional coupler is broken and the other one I have is a cheap one from Ebay with only goes to 200MHz. With the latter the SWR line (using Anritsu MS4630B network analyser) isn't flat (from 1.00 @30MHz to 1.05 @200MHz) when connected to the 7300A's input. But I get the same result with the R&S RTB3004.
I get exactly the same trace on both the Lecroy 7300A and the R&S RTB3004 in 50 Ohm mode using two different generators. A 12dB inline attenuator doesn't change the signal shape at all on the Lecroy 7300A.
All in all nothing conclusive yet.
I did some further investigation. It seems that the problem is not the oscilloscope input but the interaction between the signal source and the probe itself. If I put an attenuator or 50 Ohm feed-through between the Lecroy 7300A calibrator output and the probe the signal improves.
More likely "In that case the effect you see without the source 50ohm might be due to the cal out behaving properly when correctly loaded."
But isn't the whole point of probing to see the signal as it is as much as possible? Requiring a signal to be 'correctly loaded' is not always possible or even desireable (think about the good old PCI bus for example as an unterminated high speed bus).
Yes, and in your experiment you
were seeing the signal
correctly when it didn't have the 50ohm termination. It is just that the source's output was different when not loaded with 50ohms.
Here's the Tek 485's cal out when driving 1Mohm//20pF via a 2m lead. The timebase is 100ns/div. Not exactly a 1ns risetime - or rather each step
is 1ns risetime
The point is to measure the signal
in normal operation without disturbing it. If the signal is meant to be unterminated (e.g. a PCI bus), then of course you wouldn't terminate it because that wouldn't reflect (ho ho) normal operation.