One of the common ones that you find is 75 vs 50 ohms. A dipole is more like 75 ohms, so a lot of consumer co-ax and devices use 75 ohms.
"Professionals" tend to use 50 ohms. Honestly, I think that choice originated in the US, like their choice of 60Hz for electrical power. As close as I can tell in both cases, because it makes the basic math just a bit easier.
Anyway, the simple answer is that you need to match impedances.
There are many ways to do this, depending upon the type of transmission line. If the line under test is pretty well matched, a simple transformer will do. Unfortunately, with RF matching is often far from perfect, so the matching of 50 to 75 becomes more problematic.
If you want to do it the hard way, you recalibrate your instruments to the impedance of the line under test each time.