Thank you for the link and for pointing out that I did not provide enough information for the question to be answered. I tried not to overload the question with irrelevant information, to prevent readers from thinking: Too long didn't read. Thankfully, the situation is not as bad as in your quote. I have mainly two goals with this:
The first and less critical goal is that I want to be able to verify the quality of BNC cables I buy or make. In the past, I have bought cables from sources I thought I could trust, only to discover later, that the connectors were of terrible electrical quality or the cables had abysmal frequency dependent return losses. Also some cables may go bad with time unnoticed, e.g. foam dielectric coax cables which have been bend too much. This made me believe several times that certain measurement gear was bad, when in fact it was only a matter of bad cables. So being able to ballpark my cables would be nice to find out, which cables to throw away and which not to use for precision measurements.
The second goal is that I have repaired some vintage calibration gear. For a correct calibration of the gear and the devices that will be calibrated with it later on, there is the requirement of using a precision 50Ω +/- 1% cable with a length of 36 inches. These cables are then paired with the calibration device in order to later be able to make precise statements about the calibration accuracy of devices calibrated with this combination. Unfortunately, the original cables used for this e.g. the Tektronix 012-0482-00 are near to being unobtanium nowadays and I know from people that worked with these cables on a daily basis back in the days, that they had a tendency to go out of spec after a while. So, original vintage cables cannot be trusted without testing and a way is needed to validate them.
Due to the bad availability of original cables I tried to get information on the precise specifications of the cables to see, if maybe there was a way of making those using parts that are available today. According to my current findings, the validation of a cable should at least include verifying that the impedance stays within 50Ω +/- 1% below 1GHz and that the VSWR below 3GHz is less than 1.3. A few years ago, Dennis Tillman did an in-depth TDR evaluation of an original cable (
link) to reverse engineer its specifications using a Tektronix 7S12. This evaluation suggests that an impedance measurement accuracy of at least +/- 0.1Ω or +/- 1 mρ is possible and also allows a precise look at what is going on in the used BNC plugs (given a high quality 50Ω termination is used). So, 0.1Ω will certainly be no big deal for my application as long as the 1% spec is met, but I had the impression that this resolution may be the minimum required to see IF the 1% spec is met.
Based on this, I managed to identify several combinations of available parts(BNC plugs and cables of different types) that could be suitable for creating respective cables. However, the the quality of the cables not only depends on the parts, but also on the manufacturing process. The latter has some potential to be screwed up by me. So, just buying several combinations of parts and making different types of cables would be useless without a way to measure and compare the cables. At the moment, I do not have sufficient gear or knowledge to reliably take the required measurements. So, I am now trying to find out what I need to know to take the required measurements and if there is a way to acquire the respective measurement gear. After all, just owning measurement gear usually does not mean that you know how to use it correctly. So, time for some research. At the end, if possible at all, it would be nice if I could come up with a recipe, that can reliably be used by others to create calibration cables of this type, optimally for a reasonable price.
You never state your requirements, what your goals are, or the reasons behind them. Reading your post, I get the impression that you feel that 50 vs 50.1 is a big deal for your application, like the quote below taken from the linked article. I want to understand why?
Perhaps now maybe the “50 Ohm” cable idea makes some sense and you are now a “50 ohm” systems zealot. You now strive for “perfect 50 ohms” in all your cabling, connections and devices. You have become so unreasonable that you insist that all systems be EXACTLY 50 ohms. Well now you are in trouble.
https://www.dsinstruments.com/support/understanding-characteristic-impedance-vswr-reflection-coefficient/