Does someone have the law or conversion formula ready which can be used to determine which level of interference in a shielded/coaxial transmission line will result from an external field as it appears in radiated susceptibility tests? Of course, the screen attenuation goes into it, but how to calculate that?
At a glance you can use classic shielding attenuation formulas and calculate it from the coax cable insulation size (between braid and the center wire), braid conductor thickness and magnetic permeability. But I think the things may be a bit complicated, because the result will depends on cable orientation against field polarization and direction vector.
You will get at least 3 attenuation values:
- for plane wave far field (when RFI source is at distance more than lambda/2)
- for E near field component (electrical field)
- for H near field component (magnetic field)
In short, more thick coax cable with more thick braid provide you with better shielding.