Author Topic: Ethernet/CAT5 signal integrity testing  (Read 8637 times)

0 Members and 2 Guests are viewing this topic.

Offline forrestcTopic starter

  • Supporter
  • ****
  • Posts: 674
  • Country: us
Ethernet/CAT5 signal integrity testing
« on: October 30, 2015, 11:10:16 am »
I am in the need of a piece of test equipment (that doesn't cost 5+ figures) that will help me verify signal integrity through various midspan Gigabit Ethernet power injection designs. 

To provide some background for those who don't completely understand what I'm talking about - when you inject power for a modern Power over Ethernet (PoE) device you generally use some sort of power injector.  Back before Gigabit Ethernet, if the power was being injected in the middle of the span (I.E. between the Ethernet switch and the powered device), you'd typically just use the spare pairs in the CAT5 cable since 10/100 Ethernet only used 2 of the 4 pairs in the CAT5 cable and you could use the other two for power.   Validating and testing through one of these devices was rather simple since the data pairs were electrically connected end-to-end and most high-quality cable testers would do a 2 pair test to validate the quality of the signal connections through the injector.

Gigabit Ethernet is a different beast.  Since all 4 pairs are needed for data for Gigabit, midspan power injectors use a set of Ethernet magnetics (aka a set of 4 isolation transformers) in the middle, and inject common mode voltage on the center taps of the windings closest to the powered device.   This works well, but also causes the data pairs to be electrically isolated.   To date, I haven't found a cable tester which will even try a signal integrity test since they notice that all of the pairs have a DC short on them (aka the transformer winding for that pair), and/or they are "open" to the far end (due to the isolation), which leaves me testing these by passing a lot of data through them and hoping that no data errors corresponds to no issues.   

A similar set of issues occur when one is testing surge suppression devices as well.... but that's a different story.

What I need to be able to do is to test a given injector design and somehow determine how much impact the injector would have on the signal integrity of a Gigabit (or 10/100) Ethernet signal passing through it.  In an ideal world, I'd go buy something like a Tektronix DSA8200 - but we're talking at least 20K for a used unit without the modules I'd probably need, and then I'd have to learn how to use it for this purpose (doesn't appear to be a small task). So I'm looking for something else. 

So, I'd love to find one or more of the following:

  • A cable tester which doesn't care if the pairs are shorted and open (as described above) but will still do a fairly detailed cable validation for gigabit signal integrity.
  • Some other suitable piece of test equipment which will get me to where I need to be.  (I.E. inexpensive transmission line tester for 100 ohm cable)
  • A document or description of some alternative method of validating signal integrity... i.e. sweeping the cable+injector... and applying the result to gigabit signal integrity.  (I'm thinking something like a bode plot here, but not quite sure how to equate the results to Ethernet)
  • Something else I haven't thought of

Any ideas?   

I know someone will ask the "what is your budget?".  20K is out.   Anything less than that will be considered ;)

One more note:  I do have a fairly healthy test equipment collection, so there is every possibility I have the right stuff if I just wire it together correctly (For example, I'm going to be playing with making a TDR out of the pulse generator and the oscilloscope in the next few days, unless someone comes up with a better solution).
 

Offline jc101

  • Frequent Contributor
  • **
  • Posts: 671
  • Country: gb
Re: Ethernet/CAT5 signal integrity testing
« Reply #1 on: October 30, 2015, 12:41:41 pm »
I would consider looking at the packet level and see what, if any, degradation you get with the injector in line or not.  If the signals are bad on the wire, the packets won't get through or will be corrupted.

A couple of PCs and a copy of 'ipref' will run a bi-directional network test and report missing packets, jitter etc.  From memory you can alter the packet sizes, jumbo frames (if the NIC supports them), fragmentation etc.  Run it at 10/100/1000 speeds and not just the cable with the injector on but also make a bundle up to see if you introduce issues into adjacent cables (unlikely but worth doing), have max 100m runs and short 0.5m too.  Also make sure you load the PoE from min to max loading.  You can leave iPerf running for hours if needed.

If you get issues with the packet transfers then start to dig into the physical signals on the wire.  If nothing else it would give you an indication as to the performance of your injectors without costing a fortune.

You could also look at the likes of a Fluke OneTouch that can sit in-line and monitor the data going through, it understands PoE too and will give you various stats on the power side as well as the data.
 

Offline Marco

  • Super Contributor
  • ***
  • Posts: 6816
  • Country: nl
Re: Ethernet/CAT5 signal integrity testing
« Reply #2 on: October 30, 2015, 01:02:34 pm »
Would it be possible to make some kind of variable differential attenuator? By running BER tests at multiple attenuation factors you could find the eye size margins, without a high speed scope.
« Last Edit: October 30, 2015, 01:15:06 pm by Marco »
 

Offline forrestcTopic starter

  • Supporter
  • ****
  • Posts: 674
  • Country: us
Re: Ethernet/CAT5 signal integrity testing
« Reply #3 on: October 30, 2015, 09:40:55 pm »
I would consider looking at the packet level and see what, if any, degradation you get with the injector in line or not.  If the signals are bad on the wire, the packets won't get through or will be corrupted.

That's what we do today...  100m length of cat5 with injector and patch cables & injector between two gigabit ethernet cards which we run various packet level tests, under various loadings of the injector.   We make sure that 100% of the packets come through and are perfect.   The problem I run into is that this tends to be a fairly easy test to pass, as there is a fair bit of margin built into ethernet.    I've also gone up in cable distance, etc. etc. etc. and have never really come up with something which makes me feel like I'm getting a good feel about the quality of the injection circuitry as far as impact to signal quality.  It's like there's a cliff you fall off of when it breaks, and it's very hard to get close enough to that cliff that you can discern differences in quality of injection circuitry.   I've even purposefully done things obviously wrong with an injector with the intent of finding a cable arrangement which works with a well designed injector and fails with a bad one.   So far, all I've done is fail miserably at producing a way to discern between a good and poor design.

One thing which may help is to figure out some way to be able to actually look at the bits being recovered and see how many symbols are being corrected.  I feel the coding gain being introduced is actually hiding some problems with the automatic "ECC-like" functionality of the coding.   Unfortunately the phy hides this and I haven't found a phy with detailed enough reporting to go through the hassle of trying to build something custom with a specific phy.

I also think that some of the problem is that the type of things that a poorly designed injector is going to screw up isn't necessarily going to cause signal degradation in every situation.  For instance, the effects of a strong impedance discontinuity is going to depend greatly on the length of the cable in relation to the wavelength of the gigabit signal and exactly where other discontinuities are along the cable.   The chances of a problem showing up with adding a length of cable is probably around the same as subtracting a length of cable.   But not a way to do a consistent test.

As an aside... I just found a paper from a manufacturer of surge supression semiconductors and showing on an eye chart how little impact their product has on gigabit ethernet.   The test setup they have includes a 81134a pulse generator (40K on ebay), a 861000C analyzer mainframe (10K on ebay), and a 86112A plugin (2K).  Just wow. 
 

Offline Marco

  • Super Contributor
  • ***
  • Posts: 6816
  • Country: nl
Re: Ethernet/CAT5 signal integrity testing
« Reply #4 on: October 30, 2015, 09:55:00 pm »
Some of the Pico sampling scopes have eye pattern testing, that's probably the cheapest you'll find new. Four figures in my currency, but I don't know about yours.

Check the mask testing video.
« Last Edit: October 30, 2015, 09:57:50 pm by Marco »
 

Offline forrestcTopic starter

  • Supporter
  • ****
  • Posts: 674
  • Country: us
Re: Ethernet/CAT5 signal integrity testing
« Reply #5 on: October 30, 2015, 09:56:00 pm »
Would it be possible to make some kind of variable differential attenuator? By running BER tests at multiple attenuation factors you could find the eye size margins, without a high speed scope.

I sort of do that today with different lengths of cat5 cable.   The two problems are how to run a good BER test (since the ethernet PHYs like to hide as much of this as possible), and as I mentioned in a reply I posted a few minutes ago, I'm not sure attenuation is the main issue here.  My gut tells me that the most likely issue is going to be an impedance mismatch which causes nasty signal reflection.  Also likely are things like transformer saturation issues (which I'm not even sure what would look like on an eye diagram), and length mismatches due to really poor routing of certain injectors.   

I do feel that if I could find a way to run a good, actual, on-the-wire ber test that would go a long way.   Along with a way to check for impedance mismatches, i.e. a tdr or something similar.

 

Offline jc101

  • Frequent Contributor
  • **
  • Posts: 671
  • Country: gb
Re: Ethernet/CAT5 signal integrity testing
« Reply #6 on: October 30, 2015, 10:01:32 pm »
I would consider looking at the packet level and see what, if any, degradation you get with the injector in line or not.  If the signals are bad on the wire, the packets won't get through or will be corrupted.

That's what we do today...  100m length of cat5 with injector and patch cables & injector between two gigabit ethernet cards which we run various packet level tests, under various loadings of the injector.   We make sure that 100% of the packets come through and are perfect.   The problem I run into is that this tends to be a fairly easy test to pass, as there is a fair bit of margin built into ethernet.    I've also gone up in cable distance, etc. etc. etc. and have never really come up with something which makes me feel like I'm getting a good feel about the quality of the injection circuitry as far as impact to signal quality.  It's like there's a cliff you fall off of when it breaks, and it's very hard to get close enough to that cliff that you can discern differences in quality of injection circuitry.   I've even purposefully done things obviously wrong with an injector with the intent of finding a cable arrangement which works with a well designed injector and fails with a bad one.   So far, all I've done is fail miserably at producing a way to discern between a good and poor design.

One thing which may help is to figure out some way to be able to actually look at the bits being recovered and see how many symbols are being corrected.  I feel the coding gain being introduced is actually hiding some problems with the automatic "ECC-like" functionality of the coding.   Unfortunately the phy hides this and I haven't found a phy with detailed enough reporting to go through the hassle of trying to build something custom with a specific phy.

I also think that some of the problem is that the type of things that a poorly designed injector is going to screw up isn't necessarily going to cause signal degradation in every situation.  For instance, the effects of a strong impedance discontinuity is going to depend greatly on the length of the cable in relation to the wavelength of the gigabit signal and exactly where other discontinuities are along the cable.   The chances of a problem showing up with adding a length of cable is probably around the same as subtracting a length of cable.   But not a way to do a consistent test.

As an aside... I just found a paper from a manufacturer of surge supression semiconductors and showing on an eye chart how little impact their product has on gigabit ethernet.   The test setup they have includes a 81134a pulse generator (40K on ebay), a 861000C analyzer mainframe (10K on ebay), and a 86112A plugin (2K).  Just wow.

Well you also have to remember that any switch etc. will only forward complete frames, some of them will also report the numbers of errors received on ports.  It might be worth picking up a cheap fully managed switch that has good reporting on the ports (Cisco / HP come to mind).  Running the tests between machines through the switch it should report any iffy frames.  I used to be able to find dodgy network cables just by looking at the switch stats as a kind of preventative maintenance routine before users reported problems with their PC's being slow.
 

Offline German_EE

  • Super Contributor
  • ***
  • Posts: 2399
  • Country: de
Re: Ethernet/CAT5 signal integrity testing
« Reply #7 on: October 30, 2015, 10:02:34 pm »
This is way outside of my field of experience so please excuse me if these ideas sound a little crazy.

1) If a radio amateur wants to test the integrity of a 50 ohm line they put a 50 ohm generator on one end and a device called an SWR meter that measures the ratio between the level of the sent signal and the return. With a 50 ohm dummy load on the other end the SWR should be 1:1. So, build a generator with 100 ohm output impedance and a 100 ohm SWR meter, the 100 ohm dummy load is the easy part.

2) Again, build a 100 ohm generator but send square waves down the cable. Terminate the other end with 100 ohms and examine the signal using a scope.
Should you find yourself in a chronically leaking boat, energy devoted to changing vessels is likely to be more productive than energy devoted to patching leaks.

Warren Buffett
 

Offline forrestcTopic starter

  • Supporter
  • ****
  • Posts: 674
  • Country: us
Re: Ethernet/CAT5 signal integrity testing
« Reply #8 on: October 30, 2015, 10:14:35 pm »
Some of the Pico sampling scopes have eye pattern testing, that's probably the cheapest you'll find new. Four figures in my currency, but I don't know about yours.

Check the mask testing video.

Wow.  I want one.   A 9211A in specific.

I looked at the TDR video and that would help greatly in this case.  Along with the eye tests.   And it has the pulse generator in it as well...

I'm going to have to think strongly about this...  and keep a look out on the used market in the meantime.   Not in the budget for right now unfortunately, but a lot better cost:benefit ratio than a lot of the stuff I've been looking at.

 

Offline Marco

  • Super Contributor
  • ***
  • Posts: 6816
  • Country: nl
Re: Ethernet/CAT5 signal integrity testing
« Reply #9 on: October 30, 2015, 10:22:10 pm »
The two problems are how to run a good BER test (since the ethernet PHYs like to hide as much of this as possible), and as I mentioned in a reply I posted a few minutes ago, I'm not sure attenuation is the main issue here.  My gut tells me that the most likely issue is going to be an impedance mismatch which causes nasty signal reflection.

Do simultaneous bidirectional testing then, the reflections would cause errors the other way.

AFAICS Gigabit ethernet has no FEC, so it won't be correcting errors.

PS. oops I'm wrong, the 4D-PAM5 encoding has FEC.
« Last Edit: October 30, 2015, 10:38:56 pm by Marco »
 

Offline forrestcTopic starter

  • Supporter
  • ****
  • Posts: 674
  • Country: us
Re: Ethernet/CAT5 signal integrity testing
« Reply #10 on: October 30, 2015, 10:26:05 pm »
This is way outside of my field of experience so please excuse me if these ideas sound a little crazy.

1) If a radio amateur wants to test the integrity of a 50 ohm line they put a 50 ohm generator on one end and a device called an SWR meter that measures the ratio between the level of the sent signal and the return. With a 50 ohm dummy load on the other end the SWR should be 1:1. So, build a generator with 100 ohm output impedance and a 100 ohm SWR meter, the 100 ohm dummy load is the easy part.

2) Again, build a 100 ohm generator but send square waves down the cable. Terminate the other end with 100 ohms and examine the signal using a scope.

You're actually pretty much right on with one of the correct ways to do this... as that's pretty much all the whole 'eye diagram' thing is... 

Put a 250Mhz square wave in one side.   Terminate the other.   Look at the output signal on a scope.   It's a bit more complicated than that as generally you are letting a real transciever put the coding on the wire, and you're triggering the scope with the bit clock which is running the transciever, but the gist is the same - put a signal on one side and see what comes out the other.

The problem is that you're looking at rise times of a 250Mhz signal for gigabit internet.  So even if you do just the simple method (which will get you most of the way there) of putting a square wave of the correct amplitude in one end and looking at the other end, you're talking about needing/building a 250Mhz square wave generator capable of driving a 100 ohm line with very sharp edges, and an oscilloscope with enough bandwidth and sampling rate to see the rise and fall times of a 250Mhz signal.  That gets you into 5 figures very quickly, although the last post I replied to pointed me toward something closer to what I could swallow - 12.5KUSD for an all in one solution.

 

Offline forrestcTopic starter

  • Supporter
  • ****
  • Posts: 674
  • Country: us
Re: Ethernet/CAT5 signal integrity testing
« Reply #11 on: October 30, 2015, 11:07:19 pm »
The two problems are how to run a good BER test (since the ethernet PHYs like to hide as much of this as possible), and as I mentioned in a reply I posted a few minutes ago, I'm not sure attenuation is the main issue here.  My gut tells me that the most likely issue is going to be an impedance mismatch which causes nasty signal reflection.

Do simultaneous bidirectional testing then, the reflections would cause errors the other way.

AFAICS Gigabit ethernet has no FEC, so it won't be correcting errors.

So some in-built capabilities of gigabit ethernet:

1) It actually does do a form of FEC.   Between the scrambler and the trellis encoder and viterbi decoder you end up with bit errors being hidden in the decoder itself, and never appearing in the output bitstream.  (the processing gain is around 6dB)   It would be really nice to get a raw BER rate out of the viterbi decoder since you'd see errors much sooner.   My experience is the on-the-wire coding is so robust that by the time you see the errors you are experiencing so many raw bit errors that the decoder just can't make any sense out of any of it and effectively toggles from 'perfect' to 'not anything getting through at all'.   I think some of this is also related to the apparent fact that if you get enough errors the decoder gets out of sync with the primary, so it just can't decode until it gets back in sync with the master.    So you might have a few well-placed errors (i.e. burst interference) which will kill all of a packet, yet your longer-term BER will be better than a link which seems to be perfect but is only taking one raw bit error every once in a while.

2) The hybrid and echo cancellation circuitry in most PHY's are rather advanced.  Like the viterbi decoder, these hide lots of cable sins, especially those related to cable reflections.

My standard tests have always been full duplex gigabit (or 100mb/s where applicable)  in both ways with various patterns.   Not IP mind  you, but raw ethernet frames.   It's amazing how robust ethernet is across things which should greatly affect it's performance.   I just need a way to dig in deeper.   I.E. a tester which reports the raw BER instead of post-decoding BER.   Impedance analysis.   Phase difference analysis.  Etc.
 

Offline Marco

  • Super Contributor
  • ***
  • Posts: 6816
  • Country: nl
Re: Ethernet/CAT5 signal integrity testing
« Reply #12 on: October 30, 2015, 11:09:37 pm »
What kind of scope do you have BTW? 125 MHz isn't that much and with a computer it shouldn't be too hard to do digital persistence of a ton of waveforms to form an eye pattern.

Just need to make some high impedance taps on the signal pairs. A dual gate MOSFET will give you a couple k input impedance, AC couple the two signals into an AD8130 to get a single ended signal for the scope.

PS. oops, forgot about the dual duplex nature of the signalling.
« Last Edit: October 30, 2015, 11:15:09 pm by Marco »
 

Offline forrestcTopic starter

  • Supporter
  • ****
  • Posts: 674
  • Country: us
Re: Ethernet/CAT5 signal integrity testing
« Reply #13 on: October 31, 2015, 12:36:48 am »
What kind of scope do you have BTW? 125 MHz isn't that much and with a computer it shouldn't be too hard to do digital persistence of a ton of waveforms to form an eye pattern.

Just need to make some high impedance taps on the signal pairs. A dual gate MOSFET will give you a couple k input impedance, AC couple the two signals into an AD8130 to get a single ended signal for the scope.

PS. oops, forgot about the dual duplex nature of the signalling.

I have a month-old Tek MDO3024.   I'll admit I haven't given it a go on there yet, but hmmm...

The eye diagram of PAM-5 signalling used by gigabit ethernet looks like the rise time is on the order of 8ns... the 3024 specs a rise time of 2ns with the 200Mhz license.    It also says 1ns/div is available as an display option so I should be able to get it big enough.   (I'm not sitting by it or else I'd just power it on and try it).

Somewhere I had gotten the idea that a 1Ghz scope was not likely to be good enough, and you'd need a even faster scope for this work.  But now I look at the eye diagram maybe I got some bad information.   Although now I think about it, I wonder how this is going to work with a full-duplex signal on the wire - you'd think that both ends will be contributing to the signal and making a mess.   I guess some experimentation is in order.

 


Share me

Digg  Facebook  SlashDot  Delicious  Technorati  Twitter  Google  Yahoo
Smf