zappedagain
Electrical
- Jul 19, 2005
- 1,074
There's a fairly common guideline for GigE/100M Ethernet PCB routing on the web (that I won't mention here, yet) that states to "keep the differential pair trace lengths matched to 5 mils (0.005 in, 0.127 mm)". Does anyone know the basis for that? It seems like a really nice design goal ("if you can hit that you'll have no problems"), but how much effort is it worth? It seems like overkill.
My thoughts:
125 MHz is the GigE carrier (on 4 pairs), so to keep the eye open I need to pass the 7th or eight harmonic (1 GHz).
FR4 has a propagation rate of 170 pS/in so 1 GHz has a wavelength on board of 5.88 inches. If I want my auto-transformer cancellation better than -40 dB (1%) then I need a length match better than 0.058 inches. Did they throw another 10x on that to minimize the error effect (-60 dB)?
It all seems a bit moot as the CM rejection on most GigE transformers drops from -50 dB at 1 MHz to 15 dB at 200 MHz so that extrapolates to negligible CM rejection (0 dB) around 1 GHz. The auto-transformer turns matching is only specified to +-2%, so worst case is only -34 dB. My EMI testing experience tells me you really aren't going to get much more attenuation than that with a single component.
So matching to 50 mils (1.25 mm) will not degrade the auto-transformer operation by more than -40 dB at 1 GHz (and 50 mils also happens to match another guideline on the web published by Intel). That seems more real-world.
Did I miss something?
Thanks,
Z
My thoughts:
125 MHz is the GigE carrier (on 4 pairs), so to keep the eye open I need to pass the 7th or eight harmonic (1 GHz).
FR4 has a propagation rate of 170 pS/in so 1 GHz has a wavelength on board of 5.88 inches. If I want my auto-transformer cancellation better than -40 dB (1%) then I need a length match better than 0.058 inches. Did they throw another 10x on that to minimize the error effect (-60 dB)?
It all seems a bit moot as the CM rejection on most GigE transformers drops from -50 dB at 1 MHz to 15 dB at 200 MHz so that extrapolates to negligible CM rejection (0 dB) around 1 GHz. The auto-transformer turns matching is only specified to +-2%, so worst case is only -34 dB. My EMI testing experience tells me you really aren't going to get much more attenuation than that with a single component.
So matching to 50 mils (1.25 mm) will not degrade the auto-transformer operation by more than -40 dB at 1 GHz (and 50 mils also happens to match another guideline on the web published by Intel). That seems more real-world.
Did I miss something?
Thanks,
Z