Global Sources
EE Times-Asia
Stay in touch with EE Times Asia
EE Times-Asia > Manufacturing/Packaging

10GbE standard clears major hurdle for optical transceivers

Posted: 16 Jun 2002 ?? ?Print Version ?Bookmark and Share

Keywords:ethernet? lan? wan? transceiver? oc192?

A new standard for Ethernet systems operating at 10Gbps entitled IEEE 802.3ae has been under development and is on track for ratification this month. The elements of the standard that specify performance of the optical transmitters and receivers operating at 10Gbps have not been easy to establish. It is worthwhile to take a step back and examine the basic strategy behind this standard and the evolutionary process that leads to a final document.

Initially, the specifications and associated tests for these elements were based to a large degree upon extensions of the IEEE 802.3z standard developed a number of years earlier for transmission at 1Gbps. One of the key elements of that standard was "the Gigabit link model". This spreadsheet-type application represented the various blocks of the communication system with quantifiable parameters like extinction ratio of the transmitter, dispersion of the fiber and sensitivity of the receiver.

Within the model, these parameters could be adjusted and traded off each other while the resulting bit-error-ratio (BER) was monitored. There is a negotiation process to make sure that the performance burden is appropriately shared by each component of the link.

Building process

In building an equivalent 10Gb model, one must realize that no model is perfect and it is essential that verification be performed and that the modeled elements could be physically built. For those involved in the development of the standard, a circular problem exists. To verify the model and the resulting specifications it generates, real devices must be tested.

Also, it is important to consider the overall philosophy and objectives of 10Gb Ethernet (10GbE) systems. Minimal engineering would be required to design a working system from hardware that met the specifications of the standard. A "plug and play" mentality allowing anybody's transmitter to be combined with anybody's receiver was a key objective. To achieve this, the standard had to be designed to screen out inadequate components. Yet to maintain low-cost, the specifications of the standard could not be overly stringent and reject components that otherwise would be viable in a system. High production yields are essential to achieve low cost.

The standards development and hardware development occurred in parallel. It is probably fair to say that the standard was probably ahead of much of the hardware development. Thus, when the specifications and test methods were approaching a stable form, low-cost transceivers operating at 10Gbps were not readily available from several vendors. When the few vendors that had working parts began to test devices to the draft standard, some significant problems manifested themselves. It was apparent that either the standard was overly stringent, parts were simply inadequate or there were problems with the verification process. In the end, all three were true to some degree.

The committee had to take a step back and make some hard decisions. A choice needed to be made between redesigning the tests so that existing test equipment could be used. Both represented the possibility of a significant delay in finalization of the standard. A decision was made to make two key changes to the test methodologies.

Final choice

The original test methodology for transmitter test involved a jitter bathtub test to assess the timing stability of transmitters. This is used to verify the magnitude of the jitter as well the deterministic and random elements of that jitter, as these were basic elements of the link model. However, execution of the jitter bathtub test at 10Gbps proved to be difficult. The test is based upon making a BER measurement with the transmitter signal being measured directly by a bit-error-ratio test (BERT) set. By varying the decision time of the BERT error detector, the probability distribution of signal edges and jitter can be quantified.

Although it has been shown that higher performance error detectors minimize the problem, the subcommittee in charge of this portion of the standard concluded that a new approach to transmitter test could be developed that would allow the use of existing test equipment. This test is now known as the transmitter dispersion penalty (TDP) test. In this test, a reference transmitter is fed to a reference receiver through an attenuator and then to a BERT. The BER is monitored while the attenuation is increased. The attenuator is set to achieve a specific BER, perhaps 1E-12. The reference transmitter is replaced with the transmitter under test, which now feeds the specified test system fiber, attenuator, and reference receiver. The BER is monitored and the attenuator is adjusted to return to the BER achieved with the reference transmitter. To confirm that transmitter jitter is not excessive, the decision point for the error detector is varied in time over a 10ps span. The difference in attenuator settings represents the TDP result, with a maximum value allowed within the standard for each class of 10GbEn-transmitter.

A second major change was made for verifying receiver performance. In the early drafts of the standard, a "stressed eye receiver" test was implemented. In this test, an impaired transmitter signal, representative of the worst-case signal that compliant transmitters might generate, is sent to the receiver under test. Given the problems with the bathtub curve, it was decided to drop the random and deterministic jitter impairments. To produce an adequate test methodology, these impairments were replaced with a controlled interference signal.

Making changes to the test methodologies imposed substantial risk to the standard. The new techniques needed to be proven out in a very short time. In a combined effort of the component manufacturers and test and measurement experts, the new techniques were confirmed as viable. The last big effort was to fine tune the actual device parameter values used to specify minimal acceptable performance.

? Greg LeCheminant

Strategic Business Consultant

Agilent Technologies Inc. Lightwave Division

Article Comments - 10GbE standard clears major hurdle f...
*? You can enter [0] more charecters.
*Verify code:


Visit Asia Webinars to learn about the latest in technology and get practical design tips.

Back to Top