For most 2.5 g communication systems, fiber dispersion testing is not necessary. However, as bit rates increase to 10 Gig and above (40 Gig and 100 Gig), fiber characteristics, especially dispersion testing, become more and more important. As the transmission speed increases, the acceptable dispersion tolerance is greatly reduced.

Some service providers did not install extremely expensive new optical cables but instead upgraded existing networks by implementing Dense Wavelength Division Multiplexing (DWDM) to increase bandwidth and the ability to handle higher data rates. In a DWDM system, multiple wavelengths are transmitted through a single optical fiber. Due to a phenomenon called “discrete”, these faster data rates result in higher bit error rates.

1. What is dispersion?

Light pulses are widened or spread due to dispersion as they propagate through optical fibers. This spread can cause a bit of overlap, making it difficult for the receiver to interpret the signal. Dispersion increases with distance and bit rate. It limits the ability of the network to transmit at higher speeds, reduces the quality of the optical signal, and increases the bit error rate.

Receiver tolerance

  • Based on 1dB power loss
  • Gb/s 16000 ps/nautical mile
  • 10 Gb/s 1000 ps/nautical mile
  • 40 Gb / s 60 ps / nm

2. Compensation for dispersion

Fiber dispersion testing can tell you whether an old network can handle higher data rates before you invest a lot of money in equipment upgrades. The test results can also be used to create and implement dispersion compensation plans. Such solutions may include the use of DCF (dispersion compensating fiber) and RDF (reverse dispersion fiber) fibers, Bragg gratings, reducing the pulse spectrum width, and implementing other remedial measures. Once the remedial measures are in place, the span must be re-verified.

Existing test equipment can accurately measure dispersion. They usually use PC software for documentation and certification reports.

3. Type of dispersion

1) Dispersion (CD)

A laser pulse contains several wavelengths of light, which propagate through the fiber at different speeds. This difference in speed causes the laser pulse to spread and the bits to overlap, making it difficult for the receiver to interpret. Dispersion increases with the increase of fiberlink distance and bit rate.

2) Polarization Mode Dispersion (PMD)

Light consists of two mutually perpendicular waves. When light travels through an optical fiber, these waves travel two different paths (polarization modes).

Due to the defects of the optical fiber during the manufacturing process and the stress imposed on the optical fiber during the installation process, light propagates at different speeds on each path. Polarization mode dispersion produces a slow axis and a fast axis, which causes the pulse to be distorted or broadened.

The enlargement of the laser pulse will cause interference at the receiving end, thereby limiting the signal quality of the network, limiting the ability of the network to transmit at higher speeds, and causing signal mixing, making it difficult for the receiving end to interpret. PMD increases with the increase of fiber link distance and bit rate.

PMD is not constant and changes over time due to changes in pressure and environment. For example, an optical fiber installed on a railway track displays a different PMD value when the train is running (due to vibration) than when there is no train. Therefore, it is necessary to periodically check the PMD and re-verify the PMD value of the installed optical fiber link.

3) Bit Error Rate (BER)

Because dispersion can lead to a higher bit error rate (BER), you should include the bit error rate in your test procedures. The bit error rate is the percentage of the number of bits with errors to the total number of bids received. The bit error rate indicates how often data packets need to be retransmitted due to errors.

A high bit error rate will increase the number of retransmissions required. This in turn increases the time required to send data through the system and reduces the transmission speed. Bit error rate testing involves sending simulated data through the communication system and comparing input data with output data.