The impact of interferometric noise on the performance of optical communication networks
Interferometric noise, arising on the interference of data and parasitic crosstalk and common to many current and proposed fibre optic communication networks, may induce unacceptable power penalties and bit-error-rate floors. This work addresses key aspects of this problem via experiment and theoretical analysis: the origin and characteristics of the noise, the resultant performance degradation of optical networks, and solution paths. The study of a single crosstalk interferer generates a classification of all interferometric noise forms and reveals the key properties of probability density function and power spectrum. Performance degradation from theory and experiment agree closely. The aggregation of multiple crosstalk terms is analysed and the validity of Gaussian statistics, predicted by the Central Limit Theorem, is demonstrated. It is predicted that the total crosstalk level of noise generating terms should be held below -25 dB for a penalty of less than I dB - a further 2 to 4 dB may lead to network failure. Optical TDM switching networks, constructed from discrete lithium niobate directional couplers of -15 dB isolation, and delay lines, illustrate the importance of interferometric noise. Larger networks are modelled on a computer simulator (XFlatch) that tracks all crosstalk waveforms, calculates both interferometric and amplifier noise, and thus the bit-error-rate. A bilateral approach is proposed to manage interferometric noise; crosstalk power is minimised and noise owing to the residual crosstalk is RF rejected. Several methods are critically discussed. A novel technique, exploiting intra-bit frequency evolution of directly modulated DFB lasers in response to injection heating, is introduced and critically assessed.