Machine Learning in Short-Reach Optical Systems: A Comprehensive Survey

Research output: Contribution to journalReview articleContributedpeer-review

Contributors

  • Chen Shao - , Karlsruhe Institute of Technology (Author)
  • Elias Giacoumidis - , VPIphotonics GmbH (Author)
  • Syed Moktacim Billah - , Karlsruhe Institute of Technology (Author)
  • Shi Li - , VPIphotonics GmbH (Author)
  • Jialei Li - , VPIphotonics GmbH (Author)
  • Prashasti Sahu - , Chemnitz University of Technology (Author)
  • André Richter - , VPIphotonics GmbH (Author)
  • Michael Faerber - , Karlsruhe Institute of Technology (Author)
  • Tobias Kaefer - , Karlsruhe Institute of Technology (Author)

Abstract

Recently, extensive research has been conducted to explore the utilization of machine learning (ML) algorithms in various direct-detected and (self)-coherent short-reach communication applications. These applications encompass a wide range of tasks, including bandwidth request prediction, signal quality monitoring, fault detection, traffic prediction, and digital signal processing (DSP)-based equalization. As a versatile approach, ML demonstrates the ability to address stochastic phenomena in optical systems networks where deterministic methods may fall short. However, when it comes to DSP equalization algorithms such as feed-forward/decision-feedback equalizers (FFEs/DFEs) and Volterra-based nonlinear equalizers, their performance improvements are often marginal, and their complexity is prohibitively high, especially in cost-sensitive short-reach communications scenarios such as passive optical networks (PONs). Time-series ML models offer distinct advantages over frequency-domain models in specific contexts. They excel in capturing temporal dependencies, handling irregular or nonlinear patterns effectively, and accommodating variable time intervals. Within this survey, we outline the application of ML techniques in short-reach communications, specifically emphasizing their utilization in high-bandwidth demanding PONs. We introduce a novel taxonomy for time-series methods employed in ML signal processing, providing a structured classification framework. Our taxonomy categorizes current time-series methods into four distinct groups: traditional methods, Fourier convolution-based methods, transformer-based models, and time-series convolutional networks. Finally, we highlight prospective research directions within this rapidly evolving field and outline specific solutions to mitigate the complexity associated with hardware implementations. We aim to pave the way for more practical and efficient deployment of ML approaches in short-reach optical communication systems by addressing complexity concerns.

Details

Original languageEnglish
Article number613
JournalPhotonics
Volume11
Issue number7
Publication statusPublished - Jul 2024
Peer-reviewedYes
Externally publishedYes

External IDs

ORCID /0000-0001-5458-8645/work/199964125

Keywords

Keywords

  • bit-error ratio, equalization, machine learning, modulation format identification, nonlinearities, optical communications, optical performance monitoring, optical signal-to-noise ratio, passive optical network