Machine Learning in Short-Reach Optical Systems: A Comprehensive Survey
-
Published:2024-06-28
Issue:7
Volume:11
Page:613
-
ISSN:2304-6732
-
Container-title:Photonics
-
language:en
-
Short-container-title:Photonics
Author:
Shao Chen1ORCID, Giacoumidis Elias2ORCID, Billah Syed Moktacim1, Li Shi2ORCID, Li Jialei2, Sahu Prashasti3, Richter André2ORCID, Faerber Michael1ORCID, Kaefer Tobias1ORCID
Affiliation:
1. Department of Economics and Management, Karlsruhe Institute of Technology (KIT), 76131 Karlsruhe, Germany 2. VPIphotonics GmbH, Hallerstraße 6, 10587 Berlin, Germany 3. Electronic and Information Engineering, Technical University of Chemnitz, Str. der Nationen 62, 09111 Chemnitz, Germany
Abstract
Recently, extensive research has been conducted to explore the utilization of machine learning (ML) algorithms in various direct-detected and (self)-coherent short-reach communication applications. These applications encompass a wide range of tasks, including bandwidth request prediction, signal quality monitoring, fault detection, traffic prediction, and digital signal processing (DSP)-based equalization. As a versatile approach, ML demonstrates the ability to address stochastic phenomena in optical systems networks where deterministic methods may fall short. However, when it comes to DSP equalization algorithms such as feed-forward/decision-feedback equalizers (FFEs/DFEs) and Volterra-based nonlinear equalizers, their performance improvements are often marginal, and their complexity is prohibitively high, especially in cost-sensitive short-reach communications scenarios such as passive optical networks (PONs). Time-series ML models offer distinct advantages over frequency-domain models in specific contexts. They excel in capturing temporal dependencies, handling irregular or nonlinear patterns effectively, and accommodating variable time intervals. Within this survey, we outline the application of ML techniques in short-reach communications, specifically emphasizing their utilization in high-bandwidth demanding PONs. We introduce a novel taxonomy for time-series methods employed in ML signal processing, providing a structured classification framework. Our taxonomy categorizes current time-series methods into four distinct groups: traditional methods, Fourier convolution-based methods, transformer-based models, and time-series convolutional networks. Finally, we highlight prospective research directions within this rapidly evolving field and outline specific solutions to mitigate the complexity associated with hardware implementations. We aim to pave the way for more practical and efficient deployment of ML approaches in short-reach optical communication systems by addressing complexity concerns.
Funder
Federal Ministry of Education and Research
Reference99 articles.
1. Toward massive, ultrareliable, and low-latency wireless communication with short packets;Durisi;Proc. IEEE,2016 2. Kapoor, R., Porter, G., Tewari, M., Voelker, G.M., and Vahdat, A. (2012, January 14–17). Chronos: Predictable low latency for data center applications. Proceedings of the Third ACM Symposium on Cloud Computing 2012, San Jose, CA, USA. 3. Xie, Y., Wang, Y., Kandeepan, S., and Wang, K. (2022). Machine learning applications for short reach optical communication. Photonics, 9. 4. Wu, Q., Xu, Z., Zhu, Y., Zhang, Y., Ji, H., Yang, Y., Qiao, G., Liu, L., Wang, S., and Liang, J. (2023). Machine Learning for Self-Coherent Detection Short-Reach Optical Communications. Photonics, 10. 5. Ranzini, S.M., Da, R.F., Bülow, H., and Zibar, D. (2019). Tunable Optoelectronic Chromatic Dispersion Compensation Based on Machine Learning for Short-Reach Transmission. Appl. Sci., 9.
|
|