1. Attention is all you need;Vaswani;In Advances in Neural Information Processing Systems,2017
2. Improving neural machine translation for low resource languages through non-parallel corpora: a case study of Egyptian dialect to modern standard Arabic translation;Faheem;Scientific Reports,2024
3. Sequence to Point Learning Based on an Attention Neural Network for Nonintrusive Load Decomposition;Yang;arXiv preprint arXiv:2107.01457,2021
4. DrBERT: Unveiling the Potential of Masked Language Modeling Decoder in BERT pretraining;Liang;arXiv preprint arXiv:2401.15861,2024
5. Improving neural machine translation for low resource languages through non-parallel corpora: a case study of Egyptian dialect to modern standard Arabic translation;Laghari;Scientific Reports,2023