Author:
Dyer Laurence,Hughes Anthony,Can Burcu
Publisher
Springer International Publishing
Reference39 articles.
1. Al-Rfou, R., Choe, D., Constant, N., Guo, M., Jones, L.: Character-level language modeling with deeper self-attention. Proc. AAAI Confer. Artif. Intell. 33(01), 3159–3166 (2019). https://doi.org/10.1609/aaai.v33i01.33013159. https://ojs.aaai.org/index.php/AAAI/article/view/4182
2. Bakare, A.M., Anbananthen, K.S.M., Muthaiyah, S., Krishnan, J., Kannan, S.: Punctuation restoration with transformer model on social media data. Appl. Sci. 13(3), 1685 (2023). https://doi.org/10.3390/app13031685. https://www.mdpi.com/2076-3417/13/3/1685
3. Chay-intr, T., Kamigaito, H., Okumura, M.: Character-based Thai word segmentation with multiple attentions. In: Proceedings of the International Conference on Recent Advances in Natural Language Processing (RANLP 2021), pp. 264–273. INCOMA Ltd., Held Online (2021). https://aclanthology.org/2021.ranlp-1.31
4. Conneau, A., Khandelwal, K., Goyal, N., Chaudhary, V., Wenzek, G., Guzmán, F., Grave, E., Ott, M., Zettlemoyer, L., Stoyanov, V.: Unsupervised cross-lingual representation learning at scale. Computing Research Repository arXiv:911.02116 (2019). http://arxiv.org/abs/1911.02116. Version 2
5. Courtland, M., Faulkner, A., McElvain, G.: Efficient automatic punctuation restoration using bidirectional transformers with robust inference. In: Proceedings of the 17th International Conference on Spoken Language Translation, pp. 272–279. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.iwslt-1.33. https://aclanthology.org/2020.iwslt-1.33