Author:
Chernyavskiy Anton,Ilvovsky Dmitry,Nakov Preslav
Publisher
Springer International Publishing
Reference35 articles.
1. Arkhipov, M., Trofimova, M., Kuratov, Y., Sorokin, A.: Tuning multilingual transformers for language-specific named entity recognition. In: Proceedings of the 7th Workshop on Balto-Slavic Natural Language Processing (BSNLP 2019), pp. 89–93. Florence, Italy (2019)
2. Augenstein, I., Das, M., Riedel, S., Vikraman, L., McCallum, A.: SemEval 2017 task 10: ScienceIE - extracting keyphrases and relations from scientific publications. In: Proceedings of the 11th International Workshop on Semantic Evaluation (SemEval 2017), pp. 546–555. Vancouver, Canada (2017)
3. Beltagy, I., Peters, M.E., Cohan, A.: Longformer: the long-document transformer. In: ArXiv (2020)
4. Choromanski, K., et al.: Rethinking attention with performers. In: Proceedings of the 9th International Conference on Learning Representations (ICLR 2021) (2021)
5. Clark, K., Khandelwal, U., Levy, O., Manning, C.D.: What does BERT look at? An analysis of BERT’s attention, ArXiv (2019)
Cited by
22 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献