Publisher
Springer Nature Switzerland
Reference43 articles.
1. Ainslie, J., et al.: ETC: encoding long and structured inputs in transformers. In: Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP), pp. 268–284. Association for Computational Linguistics (2020). https://doi.org/10.18653/v1/2020.emnlp-main.19, https://aclanthology.org/2020.emnlp-main.19
2. Beltagy, I., Peters, M.E., Cohan, A.: Longformer: the long-document transformer (2020). https://doi.org/10.48550/ARXIV.2004.05150, https://arxiv.org/abs/2004.05150
3. Black, S., Leo, G., Wang, P., Leahy, C., Biderman, S.: GPT-neo: large scale autoregressive language modeling with mesh-tensorflow (2021). https://doi.org/10.5281/zenodo.5297715
4. Brown, T., et al.: Language models are few-shot learners. In: Larochelle, H., Ranzato, M., Hadsell, R., Balcan, M., Lin, H. (eds.) Advances in Neural Information Processing Systems, vol. 33, pp. 1877–1901. Curran Associates, Inc. (2020)
5. Chalkidis, I., Androutsopoulos, I., Aletras, N.: Neural legal judgment prediction in English. In: Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics, pp. 4317–4323. Association for Computational Linguistics, Florence (2019). https://doi.org/10.18653/v1/P19-1424, https://aclanthology.org/P19-1424
Cited by
2 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献