1. Recognising nested named entities in biomedical text;Alex,2007
2. Named entity recognition with bidirectional LSTM-CNNs;Chiu;Transactions of the Association for Computational Linguistics,2016
3. Clark, K., Luong, M.-T., Le, Q. V., & Manning, C. D. (2019). ELECTRA: Pre-training Text Encoders as Discriminators Rather Than Generators. In International conference on learning representations.
4. Cui, L., & Zhang, Y. (2019). Hierarchically-Refined Label Attention Network for Sequence Labeling. In Proceedings of the 2019 conference on empirical methods in natural language processing and the 9th international joint conference on natural language processing (pp. 4115–4128).
5. Devlin, J., Chang, M.-W., Lee, K., & Toutanova, K. (2019). BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding. In Proceedings of the 2019 conference of the North American chapter of the association for computational linguistics: Human language technologies, volume 1 (long and short papers) (pp. 4171–4186).