1. National Assembly: Labor on Employment 2013, No. 38/2013/QH13 (2013)
2. National Assembly: Labor Code 2019, No. 45/2019/QH14 (2019)
3. Bui, T.V., Tran, O.T., Le-Hong, P.: Improving sequence tagging for Vietnamese text using transformer-based neural models. CoRR abs/2006.15994 (2020). https://arxiv.org/abs/2006.15994
4. Dale, R.: Law and word order: NLP in legal tech. Nat. Lang. Eng. 25(1), 211–217 (2019)
5. Devlin, J., Chang, M.W., Lee, K., Toutanova, K.: BERT: pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers). Association for Computational Linguistics, Minneapolis, Minnesota, June 2019