1. Bordes, A., Usunier, N., Garcia-Duran, A., et al.: Translating embeddings for modeling multi-relational data. In: Advances in Neural Information Processing Systems 26 (2013)
2. Gururangan, S., Marasović, A., Swayamdipta, S., et al.: Don’t stop pretraining: adapt language models to domains and tasks. arXiv preprint arXiv:2004.10964 (2020)
3. Houlsby, N., Giurgiu, A., Jastrzebski, S., et al.: Parameter-efficient transfer learning for NLP. In: International Conference on Machine Learning. PMLR, pp. 2790–2799 (2019)
4. Devlin, J., Chang, M.W., Lee, K., et al.: Bert: pre-training of deep bidirectional transformers for language understanding. arXiv preprint arXiv:1810.04805 (2018)
5. Yang, S., Wang, J., Meng, F., et al.: Text mining techniques for knowledge of defects in power equipment. In: 2021 10th IEEE International Conference on Communication Systems and Network Technologies (CSNT). IEEE, pp. 205–210 (2021)