1. Abid, A., 2022. Huggingface datasets. https://github.com/huggingface/datasets.
2. Cer, D., Yang, Y., Kong, S.Y., Hua, N., Limtiaco, N., John, R.S., Constant, N., Guajardo-Cespedes, M., Yuan, S., Tar, C., Strope, B., Kurzweil, R., 2018. Universal sentence encoder for english. In: Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing: System Demonstrations, pp. 169–174. https://doi.org/10.18653/v1/D18-2029.
3. Chandrasekaran, V., Chaudhuri, K., Giacomelli, I., Jha, S., Yan, S., 2020. Exploring connections between active learning and model extraction. In: 29th USENIX Security Symposium (USENIX Security 20), pp. 1309–1326.
4. Daniel, C., Yinfei, Y., 2022. Universal sentence encoder. https://tfhub.dev/google/universal-sentence-encoder/5.
5. Devlin, J., Chang, M.W., Lee, K., Toutanova, K., 2019. Bert: Pre-training of deep bidirectional transformers for language understanding. north american chapter of the association for computational linguistics.