1. Bert: Pre-training of deep bidirectional transformers for language understanding;Kenton,2019
2. On the sentence embeddings from pre-trained language models;Li,2020
3. Sbert-wk: A sentence embedding method by dissecting bert-based word models;Wang;IEEE/ACM Trans Audio Speech Lang. Process.,2020
4. An efficient framework for learning sentence representations;Logeswaran,2018
5. Representation learning with contrastive predictive coding;Oord,2018