Author:
Gavali Prashantkumar. M.,Shirgave Suresh K.
Publisher
Springer Nature Singapore
Reference30 articles.
1. Radford A, Narasimhan K, Salimans T, Sutskever I (2018) Improving language understanding by generative pre-training
2. Devlin J, Chang M, Lee K, Toutanova K (2019) BERT: pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 conference of the North American chapter of the association for computational linguistics: human language technologies, vol 1, June 2019
3. Pennington J, Socher R, Manning CD (2014) GloVe: global vectors for word representation
4. Shen Y, Liu J (2021) Comparison of text sentiment analysis based on BERT and Word2vec. In: 2021 IEEE 3rd international conference on frontiers technology of information and computer (ICFTIC), pp 144–147
5. Turney PD, Pantel P (2010) From frequency to meaning: vector space models of semantics. J Artifi Intell Res 37:141–188