Author:
Yu Shuiyuan,Zhang Zihao,Liu Haitao
Abstract
Abstract
Word order is one of the most important grammatical devices and the basis for language understanding. However, as one of the most popular NLP architectures, Transformer does not explicitly encode word order. A solution to this problem is to incorporate position information by means of position encoding/embedding (PE). Although a variety of methods of incorporating position information have been proposed, the NLP community is still in want of detailed statistical researches on position information in real-life language. In order to understand the influence of position information on the correlation between words in more detail, we investigated the factors that affect the frequency of words and word sequences in large corpora. Our results show that absolute position, relative position, being at one of the two ends of a sentence and sentence length all significantly affect the frequency of words and word sequences. Besides, we observed that the frequency distribution of word sequences over relative position carries valuable grammatical information. Our study suggests that in order to accurately capture word–word correlations, it is not enough to focus merely on absolute and relative position. Transformers should have access to more types of position-related information which may require improvements to the current architecture.
Publisher
Cambridge University Press (CUP)
Subject
Artificial Intelligence,Linguistics and Language,Language and Linguistics,Software
Reference48 articles.
1. Mikolov, T. , Chen, K. , Corrado, G. and Dean, J. (2013). Efficient estimation of word representations in vector space. arXiv preprint arXiv:1301.3781.
2. Goldberg, Y. (2019). Assessing bert’s syntactic abilities. arXiv preprint arXiv:1901.05287.
3. Pham, T.M. , Bui, T. , Mai, L. and Nguyen, A. (2020). Out of order: how important is the sequential order of words in a sentence in natural language understanding tasks? arXiv preprint arXiv:2012.15180.
4. Dufter, P. , Schmitt, M. and Schütze, H. (2021). Position information in transformers: an overview. arXiv preprint arXiv:2102.11090.
5. Dependency distance as a metric of language comprehension difficulty;Liu;Journal of Cognitive Science,2008