Abstract
AbstractTypically, user behaviour occurs continuously, and considering this dynamic sequence correlation can lead to more accurate recommendations. Sequential recommendation systems have, therefore, become an important means of solving the problem of network information overload. However, existing attention mechanisms are still insufficient for modelling users’ dynamic and diverse preferences. This paper presents a recommendation model based on a multiheaded self-attention mechanism and multitemporal embeddings of long- and short-term interests (MSMT-LSI). MSMT-LSI balances users’ long- and short-term benefits through two multihead self-attention networks and finally forms a hybrid representation for recommendation. After finding the most suitable parameter combinations for the MSMT-LSI model through parameter sensitivity analysis and verifying the advantages of the long- and short-term fusion strategy, related experiments on five well-known datasets and their analysis shows that the performance of MSMT-LSI is better than that of the classical model on the same benchmark dataset.
Funder
National Key Research and Development Program
National Natural Science Foundation of China
Publisher
Springer Science and Business Media LLC
Reference38 articles.
1. Yin, H., Cui, B.: Spatio-Temporal Recommendation in Social Media. Springer, Singapore (2016)
2. Rendle, S., Freudenthaler, C., Gantner, Z., et al: Bpr: Bayesian personalized ranking from implicit feedback. In: Proc. of the Twenty-Fifth Conference on Uncertainty in Artificial Intelligence (UAI ’09), Arlington, Virginia, USA, pp. 452–461 (2009)
3. Wu, C.Y., Ahmed, A., Beutel, A., et al: Recurrent recommender networks. In: Proc. of the Tenth ACM International Conference on Web Search and Data Mining (WSDM ’17), Cambridge, United Kingdom, pp. 495–503 (2017)
4. Xie, R.B., Wang, Y.L., Wang, R., Lu, Y.F., et al: Long short-term temporal meta-learning in online recommendation. In: Proceedings of the Fifteenth ACM International Conference on Web Search and Data Mining (WSDM ’22). Association for Computing Machinery, New York, NY, USA, pp. 1168–1176 (2022) https://doi.org/10.1145/3488560.3498371
5. Rush, A.M., Chopra, S., Weston, J.: A neural attention model for abstractive sentence summarization. In: Proceedings of the 2015 Conference on Empirical Methods in Natural Language Processing (EMNLP 2015), Lisbon, Portugal, pp. 379–389 (2015)
Cited by
1 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献