A Multi-strategy-based Pre-training Method for Cold-start Recommendation

Author:

Hao Bowen1ORCID,Yin Hongzhi2ORCID,Zhang Jing1ORCID,Li Cuiping1ORCID,Chen Hong1ORCID

Affiliation:

1. Renmin University of China, Beijing, China

2. The University of Queensland, Brisbane, Australia

Abstract

The cold-start issue is a fundamental challenge in Recommender Systems. The recent self-supervised learning (SSL) on Graph Neural Networks (GNNs) model, PT-GNN, pre-trains the GNN model to reconstruct the cold-start embeddings and has shown great potential for cold-start recommendation. However, due to the over-smoothing problem, PT-GNN can only capture up to 3-order relation, which cannot provide much useful auxiliary information to depict the target cold-start user or item. Besides, the embedding reconstruction task only considers the intra-correlations within the subgraph of users and items, while ignoring the inter-correlations across different subgraphs. To solve the above challenges, we propose a multi-strategy-based pre-training method for cold-start recommendation (MPT), which extends PT-GNN from the perspective of model architecture and pretext tasks to improve the cold-start recommendation performance. 1 Specifically, in terms of the model architecture, in addition to the short-range dependencies of users and items captured by the GNN encoder, we introduce a Transformer encoder to capture long-range dependencies. In terms of the pretext task, in addition to considering the intra-correlations of users and items by the embedding reconstruction task, we add an embedding contrastive learning task to capture inter-correlations of users and items. We train the GNN and Transformer encoders on these pretext tasks under the meta-learning setting to simulate the real cold-start scenario, making the model able to be easily and rapidly adapted to new cold-start users and items. Experiments on three public recommendation datasets show the superiority of the proposed MPT model against the vanilla GNN models, the pre-training GNN model on user/item embedding inference, and the recommendation task.

Funder

National Key Research & Develop Plan

National Natural Science Foundation of China

Beijing Natural Science Foundation

Australian Research Council Future Fellowship

Discovery Project

Publisher

Association for Computing Machinery (ACM)

Subject

Computer Science Applications,General Business, Management and Accounting,Information Systems

Reference52 articles.

1. From ranknet to lambdarank to lambdamart: An overview;Burges Christopher J. C.;Learning,2010

2. Hongxu Chen, Hongzhi Yin, Tong Chen, Quoc Viet Hung Nguyen, Wen-Chih Peng, and Xue Li. 2019. Exploiting centrality information with graph convolutions for network representation learning. In ICDE’19. IEEE, 590–601.

3. Hongxu Chen, Hongzhi Yin, Xiangguo Sun, Tong Chen, Bogdan Gabrys, and Katarzyna Musial. 2020. Multi-level graph convolutional networks for cross-platform anchor link prediction. In SIGKDD’20. ACM, 1503–1511.

4. Jie Chen, Tengfei Ma, and Cao Xiao. 2018. FastGCN: Fast learning with graph convolutional networks via importance sampling. In ICLR’18.

5. ICML’20;Chen Ting,2020

Cited by 6 articles. 订阅此论文施引文献 订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献

1. Self-Supervised Learning for Recommender Systems: A Survey;IEEE Transactions on Knowledge and Data Engineering;2024-01

2. Community Preserving Social Recommendation with Cyclic Transfer Learning;ACM Transactions on Information Systems;2023-12-29

3. Contrastive Learning-Based Music Recommendation Model;Communications in Computer and Information Science;2023-11-13

4. Contrastive Self-supervised Learning in Recommender Systems: A Survey;ACM Transactions on Information Systems;2023-11-08

5. User Cold Start Problem in Recommendation Systems: A Systematic Review;IEEE Access;2023

同舟云学术

1.学者识别学者识别

2.学术分析学术分析

3.人才评估人才评估

"同舟云学术"是以全球学者为主线,采集、加工和组织学术论文而形成的新型学术文献查询和分析系统,可以对全球学者进行文献检索和人才价值评估。用户可以通过关注某些学科领域的顶尖人物而持续追踪该领域的学科进展和研究前沿。经过近期的数据扩容,当前同舟云学术共收录了国内外主流学术期刊6万余种,收集的期刊论文及会议论文总量共计约1.5亿篇,并以每天添加12000余篇中外论文的速度递增。我们也可以为用户提供个性化、定制化的学者数据。欢迎来电咨询!咨询电话:010-8811{复制后删除}0370

www.globalauthorid.com

TOP

Copyright © 2019-2024 北京同舟云网络信息技术有限公司
京公网安备11010802033243号  京ICP备18003416号-3