Multimodal Recipe Recommendation with Heterogeneous Graph Neural Networks
-
Published:2024-08-19
Issue:16
Volume:13
Page:3283
-
ISSN:2079-9292
-
Container-title:Electronics
-
language:en
-
Short-container-title:Electronics
Author:
Ouyang Ruiqi1, Huang Haodong2, Ou Weihua12ORCID, Liu Qilong1ORCID
Affiliation:
1. School of Mathematical Sciences, Guizhou Normal University, Guiyang 550025, China 2. School of Bigdata and Computer Science, Guizhou Normal University, Guiyang 550025, China
Abstract
Recipe recommendation is the process of recommending suitable recipes to users based on factors such as user preferences and dietary needs. Recipes typically involve multiple modalities, with text and images being common, while most typical recipe recommendation methods recommend recipes to users based on text. Obviously, the expressiveness of a single modal is often not enough, and the semantic information of images is more abundant. Moreover, it is difficult to grasp the feature fusion granularity of different kinds of modal information and the relationship between recipes and users. To solve the above problem, this paper proposes a Multimodal Heterogeneous Graph Neural Network Recipe Recommendation (MHGRR) architecture, which aims to fully fuse the various kinds of modal information of recipes and handle the relationship between users and recipes. We use embedding and shallow Convolutional Neural Networks (CNNs) to extract original text and image information for unifying feature fusion granularity, and use Heterogeneous Graph Neural Networks based on GraphSAGE to capture the complex relationship between users and recipes. To verify the effectiveness of our proposed model, we perform some comparative experiments on a real dataset; the experiments show that our method outperforms most popular recipe recommendation methods. Through an ablation experiment, we found that adding image information to recipe recommendation is more effective, and we additionally found that as the output dimensions of GraphSAGE increased, the performance of the model varied little.
Reference52 articles.
1. Harvey, M., Ludwig, B., and Elsweiler, D. (2012, January 13). Learning User Tastes: A First Step to Generating Healthy Meal Plans?. Proceedings of the First International Workshop on Recommendation Technologies for Lifestyle Change (Lifestyle 2012), Dublin, Ireland. 2. Trattner, C., and Elsweiler, D. (2017, January 3–7). Investigating the Healthiness of Internet-Sourced Recipes: Implications for Meal Planning and Recommender Systems. Proceedings of the 26th International Conference on World Wide Web, Perth, Australia. 3. Ge, M., Ricci, F., and Massimo, D. (2015, January 16–20). Health-aware Food Recommender System. Proceedings of the 9th ACM Conference on Recommender Systems, Vienna, Austria. 4. A Survey on Healthy Food Decision Influences through Technological Innovations;Marshall;ACM Trans. Comput. Healthc.,2022 5. Calzolari, N., Choukri, K., Declerck, T., Goggi, S., Grobelnik, M., Maegaard, B., Mariani, J., Mazo, H., Moreno, A., and Odijk, J. (2016, January 23–28). A Large-scale Recipe and Meal Data Collection as Infrastructure for Food Research. Proceedings of the Tenth International Conference on Language Resources and Evaluation (LREC 2016), Portorož, Slovenia.
|
|