Abstract
Cross-Modal Hashing (CMH) retrieval methods have garnered increasing attention within the information retrieval research community due to their capability to deal with large amounts of data thanks to the computational efficiency of hash-based methods. To date, the focus of cross-modal hashing methods has been on training with paired data. Paired data refers to samples with one-to-one correspondence across modalities, e.g., image and text pairs where the text sample describes the image. However, real-world applications produce unpaired data that cannot be utilised by most current CMH methods during the training process. Models that can learn from unpaired data are crucial for real-world applications such as cross-modal neural information retrieval where paired data is limited or not available to train the model. This paper provides (1) an overview of the CMH methods when applied to unpaired datasets, (2) proposes a framework that enables pairwise-constrained CMH methods to train with unpaired samples, and (3) evaluates the performance of state-of-the-art CMH methods across different pairing scenarios.
Subject
Electrical and Electronic Engineering,Computer Graphics and Computer-Aided Design,Computer Vision and Pattern Recognition,Radiology, Nuclear Medicine and imaging
Reference46 articles.
1. Baeza-Yates, R., and Ribeiro-Neto, B. (1999). Modern Information Retrieval, Association for Computing Machinery Press.
2. Efficient Discrete Latent Semantic Hashing for Scalable Cross-Modal Retrieval;Lu;Signal Process.,2019
3. Deep Semantic-Preserving Ordinal Hashing for Cross-Modal Similarity Search;Jin;IEEE Trans. Neural Netw. Learn. Syst.,2018
4. Kumar, S., and Udupa, R. (2011, January 16–22). Learning Hash Functions for Cross-view Similarity Search. Proceedings of the 22nd International Joint Conference on Artificial Intelligence, Barcelona Catalonia, Spain.
5. Zhang, D., and Li, W.J. (2014, January 27–31). Large-Scale Supervised Multimodal Hashing With Semantic Correlation Maximization. Proceedings of the AAAI Conference on Artificial Intelligence, Québec City, QC, Canada.
Cited by
4 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献