In the recent years, several hashing methods have been proposed for multi-label image retrieval. However, general methods quantify the similarities of image pairs roughly, which only consider the similarities based on category labels. In addition, general pairwise loss functions are not sensitive to the relative order of similar images. To address above problems, we present a deep semantic-aware ranking preserving hashing (DSRPH) method. First, we design a semantic-aware similarity quantization method which can measure fine-grained semantic-level similarity beyond the category based on the cosine similarity of image captions that contain high-level semantic description. Second, we propose a novel weighted pairwise loss function by adding adaptive upper and lower bounds, which can construct a compact zero-loss interval to directly constrain the relative order of similar images. Extensive experiments show that our method can generate high-quality hash codes and yield the state-of-the-art performance.
Bibliographical noteSupported by National Key R&D Program of China (No. 2017YFB1402400), National Nature Science Foundation of China (No. 61762025), Guangxi Key Laboratory of Trusted Software (No. kx202006), Guangxi Key Laboratory of Optoelectroric Information Processing (No. GD18202), Natural Science Foundation of Guangxi Province, China (No. 2019GXNSFDA185007), National Natural Science Foundation of China Grant 61672443, Hong Kong GRF-RGC General Research Fund under Grant 9042322 (CityU 11200116), Grant 9042489 (CityU 11206317), and Grant 9042816 (CityU 11209819), in part by the Key Project of Science and Technology Innovation 2030 supported by the Ministry of Science and Technology of China under Grant No. 2018AAA0101301 and in part by the Guangxi Key Laboratory of Cryptography and Information Security under Grant GCIS201906.
- Deep supervised hashing
- Image retrieval
- Ranking preserving
- Similarity quantization