Deep hash method based on metric learning

A metric learning and hashing technology, applied in the field of computer vision and image processing, can solve the problems of not encouraging the same symbols, misjudgment, poor hash code discrimination, etc., to achieve fast and accurate image retrieval, accurate hash coding, Chinese The effect of small distance

Active Publication Date: 2020-09-01
BEIJING UNIV OF POSTS & TELECOMM
View PDF4 Cites 3 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0007] The present invention is to solve the contrast loss function of the existing deep hash method, which can only make the feature vectors of the images of the same category as close as possible before quantization, but cannot encourage their symbols to be the same; make the values ​​of different categories of images before quantization as possible Stay away, but it does not encourage its sign to be opposite; it will eventually lead to poor discrimination of the quantized hash code, causing misjudgment and other problems, providing a deep hash method based on metric learning

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Deep hash method based on metric learning
  • Deep hash method based on metric learning
  • Deep hash method based on metric learning

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0026] to combine figure 1 and figure 2 Describe this embodiment, the deep hashing method based on metric learning, this method is realized by the following steps:

[0027] 1. Construct training samples in the form of triplets;

[0028] Input data: The input for each training is a triplet form of two images and the label relationship between them:

[0029] {X i , X j , S ij}

[0030] Among them, X represents the image, S ij Represents image X i , Xj The label relationship between the same category is 1, and the different category is 0, that is:

[0031]

[0032] Before entering the neural network for feature extraction, X in the input triplet pair needs to be i , X j Perform scaling (resize) and cropping (crop) operations to ensure that X i , X j Have the same image dimensions; scale the image to 256px by 256px and randomly crop the content area to 227px by 227px.

[0033] 2. Build a deep neural network;

[0034] Referring to the existing deep convolutional n...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a deep hash method based on metric learning, relates to the field of computer vision and image processing, and solves the problems that a comparison loss function of an existing deep hash method can only enable feature vectors of images of the same category before quantization to be close as much as possible, but cannot encourage the same symbol; the values of different types of images before quantization are far away as far as possible, but the symbols cannot be encouraged to be opposite; finally, the quantized hash code is poor in discriminability, and misjudgment andother problems are caused. According to the invention, a hash comparison loss function is constructed; sign bit constraint is carried out on the real numerical value feature vector before quantization, so that the hash code of the representative image obtained after the real numerical value feature vector before quantization is quantized by a sign function is more accurate, and the sign is constrained through two control functions of fsim (fi.fj) and fdiff (fi.fj), other parts in the expression are used for enabling the feature values of the same category of images to be close and the featurevalues of different categories of images to be far. According to the method, the classification precision is effectively improved, and the misjudgment rate is reduced.

Description

technical field [0001] The invention relates to the fields of computer vision and image processing, in particular to a deep hashing method based on metric learning. Background technique [0002] With the advent of the information age, information technology and storage technology have been developed at a high speed, and massive data is generated every day, and the scale of image data is even more explosive. If large-scale data is directly searched by similarity, it will inevitably lead to serious problems. Big time and space overhead. At the same time, due to the complex structure of images and the high dimensionality of features, it is an urgent problem to ensure the retrieval accuracy and retrieval efficiency of image retrieval in large-scale data sets. [0003] The main process of the deep hashing method: first, a convolutional neural network is constructed based on the convolutional layer, pooling layer, etc. for feature extraction of images. Commonly used feature extra...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06F16/51G06F16/53G06N3/04G06N3/08
CPCG06F16/51G06F16/53G06N3/08G06N3/084G06N3/045Y02D10/00
Inventor 周蓝翔肖波王义飞王浩宇尹恒
Owner BEIJING UNIV OF POSTS & TELECOMM
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products