Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

A multi-scale Hash retrieval method based on deep learning

A deep learning and multi-scale technology, applied in the field of image and multimedia signal processing, can solve the problems of large quantization error, information loss, and long training time, and achieve the effect of enhancing retention, reducing quantization error, and speeding up encoding

Pending Publication Date: 2019-06-28
SHANDONG UNIV
View PDF8 Cites 32 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] Aiming at the problems of too long training time, too large quantization error, and information loss in the encoding process of hash retrieval based on deep learning, the present invention proposes a multi-scale hash retrieval method based on deep learning

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A multi-scale Hash retrieval method based on deep learning
  • A multi-scale Hash retrieval method based on deep learning
  • A multi-scale Hash retrieval method based on deep learning

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0030] The specific process of the multi-scale hash retrieval method based on deep learning proposed by the present invention is as follows: figure 1 As shown, first preprocess the data, obtain the similarity matrix according to the label of the image, generate binary image pairs, then build the model, adjust the parameters of the model, and then use the backpropagation algorithm to train the model. The image is then encoded to obtain a unique identifier for the data. Finally, the query image is encoded, and the Hamming distance between the query image and the database image is calculated. Arrange in ascending order according to the size of the Hamming distance, and return the image data whose Hamming distance is less than a certain threshold, which is the retrieval result.

[0031] The present invention will be further described below in conjunction with specific embodiments (but not limited to this example) and accompanying drawings.

[0032] (1) Data preprocessing

[00...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

Image pairing information and image classification information are optimized and a Hash code quantization process is used to realize a simple and easy end-to-end deep multi-scale supervision Hash method, and meanwhile design a brand new pyramid connected convolutional neural network structure, and the convolutional neural network structure takes paired images as training input and enables the output of each image to be approximate to a discrete Hash code. In addition, the feature map of each convolution layer is trained, feature fusion is carried out in the training process, and the performance of deep features is effectively improved. A neural network is constrained through a new binary constraint loss function based on end-to-end learning, and a Hash code with high feature representationcapability is obtained. High-quality multi-scale Hash codes are dynamically and directly learned through an end-to-end network, and the representation capability of the Hash codes in large-scale image retrieval is improved. Compared with an existing Hash method, the method has higher retrieval accuracy. Meanwhile, the network model is simple and flexible, can generate characteristics with strongrepresentation ability, and can be widely applied to other computer vision fields.

Description

technical field [0001] The invention relates to a multi-scale hash retrieval method based on deep learning, which belongs to the technical field of image and multimedia signal processing. Background technique [0002] In recent years, due to the explosive growth of the number of images on the Internet, fast and efficient retrieval of images has become increasingly important. Among many retrieval techniques, the hash-based retrieval method balances the efficiency and accuracy at the same time, which makes the retrieval achieve good results, and thus has received extensive attention. [0003] The hash-based retrieval method represents each picture with a binary code, which still approximately maintains the physical neighbor relationship in the picture space. Use the similarity criterion to calculate the similarity between the query picture and each image in the image feature library, and output the retrieval results under a given threshold after sorting according to the simil...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06F16/51G06F16/53G06K9/62G06N3/04G06N3/08
Inventor 刘琚顾凌晨刘晓玺孙建德
Owner SHANDONG UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products