Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Compact visual descriptor deep neural network generation model in visual retrieval

A deep neural network and generative model technology, which is applied in the field of compact visual descriptor deep neural network generative models, can solve problems such as inability to adapt to application scenarios with limited computing resources and storage resources

Inactive Publication Date: 2018-11-30
XIAMEN UNIV
View PDF2 Cites 4 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] The purpose of the present invention is to solve the problem that the existing Fisher vector cannot adapt to the application scenarios with limited computing resources and storage resources due to the nature of the ultra-high-dimensional real-valued vector, in order to cope with and process large-scale image search, to overcome large-scale image Retrieval problems caused by computation and storage overhead, providing a compact visual descriptor deep neural network generation model in visual retrieval

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Compact visual descriptor deep neural network generation model in visual retrieval
  • Compact visual descriptor deep neural network generation model in visual retrieval
  • Compact visual descriptor deep neural network generation model in visual retrieval

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0052] The following embodiments will describe the present invention in detail with reference to the accompanying drawings.

[0053] The present invention comprises the following steps:

[0054] 1) For the images in the image library, randomly select a part of the images as the training set, and extract the corresponding image local features;

[0055] 2) Randomly combine image local feature set pairs for offline training of the deep neural network model for the training set;

[0056] 3) Train the deep neural network model with the backpropagation algorithm;

[0057] The specific method of training the deep neural network model by the backpropagation algorithm is:

[0058] a) For each batch in the image local feature set pair:

[0059] b) Each batch of local feature sets is input to the deep neural network model, and the gradient value of all parameters of the model is calculated using the back propagation algorithm;

[0060] c) Update model parameters;

[0061] d) exit th...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a compact visual descriptor deep neural network generation model in visual retrieval, and relates to image retrieval. The model is implemented by the following steps of: constructing a Fisher layer network; constructing a grouping and secondary classification module; training a loss function on the basis of a maximum boundary condition; for image library images and a queryimage, firstly extracting local features of the images, carrying out aggregation and binary embedding on the local features of the images by using a trained network structure so as to obtain binary codes of the images, carrying out matching in an image library according to the binary code of the query image, returning images with high similarities as a matched candidate set, carrying out geometricconsistency inspection on the candidate set by using the local features so as to carry out accurate matching, and returning a final query result. According to the model, the flexible Fisher network is used to aggregate local features of images so as to generate more efficient global feature Fisher vectors, and the grouping and secondary classification module is used to carry out binary coding onthe Fisher vectors so as to obtain compact global binary features.

Description

technical field [0001] The invention relates to image retrieval, in particular to a compact visual descriptor deep neural network generation model in visual retrieval based on Fisher network and binary embedding. Background technique [0002] With the rapid development of the Internet, the multimedia data on the network is increasing geometrically, and the image and video data are growing particularly rapidly. According to statistics, about 5h of video content is uploaded to YouTube every second; according to Cisco's 2015 survey, by 2017 about 80% of the traffic on the Internet will be video. Therefore, the picture and video data on the Internet are increasing all the time, and the existing capacity and growth rate of the data have far exceeded the processing capacity of the current technology. Faced with such a rapid growth rate of data volume, how to make good use of these data faces several problems that need to be solved, namely, how to use storage space more effectivel...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06F17/30G06N3/08
CPCG06N3/08
Inventor 纪荣嵘林贤明钱剑强施明辉
Owner XIAMEN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products