Attention and generative adversarial network-based optical remote sensing image retrieval method

An optical remote sensing image and attention technology, applied in still image data retrieval, neural learning methods, biological neural network models, etc., can solve the problems of reducing quantization errors, large time consumption, and affecting retrieval accuracy, so as to improve accuracy, Improved retrieval accuracy and reduced quantization error

Active Publication Date: 2020-03-27
XIDIAN UNIV
View PDF4 Cites 13 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, the disadvantage of this method is that only relying on the bag-of-words model to encode the features of the image to obtain the image feature vector, using this image feature vector to retrieve a large number of optical remote sensing images, the time consumption is very large
However, this method still has shortcomings: since the hash function is discrete, there is a quantization error from the deep feature of the image to the hash code. However, this method does not have an effective mechanism to reduce the quantization error, which affects the final search precision

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Attention and generative adversarial network-based optical remote sensing image retrieval method
  • Attention and generative adversarial network-based optical remote sensing image retrieval method
  • Attention and generative adversarial network-based optical remote sensing image retrieval method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0062] The present invention will be described in further detail below in conjunction with the accompanying drawings.

[0063] Refer to attached figure 1 , the steps of the present invention are further described in detail.

[0064] Step 1, construct a deep convolutional network.

[0065] Build an 11-layer deep convolutional network, and its structure is as follows: input layer → first convolutional layer → first pooling layer → second convolutional layer → second pooling layer → third convolutional layer → fourth Convolutional layer → fifth convolutional layer → first fusion layer; where the third convolutional layer is connected to the first fusion layer through the first residual layer, and the fourth convolutional layer is connected to the first fusion layer through the second residual layer connect.

[0066] Set the parameters of each layer as follows:

[0067] Set the total number of input layer feature maps to 3.

[0068] The total number of feature maps of the fir...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses an attention and generative adversarial network-based optical remote sensing image retrieval method, and mainly solves the problem of low optical remote sensing image retrievalprecision in the prior art. The method comprises the following specific steps: (1) constructing a deep convolutional network; (2) constructing an attention network; (3) constructing a generative adversarial network; (4) constructing a hash learning network; (5) training a network; (6) acquiring a hash coding vector of each optical remote sensing image; and (7) retrieving an optical remote sensingimage. According to the method, the attention network is constructed, discriminable features of the image are extracted, and the expression ability of image features is improved; a generative adversarial network is constructed, an image hash coding vector is extracted, and quantization errors are reduced; and finally, the retrieval precision of the optical remote sensing image is improved.

Description

technical field [0001] The invention belongs to the technical field of image processing, and further relates to an optical remote sensing image retrieval method based on attention and generation confrontation network in the technical field of optical remote sensing image retrieval. The invention can quickly and accurately search for the image that the user is interested in from a large amount of optical remote sensing images. Background technique [0002] With the development of satellite remote sensing and aerial remote sensing technology, the data volume and image resolution of remote sensing images are increasing, and more useful data and information can be obtained from remote sensing images. For the application of different occasions, the processing of remote sensing images also has different requirements, so in order to effectively analyze and manage these remote sensing image data, it is necessary to quickly query and retrieve interesting images from the massive remot...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06F16/583G06F16/55G06F16/532G06N3/04G06N3/08
CPCG06F16/583G06F16/55G06F16/532G06N3/08G06N3/045
Inventor 刘超马晶晶唐旭焦李成
Owner XIDIAN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products