Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Optical Remote Sensing Image Retrieval Method Based on Attention and Generative Adversarial Networks

An optical remote sensing image and attention technology, applied in still image data retrieval, neural learning methods, biological neural network models, etc., can solve the problems of reducing quantization errors, large time consumption, and affecting retrieval accuracy, so as to improve accuracy, Improved retrieval accuracy and reduced quantization error

Active Publication Date: 2022-03-22
XIDIAN UNIV
View PDF4 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, the disadvantage of this method is that only relying on the bag-of-words model to encode the features of the image to obtain the image feature vector, using this image feature vector to retrieve a large number of optical remote sensing images, the time consumption is very large
However, this method still has shortcomings: since the hash function is discrete, there is a quantization error from the deep feature of the image to the hash code. However, this method does not have an effective mechanism to reduce the quantization error, which affects the final search precision

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Optical Remote Sensing Image Retrieval Method Based on Attention and Generative Adversarial Networks
  • Optical Remote Sensing Image Retrieval Method Based on Attention and Generative Adversarial Networks
  • Optical Remote Sensing Image Retrieval Method Based on Attention and Generative Adversarial Networks

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0062] The present invention will be described in further detail below in conjunction with the accompanying drawings.

[0063] Refer to attached figure 1 , the steps of the present invention are further described in detail.

[0064] Step 1, construct a deep convolutional network.

[0065] Build an 11-layer deep convolutional network, and its structure is as follows: input layer → first convolutional layer → first pooling layer → second convolutional layer → second pooling layer → third convolutional layer → fourth Convolutional layer → fifth convolutional layer → first fusion layer; where the third convolutional layer is connected to the first fusion layer through the first residual layer, and the fourth convolutional layer is connected to the first fusion layer through the second residual layer connect.

[0066] Set the parameters of each layer as follows:

[0067] Set the total number of input layer feature maps to 3.

[0068] The total number of feature maps of the fir...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses an optical remote sensing image retrieval method based on attention and generation confrontation network, which mainly solves the problem of low optical remote sensing image retrieval accuracy in the prior art. The specific steps of the present invention are as follows: (1) construct a deep convolutional network; (2) construct an attention network; (3) construct a generating confrontation network; (4) construct a hash learning network; (5) train a network; The hash code vector of each optical remote sensing image; (7) retrieve the optical remote sensing image. The present invention constructs an attention network, extracts discriminable features of images, and improves the expression ability of image features; constructs a generative confrontation network, extracts image hash code vectors, and reduces quantization errors; finally improves the retrieval accuracy of optical remote sensing images .

Description

technical field [0001] The invention belongs to the technical field of image processing, and further relates to an optical remote sensing image retrieval method based on attention and generation confrontation network in the technical field of optical remote sensing image retrieval. The invention can quickly and accurately search for the image that the user is interested in from a large amount of optical remote sensing images. Background technique [0002] With the development of satellite remote sensing and aerial remote sensing technology, the data volume and image resolution of remote sensing images are increasing, and more useful data and information can be obtained from remote sensing images. For the application of different occasions, the processing of remote sensing images also has different requirements, so in order to effectively analyze and manage these remote sensing image data, it is necessary to quickly query and retrieve interesting images from the massive remot...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06F16/583G06F16/55G06F16/532G06N3/04G06N3/08
CPCG06F16/583G06F16/55G06F16/532G06N3/08G06N3/045
Inventor 刘超马晶晶唐旭焦李成
Owner XIDIAN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products