Feature pyramid-based remote-sensing image time-sensitive target recognition system and method

A feature pyramid and remote sensing image technology, applied in the field of remote sensing image processing, can solve the problems of poor multi-scale target detection, high demand for computing resources, large scale and many parameters, etc. The effect of reducing the number of parameters

Active Publication Date: 2018-11-06
HUAZHONG UNIV OF SCI & TECH
View PDF4 Cites 38 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] Existing deep learning target detection models usually only use the upper layer of single-scale features. Although the semantic information of the upper layer features is strong, the position information is weak, and the detection effect on multi-scale targets is poor. At the same time, the existing network scale has large parameters. Many, high demand for computing resources

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Feature pyramid-based remote-sensing image time-sensitive target recognition system and method
  • Feature pyramid-based remote-sensing image time-sensitive target recognition system and method
  • Feature pyramid-based remote-sensing image time-sensitive target recognition system and method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0047] The present invention provides a remote sensing image time-sensitive target recognition system based on a feature pyramid. The image recognition system includes a target feature extraction sub-network, a feature layer sub-network, a candidate region generation sub-network and a classification regression sub-network. Among them, the target feature extraction sub-network has multiple levels of output terminals, the classification regression sub-network has multiple levels of input terminals and RPN input terminals, and the feature layer sub-network and candidate region generation sub-network have multiple levels of input terminals and A plurality of hierarchical output terminals, a hierarchical output terminal of the target feature extraction subnetwork is connected with a hierarchical input terminal of the feature layer subnetwork, a hierarchical output terminal of the feature layer subnetwork is connected with a hierarchical input terminal of the candidate region generati...

Embodiment 2

[0050] On the basis of Embodiment 1, the feature layer sub-network includes a plurality of feature layer sub-modules, denoted as the first feature layer sub-module, the second feature layer sub-module, ..., the i-th feature layer sub-module, ..., the Nth feature layer sub-module. Among them, one input end of the i-th feature layer sub-module is used as a layer input end of the feature layer sub-network, and the other input end of the i-th feature layer sub-module is the same as the output end of the i+1th feature layer sub-module Connection, the input end of the Nth feature layer sub-module is used as a layer input end of the feature layer sub-network, 1≤i≤N-1, the first N-1 feature layer sub-modules are used for the upper layer feature layer and The current feature layer is superimposed to obtain the current fusion feature layer; the Nth feature layer sub-module is used to output the current feature layer as the current fusion feature layer, and the previous feature layer is ...

Embodiment 3

[0052] like figure 2 As shown, on the basis of Embodiment 2, any one of the feature layer submodules in the first N-1 feature layer submodules includes the upper layer processing subunit, the current processing subunit and the superposition unit, and the upper layer processing subunit The output terminal of the superposition unit is connected to the first input terminal of the superposition unit, the output terminal of the current processing subunit is connected to the second input terminal of the superposition unit, and the upper layer processing subunit is used to perform upsampling processing and output processing on the upper layer feature layer After the upper feature layer T2, the current processing subunit is used to perform 1×1 convolution processing on the current feature layer to output the current feature layer S2 after processing, and the superposition unit is used to process the processed upper layer feature layer and the current feature layer Overlay processing,...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a feature pyramid-based remote-sensing image time-sensitive target recognition system and method. The system comprises a target feature extraction sub-network, a feature layersub-network, a candidate area generation sub-network and a classification and regression sub-network, wherein the target feature extraction sub-network is used for carrying out multiple layers of convolution processing on a to-be-processed image and outputting the convolution processing result of each layer as a feature layer; the feature layer sub-network is used for overlapped the last feature layer and the current feature layer to obtain the current fused feature layer, wherein the topmost fused feature layer is a topmost feature layer; the candidate area generation sub-network is used forextracting candidate areas from different layers of fused feature layers; and the classification and regression sub-network is used for mapping the candidate areas to different layers of fused featurelayers so as to obtain a plurality of mapped fused feature layers, and carrying out target judgement on the plurality of mapped fused feature layers so as to output a result. According to the systemand method, the hierarchical structures of feature pyramids are utilized to ensure that all the scales of features have rich semantic information.

Description

technical field [0001] The invention belongs to the field of remote sensing image processing, and more specifically relates to a time-sensitive target recognition system and method for remote sensing images based on feature pyramids. Background technique [0002] Large-format remote sensing image moving target detection is an important component in the field of remote sensing image analysis and is extremely challenging due to its characteristics of multi-scale, wide format, and special viewing angle. Its basic task is to detect a given aerial or satellite image Determine whether it contains one or more category target objects and the precise location of each category object target. As an important part of computer vision and remote sensing image analysis, time-sensitive target detection in large-format remote sensing images has high research significance. With the rapid development of contemporary remote sensing technology, sensor technology and Internet technology, people ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/00G06N3/04
CPCG06V20/13G06N3/045
Inventor 杨卫东金俊波王祯瑞习思黄竞辉钟胜陈俊
Owner HUAZHONG UNIV OF SCI & TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products