Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Remote sensing target detection method based on boundary constraint CenterNet

A boundary constraint and target detection technology, applied to target detection in optical remote sensing images, based on boundary constraint CenterNet target detection, in the field of image target detection, can solve the problems of dense small target detection accuracy and low recall rate, and achieve the goal of improving Detection accuracy, the effect of improving detection accuracy and recall rate

Active Publication Date: 2019-12-03
XIDIAN UNIV
View PDF4 Cites 18 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] The purpose of the present invention is to address the deficiencies of the above-mentioned prior art, and propose a target detection method based on boundary constraints CenterNet, which is used to solve the technical problems of low detection accuracy and low recall rate of dense small targets in the prior art

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Remote sensing target detection method based on boundary constraint CenterNet
  • Remote sensing target detection method based on boundary constraint CenterNet
  • Remote sensing target detection method based on boundary constraint CenterNet

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0029] The present invention will be further described in detail below in conjunction with the accompanying drawings and specific embodiments.

[0030] refer to figure 1 , the implementation steps of the present invention are as follows:

[0031] Step 1) Get the training sample set:

[0032] Randomly select N pieces of images with a pixel size of W×H×c from the optical remote sensing image data set as a training sample set, where N=10000, W=H=511, c=3;

[0033] Step 2) Construct the boundary constraint CenterNet network:

[0034] (2a) Build a feature extraction network, a boundary-constrained convolutional network, and a keypoint generation network, where:

[0035] The feature extraction network includes the first input layer, the first downsampling convolutional layer, the first convolutional layer, the second downsampling convolutional layer, the second convolutional layer, the third downsampling convolutional layer, the Four downsampling convolutional layers, fifth down...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a remote sensing target detection method based on boundary constraint CenterNet, which is used for solving the technical problems of relatively low detection precision and recall rate of dense small targets in the prior art. The method comprises the following implementation steps: obtaining a training sample set; the method comprises the following steps: constructing a boundary constraint CenterNet network; obtaining a prediction label and an embedded vector of the training sample set; calculating the loss of the boundary constraint CenterNet network; carrying out the training of a boundary constraint CenterNet network; and obtaining a target detection result based on the trained boundary constraint CenterNet network. Through performing maximum pooling in the constrained pooling area through the corner constraint pooling layer, the fine features around the target are extracted, the detection precision and recall rate of dense small targets are effectively improved, meanwhile, the boundary constraint label generated by the boundary constraint convolutional network is utilized to constrain the prediction box, a more accurate target prediction box is obtained, and the detection precision of the target is further improved.

Description

technical field [0001] The invention belongs to the technical field of machine vision and relates to an image target detection method, in particular to a boundary-constrained CenterNet-based target detection method, which can be used for target detection in optical remote sensing images. Background technique [0002] The target detection method is one of the core research contents in the field of machine vision. It is a technique for regressing and classifying all the targets of interest in the image by extracting and processing image features, and determining their positions and categories. It is widely used in Object Detection in Optical Remote Sensing Images. The technical indicators of target detection methods include detection accuracy, recall rate, and detection speed. In remote sensing images, affected by the image resolution, there are a large number of dense small targets. The small ratio in the whole image makes it difficult to accurately detect the existence of d...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/00G06K9/62
CPCG06V20/13G06V20/53G06V2201/07G06F18/214
Inventor 冯婕曾德宁李迪焦李成张向荣曹向海刘若辰尚荣华
Owner XIDIAN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products