Check patentability & draft patents in minutes with Patsnap Eureka AI!

Complex battlefield environment target efficient identification method based on improved FasterR-CNN

A recognition method and battlefield technology, applied in the field of deep learning and image recognition, can solve the problems of feature loss and low resolution of feature images, and achieve the effect of improving accuracy, improving network performance, and improving accuracy

Pending Publication Date: 2021-02-26
DALIAN JIAOTONG UNIVERSITY
View PDF0 Cites 1 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

The area generation network needs to generate candidate areas. In the original convolutional network, the feature image output by the last convolutional layer of the ZF or VGG network is directly used to predict the foreground position. Due to the low resolution of the feature image, it is serious for small objects or being occluded. The characteristics of the target will be lost

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Complex battlefield environment target efficient identification method based on improved FasterR-CNN
  • Complex battlefield environment target efficient identification method based on improved FasterR-CNN
  • Complex battlefield environment target efficient identification method based on improved FasterR-CNN

Examples

Experimental program
Comparison scheme
Effect test

Embodiment

[0050] S1: Build a two-way feature extraction network, specifically:

[0051] Since the candidate area generation network and the classification regression network share the same feature extraction network, it is easy to cause feature interference, and the feature extraction process is redesigned.

[0052] Therefore, an independent feature extraction network is set up for the candidate area generation network and the classification regression network, and the residual ResNet result with fewer parameters is used instead of the VGG16 network, so that the features learned by the candidate area generation network will not enter the classification regression network and improve network performance. .

[0053] S2: The battlefield environment feature map output by one-way feature extraction network is input into the candidate region generation network. The candidate region generation network distinguishes the background and the target in the battlefield environment by fusing shallow ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a complex battlefield environment target efficient identification method based on an improved FasterR-CNN. The method comprises the following steps: constructing two paths of feature extraction networks; inputting a battlefield environment feature map output by one path of feature extraction network into a candidate region generation network, and enabling the candidate region generation network to distinguish a background and a target in a battlefield environment by fusing shallow target position information and deep high-order semantic features; resetting the anchor point size and the length-width ratio of the battlefield environment feature map by adopting a K-Means clustering method; and inputting the battlefield environment feature map output by the other path of feature extraction network and the candidate region output by the candidate region generation network into a classification regression network, performing classification identification on the target, and performing regression on the position. According to the method, in a complex battlefield environment, the accuracy of a battlefield target is improved, so that the model has better theoretical guidance significance compared with a previous model.

Description

technical field [0001] The invention belongs to the field of deep learning and image recognition, in particular to an efficient recognition method for complex battlefield environment targets based on improved FasterR-CNN. Background technique [0002] In a complex battlefield environment, how to effectively and accurately identify battlefield targets for unmanned equipment clusters is the key to achieving precise control of unmanned equipment, obstacle avoidance, and execution of attack tasks. The identification of enemy targets can ensure that our unmanned Complete reconnaissance and combat missions off the battlefield. Traditional object recognition methods mainly include Cascade+HOG, DPM+Haar, SVM and their improvement and optimization methods. The disadvantage of these methods is that they need to manually design features, and the workload is huge, and similar algorithms have large changes in target shape, complex background or light Insufficient cases tend to perform p...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06K9/00G06K9/62G06N3/04G06N3/08
CPCG06N3/08G06V20/10G06V2201/07G06N3/045G06F18/23213G06F18/24G06F18/253
Inventor 王运明彭超亮初宪武
Owner DALIAN JIAOTONG UNIVERSITY
Features
  • R&D
  • Intellectual Property
  • Life Sciences
  • Materials
  • Tech Scout
Why Patsnap Eureka
  • Unparalleled Data Quality
  • Higher Quality Content
  • 60% Fewer Hallucinations
Social media
Patsnap Eureka Blog
Learn More