Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

A Weak Target Detection Method Based on Feature Mapping Neural Network

A weak target and neural network technology, applied in the field of weak target detection, can solve the problem of low detection accuracy, achieve high detection accuracy, improve network stability, and strong robustness

Active Publication Date: 2021-11-05
江西诚安科技有限公司
View PDF9 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0006] The object of the present invention is: the present invention provides a weak target detection method based on feature mapping neural network, which solves the problem of low detection accuracy caused by existing weak targets affected by noise and interference

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A Weak Target Detection Method Based on Feature Mapping Neural Network
  • A Weak Target Detection Method Based on Feature Mapping Neural Network
  • A Weak Target Detection Method Based on Feature Mapping Neural Network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0096] The training data of the network is as image 3 As shown, the small circle represents one type of item, and the x point represents another type of sample. The training samples are linear and inseparable, which is difficult to distinguish; after training the deep network through the training samples, the second layer and the third layer are output Features such as Figure 4 As shown, after the linearly inseparable two-dimensional features are mapped to three-dimensional by the neural network, they have linear discriminability in the three-dimensional feature space, and after the three-dimensional features are re-encoded to two-dimensional, as Figure 5 As shown, the original non-linear and non-separable low-dimensional features are mapped to the linear discriminable feature space, and the blue x-point samples are distributed in a compact area. Such as figure 2 As shown, the entire network contains 6 layers in total, which can be divided into 2 parts. The input layer t...

Embodiment 2

[0106] Step 1 includes the following steps:

[0107] Step 1.1: Construct the structure of a spindle-type deep neural network including an input layer, a decoding layer, an encoding layer and a softmax output layer;

[0108] Step 1.2: using the cross-validation method to determine the hyperparameters of the spindle-shaped deep neural network to obtain the spindle-shaped deep neural network;

[0109] Step 1.3: Construct the training data set;

[0110] Step 1.4: Input the training data set into the spindle-type deep neural network and train in an unsupervised manner to obtain initialized network weights to complete the training.

[0111] The decoding layer and encoding layer training calculation are as follows:

[0112] h k =σ(W k X+b k )

[0113] Among them, W k represents the weight matrix, b k Represents the bias vector, σ represents the activation function, X={x 1 ,x 2 ,...,x m} represents the input of the current layer, h k Indicates the output of the current lay...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a weak and small target detection method based on a feature mapping neural network, which relates to the field of weak and small target detection; the method comprises steps 1: constructing and training a spindle-type deep neural network; The spindle-shaped deep neural network obtains the amplitude map of target enhancement and background suppression; Step 3: The amplitude map adopts the constant false alarm rate method to complete the detection of weak and small targets. The invention adopts a spindle-shaped network structure to improve the powerful representation ability of the network, solves the problem of low detection accuracy caused by the influence of noise and interference on the existing weak and small targets, and achieves the improvement of the powerful representation ability of the network. The effect of object detection accuracy.

Description

technical field [0001] The invention relates to the field of weak and small target detection, in particular to a weak and small target detection method based on a feature mapping neural network. Background technique [0002] Passive millimeter-wave PMMW and infrared imaging have excellent characteristics of no radiation and strong penetrating ability, and their application in the military field has attracted increasing attention. Therefore, it is of great significance to study weak and small target detection under millimeter-wave and infrared imaging. Weak and small target detection technology has developed rapidly in recent years. Weak and small targets refer to targets with a diameter of 3-5 pixels. However, high-precision detection of weak and small targets under millimeter-wave and infrared imaging conditions still faces great difficulties: First, the imaging distance of the target Generally far away, the detected target area is small, the signal-to-noise ratio is low, a...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06K9/62G06K9/40
CPCG06V10/30G06V2201/07G06F18/213G06F18/214
Inventor 谢春芝高志升
Owner 江西诚安科技有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products