Underwater target feature extraction method based on convolutional neural network (CNN)

A convolutional neural network and underwater target technology, applied in the field of underwater target feature extraction, can solve problems that affect the quality of feature extraction, spatial features cannot be restored, and affect classification accuracy, etc.

Active Publication Date: 2017-09-22
HARBIN ENG UNIV
View PDF4 Cites 34 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0007] The feature information in two-dimensional images is often highly concealed, because changes in the position of the observation target and different observation angles will cause deformation, displacement or even distortion of the observation target.
In this process, spatial information will be lost and spatial features cannot be restored in the SoftMax layer, which will affect the classification accuracy and indirectly affect the quality of feature extraction when the network is continuously fed back and adjusted.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Underwater target feature extraction method based on convolutional neural network (CNN)
  • Underwater target feature extraction method based on convolutional neural network (CNN)
  • Underwater target feature extraction method based on convolutional neural network (CNN)

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0090] The following examples describe the present invention in more detail.

[0091] The time-frequency domain conversion is performed on the original noise signal to generate a LoFAR spectrogram that can represent time-frequency domain information. The specific processing process is:

[0092] 1. Define S(n) as the sampling sequence of the original radiation noise signal, divide it into 25 continuous parts, and set 25 sampling points for each part. Among them, 25 consecutive parts are allowed to have overlapping parts of data, and the degree of crossover is set to 50%.

[0093] 2. Define M j (n) is the sampling sample of the j-th segment signal, and it is normalized and centered. The purpose is to make the amplitude of the radiation noise signal evenly distributed in time and to achieve DC removal so that the average value of the sample is zero.

[0094] Normalization processing:

[0095]

[0096] In order to facilitate the calculation of the Fourier transform, the val...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention provides an underwater target feature extraction method based on a convolutional neural network (CNN). 1, a sampling sequence of an original radiation noise signal is divided into 25 consecutive parts, and each part is set with 25 sampling points; 2, normalization and centralization processing are carried out on a sampling sample of the j-th segment of the data signal; 3, short-time Fourier transform is carried out to obtain a LoFAR graph; 4, a vector is assigned to an existing 3-dimensional tensor; 5, an obtained feature vector is input to a fully-connected layer for classification and calculation of error with label data, whether the loss error is below an error threshold is tested, if the loss error is below the error threshold, network training is stopped, and otherwise, step 6 is entered; and 6, a gradient descent method is used to carry out parameter adjustment layer by layer on the network from back to front, and shifting to the step 2 is carried out. Compared with the traditional convolutional neural network algorithms, the method of the invention carries out a weighted operation of spatial information multi-dimensions on a feature graph layer to compensate for a defect of spatial information losses caused by one-dimensional vectorization of the fully-connected layer.

Description

technical field [0001] The invention relates to an underwater target feature extraction method. Background technique [0002] At present, there are mainly two methods for underwater target feature extraction: time domain and frequency domain. The time domain is extracted from the waveform structure features. Reflected in the shape of the echo, the more obvious the target difference is, the more obvious the difference in the waveform structure is; in addition, the difference in the receiving angle of the echo and the attitude of the target will also have a greater impact on the waveform in the time domain, and these differences also hide the difference between the target. The characteristics of the object, that is, the classification features of the target are extracted from the waveform structure. Frequency domain features refer to the spectrum features obtained after signal processing. The target is identified by the method of spectrum estimation, and the target feature pa...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/46G06K9/62G06N3/04G06N3/08
CPCG06N3/084G06V10/454G06N3/045G06F18/217
Inventor 王红滨何鸣宋奎勇周连科王念滨郎泽宇王瑛琦顾正浩李浩然迟熙翔
Owner HARBIN ENG UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products