Eureka AIR delivers breakthrough ideas for toughest innovation challenges, trusted by R&D personnel around the world.

Modulation identification method based on convolutional neural network

A convolutional neural network and modulation recognition technology, applied in the field of signal modulation recognition, can solve the problems of poor modulation recognition performance

Inactive Publication Date: 2021-02-02
成都悦鉴科技有限公司
View PDF9 Cites 25 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] The purpose of the present invention is to overcome the shortcomings of poor modulation recognition performance in the prior art when processing signals, and provide a modulation recognition method based on convolutional neural network, by combining the temporal feature extraction ability and attention mechanism of temporal convolutional network To enhance the ability of feature expression, a parallel network is proposed to fuse the spatial features extracted from the convolutional neural network and the temporal features extracted from the temporal convolutional network to further improve the performance of modulation recognition

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Modulation identification method based on convolutional neural network
  • Modulation identification method based on convolutional neural network
  • Modulation identification method based on convolutional neural network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0075] like Figure 1-8 As shown, this embodiment relates to a modulation recognition method based on a convolutional neural network, comprising the following steps:

[0076] S1: Select the modulated signal data set and design the convolutional neural network model structure;

[0077] S2: Construct the residual unit through the residual connection in the convolutional neural network model;

[0078] S3: Perform batch normalization on the data in the network layer in batches in the convolutional neural network model;

[0079] S4: setting convolutional neural network parameters;

[0080] S5: Train the convolutional neural network and randomly lose the data in the training set;

[0081] S5.1: Initialize network parameters;

[0082] S5.2: Calculate the feature maps output by the convolutional layer, pooling layer, and fully connected layer;

[0083] S5.3: Obtain the error between the predicted value and the actual value;

[0084] S5.4: Determine whether the error is converged...

Embodiment 2

[0115] like Figure 1-8 As shown, the specific parameter settings of the convolutional neural network proposed in this embodiment on the basis of Embodiment 1 are shown in Table 2, wherein the first residual unit Unit1 and the second residual unit Unit2 and the global average pool are listed. The specific parameters of the classification layer GAP and Softmax classification layer, and the remaining three residual units are consistent with the parameters in the second residual unit. The number of convolution kernels is set to 32, and the size of the first convolution kernel of each residual unit is 1×1, and then the size of the convolution kernel in the first residual unit is 2×3, and the size of the convolution kernel in the remaining residual units is is 1×3. It is worth noting that edge expansion is performed in both convolutional layers. The dropout layer is respectively connected after the second residual unit and the fourth residual unit to improve the generalization pe...

Embodiment 3

[0120] In this embodiment, the 64-bit Ubuntu 16.04 LTS system is used as the experimental environment, and the Keras deep learning framework at the back end of Tensorflow is used to complete the model construction, and the NVIDIA GTX 1070Ti graphics card is used to accelerate the training process. Using the public modulation signal dataset RML2016.10a as the experimental data, 60% of the data in the dataset is used as the training set, 20% as the test set, and the remaining 20% ​​as the verification set. During the training process, the cross-entropy loss function and Adam optimizer are used, and the number of training iterations is set to 120. The early stop method is used to control whether the model continues to iterate. If the recognition rate does not change much after every 10 rounds of training, the iteration is stopped. During the training process of the convolutional neural network, different batch sizes may have a certain impact on the recognition rate and training ti...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a modulation identification method based on a convolutional neural network. The method comprises the following steps: S1, selecting a modulation signal data set and designing aconvolutional neural network model structure; s2, constructing a residual error unit in the convolutional neural network model in a residual error connection mode; s3, performing batch normalizationon the data in the network layer in batches in the convolutional neural network model; s4, setting convolutional neural network parameters; s5, training the convolutional neural network, and randomlylosing data in the training set; s6, substituting the signal into the trained convolutional neural network and performing modulation identification; the parallel network is provided by combining the time sequence feature extraction capability of the time convolution network and the feature expression enhancement capability of the attention mechanism, so that the spatial features extracted from theconvolution neural network and the time sequence features extracted from the time convolution network are fused, and the modulation recognition performance is further improved.

Description

technical field [0001] The invention relates to the field of signal modulation recognition, in particular to a modulation recognition method based on a convolutional neural network. Background technique [0002] Modulation identification, also known as modulation classification, refers to accurately identifying the modulation type from the received signal under the premise of unknown signal modulation mode prior information, laying the foundation for subsequent demodulation work. Whether in the civilian field or the military field, modulation recognition plays an extremely critical role. In the civilian field, due to the emergence of various communication methods and communication equipment, radio spectrum resources are becoming increasingly scarce. With the development of the communication industry, the current wireless spectrum is divided into civil radio and television, wireless communication, and satellite communication according to different services. and other frequen...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/62G06N3/04G06N3/08H04L27/00
CPCG06N3/08H04L27/0012G06N3/045G06F18/241
Inventor 张航陈宇林
Owner 成都悦鉴科技有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Eureka Blog
Learn More
PatSnap group products