Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Optimizing training method of neural network equalizer

A technology of neural network and training method, which is applied in the direction of biological neural network model, physical realization, etc., can solve the problems of lower optimization of the equalizer, non-convergence of the training process, failure to successfully achieve the equalizer training quality target, etc., to improve usability and reduce The effect of training time overhead

Inactive Publication Date: 2007-04-11
ZTE CORP
View PDF2 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] However, the application of neural networks in nonlinear equilibrium is still restricted by a key factor: the convergence of the training process.
Different from the traditional analytical optimization method, the training process of the neural network mainly utilizes local information; the local optimum in the image space will interfere with the convergence of the training process to the global optimum, which will lead to more inconsistencies in the training process of the neural network. Determinism and variability, that is, the probability distribution of training time overhead has heavy-tailed characteristics
Specifically in the adaptive nonlinear equalization method, the heavy tail of the training time overhead of the neural network equalizer will lead to the training process. In some time periods, the training quality target for the equalizer cannot be successfully achieved, that is, the training process does not converge. During these time periods, the equalizer is in a state of lower performance, which seriously affects the equalization quality

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Optimizing training method of neural network equalizer
  • Optimizing training method of neural network equalizer
  • Optimizing training method of neural network equalizer

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0018] Since the cost distribution of neural network training time has heavy tails, and the heavy tails of computational time cost distributions are common in NP optimization problems, the method of suppressing the heavy tails of NP optimization problems can be used to solve neural networks. Convergence of training. Following the above ideas, the present invention proposes an optimized training method for improving the training efficiency of feedforward neural networks.

[0019] The present invention selects Time Lagged Forward Neural Networks (TLFN) as an embodiment. The reason is that this type of network topology has both long-range and short-range memory, and can flexibly compromise the long-range correlation and dynamic behavior of the modeling object. Short-range bursty; and this type of network belongs to the feedforward structure, and the computational complexity of its training can be controlled at a low level. The optimized training method of the present invention can al...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides an optimized training method for neural net equalizer, wherein the training time expense of the neural net equalizer under the typical channel condition is first recorded, the optimized restarting point of the neural net equalizer is calculated, training the sequence training neural net equalizer, restarting the training procedure, after the neural net equalizer training contracts, compensating the distorted wireless signal through its output.

Description

Technical field [0001] The invention relates to the field of wireless communication, in particular to an optimized training method of a neural network equalizer. Background technique [0002] Equalization is the core method for wireless communication systems to solve problems such as multipath fading and nonlinear amplification and distortion. In the case that the channel and the transmitter power amplifier do not introduce any amplitude distortion, the linear equalizer based on the mean square error criterion can achieve optimized performance measured by the symbol error probability. However, the complex wireless environment and high-speed wireless modulation technology usually cause significant amplitude distortion, which severely limits the availability of linear equalizers. In this context, nonlinear equalization algorithms have been proposed, such as decision feedback method and maximum likelihood sequence estimation method. However, the above nonlinear methods still have li...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06N3/06
Inventor 侯越先王宁
Owner ZTE CORP
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products