Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Bidirectional and parallel decoding method of convolutional Turbo code

A decoding and convolution technology, which is applied in the field of channel coding and Turbo code decoding, can solve the problem of long decoding delay of component decoders, and achieve the effect of reducing storage space

Inactive Publication Date: 2012-02-01
UNIV OF ELECTRONICS SCI & TECH OF CHINA
View PDF3 Cites 13 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Although the multiplication operation is reduced, the component decoder using the Log-MAP algorithm or the Max-Log-MAP algorithm still has a long decoding delay

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Bidirectional and parallel decoding method of convolutional Turbo code
  • Bidirectional and parallel decoding method of convolutional Turbo code
  • Bidirectional and parallel decoding method of convolutional Turbo code

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0027] The present invention improves the decoding process in the component decoder in convolution Turbo decoding, and other processing processes remain unchanged. The convolution Turbo decoding process includes:

[0028] The data to be decoded and the prior likelihood ratio information are input into an iterative decoder in which two component decoders are connected in parallel;

[0029] When the preset maximum number of iterations is not reached, the component decoder outputs the a posteriori likelihood ratio information after decoding, converts it into external information, and after interleaving or deinterleaving, it is input to the other as the a priori likelihood ratio information a component decoder;

[0030] When the preset maximum number of iterations is reached, the last working component decoder decodes and outputs the posterior likelihood ratio information, and after deinterleaving and hard decision, the decoding result is obtained.

[0031] This embodiment improv...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a decoding method of a convolutional Turbo code for reducing decoding time delay and saving a memory. The decoding method comprises the following steps of: simultaneously carrying out forward recursion and backward recursion in a component decoding process; dividing the forward recursion and the backward recursion into two stages with equivalent computation quantity; and sequentially calculating and obtaining posterior likelihood ratio information at the beginning of the second stage. The time delay from the beginning of recursion operation to the end of the posterior likelihood ratio information operation is shortened once compared with the traditional decoding process. Furthermore, the traditional posterior likelihood ratio operation is serial, while the posteriorlikelihood ratio operation of the invention is carried out bidirectionally and simultaneously in parallel, the required calculation time and the recursive calculation time are overlapped, and it is unnecessary to distribute additional calculation time; in addition, a bidirectional parallel structure can ensure that the memory used for storing state metric is reduced by half. Furthermore, through the calculation of splitting branch metric, redundancy calculation is reduced, and the space for storing the branch metric is reduced by half.

Description

technical field [0001] The invention belongs to the communication field, and mainly relates to channel coding, especially related technologies of Turbo code decoding. Background technique [0002] Since the concept of iterative decoding was proposed, Turbo codes have been widely studied and applied. Convolutional Turbo Code (CTC) has developed rapidly in recent years because of its higher coding efficiency, faster coding speed and greater free distance. It has been selected as the forward error correction of the physical layer by the standards 802.16e and 802.16m. pattern. [0003] Standard 802.16m chooses double binary convolution Turbo code (DB-CTC) as one of the channel coding schemes. In the systematic code DB-CTC, 2 bits of information are input in parallel at each moment of coding, and 6 bits are output. Due to the adoption of the second encoding scheme, the states before and after encoding are the same, so there is no need for a tail bit. However, these characteri...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): H03M13/29H03M13/23H03M13/27
Inventor 王臣周亮詹明曾黎黎
Owner UNIV OF ELECTRONICS SCI & TECH OF CHINA
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products