Check patentability & draft patents in minutes with Patsnap Eureka AI!

System level testing of entropy encoding

A technology of entropy coding and codeword, applied in the field of system-level testing of entropy coding

Active Publication Date: 2019-04-26
IBM CORP
View PDF4 Cites 1 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Computational resources are consumed during compression, and often in the reversal (expansion) of the compression process

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • System level testing of entropy encoding
  • System level testing of entropy encoding
  • System level testing of entropy encoding

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0016] Embodiments described herein provide a system-level test of entropy coding in a computer system implementing two-stage compression. One or more embodiments provide a method of testing a two-stage compression process and its corresponding extensions using pseudo-random test generation. The two-stage compression / expansion process may include Lempel-Ziv type coding as the first stage and entropy coding based on Huffman compression techniques as the second stage. According to one or more embodiments, compression and expansion are the two main components used to test entropy coding. For compression, a Huffman tree is generated for all possible combinations based on the input data, and a Signed Transformation Table (STT) with left-justified codewords is built. Additionally, an Entropy Encoding Descriptor (EED) defining a Huffman tree is generated that includes an indication of the number of bits in the STT (used during compression) and the number of input bits for symbol ind...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

An aspect includes receiving a symbol translation table (STT) that includes input symbols and their corresponding codewords. An entropy encoding descriptor (EED) that specifies how many of the codewords have each of the different lengths is also received. Contents of one or both of the STT and the EED are modified to generate a test case and an entropy encoding test is executed. The executing includes performing a lossless data compression process based on contents of an input data string that includes one or more of the input symbols, and on contents of the STT and the EED; or performing a data expansion process based on contents of an input data string that includes one or more of the codewords, and on contents of the STT and the EED. A result of the entropy encoding test is compared toan expected result.

Description

Background technique [0001] The present invention relates generally to computer systems, and more particularly to system-level testing of entropy coding. [0002] In signal processing, data compression involves reducing the size of a data file by encoding information so that it uses fewer bits than the original representation of the information. Compression is performed using lossy or lossless compression. Lossless compression reduces bits by identifying and removing statistical redundancies, and no information is lost when lossless compression is performed. In contrast, lossy compression reduces bits by removing unnecessary or less important information, and those bits may be lost when lossy compression is performed. Data compression is useful because it reduces the resources required to store and transmit data. Computational resources are consumed during the compression process, and often in the reversal (expansion) of the compression process. The design of a data compre...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): H03M7/40H03M7/30
CPCH03M7/4037H03M7/6082H03M7/6076H03M7/4031H03M7/3088H03M7/40
Inventor A·杜阿勒D·威蒂格S·加米
Owner IBM CORP
Features
  • R&D
  • Intellectual Property
  • Life Sciences
  • Materials
  • Tech Scout
Why Patsnap Eureka
  • Unparalleled Data Quality
  • Higher Quality Content
  • 60% Fewer Hallucinations
Social media
Patsnap Eureka Blog
Learn More