Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Deep learning feature compression and decompression method, system and terminal

A technology of deep learning and compression method, which is applied in the field of image processing and computer vision, can solve the problems of not yet collecting data, not finding instructions or reports, etc., and achieve the effect of saving transmission bandwidth resources and having flexibility

Active Publication Date: 2021-07-23
SHANGHAI UNIV
View PDF4 Cites 5 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] At present, there is no description or report of the similar technology of the present invention, and no similar data at home and abroad have been collected yet.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Deep learning feature compression and decompression method, system and terminal
  • Deep learning feature compression and decompression method, system and terminal
  • Deep learning feature compression and decompression method, system and terminal

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0060] The following is a detailed description of the embodiments of the present invention: this embodiment is implemented on the premise of the technical solution of the present invention, and provides detailed implementation methods and specific operation processes. It should be noted that those skilled in the art can make several modifications and improvements without departing from the concept of the present invention, and these all belong to the protection scope of the present invention.

[0061] figure 1 It is a flow chart of deep learning feature compression and its corresponding decompression method provided by an embodiment of the present invention.

[0062] like figure 1 As shown, an embodiment of the present invention provides a deep learning feature compression method, which may include the following steps:

[0063] On the encoding side:

[0064] S100, performing a compact space transformation on the input original features to obtain a compact feature expression...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a deep learning feature compression and decompression method and system, and a terminal, and the method comprises the steps: carrying out the compact space transformation of an input original feature through a coding end, and obtaining the compact feature expression of the original feature; calculating an importance coefficient of each channel in the features by using compact feature expression, and performing quantization parameter adaptive distribution of each channel; performing non-uniform quantization on different channels based on the distributed quantization parameters to obtain a quantized multi-channel feature map; and performing feature coding on the quantized multi-channel feature map to complete compression. The decoding end carries out decoding to obtain an inverse quantized multi-channel feature map; self-adaptive performance compensation is carried out on different quantization levels of the inversely quantized multi-channel feature map; and original feature reconstruction is performed on the compensated multi-channel feature map to complete feature decompression. According to the method, the coding end adaptively allocates the quantization parameters based on the image content, and the decoding end adaptively performs performance compensation based on the quantization level, so that the performance of non-uniform quantization on feature compression is improved.

Description

technical field [0001] The present invention relates to the technical fields of image processing and computer vision, in particular to a deep learning feature compression and decompression method, system and terminal based on quantization adaptive distribution and compensation. Background technique [0002] Traditional image compression technology is designed for human visual characteristics, and with the superior performance of deep neural networks in various machine vision tasks, such as image classification, target detection, semantic segmentation, etc., a large number of machine-based AI applications for vision. In order to ensure that the performance of machine vision tasks is not damaged by the image compression process, the analysis-then-compression (Analysis-Then-Compression) mode is adopted to meet the needs of machine vision, that is, the lossless image is directly characterized by the neural network at the image acquisition end. Extraction, and then encode and tr...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): H04N19/42H04N19/189H04N19/124G06N3/08G06N3/04
CPCH04N19/42H04N19/189H04N19/124G06N3/04G06N3/084
Inventor 安平王维茜
Owner SHANGHAI UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products