Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Self-adaptive CU splitting decision-making method based on deep learning and multi-feature fusion

A multi-feature fusion and deep learning technology, applied in the field of adaptive CU split decision-making, can solve problems such as coding complexity, and achieve the effect of reducing computational complexity and saving coding time

Active Publication Date: 2020-07-17
ZHENGZHOU UNIVERSITY OF LIGHT INDUSTRY
View PDF4 Cites 5 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0007] In view of the deficiencies in the above-mentioned background technology, the present invention proposes an adaptive CU split decision method based on deep learning and multi-feature fusion, which combines deep learning and multi-feature fusion to solve the technical problem of coding complexity

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Self-adaptive CU splitting decision-making method based on deep learning and multi-feature fusion
  • Self-adaptive CU splitting decision-making method based on deep learning and multi-feature fusion
  • Self-adaptive CU splitting decision-making method based on deep learning and multi-feature fusion

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0049] The following will clearly and completely describe the technical solutions in the embodiments of the present invention with reference to the accompanying drawings in the embodiments of the present invention. Obviously, the described embodiments are only some, not all, embodiments of the present invention. Based on the embodiments of the present invention, all other embodiments obtained by persons of ordinary skill in the art without making creative efforts belong to the protection scope of the present invention.

[0050] Such as figure 1 As shown, the embodiment of the present invention provides an adaptive CU split decision method based on deep learning and multi-feature fusion. First, the standard deviation SD is used to calculate the texture complexity of the CU, and then, based on the function of the quantization parameter QP and depth A threshold model that can improve segmentation accuracy is established to identify complex CUs and uniform CUs; ​​an adaptive CNN s...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides an adaptive CU splitting decision method based on deep learning and multi-feature fusion. The method comprises the steps: firstly, calculating the texture complexity SD of a current CU through standard deviation, building a threshold model through a quantization parameter function and a depth function, and dividing the current CU into a complex CU and a uniform CU; secondly,if the complex CU belongs to the edge CU, judging whether the complex CU is split or not by utilizing a CNN structure based on multi-feature fusion; otherwise, judging whether the complex CU is splitor not by utilizing the self-adaptive CNN structure. According to the method, deep learning and multi-feature fusion are combined, and the problem of coding complexity is solved. The CNN structure based on multi-feature fusion and the CNN structure based on self-adaptation can successfully process the training samples, calculation of rate distortion RDO of all CU and complex CU is avoided, and therefore the calculation complexity is reduced, and the coding time is saved.

Description

technical field [0001] The present invention relates to the technical field of image processing, in particular to an adaptive CU split decision method based on deep learning and multi-feature fusion. Background technique [0002] With higher demands on video compression, it becomes more important to develop more efficient video coding standards. JVET has developed the next generation video coding standard - H.266 / VVC. The H.266 / VVC Test Model (VTM) implements many novel techniques that can significantly improve coding efficiency. H.266 / VVC uses a quadtree-nested multi-type tree (QTMT) coding block architecture for block partitioning, which shows better coding performance, but results in enormous computational complexity, possibly 5 times that of HEVC , and H.266 / VVC also contains 67 intra-frame prediction modes for intra-frame prediction. Among them, the planar mode and DC mode are the same as H.265 / HEVC. The prediction mode becomes denser, so more accurate prediction can ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06T7/40G06T7/136G06T7/13G06K9/62G06N3/04
CPCG06T7/40G06T7/13G06T7/136G06N3/045G06F18/241G06F18/253G06F18/214Y02D10/00
Inventor 赵进超张秋闻王兆博王祎菡崔腾耀赵永博郭睿骁王晓蒋斌黄立勋张伟伟钱晓亮吴庆岗常化文魏涛孙丽君
Owner ZHENGZHOU UNIVERSITY OF LIGHT INDUSTRY
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products