Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Video compression method based on deep neural network

A deep neural network and video compression technology, applied in the field of video coding to achieve good scalability

Active Publication Date: 2017-11-24
NANJING UNIV
View PDF6 Cites 62 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

For those original video data in YUV420 format for each frame, no wired network can meet the real-time transmission of such video content

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Video compression method based on deep neural network
  • Video compression method based on deep neural network
  • Video compression method based on deep neural network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0027] In order to make the purpose, technical solution and advantages of the present invention clearer, the implementation method of the present invention will be further described in detail below in conjunction with the accompanying drawings.

[0028] A kind of video compression method based on deep neural network of the present embodiment, the steps are as follows:

[0029] (1) First collect and organize the required high-definition images (including Kodak lossless image library, ImageNet image library, etc.), organize standardized video image data sets, and construct neural network training data sets, test data sets and cross-validation sets.

[0030] (2) Establish a multi-layer prediction neural network and residual neural network: divide the image into non-overlapping M×N blocks, and train the prediction model of video coding mainly as intra-frame prediction mode and inter-frame prediction mode.

[0031] (3) For the inter-frame prediction mode, use the motion estimation ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a video compression method based on the deep neural network. The method includes the following steps that a video image data set is collected and organized, and a neural network training set, a test set and a cross validation set are constructed; the multi-layer deep neural network is set up; for inter-frame prediction, a motion estimation algorithm is used for searching for the optimal matching block, and residuals and the mean square error of inter-frame prediction are calculated; the residuals obtained after prediction are used as new training data to train a residual code network, and a residual network model comprises an intra-frame residual and an inter-frame residual; compression data serving as fixed-length code streams together with output data, obtained after quantization and lossless entropy coding, of the residual neural network is predicted; a decoding terminal restores the compression data through a neutral network symmetric with a coding terminal, and a compressed image is obtained through reestablishment and recovery. Compared with a traditional H.264 video coding method in equality comparison of plenty of test video sequences, the video compression method can save about 26% code rate on the premise of equal quality.

Description

technical field [0001] The invention relates to the field of video coding, in particular to a video compression method based on a deep neural network. Background technique [0002] In recent years, artificial neural networks have developed to the stage of deep learning. Deep learning attempts to use a series of algorithms that contain complex structures or multiple processing layers composed of multiple nonlinear transformations to perform high-level abstraction on data. Its powerful expressive ability enables it to achieve the best results in various machine learning tasks. The performance on video and image processing also currently exceeds other methods. [0003] Deep learning uses the idea of ​​hierarchical abstraction, and high-level concepts are learned through low-level concepts. This hierarchical structure is usually constructed using a greedy layer-by-layer training algorithm, and effective features that are helpful for machine learning are selected from it. Many ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): H04N19/42H04N19/503H04N19/124H04N19/91G06N3/04G06N3/08
CPCH04N19/124H04N19/42H04N19/503H04N19/91G06N3/08G06N3/045
Inventor 马展陈彤刘浩杰
Owner NANJING UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products