Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Signal processing method based on deep neural network

A technology of deep neural network and signal processing, applied in the field of video compression combining optical flow information and depth information for frame prediction

Active Publication Date: 2021-01-08
苏州天必佑科技有限公司
View PDF4 Cites 6 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Still, it's worth noting that VVC improvements may come at the cost of multiplicative encoding / decoding complexity

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Signal processing method based on deep neural network
  • Signal processing method based on deep neural network
  • Signal processing method based on deep neural network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0032] The specific implementation manners of the present invention will be further described below in conjunction with the drawings and examples. The following examples are only used to illustrate the technical solution of the present invention more clearly, but not to limit the protection scope of the present invention.

[0033] Such as Figure 1 to Figure 3 Shown, the technical scheme of concrete implementation of the present invention is as follows:

[0034] 1. Build the development environment python3.6+Pytorch1.4+cuda9.0+cudnn7.0.

[0035] 2. Download and preprocess the training data set; the training set uses viemo90K, the data volume of the data set reaches 80G, and consists of 89,800 video clips downloaded from vimeo.com, covering a large number of scenes and actions; mainly used for the following four Video processing tasks: temporal frame interpolation, video denoising, video deblocking, and video super-resolution.

[0036] 3. Establish a video compression projec...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a signal processing method based on a deep neural network, and the method comprises the steps: firstly dividing a video frame into a key frame and a non-key frame according tothe threshold value of the mean square error between a current frame and a previous frame, and carrying out the compression of training network models of the key frame and the non-key frame; for a non-key frame, an entropy model auto-encoder based on context and super prior is adopted to perform intra-frame prediction; for a non-key frame, optical flow information and depth information are extracted and combined to generate motion information so as to carry out frame reconstruction, then extraction coding is carried out on a residual error between a reconstructed frame and a real frame, and finally a current frame is generated at a decoding end according to the transmitted motion information and residual error information in combination with a previous frame. The end-to-end video compression method makes full use of the strong nonlinear expression ability of the deep neural network and the advantages of joint training, and the compression effect of the end-to-end video compression method exceeds h.264.

Description

technical field [0001] The invention relates to the field of video compression, in particular to a video compression method combining optical flow information and depth information for frame prediction. Background technique [0002] Image / video coding generally refers to computing techniques that compress images / videos into binary codes for storage and transmission. Compression can be divided into lossless coding and lossy coding according to whether it can ensure that the image / video is reconstructed perfectly from bits. For natural images / videos, the compression efficiency of lossless coding is usually lower than required, so most of the work is focused on lossy coding. Lossy image / video coding solutions are mainly evaluated from two aspects: one is the compression efficiency, which is usually measured by the number of bits (encoding rate), the lower the better; Measured in terms of quality, higher quality is better compared to the original image / video. [0003] Image / v...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): H04N19/147H04N19/159H04N19/172H04N19/42H04N19/85H04N19/91G06T9/00G06T7/269
CPCH04N19/159H04N19/172H04N19/42H04N19/85H04N19/91H04N19/147G06T9/002G06T7/269G06T2207/10016G06T2207/20081G06T2207/20084
Inventor 侯兴松李瑞敏
Owner 苏州天必佑科技有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products