Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

360-degree panoramic video coding method based on motion attention model

An attention model and panoramic video technology, which is applied in the field of 360-degree panoramic video coding based on motion attention model, and can solve the problem of reducing the compression rate of 360-degree panoramic video.

Active Publication Date: 2019-11-22
PLEX VR DIGITAL TECH CO LTD
View PDF4 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0003] As users have higher and higher requirements for the authenticity of virtual reality, the current common video coding scheme can no longer meet the requirements of reducing the compression bit rate of 360-degree panoramic video while ensuring the same subjective quality

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • 360-degree panoramic video coding method based on motion attention model
  • 360-degree panoramic video coding method based on motion attention model
  • 360-degree panoramic video coding method based on motion attention model

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0040] The present invention will be further described now in conjunction with accompanying drawing.

[0041] see figure 1 , figure 1 Shown is the HEVC coding framework of an embodiment of the present invention. The purpose of this embodiment is to provide a 360-degree panoramic video coding method based on a motion attention model, adding the attention model to it, mainly including the following steps:

[0042] Step 1: Extract the motion vector to obtain the motion vector field, and calculate the reliability of each motion vector.

[0043] The motion vector is extracted from the HEVC reference encoder HM16.0 used in this embodiment, and a motion vector field is obtained. see figure 2 , figure 2 The single arrow shown in is the motion vector, which is the relative displacement of the coding block relative to the reference frame within a certain search range; the motion vector field densely distributed in the video together constitutes the motion vector field.

[0044] S...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The present invention relates to a 360-degree panoramic video coding method based on a motion attention model, comprising: extracting a motion vector to obtain a motion vector field, and calculating the reliability of each motion vector; according to the reliability, performing reliability weighted filtering pre-processing to reduce noise; performing global motion compensation on the motion vector field; constructing a motion attention model to obtain motion attention of a coded block; and according to the motion attention of the coded block, self-adaptively allocating a codeword. The present invention has the beneficial effects that: the calculation of the motion attention is performed on the basis of the motion vector field, and does not require additional calculation complexity, reducing the impact of noise, and comprehensively considering the strength of the motion vector, the contrast ratio of a space-domain motion vector and the contrast ratio of a time-domain motion vector to construct the motion attention model, allocating a relatively large number of codewords to a region of motion interest, allocating relatively few codewords to an unchanging video region so as to reduce coded codewords of unused regions while improving the video quality of the scene of motion interest.

Description

technical field [0001] The invention relates to the technical field of 360-degree panoramic video coding, in particular to a 360-degree panoramic video coding method based on a motion attention model. Background technique [0002] The traditional live broadcast method can bring real-time game enjoyment to the audience. After adding the 360-degree panoramic live broadcast technology, it can not only create a more on-site watching atmosphere, but also break through the limitation of space seats and greatly broaden the audience. The development of 360-degree panoramic live broadcast technology can not only be used for event-based live broadcasts such as concerts and sports events, but can also be used in the medical field, real estate on-site viewing and sales, etc. Under normal circumstances, it is an outdoor live broadcast. In this environment, the network at the acquisition end itself is extremely unstable, which affects the quality of users watching 360-degree panoramic li...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): H04N19/513H04N19/527H04N19/176
CPCH04N19/176H04N19/513H04N19/527
Inventor 虞晶怡胡强
Owner PLEX VR DIGITAL TECH CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products