A Fast Multi-View Depth Video Coding Method

A deep video and fast coding technology, applied in digital video signal modification, electrical components, image communication, etc., can solve problems such as increased complexity of coding algorithms, less texture details, unfavorable real-time application of multi-view video systems, etc.

Active Publication Date: 2016-06-08
NINGBO UNIV
View PDF5 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, the complex prediction relationship makes the multi-view video coding under the HBP prediction structure quite high complexity
The coding complexity in the HBP prediction structure mainly comes from the B frame. In JMVC, for each macroblock in the B frame, it is necessary to traverse the SKIP mode, the intra prediction mode and the inter prediction mode, and then use the rate-distortion optimization technology to Select the optimal encoding mode, but since each inter-frame prediction mode requires complex motion estimation based on multiple reference frames and bidirectional search, it will further increase the complexity of the original complex encoding algorithm, which is very unfavorable for multi-view The real-time application of the video system is also contrary to the low-latency and low-complexity requirements of the encoding algorithm
[0004] Aiming at the high complexity of multi-view video coding, a lot of research has been done on fast multi-view color video coding methods at home and abroad, but these methods are all proposed for multi-view color video. It has different characteristics from color video, and its function is not for final display but for auxiliary virtual view point drawing, so the existing multi-viewpoint color video fast coding method cannot be directly applied to multi-viewpoint depth video coding

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A Fast Multi-View Depth Video Coding Method
  • A Fast Multi-View Depth Video Coding Method
  • A Fast Multi-View Depth Video Coding Method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0040] The present invention will be further described in detail below in conjunction with the accompanying drawings and embodiments.

[0041] The present invention proposes a multi-viewpoint depth video fast coding method, which proposes the coding mode complexity of a macroblock starting from the spatial content correlation, temporal correlation and coding mode correlation of adjacent macroblocks of the depth video factor, and according to the encoding mode complexity factor, the depth video is divided into simple mode areas and complex mode areas, and different fast encoding mode selection methods are used for different areas. Simple mode areas only select simple encoding modes for searching, while complex The pattern area undergoes a relatively finer and more complex search process.

[0042] The overall implementation block diagram of the multi-view depth video fast encoding method of the present invention is as follows: image 3 As shown, it specifically includes the fol...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a multi-view depth video fast coding method. A coding mode complexity factor of a macro block is provided; the macro block is attributed to a simple mode zone or a complex mode zone according to the coding mode complexity factor, namely, a depth video is divided into the simple mode zone and the complex mode zone, and different fast coding mode selection strategies are adopted according to different zones; search for simple coding modes is performed for a macro block in the simple mode zone, and search for complex coding modes is performed for a macro block in the complex mode zone; and therefore, little-contributed time-consuming coding mode search in the coding process of a current coding frame can be avoided. As a result, under the premise that virtual viewpoint rendering quality is ensured and the bit rate of depth video coding is not affected, the computation complexity of multi-view depth video coding can be effectively reduced, and the coding time of the multi-view depth video coding can be saved.

Description

technical field [0001] The invention relates to a video signal encoding technology, in particular to a multi-viewpoint depth video fast encoding method. Background technique [0002] With the continuous development of 3D display and related technologies, multi-viewpoint video systems such as 3D TV and free-viewpoint TV have attracted more and more attention from domestic and foreign scholars and industrial circles. This 3D scene representation method of multi-view color video and depth video (MultiviewVideoplusDepth, MVD) can be used for multi-view autostereoscopic display, especially for scenes with a wide range of viewing angles and rich depth levels. Video information, which has become the mainstream data format of the multi-viewpoint video system. In the multi-view video system based on MVD, the depth information effectively represents the geometric information of the 3D scene, and reflects the relative distance from the shooting scene to the camera. It is a grayscale i...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Patents(China)
IPC IPC(8): H04N19/597H04N13/00
Inventor 彭宗举王叶群蒋刚毅郁梅陈芬
Owner NINGBO UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products