Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Method for sensing stereoscopic video coding based on parallax just-noticeable difference model

A technology of stereoscopic video coding and error model, which is applied in the field of video processing and can solve problems such as the reduction of coding efficiency of coding software

Active Publication Date: 2016-02-03
XIDIAN UNIV
View PDF6 Cites 12 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, this method needs to calculate the depth map sequence of the stereoscopic video first, or use the stereoscopic video with a known depth map, which reduces the coding efficiency of the coding software.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Method for sensing stereoscopic video coding based on parallax just-noticeable difference model
  • Method for sensing stereoscopic video coding based on parallax just-noticeable difference model
  • Method for sensing stereoscopic video coding based on parallax just-noticeable difference model

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0081] Such as figure 1 Shown, the realization steps of the present invention are as follows:

[0082] Step 1. Disparity estimation

[0083] 1a) Read in each frame image I corresponding to the left and right viewpoints of the binocular stereo video respectively iL and I iR , and use the method based on the mean shift color segmentation to segment and preprocess it to obtain the image I' iL and I' iR ;

[0084] (1a1) Perform mean shift filtering on each frame of images corresponding to the left and right viewpoints respectively, to obtain the information of all subspace convergence points;

[0085] (1a2) segmented regions are obtained by combining clusters of pixel points delineating the same domain through mean-shift filtering;

[0086] 1b) For I' iL and I' iR Stereo matching is performed to obtain the disparity d(x,y) between the left and right viewpoints. The specific steps are as follows:

[0087] (1b1) Using local stereo matching, we can get:

[0088] d(x,y)=a x+...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention belongs to the field of video processing technology, and specifically discloses a method for sensing stereoscopic video coding based on a parallax just-noticeable difference model. The method comprises the following steps: (1) estimating parallax; (2) estimating a JND model based on the parallax; (3) calculating a brightness, texture and time weighted JND model; (4) combining the JND model based on the parallax with a spatial domain-time domain JND model by using a nonlinear additive model to obtain a binocular stereoscopic JND model based on the parallax; and (5) using the binocular stereoscopic JND model based on the parallax on a stereoscopic residual preprocessor for resetting a residual. The method disclosed by the invention can be used for effectively eliminating inter-view redundancy of time, space and binocular stereoscopic videos, and the information of the brightness, a texture region or an object edge keeps a very natural visual effect. Therefore, the method disclosed by the invention can be used for greatly reducing the stereoscopic video code rate on the premise of generating no influence on the stereoscopic visual perceptual quality.

Description

technical field [0001] The invention belongs to the technical field of video processing, and in particular relates to a perceptual stereoscopic video coding method, in particular to a perceptual stereoscopic video coding method based on a parallax minimum perceivable error model. Background technique [0002] Due to the increasing demand for realistic visual experience, the development of 3D TV technology is very rapid in recent years. The multi-viewpoint video generated by capturing the same scene from different viewpoints by multiple cameras with different viewpoints can bring users a more vivid visual experience. However, as the number of cameras increases, the storage space and bandwidth required for storing and transmitting 3D stereoscopic videos need to increase exponentially in order to maintain the quality of video images. Therefore, an effective stereoscopic video coding method is very necessary. [0003] Stereoscopic video coding is to eliminate the space, time a...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): H04N19/597H04N19/147H04N19/137H04N19/186H04N13/00
Inventor 郑喆坤焦李成薛飞孙天乔伊果
Owner XIDIAN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products