Stereoscopic video visual comfort evaluation method based on region segmentation

A technology of stereoscopic video and area segmentation, applied in stereoscopic systems, televisions, electrical components, etc., can solve problems such as non-unification, and achieve the effect of accelerating development

Active Publication Date: 2013-05-08
JILIN UNIV
3 Cites 40 Cited by

AI-Extracted Technical Summary

Problems solved by technology

At present, there is no unified judgment standard for the evaluation of the visual comfort of stereoscopic images. Therefore, it is of great significance to study the factors affecting the comfort of s...
the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Abstract

The invention discloses a stereoscopic video visual comfort evaluation method based on region segmentation and belongs to the technical field of three-dimensional picture evaluation. The method comprises the steps of self-adaptively extracting motion areas in stereo video sequences and significant areas in background parts; combining depth sensing theory and space combination technology to conduct modeling on the characteristic quantity of visual comfort level; utilizing the linear regression method and combining the logic relationship between human being subjective evaluation testing results and the parallax depth to select undetermined coefficient and feature weighting value conforming to the human being subjective evaluation results and realize evaluation of stereoscopic video comfort adaptiveness. The method demarcates projection equipment and experimental conditions, respectively presets weight on a background part and a motion target in a significant region to realize optimization of evaluation results of comfort level, builds a stereo image quality objective evaluation model conforming to human subjective perception results, and has an immeasurable significant function in quickening the development of a stereo video system.

Application Domain

Technology Topic

Image

  • Stereoscopic video visual comfort evaluation method based on region segmentation
  • Stereoscopic video visual comfort evaluation method based on region segmentation
  • Stereoscopic video visual comfort evaluation method based on region segmentation

Examples

  • Experimental program(1)

Example Embodiment

[0040] The present invention will be further described in detail below in conjunction with the accompanying drawings. The present invention will describe the following implementations in detail from two parts: the calibration of projection equipment parameters and the evaluation of stereoscopic video comfort.
[0041] 1. Calibration of projection equipment parameters
[0042] Due to the influence of different types of experimental equipment and experimental conditions, it is difficult to guarantee the accuracy of comfort evaluation when observing stereoscopic video. Therefore, it is necessary to calibrate the parameters of the projection equipment first. Here, the present invention compares the characteristic curves between various factors that affect the visual comfort rating (such as visual fatigue, physical discomfort, inattention) and the subjective evaluation experimental results through a human subjective evaluation, and establishes The visual comfort evaluation model is combined with the logical relationship between the subjective evaluation results and the parallax depth d to determine the functional relationship of the visual comfort evaluation model.
[0043]The participants of this subjective evaluation experiment are required to meet the following conditions: the selected age is 20-35 years old, the eyesight is normal (naked vision or wearing glasses corrected vision reaches 1.0, and there is no eye disease that affects the experimental data), and there is stereoscopic video observation experience, and can correctly and thoroughly understand the judging criteria, and professionals without subjective tendencies in the scoring process are used as observers; the stereoscopic video sequence adopts the standard sequence issued by the international video organization, and the playback interval is set to 20s; for the experimental scene The layout can refer to figure 1 As shown, among them, L is the distance between the observer and the screen. According to the research results of foreign related laboratories, a better visual effect can be obtained by observing at a distance of 3m from the screen. Therefore, L is selected in this experiment as 3m; the distance e between the eyes of an adult is generally 65mm, so e is selected here as 65mm.
[0044] When watching a stereoscopic video, the observer can obtain a better stereoscopic visual impact, which is caused by the fact that the real convergence point of the eyes is inside and outside the screen. like figure 2 As shown, when the right pixel point r of an object point is on the right side of the left pixel point l, the object point has positive parallax, and the reproduced stereoscopic depth d 1 is negative, indicating that the viewer sees that the object point is behind the display, such as object point Q 1; and vice versa, as in image 3 As shown, when the right pixel point r of an object point is on the right side of the left pixel point l, the object point has negative parallax, and the stereoscopic depth d is reproduced 2 is positive, which means that the viewer sees that the object point is in front of the display screen, such as the object point Q 2. figure 2 and image 3 middle p 1 ,p 2 are the horizontal disparities of the corresponding matches in the view, respectively.
[0045] According to the definition of horizontal parallax and the schematic diagram of stereoscopic effect, the relational expression of parallax depth d can be obtained:
[0046] d = Lp p - e
[0047] Where L and e are known quantities, and p is the horizontal parallax of two images in the same frame when the vertical parallax is ignored, so that the parallax depth d of the corresponding target area can be obtained.
[0048] 2. Evaluation of stereoscopic video comfort
[0049] According to the different regions, fill in the MOS value obtained from the subjective evaluation experiment and the corresponding parallax depth d into the following Figure 4 In the coordinate system shown, the marked points are connected into a smooth curve to obtain the corresponding functional relationship. Here, the present invention proposes for the first time to treat the background objects in the salient area separately from the moving objects. Considering the degree of attention of human eyes, the moving objects in the salient area and the moving objects in the original view are classified as the moving area at the same time. , as for the remaining background objects, they are classified as the background part of the salient region. first select Figure 4 The observation point on the smooth curve in the coordinate system (d M , MOS M ), (d S , MOS S ) corresponding to the left and right matching views of the current frame, and obtain their standard disparity maps, and then use the GBVS algorithm to process the left and right views respectively to obtain their respective salient area distribution maps, and the just obtained salient area distribution map Synthesize an image with the standard disparity map with a threshold of 0.5 and a ratio of 1:1, segment and extract the image, and obtain the corresponding salient area (that is, attract most of the attention of the human eye, which can be compared to the experimental results The area that causes a large deviation); then use the high-order detection method to process the two adjacent frames of views, and suppress the noise while obtaining the motion areas of the two views. Segment and extract the regions obtained above, and calculate their respective parallax depth d. Here, the spatial merging technology is used to process the parallax depth values ​​in the region, and a relatively stable value of the parallax depth d can be obtained. final union Figure 4 The mathematical expression of the function relationship obtained in can get the expression of the stereoscopic video visual comfort score:
[0050] VC = f ( d ‾ M , d ‾ S ) = w 1 ( a 1 d ‾ M 3 + a 2 d ‾ M 2 + a 3 d ‾ M + a 4 ) + w 2 ( b 1 d ‾ S 3 + b 2 d ‾ S 2 + b 3 d ‾ S + b 4 )
[0051] where w 1 、w 2 Assigned to 0.8, 0.2, a respectively 1 、a 2 、a 3 、a 4 , b 1 , b 2 , b 3 , b 4 , which can be calculated from the results of subjective evaluation experiments.
[0052] Image 6 The figure shows the extraction of salient regions in the image by GBVS. The first thing we pay attention to when looking at the image is the car 4, then the passer-by 3 walking on the road and the old man 2 standing in front of the door. In addition, the door 1 behind the old man is due to The color is inconsistent with other parts of the background, so it also attracts part of the attention of the observer, but because it only exists in the background, it receives less attention than the car 4, the walking passer-by 3 and the standing old man 2 A lot, so only a small percentage is used when assigning weights.
the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

no PUM

Description & Claims & Application Information

We can also present the details of the Description, Claims and Application information to help users get a comprehensive understanding of the technical details of the patent, such as background art, summary of invention, brief description of drawings, description of embodiments, and other original content. On the other hand, users can also determine the specific scope of protection of the technology through the list of claims; as well as understand the changes in the life cycle of the technology with the presentation of the patent timeline. Login to view more.
the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Similar technology patents

Classification and recommendation of technical efficacy words

Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products