Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Three-dimensional image quality objective evaluation method based on visual fidelity

A stereoscopic image, fidelity technology, applied in image enhancement, image analysis, image data processing and other directions, can solve problems such as high computational complexity and inapplicability to applications

Inactive Publication Date: 2015-03-11
NINGBO UNIV
View PDF2 Cites 20 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

At present, the existing method is to predict the evaluation model through machine learning, but its computational complexity is high, and the training model needs to predict the subjective evaluation value of each evaluation image, which is not suitable for practical applications and has certain limitations.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Three-dimensional image quality objective evaluation method based on visual fidelity
  • Three-dimensional image quality objective evaluation method based on visual fidelity
  • Three-dimensional image quality objective evaluation method based on visual fidelity

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0071] The present invention will be further described in detail below in conjunction with the accompanying drawings and embodiments.

[0072] A kind of stereoscopic image quality objective evaluation method based on visual fidelity proposed by the present invention, its overall realization block diagram is as follows figure 1 As shown, it includes two processes of training phase and testing phase, and the training phase includes the following steps:

[0073] ①-1. Select N original undistorted stereoscopic images to form a training image set, denoted as {S i,org |1≤i≤N}, where, N>1, S i,org means {S i,org |1≤i≤N} The ith original undistorted stereo image, the symbol "{}" is a set symbol.

[0074] During specific implementation, the number of frames selected for the original undistorted stereoscopic image should be appropriate. If the value of N is larger, the accuracy of the visual dictionary table obtained through training is also higher, but the computational complexity i...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a three-dimensional image quality objective evaluation method based on visual fidelity. The method includes: in a training stage, selecting multiple original distortionless three-dimensional images to form a training image set, determining whether pixel points in the distortionless three-dimensional images belong to a shielding area or a matching area through area detection, and structuring a monocular vision dictionary table and a binocular vision dictionary table to the training image set through an unsupervised learning mode; in a testing stage, for testing three-dimensional images and the original distortionless three-dimensional images, estimating sparse coefficient array of each subblock, belonging to the shielding area and the matching area, in the testing three-dimensional images and the corresponding distortionless three-dimensional images according to the monocular vision dictionary table and the binocular vision dictionary table, calculating monocular image quality objective evaluation prediction value and binocular image quality objective evaluation prediction value through the sparse coefficient array, and finally combining to acquire an image quality evaluation predication value. The three-dimensional image quality objective evaluation method has the advantage that the acquired image quality objective evaluation predication value is highly uniform with a subjective evaluation value.

Description

technical field [0001] The invention relates to an image quality evaluation method, in particular to an objective evaluation method for stereoscopic image quality based on visual fidelity. Background technique [0002] With the rapid development of image coding technology and stereoscopic display technology, stereoscopic image technology has received more and more attention and applications, and has become a current research hotspot. Stereoscopic image technology utilizes the principle of binocular parallax of the human eye. Both eyes independently receive left and right viewpoint images from the same scene, and form binocular parallax through brain fusion, so as to enjoy stereoscopic images with a sense of depth and realism. . Compared with single-channel images, stereo images need to ensure the image quality of two channels at the same time, so it is very important to evaluate its quality. However, there is currently no effective objective evaluation method to evaluate t...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06T7/00
CPCG06T7/0002G06T2207/10012G06T2207/30168
Inventor 邵枫李柯蒙李福翠
Owner NINGBO UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products