Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Professional stereoscopic video visual comfort classification method based on attention and recurrent neural network

A technology of cyclic neural network and stereoscopic video, which is applied in the direction of biological neural network model, stereoscopic system, neural architecture, etc., and can solve problems such as the consideration of children's audience

Active Publication Date: 2020-10-30
FUZHOU UNIV
View PDF4 Cites 7 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] The purpose of the present invention is to propose a professional stereoscopic video visual comfort classification method based on attention and cyclic neural network, which solves the problem that the current stereoscopic video comfort evaluation algorithm does not consider children as audience objects, and can effectively distinguish whether the professional stereoscopic video is suitable for children

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Professional stereoscopic video visual comfort classification method based on attention and recurrent neural network
  • Professional stereoscopic video visual comfort classification method based on attention and recurrent neural network
  • Professional stereoscopic video visual comfort classification method based on attention and recurrent neural network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0068] The present invention will be further described below in conjunction with the accompanying drawings and specific embodiments.

[0069] Such as figure 1 , figure 2 As shown, the present embodiment provides a professional stereoscopic video visual comfort classification method based on attention and recurrent neural network, comprising the following steps:

[0070] Step S1: Carry out scene segmentation on the training video set and the video set to be predicted and obtain the disparity map through preprocessing; specifically include the following steps:

[0071] Step S11: using a multimedia video processing tool to divide the video into frames of images;

[0072] Step S12: using the shot division algorithm to divide the stereoscopic video into non-overlapping video segments, each segment is called a shot;

[0073] Step S13: Divide each frame into left and right views, and use the SiftFlow algorithm to calculate the horizontal displacement of corresponding pixels in th...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to a professional stereoscopic video visual comfort classification method based on attention and a recurrent neural network. The method comprises the following steps: 1, performing scene segmentation on a training video set and a to-be-predicted video set, and performing preprocessing to obtain a disparity map; 2, performing frame-level processing to obtain frame-level features; 3, performing lens-level processing to obtain a hidden state set; 4, performing double-flow fusion: fusing the hidden state set output in the previous step by using an attention network to obtaina final hidden state; 5, outputting a classification probability of the final hidden state through a classification network, and classifying the professional three-dimensional video to be suitable forchildren to watch or only suitable for adults to watch; 6, inputting the left view and the corresponding disparity map of the three-dimensional video in the to-be-tested video set into the trained model for classification. Whether professional stereoscopic videos are suitable for children to watch or not can be effectively distinguished.

Description

technical field [0001] The invention relates to the fields of image and video processing and computer vision, in particular to a method for classifying professional stereoscopic video visual comfort based on attention and cyclic neural networks. Background technique [0002] Stereoscopic video, also known as 3D video, is different from 2D video. Its most important feature is that it has depth information, so that the presentation of the landscape in the video is no longer limited to the screen. The vigorous development of stereoscopic technology has enabled people to obtain a better viewing experience, but it has also brought some troubles. For example, watching uncomfortable stereoscopic videos for a long time will cause dizziness, dry eyes, nausea and other feelings. These adverse reactions It will discourage viewers from watching, and even affect the physical health of viewers. Therefore, how to evaluate the visual comfort quality of stereoscopic images has become a conc...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/62G06N3/04H04N13/10H04N13/161
CPCG06N3/04H04N13/00H04N2013/0074G06N3/044G06F18/24G06F18/253G06F18/214
Inventor 牛玉贞彭丹泓
Owner FUZHOU UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products