Scene Flow Estimation Method Based on Local Rigidity Assumption in RGBD Video

A technology of scene flow in video, applied in computing, computer components, image analysis, etc., can solve the problems of scene flow estimation error in local area and inaccurate estimation of scene flow in global area, etc., and achieve the effect of improving accuracy

Active Publication Date: 2022-03-04
XIAN UNIV OF TECH
View PDF0 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] The purpose of the present invention is to provide a scene flow estimation method based on local rigid assumptions in RGBD video, which solves the problem of inaccurate estimation of scene flow in the global area due to the existence of errors in scene flow estimation in local areas in existing research methods

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Scene Flow Estimation Method Based on Local Rigidity Assumption in RGBD Video
  • Scene Flow Estimation Method Based on Local Rigidity Assumption in RGBD Video
  • Scene Flow Estimation Method Based on Local Rigidity Assumption in RGBD Video

Examples

Experimental program
Comparison scheme
Effect test

Embodiment

[0109] The implementation process of scene flow estimation method based on local rigidity assumption in RGBD video is illustrated by an operation example on a set of simulation data.

[0110] (1) First execute step 1, collect two consecutive frames of RGB and depth images, and then calculate the optical flow information based on two consecutive frames of RGB images, Figure 4 It is to collect two consecutive frames of original RGB image and depth image, Figure 5 is the optical flow information from RGB image 1 to RGB image 2 and the optical flow information map from RGB image 2 to RGB image 1 calculated based on two frames of RGB images;

[0111] (2) Execute step 2, the repaired depth image can be obtained, the result is as follows Figure 6 Shown are two frames of depth images repaired according to the corresponding RGB image information.

[0112] (3) Executing step 3, the layered information of the depth image can be obtained, and the segmentation result of the depth imag...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The scene flow estimation method based on local rigidity assumption in the RGBD video disclosed by the present invention, first, respectively input two consecutive frames of RGB images and depth images, and calculate the optical flow information between the two consecutive frames of RGB images; secondly, according to the coordinate conversion configuration Quasi-input depth image and RGB image, repair the holes in the depth image and eliminate noise points; then, use the K-means algorithm to layer the repaired depth image, and divide the pixel values ​​​​with close depth values ​​into the same layer; then use Local rigidity and global non-rigidity assume that each layer is divided into many blocks and the motion information of each block is calculated; finally, the final scene flow information is obtained according to the optical flow information and the layered information of the depth image. Compared with the traditional method for calculating scene flow, the method disclosed by the invention has higher precision.

Description

technical field [0001] The invention belongs to the technical field of computer digital image processing, and in particular relates to a scene flow estimation method based on a local rigid assumption in RGBD video. Background technique [0002] Optical flow refers to the clockwise velocity of the pixel motion of a space moving object on the observation plane, which expresses the change of the image. Since it contains the information of the target motion, it can be used by the observer to determine the motion of the target. The optical flow field can be derived from the definition of optical flow, which refers to a two-dimensional (2D) instantaneous velocity field composed of all pixels in the image, where the two-dimensional velocity vector is the three-dimensional velocity vector of the visible points in the scene on the imaging surface projection. Therefore, optical flow not only contains the motion information of the observed object, but also contains rich information ab...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Patents(China)
IPC IPC(8): G06T7/207G06T5/00G06K9/62G06T7/55
CPCG06T7/207G06T5/002G06T5/005G06T7/55G06T2207/10016G06T2207/10024G06T2207/20028G06F18/23213
Inventor 李秀秀刘沿娟金海燕蔡磊
Owner XIAN UNIV OF TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products