Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Kinect Depth Reconstruction Method Based on Camera Motion and Image Shading

A technology of camera movement and shading, which is applied in image analysis, image enhancement, image data processing, etc., and can solve problems such as insufficient depth detection accuracy

Active Publication Date: 2020-05-22
SOUTH CHINA UNIV OF TECH
View PDF1 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0006] The object of the present invention is to overcome the deficiency that the depth detection accuracy of existing civil depth camera is not enough, provide a kind of Kinect depth reconstruction method based on camera motion and image light and shade, this method does not need to carry out physical improvement to depth camera, does not need It needs to design a complex combination of devices, and does not need the complex and harsh illumination calibration steps that are often used in traditional depth reconstruction methods, which are generally limited to laboratory conditions and have no practical application value. Compared with traditional methods , has greater practical application value and significance

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Kinect Depth Reconstruction Method Based on Camera Motion and Image Shading
  • Kinect Depth Reconstruction Method Based on Camera Motion and Image Shading
  • Kinect Depth Reconstruction Method Based on Camera Motion and Image Shading

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0084] The present invention will be further described below in conjunction with specific examples.

[0085] The Kinect depth reconstruction method based on camera motion and image light and shade described in the present embodiment comprises the following steps:

[0086] 1) When the Kinect depth camera and RGB camera are calibrated and aligned, upload the data collected by Kinect to the computer through a third-party interface.

[0087] 2.1) When the system is initialized, an RGB image is read and used as a key frame, and a depth map is bound to the key frame. The depth map and the grayscale image have the same dimension, traverse the depth map, and assign a random value to each pixel position. value;

[0088] 2.2) Every time an RGB image is read, the following cost function is constructed:

[0089]

[0090] Where||·|| δ is the Hubble operator, r p represents the error, represents the variance of the error, .

[0091] The Hubble operator is defined as follows:

[0...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a camera motion and image brightness-based Kinect depth reconstruction algorithm. The algorithm comprises the steps of 1) uploading data collected by Kinect to a computer through a third-party interface under the condition that a Kinect depth camera and an RGB camera are calibrated and aligned; 2) recovering a three-dimensional scene structure and a motion track of the kinect RGB camera from an RGB video sequence, and obtaining a relationship between point cloud and a camera motion; and 3) reconstructing image depth by utilizing brightness status information of an image in combination with the relationship between the point cloud and the camera motion, obtained in the step 2). According to the algorithm, the depth camera does not need to be improved physically, a complex apparatus combination does not need to be designed, and an illumination calibration step which is often used in a conventional depth reconstruction method, generally only can be limited in laboratory conditions, does not have a practical application value and is complex and strict in condition is not needed, so that compared with the conventional method, the algorithm has higher practical application value and significance.

Description

technical field [0001] The invention relates to the field of depth reconstruction in computer image processing, in particular to a Kinect depth reconstruction method based on camera motion and image light and shade. Background technique [0002] With the introduction and promotion of some relatively cheap civil depth cameras in recent years, such as Microsoft Kinect and AsusXtion Pro, depth information is widely used in somatosensory games, real-time 3D reconstruction, augmented reality, virtual reality and other fields. The application of information has become an important support for the development of new human-computer interaction methods. However, most of the more popular civilian depth cameras currently on the market have the problems of insufficient depth detection accuracy and too much interference noise, which seriously affects the quality of application products based on depth information. Therefore, how to obtain more accurate depth information is of great signi...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06T7/593G06T7/277
CPCG06T2207/10021G06T2207/10024G06T2207/10028
Inventor 青春美黄韬袁书聪徐向民
Owner SOUTH CHINA UNIV OF TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products