Camera motion and image brightness-based Kinect depth reconstruction algorithm

A camera movement and image technology, applied in image enhancement, image analysis, image data processing, etc., can solve the problem of insufficient depth detection accuracy

Active Publication Date: 2017-05-31
SOUTH CHINA UNIV OF TECH
View PDF1 Cites 44 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0006] The purpose of the present invention is to overcome the deficiency that the depth detection precision of existing civil depth camera is not enough, provide a kind of Kinect depth reconstruction algorithm based on camera motion and image light and shade, this algorithm does not need to carry out physical improvement to depth camera, does not need It needs to design a complex combination of devices, and does not need the complex and harsh illumination calibration steps that are often used in traditional depth reconstruction methods, which are generally limited to laboratory conditions and have no practical application value. Compared with traditional methods , has greater practical application value and significance

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Camera motion and image brightness-based Kinect depth reconstruction algorithm
  • Camera motion and image brightness-based Kinect depth reconstruction algorithm
  • Camera motion and image brightness-based Kinect depth reconstruction algorithm

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0084] The present invention will be further described below in conjunction with specific examples.

[0085] The Kinect depth reconstruction algorithm based on camera motion and image shade described in the present embodiment comprises the following steps:

[0086] 1) When the Kinect depth camera and RGB camera are calibrated and aligned, upload the data collected by Kinect to the computer through a third-party interface.

[0087] 2.1) When the system is initialized, an RGB image is read and used as a key frame, and a depth map is bound to the key frame. The depth map and the grayscale image have the same dimension, traverse the depth map, and assign a random value to each pixel position. value;

[0088] 2.2) Every time an RGB image is read, the following cost function is constructed:

[0089]

[0090] Where||·|| δ is the Hubble operator, r p represents the error, represents the variance of the error, .

[0091] The Hubble operator is defined as follows:

[0092] ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a camera motion and image brightness-based Kinect depth reconstruction algorithm. The algorithm comprises the steps of 1) uploading data collected by Kinect to a computer through a third-party interface under the condition that a Kinect depth camera and an RGB camera are calibrated and aligned; 2) recovering a three-dimensional scene structure and a motion track of the kinect RGB camera from an RGB video sequence, and obtaining a relationship between point cloud and a camera motion; and 3) reconstructing image depth by utilizing brightness status information of an image in combination with the relationship between the point cloud and the camera motion, obtained in the step 2). According to the algorithm, the depth camera does not need to be improved physically, a complex apparatus combination does not need to be designed, and an illumination calibration step which is often used in a conventional depth reconstruction method, generally only can be limited in laboratory conditions, does not have a practical application value and is complex and strict in condition is not needed, so that compared with the conventional method, the algorithm has higher practical application value and significance.

Description

technical field [0001] The invention relates to the field of depth reconstruction in computer image processing, in particular to a Kinect depth reconstruction algorithm based on camera motion and image shading. Background technique [0002] With the introduction and promotion of some relatively cheap civil depth cameras in recent years, such as Microsoft Kinect and AsusXtion Pro, depth information is widely used in somatosensory games, real-time 3D reconstruction, augmented reality, virtual reality and other fields. The application of information has become an important support for the development of new human-computer interaction methods. However, most of the more popular civilian depth cameras currently on the market have the problems of insufficient depth detection accuracy and too much interference noise, which seriously affects the quality of application products based on depth information. Therefore, how to obtain more accurate depth information is of great significan...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06T7/593G06T7/277
CPCG06T2207/10021G06T2207/10024G06T2207/10028
Inventor 青春美黄韬袁书聪徐向民
Owner SOUTH CHINA UNIV OF TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products