No-distortion integrated imaging three-dimensional displaying method based on Kinect

An integrated imaging and three-dimensional display technology, applied in image enhancement, image analysis, image data processing, etc., can solve problems such as optical distortion, failure to consider the effect of color images on depth image restoration, and inability to obtain high-quality continuous depth images

Active Publication Date: 2017-07-04
DALIAN UNIV OF TECH
View PDF5 Cites 17 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

The first key problem is that due to the reflection, illumination and occlusion of the surface material of the object in the 3D scene, there are a large number of black holes in the depth image acquired by the Kinect sensor.
Especially in the complex background with a large depth of field, because the infrared sensor of the Kinect sensor cannot capture the light reflected from the surface of the object, it is impossible to obtain high-quality continuous depth images in the black hole area that forms the depth image
The second key issue is that there is an edge misalignment between the depth image and the color image due to the inherent optical distortion between the depth camera and the color camera of the Kinect sensor
For example, Telea et al. proposed a fast stepping method in [Animage inpainting technique based on the fast marching method.journal of graphics, gpu, and game tools, 9(1):23–34,2004.],

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • No-distortion integrated imaging three-dimensional displaying method based on Kinect
  • No-distortion integrated imaging three-dimensional displaying method based on Kinect
  • No-distortion integrated imaging three-dimensional displaying method based on Kinect

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0075] In order to make the specific implementation of the present invention more clear, the following four steps included in the present invention will be described in detail and completely in combination with the technical solution of the present invention and the accompanying drawings.

[0076] Such as figure 1 As shown, the distortion-free integrated imaging three-dimensional display method based on the Kinect sensor comprises the following steps:

[0077] Step 1, calibrate and calibrate the depth image and color image acquired by the Kinect sensor. Its specific implementation includes the following two parts:

[0078] 1. Calibrate the Kinect sensor

[0079] Ordinary cameras are usually calibrated by shooting a checkerboard patterned calibration board, combined with Zhang’s calibration and other calibration methods, but in the Kinect depth image, the checkerboard pattern on the calibration board cannot be displayed, so it is impossible to directly calibrate the infrared ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a no-distortion integrated imaging three-dimensional displaying method based on a Kinect sensor. The method comprises the steps of acquiring a depth image of a three-dimensional scene by means of the Kinect sensor, and performing black hole filling on the depth image obtained by the Kinect according to a rapid stepping method; performing combined bilateral filtering on the depth image after filling according to a corresponding colorful image, and obtaining the depth image with continuous depth change and smooth edge; and finally obtaining a primitive image array of a large-field-depth complex scene by means of an optical field model, thereby realizing no-distortion three-dimensional integrated imaging displaying. According to the primitive image array obtained through the method of the invention, because of an ideal state without influence by scattering and refraction, the image quality is better than that of the primitive image array which is obtained through a microlens array. The no-distortion integrated imaging three-dimensional displaying method can effectively overcome physical restriction of a micro-lens array in recording the large-field-depth scene and furthermore can realize no-distortion three-dimensional integrated imaging displaying for the large-field-depth compliex scene.

Description

technical field [0001] The invention relates to the technical fields of computer vision, three-dimensional reconstruction and stereoscopic display, in particular to an integrated imaging three-dimensional display method that can adapt to large depth of field and complex scenes. Background technique [0002] With the development of optoelectronic devices and video processing technology, integrated imaging 3D display has become a research hotspot in the next generation of true 3D display technology. Compared with holographic three-dimensional display, it has the characteristics of relatively small amount of data, no coherent light source and no harsh environmental requirements, etc. It has become one of the most advanced three-dimensional display methods in the world, and it is also the most promising way to realize three-dimensional TV. There are two implementation schemes for naked-view true 3D display, one is the multi-view scheme, and the other is the "texture + depth" sch...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G06T7/80G06T7/521
CPCG06T2207/10024
Inventor 朴永日张淼王晓慧
Owner DALIAN UNIV OF TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products