Kinectv2-based complete object real-time three-dimensional reconstruction method

A real-time three-dimensional and complete technology, applied in the field of computer vision, can solve the problems of large amount of calculation, difficulty in ensuring real-time performance and accuracy at the same time, high price of three-dimensional scanning equipment, and achieve the effect of improving the effect

Inactive Publication Date: 2019-07-23
XIDIAN UNIV
View PDF5 Cites 25 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0003] Existing 3D reconstruction generally uses multiple photos, monocular or binocular cameras for depth acquisition. This process is computationally intensive, and it is difficult to g

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Kinectv2-based complete object real-time three-dimensional reconstruction method
  • Kinectv2-based complete object real-time three-dimensional reconstruction method
  • Kinectv2-based complete object real-time three-dimensional reconstruction method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0064] In the first group, we used a blue disk with a frosted surface. From the depth map, we can see that the edge of the disk is surrounded by small black cavities. When generating a three-dimensional plane through fusion, we found that the surface is rough and not smooth. The influence of structure noise; the second group uses a smooth white reflective disc. When observing the depth map, we can see a black hole above, and this area happens to be the area where the indoor light is reflected on the disc, thus As a result, the three-dimensional plane is missing during fusion; in the third group, we use a matte black disc, and there are fine black dots in the area above the corresponding depth map disc, and then when observing the fused three-dimensional model, we find that the fusion degree of the model plane is poor, and the depth loss is serious .

[0065] From the above experiments, it is found that due to the influence of the Kinect2.0 device itself and the acquisition env...

Embodiment 2

[0073] Deep Completion Network Design

[0074] Since Kinect2.0 is usually unable to perceive the depth map of bright, transparent areas and distant surfaces of objects, when the measured object or environment is too complex, the effect of using filters to denoise the acquired depth image is limited, and it cannot meet the requirements of most depth images. Repair of missing regions. When the traditional inpainting algorithm is bottlenecked, considering that the high-resolution color image obtained by Kinect2.0 has rich detailed information, we try to turn our attention to the field of deep learning, hoping to train a large number of data samples that can predict the depth image. And patched network. To this end, we introduce a method to try to construct the existing database, and design a deep network that can perform end-to-end training and evaluation on color images and depth images, predict the local differential properties of color images and combine the original data col...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention relates to a Kinectv2-based complete object real-time three-dimensional reconstruction method. The method comprises data acquisition, depth completion, point cloud processing, ICP pointcloud registration, point cloud fusion and surface reconstruction. The beneficial effects of the invention are as follows: the Kinect2.0-based object complete real-time three-dimensional reconstruction method comprises the steps of data acquisition, deep completion, point cloud processing, ICP point cloud registration, point cloud fusion, curved surface reconstruction and the like, the effect of the method on complete real-time three-dimensional reconstruction of the object is improved; the defects that a plurality of pictures, monocular cameras or binocular cameras are generally used for depth acquisition in traditional three-dimensional reconstruction, the calculation amount in the process is large, real-time performance and precision are difficult to guarantee at the same time, professional high-precision three-dimensional scanning equipment is too high in price, and the application and popularization degree of the three-dimensional reconstruction technology is limited are overcome.

Description

technical field [0001] The invention relates to the technical field of computer vision, in particular to a Kinect2.0-based complete and real-time three-dimensional reconstruction method of an object. Background technique [0002] With the advancement of technology and the diversification of life needs, computer vision technology has been continuously iterated and updated, which can help us obtain more information in digital images and videos. Among them, 3D reconstruction technology has become a hot spot of continuous attention in recent years , transform image analysis from two-dimensional space to three-dimensional, and provide us with more optimized solutions with a more three-dimensional perspective. [0003] Existing 3D reconstruction generally uses multiple photos, monocular or binocular cameras for depth acquisition. This process is computationally intensive, and it is difficult to guarantee real-time and accuracy at the same time. Professional high-precision 3D scann...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G06T17/20
CPCG06T17/20G06T2200/08
Inventor 卢朝阳郑熙映
Owner XIDIAN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products