Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Real-time 3D reconstruction method based on depth map

A three-dimensional reconstruction and depth map technology, applied in the field of computer vision, can solve the problems of slow reconstruction speed, limited reconstruction scale by memory, poor reconstruction quality, etc., and achieve the effect of improving efficiency.

Active Publication Date: 2018-12-07
HUAZHONG UNIV OF SCI & TECH
View PDF9 Cites 20 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] In view of the defects and improvement needs of the prior art, the present invention provides a 3D reconstruction method based on a depth map, aiming at solving the problems of poor reconstruction quality, slow reconstruction speed and limited memory of the reconstruction scale in the existing 3D reconstruction methods

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Real-time 3D reconstruction method based on depth map
  • Real-time 3D reconstruction method based on depth map
  • Real-time 3D reconstruction method based on depth map

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0052] In order to make the object, technical solution and advantages of the present invention clearer, the present invention will be further described in detail below in conjunction with the accompanying drawings and embodiments. It should be understood that the specific embodiments described here are only used to explain the present invention, not to limit the present invention. In addition, the technical features involved in the various embodiments of the present invention described below can be combined with each other as long as they do not constitute a conflict with each other.

[0053] Such as figure 1 As shown, the real-time three-dimensional reconstruction method based on the depth map provided by the present invention includes the following steps:

[0054] (1) Use the depth camera to obtain the depth map and RGB color map of the shooting scene; the depth map is used to represent the distance from the scene surface to the depth camera, and the color map is used to gi...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a real-time 3D reconstruction method based on a depth map. The method comprises steps that depth maps and an RGB color map of a shooting scene are obtained through utilizing adepth camera; each frame of the depth maps is processed as follows, the depth information is complemented, pixel points are then converted into a first type of three-dimensional coordinate points, anda normal vector of each pixel point is calculated; the first type of three-dimensional coordinate points corresponding to the depth map are converted into a second type of three-dimensional coordinate points; each of the second type of three-dimensional coordinate points corresponding to the depth map is assigned a voxel block, and a hash table is utilized to index the voxel blocks; an sdf valueof each voxel in the voxel blocks is updated through weighted fusion of the voxel blocks, and the scene surface is extracted; the texture information of the scene surface is obtained, and a surface normal vector of each voxel of the scene surface is calculated. The method is advantaged in that the reconstruction speed and reconstruction quality can be effectively improved, and the method is applicable to large-scale scene reconstruction.

Description

technical field [0001] The invention belongs to the technical field of computer vision, and more specifically relates to a real-time three-dimensional reconstruction method based on a depth map. Background technique [0002] 3D reconstruction technology has always been a hot topic in the field of computer graphics and computer vision. Early 3D reconstruction technologies usually used 2D images as input to reconstruct a 3D model in the scene. Limited by the input data, the reconstructed 3D model was usually incomplete and less realistic. In recent years, with the emergence of various depth cameras for ordinary consumers, the 3D reconstruction technology based on depth cameras has developed rapidly. Depth cameras are inexpensive, modest in size, easy to operate, and easy for researchers and engineers to develop. Recent work has focused on using this consumer depth camera for real-time surface reconstruction. 3D reconstruction technology is also the basis of augmented reality...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06T17/00G06T15/00G06F17/30
CPCG06T15/005G06T17/00
Inventor 李丹胡迎松邹春明袁凌谭琪蔚孙钰可刘爽
Owner HUAZHONG UNIV OF SCI & TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products