Sparse multi-view-angle three-dimensional reconstruction method for indoor scene

A technology for 3D reconstruction and indoor scene, which is applied in the fields of computer vision and computer graphics to achieve good scalability and easy implementation.

Active Publication Date: 2019-09-27
TIANJIN UNIV
View PDF5 Cites 8 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Simultaneous Localization and Map Building (SLAM) (M.G. Dissanayake, P. Newman, S. Clark, H.F. Durrant Whyte, and M. Csorba. A solution to the simultaneous localization and map building (SLAM) problem. IEEE Trans. Robotics & Automation, 17(3): 229–241, 2001.) Technique and Structure of Motion Recovery (SFM) (N. Snavely, S.M. Seitz, and R. Szeliski. Photo tourism: exploring photocollectio

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Sparse multi-view-angle three-dimensional reconstruction method for indoor scene
  • Sparse multi-view-angle three-dimensional reconstruction method for indoor scene
  • Sparse multi-view-angle three-dimensional reconstruction method for indoor scene

Examples

Experimental program
Comparison scheme
Effect test

Embodiment approach

[0030] The invention utilizes the color images of sparse viewing angles to reconstruct the indoor three-dimensional scene. Firstly, the existing method is used to calculate the depth map and semantic map corresponding to each color image, and then the global-local registration method proposed by us is used to realize the fusion of 3D point cloud models of each sparse perspective. like figure 1 As shown, it is a flow chart of the indoor reconstruction based on the color picture three-dimensional scene of the embodiment of the present invention, and the specific implementation scheme is as follows:

[0031] 1) Take 3 to 5 images in an indoor scene, and use a sparse perspective to shoot, but there is still a certain degree of overlap between each two. Compared with the tracking method, the photographer has more room for movement, and Easier to operate.

[0032] 2) Use existing methods to estimate the depth map and semantic map corresponding to each color image.

[0033] 3) Fil...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention belongs to the field of computer vision and computer graphics, and aims to ensure the universality of the application of the technology and generate a relatively accurate three-dimensional model. The technical scheme adopted by the invention is as follows: the sparse multi-view three-dimensional reconstruction method for the indoor scene comprises the following steps of: obtaining depth information and semantic information based on a deep learning method, and then realizing model fusion of each sparse view angle by using a global-local registration method, wherein the global-local registration method specifically comprises the steps of filtering and smoothing the depth map by using a method of establishing a patch by using a single view angle, converting the depth map under each view angle into point cloud and carrying out fusion. The method is mainly applied to image processing occasions.

Description

technical field [0001] The invention belongs to the fields of computer vision and computer graphics, in particular to a method for three-dimensional reconstruction of an indoor scene under sparse multi-angle shooting. Background technique [0002] With the increasing demand for indoor navigation, home or office design, and augmented reality, 3D reconstruction and understanding of indoor scenes has become a hot topic in computer vision and graphics. Today's reconstruction methods broadly fall into two categories. The first method is to use a time-of-flight or structured light-based depth camera to scan the scene, KinectFusion (R.A. Newcombe, S.Izadi, O.Hilliges, D.Molyneaux, D.Kim, A.J.Davison, P.Kohi, J. Shotton, S. Hodges, and A. Fitzgibbon. KinectFusion: Real-time densesurface mapping and tracking. In ISMAR, pages 127–136, 2011.) showed in detail the process of using Kinect for indoor 3D reconstruction, after which ElasticFusion (T.Whelan, R.F.Salas-Moreno, B.Glocker, A....

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G06T17/20G06T7/30
CPCG06T17/20G06T7/30G06T2207/10028
Inventor 杨敬钰徐吉李坤吴昊岳焕景
Owner TIANJIN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products