Three-dimensional reconstruction method of large-scene object

A technology for 3D reconstruction and large scenes, applied in the field of 3D reconstruction of large scene objects, can solve problems such as point cloud sparseness, 3D point cloud coordinate errors, weak constraints, etc., achieve high reconstruction accuracy, reduce computational complexity, and point cloud dense effect

Active Publication Date: 2016-11-09
GUANGXI UNIV
View PDF4 Cites 32 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0003] The 3D reconstruction method for large scene objects is currently used more often than SFM (Structure FromMotion). Its disadvantage is that the epipolar geometric constraints are used for feature detection and matching, and it can only be determined that the matching point is on the corresponding epipolar line. The exact position of the matching point cannot be determined, and this constraint is relatively weak
Once the mismatching points cannot be deleted, it is wrong to solve the basic matrix, and the obtained 3D point cloud coordinates are also wrong, and the point cloud coordinates corresponding to other image ...

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Three-dimensional reconstruction method of large-scene object
  • Three-dimensional reconstruction method of large-scene object
  • Three-dimensional reconstruction method of large-scene object

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0025] The specific embodiment of the present invention will be further described below in conjunction with accompanying drawing:

[0026] Such as figure 1 Shown, a kind of three-dimensional reconstruction method of large scene object is characterized in that: comprise the following steps:

[0027] 1) Use the camera to collect several sequence photos of scene objects from different angles of view and positions, and number the several sequence photos as I according to the sequence of collection i ,(i=1,2,3...n);

[0028] 2) Divide the sequence of photos into several units N according to the sequence of numbers i , (i=1, 2, 3...n), each unit includes three sequential photos collected in sequence, and unit N i , (i=1, 2, 3...n) the third sequence of photos as the adjacent next unit N i+1 , (i=1, 2, 3...n) the first sequence of photos, and use this sequence of photos as the common sequence of photos of the two adjacent units;

[0029] 3) By using the SIFT algorithm for each u...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a three-dimensional reconstruction method of a large-scene object. The method comprises the following steps: 1, acquiring a plurality of sequence pictures of the scene object from different visual angles and positions by use of a camera; 2, dividing the sequence pictures into a plurality of units Ni; 3, performing feature detection and matching on three sequence pictures of each unit by use of an SIFT algorithm; 4, initializing a first sequence picture of a unit N1 (i=1), and calculating a basic matrix F and an intrinsic matrix E of the other two sequence pictures; 5, solving camera parameters Ri and Ti and three-dimensional point cloud coordinates Mj of each sequence picture in other units; 6, obtaining seed surface sheets of the scene object; 7, expanding the seed surface sheets; and 8, filtering the seed surface sheets. According to the invention, by use of three-view constraints, mismatch points are effectively removed, an accurate basic matrix is obtained, the reconstruction precision is high, the units do not need coordinate conversion, the calculation complexity is reduced, and through performing dense reconstruction on the basis of a motion recovery structure, obtained point clouds of a three-dimensional model are enabled to be denser.

Description

technical field [0001] The invention relates to the fields of computer vision and computer graphics, in particular to a method for three-dimensional reconstruction of large scene objects. Background technique [0002] The ultimate goal of 3D reconstruction is to restore the 3D structure of the target scene, and 3D reconstruction based on image sequences is one of the main means to obtain 3D structure. This method can be regarded as the inverse process of photography, its cost is relatively low, it only needs an ordinary camera, it is easy to operate and easy to carry, feature point matching and multi-view stereo reconstruction based on image sequences are researches in computer vision technology Basics and focus. [0003] The 3D reconstruction method for large scene objects is currently used more often than SFM (Structure FromMotion). Its disadvantage is that the epipolar geometric constraints are used for feature detection and matching, and it can only be determined that t...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G06T17/00
CPCG06T17/00
Inventor 林靖宇郑恩
Owner GUANGXI UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products