Check patentability & draft patents in minutes with Patsnap Eureka AI!

A high-precision 3D reconstruction method and system for depth cameras

A technology of three-dimensional reconstruction and depth camera, applied in the field of high-precision three-dimensional reconstruction methods and systems, can solve the problems of difficulty in reconstructing three-dimensional scene models, inability to restore three-dimensional scene models, etc., and achieve the effect of improving accuracy

Active Publication Date: 2021-08-27
HEFEI UNIV OF TECH
View PDF0 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Due to the influence of factors such as illumination changes in the scene, occlusion between objects, scene aggregation, repetitive structure, and noise in the collected data, it is difficult for the traditional manual-based active 3D modeling method to reconstruct the real 3D scene model, cannot recover high-precision 3D scene model contained in a large number of image sequences

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A high-precision 3D reconstruction method and system for depth cameras
  • A high-precision 3D reconstruction method and system for depth cameras
  • A high-precision 3D reconstruction method and system for depth cameras

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0044] The following will clearly and completely describe the technical solutions in the embodiments of the present invention with reference to the accompanying drawings in the embodiments of the present invention. Obviously, the described embodiments are only some, not all, embodiments of the present invention. Based on the embodiments of the present invention, all other embodiments obtained by persons of ordinary skill in the art without making creative efforts belong to the protection scope of the present invention.

[0045] The purpose of the present invention is to provide a depth camera-oriented multi-view 3D reconstruction method and system, capable of recovering a high-precision 3D scene model from image sequences collected by the depth camera.

[0046] In order to make the above objects, features and advantages of the present invention more comprehensible, the present invention will be further described in detail below in conjunction with the accompanying drawings and ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a depth camera-oriented multi-view three-dimensional reconstruction method and system. The method includes: acquiring a plurality of pairs of images of a real scene from different angles according to a depth camera, each pair of images including a color image and a depth image; acquiring two-dimensional feature points and feature descriptors on each color image according to a feature point detection algorithm, and calculating The confidence between the feature descriptors corresponding to any two color images is obtained to obtain the feature matching relationship; according to each feature matching relationship and the corresponding depth image, the corresponding 3D image is obtained, and the 2D feature points are back-projected to the camera coordinate system Next, obtain the 3D points on the 3D image; calculate the absolute pose of the camera according to the 2D feature points and the 3D points; according to the absolute pose of the camera, map the 3D points to the camera coordinate system to obtain the initial 3D model of the scene. By adopting the method or system of the present invention, a high-precision three-dimensional scene model can be restored from the image sequence collected by the depth camera.

Description

technical field [0001] The invention relates to the fields of computer vision and computer graphics, in particular to a high-precision three-dimensional reconstruction method and system for depth cameras. Background technique [0002] High-precision 3D reconstruction has always been one of the hot research issues in the field of computer vision and graphics, and its purpose is how to efficiently restore the 3D model of the scene from the input image sequence. At the same time, 3D reconstruction technology plays an irreplaceable role in the game industry, military simulation, agriculture and industry, augmented reality, virtual reality, scene monitoring, map navigation and automatic driving. Due to the influence of factors such as illumination changes in the scene, occlusion between objects, scene aggregation, repetitive structure, and noise in the collected data, it is difficult for the traditional manual-based active 3D modeling method to reconstruct the real 3D scene mode...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06T17/00
CPCG06T17/00
Inventor 贾伟曹明伟陈缘李国庆刘晓平
Owner HEFEI UNIV OF TECH
Features
  • R&D
  • Intellectual Property
  • Life Sciences
  • Materials
  • Tech Scout
Why Patsnap Eureka
  • Unparalleled Data Quality
  • Higher Quality Content
  • 60% Fewer Hallucinations
Social media
Patsnap Eureka Blog
Learn More