A texture fusion method for real-time 3D reconstruction of RGB-D camera

A RGB-D, fusion method technology, applied in 3D modeling, image data processing, image enhancement and other directions, can solve the problems of difficult to realize real-time application of ordinary PC or mobile terminal, difficult to integrate, high computing cost, etc., to achieve low computing cost, improve texture accuracy, reduce the effect of texture blurring problems

Active Publication Date: 2021-08-24
ZHEJIANG UNIV
View PDF4 Cites 1 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

This method usually requires good initial reconstruction results, and at the same time requires high computing costs, it is difficult to realize real-time application on ordinary PCs or mobile terminals, and it is not easy to integrate into the existing three-dimensional reconstruction framework

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A texture fusion method for real-time 3D reconstruction of RGB-D camera
  • A texture fusion method for real-time 3D reconstruction of RGB-D camera
  • A texture fusion method for real-time 3D reconstruction of RGB-D camera

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0078] In order to make the object, technical solution and advantages of the present invention clearer, the present invention will be further described in detail below in conjunction with the accompanying drawings and embodiments. It should be understood that the specific embodiments described here are only used to explain the present invention, not to limit the present invention.

[0079] Taking the application of the present invention to the voxel-based RGB-D reconstruction framework as an example, the application mode of the present invention is described, such as figure 2 shown. The voxel-based RGB-D reconstruction framework mainly includes five parts: input RGB-D data, data preprocessing, camera pose evaluation, fusion and update of voxel information, and surface mesh extraction. The present invention is mainly applied in the fusion and update of voxel information, the input of which includes the original RGB-D image, the estimated camera pose and the initial / fused glob...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a texture fusion method for real-time three-dimensional reconstruction of an RGB-D camera. Process the RGB-D data stream to obtain the clarity of the color image, select the key frame to extract the foreground, filter and denoise the depth image, calculate the normal vector of the point cloud of the depth image, and construct the reconstruction data stream; use the probabilistic method and the heuristic method The combined method quantitatively establishes the adaptive weight field of the color image as the confidence distribution of the real-time frame describing the color data; by comparing the confidence weight in the adaptive weight field of the real-time frame with the latest confidence weight of the reference point cloud, Choose from the three operations of replacement, fusion and preservation to update the texture result and realize texture fusion for 3D reconstruction. The invention can extract high-quality data, can effectively reduce the blurring of texture fusion, realize clear texture reconstruction results, and embed it into the RGB-D reconstruction framework at a low calculation cost, thereby significantly improving the texture reconstruction accuracy.

Description

technical field [0001] The invention belongs to the field of computer vision and computer graphics image processing, and more specifically relates to a texture fusion method for real-time three-dimensional reconstruction of an RGB-D camera. Background technique [0002] With the increasing importance of 3D reconstruction technology in the fields of autonomous driving, virtual reality, robot positioning and navigation, how to use low-cost RGB-D sensors to obtain high-quality 3D reconstruction scenes in real time has become a hot issue. However, most of the current real-time 3D reconstruction based on RGB-D adopts the method of pixel-by-pixel weighted average sequence frame images to generate the texture of the object surface. This method is easy to destroy the structural information of the texture and cause the texture blur problem. [0003] At present, there are two main ways to improve the quality of texture reconstruction. The first way is to select high-quality data as m...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Patents(China)
IPC IPC(8): G06T17/00G06T19/20G06T5/00G06T5/50G06T7/11G06T7/194G06T7/41G06T7/66
CPCG06T5/002G06T5/50G06T17/00G06T19/20G06T2207/10024G06T2207/10028G06T2207/20221G06T2219/2012G06T7/11G06T7/194G06T7/41G06T7/66
Inventor 李基拓刘鑫琦陆国栋
Owner ZHEJIANG UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products