Reconstruction method and device of three-dimensional scene

A technology of 3D scene and position information, which is applied in the field of image processing, and can solve problems such as the deviation of the camera attitude trajectory from the real trajectory, the cumulative error of the camera attitude trajectory, and the inaccuracy of the 3D scene.

Inactive Publication Date: 2017-07-07
HANGZHOU HUAWEI DIGITAL TECH
View PDF0 Cites 40 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] Although Kintinous solves the problem of limited Kinect Fusion 3D scenes, when the system runs for a long time, since the camera pose in Kintinous is estimated based on the pose trajectory of the previous frame, when the pose trajectory of the camera in a certain frame When an error occurs, it will lead to a cumulative error in the attitude trajectory of the camera in subsequent frames, so the attitude trajectory of the camera deviates from the real trajectory, resulting in inaccurate reconstruction of the 3D scene

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Reconstruction method and device of three-dimensional scene
  • Reconstruction method and device of three-dimensional scene
  • Reconstruction method and device of three-dimensional scene

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0101] In order to make the purpose, technical solutions and advantages of the embodiments of the present invention clearer, the technical solutions in the embodiments of the present invention will be clearly and completely described below in conjunction with the drawings in the embodiments of the present invention. Obviously, the described embodiments It is a part of embodiments of the present invention, but not all embodiments. Based on the embodiments of the present invention, all other embodiments obtained by persons of ordinary skill in the art without creative efforts fall within the protection scope of the present invention.

[0102] The present invention is applied in three-dimensional scene reconstruction, figure 1 It is the application scenario diagram of the present invention, such as figure 1 As shown, it includes: indoor scene, camera based on color depth RGB-D sensor (such as Kinect camera), graphics processing unit GPU and central processing unit CPU. Scan, an...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The present invention provides a reconstruction method and device of a three-dimensional scene. The method comprises the following steps that: the first camera attitude of a current frame is obtained according to the depth image of the current frame and the depth image of a reference frame; the second camera attitude of the current frame is estimated according to the gray-scale image of the current frame and the gray-scale image of the reference frame; the fused camera attitude of the current frame is obtained according to the first camera attitude of the current frame and the second camera attitude of the current frame; and a three-dimensional scene model corresponding to the current frame is generated according to the depth image of the current frame and the fused camera attitude of the current frame. According to the reconstruction method of the three-dimensional scene provided by the embodiments of the invention, depth information-based first camera attitude estimation and gray-scale information-based second camera attitude estimation are integrated, so that the cumulative error in the camera attitude estimation can be reduced, and therefore, the reconstruction method can conduct more stably in different scenes, and the reconstructed three-dimensional scene is more accurate.

Description

technical field [0001] Embodiments of the present invention relate to image processing technologies, and in particular, to a method and device for reconstructing a three-dimensional scene. Background technique [0002] Three-dimensional reconstruction refers to the establishment of a mathematical model suitable for computer representation and processing of three-dimensional objects, and it is the key technology to establish a virtual reality that expresses the objective world in the computer. The technology to achieve 3D reconstruction mainly includes Kinect Fusion technology, which is a real-time attitude positioning and matching (SLAM) technology based on color and depth images, that is, the Kinect camera is used to shoot around the 3D object to be reconstructed, so as to reconstruct in real time 3D model of an object. However, the working scene size and resolution of Kinect Fusion are determined by the video memory size of the graphics processor GPU, and the calculation ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06T7/55
CPCG06T2207/10016G06T2207/30244
Inventor 陈子冲章国锋吕朝阳吕培
Owner HANGZHOU HUAWEI DIGITAL TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products