Unlock instant, AI-driven research and patent intelligence for your innovation.

Two-stage three-dimensional scene modeling method

A technology of 3D scenes and modeling methods, which is applied in the field of 3D reconstruction of indoor dynamic scenes, can solve the problems that the accuracy and time cannot be guaranteed at the same time, and achieve the effect of good timing and continuity

Active Publication Date: 2021-07-23
ARMY ENG UNIV OF PLA
View PDF2 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0006] The purpose of the present invention is to propose a two-stage three-dimensional scene reconstruction method for the existing three-dimensional reconstruction method under the problem that the accuracy and time of the scene cannot be guaranteed at the same time, which uses RGBD cameras and optical cameras to perform three-dimensional reconstruction of the indoor environment

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Two-stage three-dimensional scene modeling method
  • Two-stage three-dimensional scene modeling method
  • Two-stage three-dimensional scene modeling method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0042] The present invention is achieved through the following technical solutions:

[0043] like figure 1 As shown, the present invention discloses a two-stage three-dimensional scene modeling method. The device used in the three-dimensional reconstruction method mainly includes a PC with a GPU, an RGBD camera and an optical camera, and the specific steps are as follows:

[0044] 1) Fixing device: Fix the RGBD camera and optical camera on the robot (or hand-held) in a better way, so that it can collect data smoothly and continuously;

[0045] 2) Calibrate the RGBD camera and the optical camera: get the RGB internal parameters and depth internal parameters of the RGBD camera, as well as the transfer matrices of the two cameras, and the internal parameter values ​​of the optical camera;

[0046] 3) Collect scene images: use ROS to connect the RGBD camera, collect each frame of RGB image and depth image, align the two through timestamps, and collect optical images at the same t...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a two-stage three-dimensional scene modeling method, belonging to the field of computer vision. The method comprises the following the steps that RGB data and depth data are collected at the same time through an RGBD camera and a monocular camera; and modeling is conducted at different speeds through a three-dimensional reconstruction algorithm based on visual synchronous positioning and mapping and a three-dimensional reconstruction algorithm based on a motion recovery structure. In the first stage, an SLAM reconstruction progress is recorded in real time, image frame registration, attitude estimation, point cloud matching, loopback detection and model fusion are carried out, a sparse reconstruction model is generated in a short time, real-time positioning and navigation can be helped, and the time sequence performance of data is enhanced; and in the second stage, the three-dimensional reconstruction algorithm based on the motion recovery structure is carried out through a large number of high-definition RGB images, and a dense and complete reconstruction model with rich details is generated. The method meets the requirement for auxiliary positioning of an reconstruction result in a short time, has timeliness, and can provide a dense reconstruction result with high reconstruction precision.

Description

technical field [0001] The invention belongs to the field of three-dimensional imaging, in particular, to a three-dimensional reconstruction method of an indoor dynamic scene based on an RGBD camera. Background technique [0002] The emergence of Simultaneous Localization and Mapping (SLAM) technology has well solved the problems in the fields of intelligent robot image recognition, mapping and positioning, using sensors mounted on the robot (such as binocular and monocular cameras) to obtain external information, use this information to construct a map, and estimate its own position and attitude. [0003] RGBD cameras can help do better reconstruction, because with sensors that can obtain depth information, you can real-time or save good depth information, and then through some calculations, better 3D reconstruction. [0004] Structure from Motion (SFM) is a 3D reconstruction method that derives 3D information from time-series 2D images. The advantage is that as many pictu...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06T17/00G06T7/80G06K9/46
CPCG06T17/00G06T7/85G06T2207/10028G06T2207/10024G06V10/462G06V10/44
Inventor 芮挺王新晴刘昕晖王东徐飞翔杨成松王继新赵华琛郑南蒋群艳
Owner ARMY ENG UNIV OF PLA