Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Real-time three-dimensional scene reconstruction system and method based on inertia and depth vision

A real-time 3D, depth vision technology, applied in the details of processing steps, image data processing, 3D modeling, etc., can solve problems such as system robustness reduction, reconstruction difficulty, algorithm failure, etc., to improve accuracy and fault tolerance. Strong and efficient effect

Inactive Publication Date: 2016-10-26
武汉盈力科技股份有限公司
View PDF3 Cites 53 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

If the shape of the surrounding objects changes too much, the algorithm will fail
[0008] At present, the real-time 3D environment reconstruction system based on RGB-D can achieve better 3D reconstruction effect based on visual odometry technology and ICP technology when the surrounding environment feature information is rich. However, when the surrounding environment conditions change, such as Light changes, changes in surrounding environment characteristics, etc., RGB-D real-time 3D environment reconstruction will encounter difficulties
Once the real-time 3D environment reconstruction is interrupted due to changes in the surrounding environmental conditions, it is difficult to continue working, reducing the robustness of the entire system

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Real-time three-dimensional scene reconstruction system and method based on inertia and depth vision
  • Real-time three-dimensional scene reconstruction system and method based on inertia and depth vision
  • Real-time three-dimensional scene reconstruction system and method based on inertia and depth vision

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0045] The present invention will be further described below in conjunction with the accompanying drawings and embodiments. The following examples will help those skilled in the art to further understand the present invention, but do not limit the present invention in any form. It should be noted that the sensor combination involved in the present invention is only a special case of the present invention, and the present invention is not limited to IMU, RGB and depth D sensors. Based on the concept of the present invention, several modifications and improvements can be made, such as using a binocular vision system instead of a monocular camera, or using an IMU with higher precision. Inspired by the algorithm idea of ​​the present invention, those skilled in the field of computer vision and multi-sensor fusion can also make improvements, such as using better feature tracking algorithms to enhance the accuracy of visual odometry in estimating relative motion. These improvements...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a real-time three-dimensional scene reconstruction system and method based on inertia and depth vision. The relative displacement and relative attitude change obtained by an IMU sensor through integral computation in short time is relatively accurate, and accordingly, the delta PIMU of an IMU can be used as an approximate real value to compare with delta PRGB and delta PD. If the difference between delta PRGB and delta PIMU exceeds a certain threshold value, the relative displacement and relative attitude change calculated by an RGB camera through tracking characteristics on successive frames is determined to be not accurate enough; in a similar way, if the difference between delta PD and delta PIMU exceeds a certain threshold value, the relative displacement and relative attitude change obtained in an ICP algorithm of matching point cloud of successive frames is determined to be not accurate enough; then data is fused based on comparison results, and relative displacement and attitude change of equipment is estimated in real time, so as to improve three-dimensional scene reconstruction precision and equipment fault tolerance.

Description

technical field [0001] The invention relates to the field of computer vision and multi-sensor fusion navigation drawing, in particular to a system and method for real-time three-dimensional scene reconstruction based on inertial sensors and depth vision sensors. Background technique [0002] Around 2010, Israel PrimeSense made a breakthrough in the miniaturization and modularization technology of depth sensors based on structured light, and cooperated with Microsoft to develop the Kienct sensor. A color RGB camera and a depth D camera are integrated on the Kienect sensor, which can quickly obtain point cloud data on the surface of objects within 0.5 meters to 4 meters around. It is a somatosensory device used to capture changes in players' movements. Once the technology was introduced to the market, it attracted the attention of both industry and academia. In the past few years, similar sensors have been launched one after another, such as the Structure Sensor of Occipital ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06T17/00
CPCG06T17/00G06T2200/08
Inventor 巨辉杨斌曹顺
Owner 武汉盈力科技股份有限公司
Features
  • Generate Ideas
  • Intellectual Property
  • Life Sciences
  • Materials
  • Tech Scout
Why Patsnap Eureka
  • Unparalleled Data Quality
  • Higher Quality Content
  • 60% Fewer Hallucinations
Social media
Patsnap Eureka Blog
Learn More