Check patentability & draft patents in minutes with Patsnap Eureka AI!

Low-texture plane scene reconstruction method based on sparse SLAM frame

A scene reconstruction and low-texture technology, applied in image data processing, 3D image processing, instruments, etc., can solve problems such as easy loss of tracking, insufficient algorithm robustness, poor reconstruction effect, etc., to solve quality problems and improve Robust, good real-time effect

Active Publication Date: 2018-11-09
BEIHANG UNIV
View PDF4 Cites 11 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] In order to solve the problem that the SLAM system based on sparse feature points faces low-texture plane areas, such as walls and cabinets, the reconstruction effect is poor, the tracking is easily lost, and the robustness of the algorithm is not strong enough. A method that integrates plane detection into The reconstruction method in the SLAM system based on feature points can make this type of SLAM system be applied to more life scenes.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Low-texture plane scene reconstruction method based on sparse SLAM frame
  • Low-texture plane scene reconstruction method based on sparse SLAM frame
  • Low-texture plane scene reconstruction method based on sparse SLAM frame

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0022] The present invention will be described in further detail below in conjunction with the accompanying drawings.

[0023] The low-texture planar scene reconstruction method based on the sparse SLAM framework of the present invention, such as figure 1 shown, including the following steps:

[0024] Step 1: Read the RGB image and depth image of each frame of the image data through the RGB-D camera Kinect.

[0025] Step 2: For each frame of image obtained, according to whether a sufficient number of feature points can be obtained for each frame of image, it is divided into two cases and carried out simultaneously. They are:

[0026] Case 1: Under the framework of sparse SLAM, the feature points in the image are obtained through the gray level difference between pixels. First, for each frame of image acquired in step 1, extract the ORB feature points in the image. For the complex texture features in the scene to be reconstructed, it is reflected in the image that a suffici...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a low-texture plane scene reconstruction method based on a sparse SLAM framework, and the method comprises the following steps: 1, reading an RGB image and a depth image of each frame of image data through an RGB-D camera Kinect; 2, performing the following operations for each acquired image frame: A, obtaining the feature points in the image through the grayscale difference between the pixels in the framework of the sparse SLAM; B, extracting a plane area where sufficient feature points cannot be obtained in the image as a plane road sign; 3, taking the coordinates ofthe feature points in the world coordinate system as map points to perform local map construction; 4, carrying out the bundled adjustment and optimization of a local map and performing the looped detection of the local map; 5, taking the feature points and the plane as road signs for loop detection, and performing the global bundled adjustment to optimize the pose and trajectory of a camera. The method not only can solve the quality problem of low-texture region reconstruction, but also can improve the robustness of a sparse feature point SLAM system.

Description

technical field [0001] The invention relates to the fields of computer vision, digital image processing and simultaneous positioning and mapping, in particular to a low-texture planar scene reconstruction method based on a sparse SLAM framework. Background technique [0002] In the field of computer graphics, 3D reconstruction refers to recovering the shape and position information of objects in 3D space by using information such as depth and texture in 2D images captured by cameras. 3D reconstruction based on visual images has the characteristics of low cost and high degree of automation. [0003] There are many types of vision-based 3D reconstruction according to different classification methods. In recent years, whether based on RGB-D images or 3D reconstruction from other monocular cameras, SLAM frameworks have been widely used. Simultaneous Localization and Mapping (SLAM) problem is to allow localization and reconstruction to be carried out at the same time. Compared ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06T15/04G06T7/80
CPCG06T7/85G06T15/04
Inventor 赵永嘉陈昌杰雷小永戴树岭
Owner BEIHANG UNIV
Features
  • R&D
  • Intellectual Property
  • Life Sciences
  • Materials
  • Tech Scout
Why Patsnap Eureka
  • Unparalleled Data Quality
  • Higher Quality Content
  • 60% Fewer Hallucinations
Social media
Patsnap Eureka Blog
Learn More