Visual inertial navigation SLAM method based on ground plane hypothesis

A ground-level, ground-based technology, applied in image data processing, instrumentation, computing, etc., to solve problems such as decreased accuracy and coherence

Active Publication Date: 2018-10-30
NORTHEASTERN UNIV
View PDF7 Cites 49 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, the accuracy and coherence of traditional pure vision-base...

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Visual inertial navigation SLAM method based on ground plane hypothesis
  • Visual inertial navigation SLAM method based on ground plane hypothesis
  • Visual inertial navigation SLAM method based on ground plane hypothesis

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0143] The present invention will be further elaborated below in conjunction with the accompanying drawings of the description.

[0144] The present invention proposes a Visual-Inertial SLAM method (VIGO) based on ground plane assumptions, adding feature points on the ground and feature points on plane road signs as map features to realize SLAM. In order to make the positioning more robust and continuous, and restore the real scale, this paper adds an inertial sensor and adds the pre-integration data of the IMU to the optimization framework. In this way, the estimation of camera pose can be globally restricted, which greatly improves the accuracy. In addition, in the reconstructed 3D map, the ground area can be clearly constructed to provide richer information for subsequent AR applications or robot applications. The overall framework is as figure 1 shown.

[0145] The present invention is based on the visual inertial navigation SLAM method of ground plane hypothesis, compr...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention relates to a visual inertial navigation SLAM method based on ground plane hypothesis. According to the method, feature points are extracted from an image to perform IMU pre-integration,a camera projection model is established, and camera internal parameter calibration and external parameter calibration between an IMU and a camera are performed; a system is initialized, a visually observed point cloud and a camera pose are aligned to the IMU pre-integration, and a ground equation and the camera pose are restored; the ground is initialized to obtain a ground equation, the ground equation under the current camera pose is determined and back projected to an image coordinate system, and a more accurate ground region is acquired; and based on state estimation, all sensor observation models are derived, camera observation, IMU observation and ground feature observation are fused to do state estimation, a graph optimization model is used to do state estimation, and a sparse graph optimization and gradient descent method is used to realize overall optimization. Compared with previous algorithms, the precision of the method is greatly improved, estimation of the camera pose can be limited globally, and therefore accuracy is greatly improved.

Description

technical field [0001] The invention relates to a positioning and mapping technology, in particular to a visual inertial navigation SLAM method based on ground plane assumptions. Background technique [0002] The full name of SLAM is simultaneous positioning and map construction. The camera collects images in real time, estimates the camera's motion trajectory through frame-by-frame images, and reconstructs a map of the camera motion scene. Traditional visual SLAM uses points and lines with obvious color changes in the scene as landmarks on the map, which have no practical meaning and no contextual semantics, and are seriously affected by lighting and pedestrian occlusion in the shopping mall environment. In order to allow the robot to move freely in the indoor and outdoor environments, and to integrate AR applications into the scene more realistically, SLAM has become a research hotspot in recent years, and the monocular camera has small size, low cost, and can be easily em...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G06T7/73G06T7/80
CPCG06T7/73G06T7/80
Inventor 于瑞云杨硕石佳
Owner NORTHEASTERN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products