Outdoor large-scale scene three-dimensional mapping method fusing multiple sensors

A multi-sensor, large-scene technology, used in instruments, image analysis, image enhancement, etc., to solve problems such as inaccurate positioning, inability to build 3D maps, and non-uniform motion distortion.

Active Publication Date: 2021-04-09
FUZHOU UNIV
View PDF4 Cites 23 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0003] However, pure visual SLAM requires moderate lighting conditions and obvious image features, and cannot construct 3D maps outdoors
Laser SLAM is prone to non-uniform motion distortion during motion, and its positioning is inaccurate in degraded scenes

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Outdoor large-scale scene three-dimensional mapping method fusing multiple sensors
  • Outdoor large-scale scene three-dimensional mapping method fusing multiple sensors
  • Outdoor large-scale scene three-dimensional mapping method fusing multiple sensors

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0056] The present invention will be further described below in conjunction with the accompanying drawings and embodiments.

[0057] It should be pointed out that the following detailed description is exemplary and is intended to provide further explanation to the present application. Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this application belongs.

[0058] It should be noted that the terminology used here is only for describing specific implementations, and is not intended to limit the exemplary implementations according to the present application. As used herein, unless the context clearly dictates otherwise, the singular is intended to include the plural, and it should also be understood that when the terms "comprising" and / or "comprising" are used in this specification, they mean There are features, steps, operations, means, components and / or combina...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention provides an outdoor large-scale scene three-dimensional mapping method fusing multiple sensors. The implementation process is divided into two modules, namely a visual inertia odometer module, and a laser odometer and mapping module. The visual inertia odometer module comprises optical flow tracking, IMU pre-integration, initialization, sliding window optimization, marginalization and word bag model establishment. The laser odometer and mapping module comprises point cloud segmentation, point cloud distortion removal, feature extraction and inter-frame matching, loopback detection and mapping. Compared with a single radar mapping scheme, the high-frequency pose of the visual inertia odometer is fused, and the method has the advantages of being good in point cloud distortion removal effect, high in loopback detection precision and high in mapping precision. The problem that an outdoor large-scene three-dimensional map is low in precision is solved, and a breakthrough is provided for further development of unmanned driving.

Description

technical field [0001] The invention relates to the technical field of unmanned driving, in particular to a multi-sensor fusion three-dimensional mapping method for outdoor large scenes. Background technique [0002] At the same time, the application of Simultaneous Localization and Mapping (SLAM) on unmanned vehicles has attracted the attention of more and more researchers. Its purpose is to enable the unmanned vehicle to independently complete the pose estimation and navigation of the unmanned vehicle according to the sensors carried by itself in the environmental map established by SLAM technology and without prior information after the GPS fails. The mainstream SLAM method can be divided into two types according to the type of sensor, image-based visual SLAM and radar-based laser SLAM. In addition, visual SLAM integrated with inertial measurement devices (Inertial Measurement Unit, IMU) is also a research hotspot today. [0003] However, pure visual SLAM requires moder...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06T17/05G06K9/62G06T7/80G06K9/46
CPCG06T17/05G06T7/80G06T2207/10028G06V10/44G06F18/22G06F18/23213
Inventor 彭育辉林晨浩马中原钟聪
Owner FUZHOU UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products