Simultaneous positioning and mapping method for autonomous mobile platform in rescue scene

An autonomous mobile and platform technology, applied in image enhancement, image analysis, image data processing, etc., can solve problems such as smoke influence, uneven illumination, random noise, etc., and achieve the effect of improving robustness

Active Publication Date: 2020-08-25
SOUTH CHINA UNIV OF TECH
View PDF3 Cites 23 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] For positioning and mapping in rescue scenarios based on multi-camera omni-directional visi...

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Simultaneous positioning and mapping method for autonomous mobile platform in rescue scene
  • Simultaneous positioning and mapping method for autonomous mobile platform in rescue scene
  • Simultaneous positioning and mapping method for autonomous mobile platform in rescue scene

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0062] In order to make the purpose, technical solutions and advantages of the embodiments of the present invention clearer, the technical solutions in the embodiments of the present invention will be clearly and completely described below in conjunction with the drawings in the embodiments of the present invention. Obviously, the described embodiments It is a part of embodiments of the present invention, but not all embodiments. Based on the embodiments of the present invention, all other embodiments obtained by persons of ordinary skill in the art without making creative efforts belong to the protection scope of the present invention.

[0063] Step S1, acquire omni-directional images through four-eye cameras, and perform preprocessing on the images to obtain a defogged image.

[0064] For the smoke phenomenon in the rescue scene, the smoke will increase the brightness of the picture and reduce the saturation. Using deep learning algorithm for image defogging processing, usi...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a simultaneous positioning and mapping method for an autonomous mobile platform in a rescue scene, which is applied to a search and rescue autonomous mobile platform and can beused for positioning and mapping in extreme environments such as a fire rescue scene, a complex dangerous accident scene and the like. The method is mainly divided into three parts of sensor information processing, pose estimation and pose correction. The method comprises the following steps: firstly, performing operations such as deblurring and feature extraction on omnibearing image informationacquired by a four-eye camera module; secondly, fusing with the measurement information of the inertial navigation sensor and estimating the pose, and finally, carrying out global pose optimization and correction. According to the method, a full-automatic end-to-end defogging method is designed based on the convolutional neural network, a post-processing step is not needed, the application rangeis wide, and the method can be applied to defogging of indoor and natural scenes at the same time. And meanwhile, a relatively stable binocular vision SLAM method based on a feature point method is adopted to fuse IMU sensor data to perform positioning and mapping of the autonomous mobile platform.

Description

technical field [0001] The invention belongs to the technical field of positioning and navigation of an autonomous mobile platform, and relates to a positioning and mapping method of an autonomous mobile platform in a rescue scene based on multi-camera stereo vision and an IMU sensor. Background technique [0002] In recent years, safety issues in special environments such as fire and chemical industry have attracted more and more attention. Extreme environmental accidents, such as fire rescue sites and complex dangerous accident sites, are characterized by complexity and high risk factors, and are prone to light and smog pollution. Considering the complexity and danger of the environment, the present invention proposes an autonomous mobile platform positioning and mapping method in a rescue scene to explore the environment through autonomous positioning and mapping, and to a certain extent improves the positioning and mapping in this environment. robustness and precision. ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G06T5/00G06T7/73G06T7/246G06K9/62
CPCG06T5/003G06T7/73G06T7/246G06T2207/10028G06T2207/20164G06F18/253
Inventor 向石方陈安吴忻生刘海明杨璞光
Owner SOUTH CHINA UNIV OF TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products