All-weather unmanned autonomous working platform in unknown environment

An unknown environment and unmanned autonomous technology, applied in the field of artificial intelligence and visual navigation, can solve problems such as poor accuracy, insufficient autonomy, and poor environmental adaptability, and achieve the effects of improving obstacle avoidance accuracy, saving computing time, and high accuracy

Active Publication Date: 2020-02-21
JILIN UNIV
View PDF3 Cites 25 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0009] In order to solve the defects of poor precision, insufficient autonomy, and poor environmental adaptability in unknown environment detection and disaster area search and rescue tools in the above background technology, the present invention provides an all-weather unknown environment unmanned vehicle based on V-SLAM technology and multi-spectral image fusion technology. Autonomous working platforms, which can take the form of drones, unmanned rovers, or detection robots

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • All-weather unmanned autonomous working platform in unknown environment
  • All-weather unmanned autonomous working platform in unknown environment
  • All-weather unmanned autonomous working platform in unknown environment

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0042] Describe technical scheme of the present invention in detail below in conjunction with accompanying drawing:

[0043] An unmanned autonomous working platform in an all-weather unknown environment, including: a visual positioning module, a multi-spectral image fusion module, an image recognition module, a map construction module, and a loopback and return detection module. Described visual localization module utilizes GCN (graph convolutional neural network) to select key frame in video stream, generates binary feature point descriptor and carries out pose solution; Described map drawing module receives sparse feature point cloud data from localization module And carry out local map drawing; Described multi-spectral image fusion module carries out image fusion to key frame and transmits to image recognition module, and described image recognition module classifies multi-spectral fusion image, finds target object and carries out semantic map construction; Finds After the ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses an unmanned autonomous working platform based on an all-weather unknown environment, and belongs to the field of artificial intelligence and visual navigation. The platform comprises five modules: a stereoscopic vision positioning module, an infrared visible light fusion module, an image recognition module, a map construction module and a loop-back and return detection module. The visual positioning module and the image recognition module share a graph convolutional neural network framework, the visual positioning module selects key frames to perform feature matching and visual positioning, the image recognition module performs semantic classification on a point cloud local map, and the map construction module performs point cloud splicing to form a global depth dense semantic map. The deep neural network is introduced to improve the feature extraction effect and save the extraction time. Monocular vision distance measurement is adopted, so that the multi-parallax registration time is saved. Multi-spectral fusion of key frame images is carried out, all-weather efficient work is achieved, and the detection rate of shielded targets is increased.

Description

technical field [0001] The invention relates to the technical field of artificial intelligence and visual navigation, in particular to an all-weather unknown environment unmanned autonomous working platform based on V-SLAM (stereo vision real-time positioning and mapping) and multi-spectral image fusion technology. Background technique [0002] For the complex and unknown environment where the geographical environment changes greatly in a short period of time, such as disaster areas and fire scenes after earthquakes, there is currently no efficient search and rescue and map construction equipment. [0003] In view of the above situation, the current map construction work is mainly carried out in coordination with helicopters and satellite positioning systems. Among them, the helicopter solution has poor maneuverability, high search costs, and requires ground base stations to plan paths in real time; while the satellite mapping method has a large amount of data and high calcu...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06T19/00G06T5/50G06N3/04
CPCG06T19/003G06T5/50G06T2207/10028G06N3/045Y02A90/10
Inventor 张旺黄康齐昊罡蔡炜烔赵风尚夏希林郭相坤
Owner JILIN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products