Autonomous navigation method and system and map modeling method and system

An autonomous navigation and map model technology, applied in the field of navigation, can solve the problem that unmanned aerial vehicles cannot effectively navigate and fly autonomously

Active Publication Date: 2014-07-23
BAIDU ONLINE NETWORK TECH (BEIJIBG) CO LTD
View PDF4 Cites 58 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, in complex indoor scenes, due to factors such as building occlusion, unmanned aerial vehicles cannot effectively use GPS positioning technology for spatial positioning, which makes it impossible for unmanned aerial vehicles to effectively perform autonomous tasks in complex indoor scenes. navigation flight
[0004] Secondly, in complex indoor scenes, autonomous navigation technology requires a more accurate environmental map, while the existing SLAM algorithm can only construct a sparse map model with an error less than 5% of the entire environmental scale
At the same time, the laser scanning system that can be used to build a high-precision map model with an error of less than 1% of the entire environmental scale is not suitable for unmanned aerial vehicles flying indoors, so the method for building a high-precision map model in an indoor environment also needs to be improved.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Autonomous navigation method and system and map modeling method and system
  • Autonomous navigation method and system and map modeling method and system
  • Autonomous navigation method and system and map modeling method and system

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0034] Embodiments of the present invention are described in detail below, examples of which are shown in the drawings, wherein the same or similar reference numerals designate the same or similar elements or elements having the same or similar functions throughout. The embodiments described below by referring to the figures are exemplary and are intended to explain the present invention and should not be construed as limiting the present invention.

[0035] In describing the present invention, it is to be understood that the terms "center", "longitudinal", "transverse", "length", "width", "thickness", "upper", "lower", "front", " Orientation or position indicated by "back", "left", "right", "vertical", "horizontal", "top", "bottom", "inner", "outer", "clockwise", "counterclockwise", etc. The relationship is based on the orientation or positional relationship shown in the drawings, and is only for the convenience of describing the present invention and simplifying the descript...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses an autonomous navigation method and system and a map modeling method and system of an unmanned aerial vehicle. The autonomous navigation method comprises the steps of controlling the unmanned aerial vehicle to take off, and collecting videos of scenes corresponding to the unmanned aerial vehicle at all collecting time points; obtaining characteristic points in the videos of the scenes corresponding to all the collecting time points; generating a flight path of the unmanned aerial vehicle according to the characteristic points in the videos of the scenes corresponding to all the collecting time points; generating a first map model according to the flight path of the unmanned aerial vehicle and the videos of the scenes corresponding to all the collecting time points; carrying out autonomous navigation on the unmanned aerial vehicle according to the first map model. According to the autonomous navigation method, the videos of the scenes of the unmanned aerial vehicle corresponding to all the collecting time points are utilized, the autonomous navigation is carried out by analyzing and recognizing the videos, and therefore the autonomous navigation can be carried out on the unmanned aerial vehicle in an indoor scene.

Description

technical field [0001] The invention relates to the technical field of navigation, in particular to an autonomous navigation method and system, and a map modeling method and system. Background technique [0002] The autonomous navigation technology uses the built-in sensors of the unmanned aerial vehicle to detect the scene that the unmanned aerial vehicle passes through, and completes the autonomous positioning of the unmanned aerial vehicle in the scene and the analysis of the flight trajectory according to the detection results. Therefore, this technology is widely used in the military field and scientific research field. Today, with the introduction of low-cost sensors and the improvement of embedded computing technology, autonomous navigation technology is gradually expanding from the military and scientific research fields to the civilian and commercial fields. However, the application of existing autonomous navigation technologies in indoor environments still has the...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G05D1/10
CPCB64C39/024G06T17/05G05D1/0094B64U2101/30B64U2201/10G08G5/0034G08G5/0069
Inventor 倪凯王延可王亮陶吉余凯
Owner BAIDU ONLINE NETWORK TECH (BEIJIBG) CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products