Indoor autonomous navigation method for micro unmanned aerial vehicle

An autonomous navigation and unmanned aerial vehicle technology, applied in navigation, surveying and navigation, navigation through speed/acceleration measurement, etc., can solve the problem of not considering the reliability factors of positioning and state estimation, it is difficult to satisfy autonomous flight, and it is difficult to fully adapt Indoor autonomous flight and other issues to achieve the effect of improving autonomous environment perception, avoiding real-time modeling, and improving overall modeling accuracy

Active Publication Date: 2014-12-24
TSINGHUA UNIV
View PDF5 Cites 148 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, the current commonly used methods do not consider the reliability factors of positioning and state estimation in the planning process, so it is difficult to fully adapt to the requirements of indoor autonomous flight
[0011] Therefore, the navigation, environment modeling and path planning methods of the current micro-UAV system have certain limitations, and it is difficult to meet the special needs of autonomous flight in the indoor environment.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Indoor autonomous navigation method for micro unmanned aerial vehicle
  • Indoor autonomous navigation method for micro unmanned aerial vehicle
  • Indoor autonomous navigation method for micro unmanned aerial vehicle

Examples

Experimental program
Comparison scheme
Effect test

Embodiment

[0058] S1 Motion state estimation of micro UAV based on RGB-D and MEMS sensors, such as figure 2 As shown, it specifically includes the following steps:

[0059] S11: Obtain the image and depth data of the environment through the RGB-D camera, including a series of two-dimensional color domain (RGB) pixels composed of two-dimensional color images and corresponding three-dimensional depth domain data.

[0060] For a point p in a three-dimensional environment i , the form of information obtained by the RGB-D camera is as follows: p i =(x i ,y i ,z i ,r i , g i ,b i ), where x i ,y i ,z i is the 3D depth data of the point, representing the 3D position of the point relative to the center of the RGB-D camera, r i , g i ,b i It is the color gamut (RGB) information corresponding to this point. Therefore, at a certain moment, all the environmental points in the field of view of the RGB-D camera are described as p i The information in the form constitutes a frame of two...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention relates to an indoor autonomous navigation method for a micro unmanned aerial vehicle, belonging to the technical field of autonomous navigation of micro unmanned aerial vehicles. The method comprises the following steps: estimating the motion state of the micro unmanned aerial vehicle based on an RGB-D camera and an MEMS inertial sensor; performing real-time three-dimensional environment modeling based on fusion of the RGB-D camera and the MEMS inertial sensor; and performing real-time trusted path planning and path tracking control, implementing position control and attitude control through the three steps, and guiding the unmanned aerial vehicle to track the preset path to fly. According to the method, the unmanned aerial vehicle positioning accuracy and the motion state estimation accuracy are improved; the indoor autonomous environmental perception capability of the micro unmanned aerial vehicle is improved; according to the generated path, the positioning accuracy can be effectively guaranteed on the basis that the path feasibility is met; and the autonomy capability of the micro unmanned aerial vehicle is effectively improved.

Description

technical field [0001] The invention belongs to the technical field of autonomous navigation of micro UAVs, in particular to an indoor navigation method and system for micro UAVs based on RGB-D cameras and MEMS sensors. Background technique [0002] Micro-UAV has the characteristics of small size, strong mobility, flexible operation, and low cost, and can perform tasks in dangerous and complex environments. Therefore, in recent years, it has been widely used in military and civilian fields such as reconnaissance, disaster relief, and environmental detection. application. How to realize the autonomous flight of micro-UAVs in complex indoor environments (such as high-rise building fire scenes, post-earthquake buildings, landslide mines, etc.) is an important research issue in this field, and the corresponding autonomous navigation, planning and control are The key technology to realize the autonomous flight of micro UAVs. [0003] The indoor environment is a typical complex ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G01C21/18
CPCG01C21/165G01C21/18
Inventor 李大川李清唐良文杨盛程农
Owner TSINGHUA UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products