Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Unmanned plane and barrier data fusion method

A technology of data fusion and obstacles, applied in the reflection/re-radiation of radio waves, computer components, radio wave measurement systems, etc., can solve the problem of damage to drones, delays in quick understanding of on-site disaster relief, drone collisions, etc. problems, to achieve the effect of reducing uncertainty and decision-making risk

Active Publication Date: 2018-03-09
DALIAN ROILAND SCI & TECH CO LTD
View PDF11 Cites 20 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] Due to the complex and unknown post-disaster scene environment, post-disaster shooting and recording of UAVs may cause UAVs to collide, cause damage to UAVs, and delay the rapid understanding of on-site disaster relief. Therefore, it is necessary to carry out post-disaster During the rescue process, ensure the flight safety of drones

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Unmanned plane and barrier data fusion method
  • Unmanned plane and barrier data fusion method
  • Unmanned plane and barrier data fusion method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0058] This embodiment provides a data fusion method between a drone and an obstacle, including: a data fusion layer, a feature layer, a decision layer, and a detection device;

[0059] The detection device includes:

[0060] Radar height sensor to measure the vertical distance of the UAV to the ground;

[0061] GPS / Beidou positioning sensor for real-time positioning to realize tasks such as the fixed-point hovering of the drone, and can realize the measurement of the height of the drone and the measurement of the relative speed of the drone;

[0062] The AHRS module collects the flight attitude and navigation information of the UAV; the AHRS module includes the MEMS three-axis gyroscope, accelerometer and magnetometer. The output data is three-dimensional acceleration, three-dimensional angular velocity and three-dimensional geomagnetic field strength.

[0063] The millimeter wave radar sensor adopts a chirp triangle wave system to realize long-distance measurement from obstacles to th...

Embodiment 2

[0070] As a further limitation to Embodiment 1: The data fusion layer processes the data collected by each sensor:

[0071] 1) The output data of the millimeter wave radar sensor is the relative distance R1, the relative speed V1, and the angle between the obstacle and the radar normal, including the azimuth angle θ1 and the pitch angle ψ1;

[0072] 2) The ultrasonic radar sensor inputs the relative distance R2 between the UAV and the obstacle;

[0073] 3) The binocular vision sensor outputs the object area S, azimuth angle θ2 and relative distance R3;

[0074] 4) The radar height sensor outputs the height value R4 between the drone and the ground;

[0075] 5) GPS / Beidou positioning sensor mainly obtains the altitude H2 and horizontal speed V2 of the drone;

[0076] GPS data follows the NMEA0183 protocol, and the output information is in a standard and fixed format. Among them, GPGGA and GPVTG sentences are closely related to UAV navigation. Their data format is specified as follows:

...

Embodiment 3

[0083] As a supplement to Embodiment 1 or 2, the feature layer performs data fusion of the relative distance between the drone and the obstacle, the data fusion of the relative height of the drone and the ground, the data fusion of the relative speed of the drone and the obstacle, and Obtain the size, shape and other attributes of obstacles;

[0084] The data fusion of the relative distance between the drone and the obstacle is processed according to the distance range:

[0085] A. Ultrasonic radar sensors, binocular vision sensors, and millimeter wave radar sensors perform detection within the range of 0m to 10m, but the relative accuracy of these radars is different. In the short range, the accuracy of ultrasonic is higher. However, in order to improve the accuracy of the calculation of the height, the weighted average is used, that is, the weighted values ​​of α and β are introduced to carry out the weighted average of the ultrasonic radar sensor, the binocular vision sensor and...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to an unmanned plane and barrier data fusion method. The method is characterized by comprising steps that data acquired by each sensor is processed by a data fusion layer; millimeter wave radar sensor output data is a relative distance R1 between an unmanned plane and a barrier, the relatively speed V1 and angles between the barrier and a radar normal, and an azimuth theta1 and a pitch angle psi1 are comprised; a relative distance R2 between the unmanned plane and the barrier is inputted by an ultrasonic radar sensor; an object area S, an azimuth theta2 and a relative distance R3 are outputted by a binocular vision sensor; and a height value R4 between the unmanned plane and the ground is outputted by a radar height sensor. The method is advantaged in that incompletedata of local environments provided by same or different types of sensors at different positions are fused, possible redundancy among the sensors and contradiction data are eliminated, complementationis carried out, and uncertainty is reduced.

Description

Technical field [0001] The invention belongs to the technical field of UAV obstacle avoidance, and in particular relates to a data fusion method between UAV and obstacles. Background technique [0002] In recent years, UAV technology has quickly become a new hot spot for research and development at home and abroad, and due to its high mobility, flexible operation, low cost, real-time image transmission, and high resolution, UAVs are used In various fields of society, such as disaster rescue, power inspection, forestry fire prevention, agricultural spraying, vegetation protection, aerial photography, etc. [0003] In the post-disaster rescue scene, due to the many limitations of traditional methods, drone technology has gradually developed. Post-disaster rescue drones can use the fastest and most convenient means to observe and intervene in the rescue scene from the air when the environment is bad, the on-site situation cannot be understood in time, and the rescue is emergency. Th...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G01S13/86G01S13/88G01S13/93G06K9/00
CPCG01S13/862G01S13/867G01S13/882G01S13/933G06V20/188G06V20/13
Inventor 田雨农王鑫照
Owner DALIAN ROILAND SCI & TECH CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products