Decision fusion method applied to obstacle avoidance system
A technology of decision fusion and obstacle avoidance, applied in radio wave measurement system, utilization of re-radiation, reflection/re-radiation of radio waves, etc., can solve delays in quick understanding of on-site disaster relief, UAV collision, UAV damage, etc. problem, to reduce the effect of decision-making risk
- Summary
- Abstract
- Description
- Claims
- Application Information
AI Technical Summary
Problems solved by technology
Method used
Image
Examples
Embodiment 1
[0058] This embodiment provides a decision fusion method applied in an obstacle avoidance system, including: a data fusion layer, a feature layer, a decision layer and a detection device;
[0059] The detection device includes:
[0060] Radar altitude sensor, which measures the vertical distance of the drone to the ground;
[0061] GPS / Beidou positioning sensor for real-time positioning to achieve fixed-point hovering and other tasks of the UAV, and can realize the measurement of the height of the UAV and the measurement of the relative speed of the UAV;
[0062] The AHRS module collects the flight attitude and navigation information of the UAV; the AHRS module includes the MEMS three-axis gyroscope, accelerometer and magnetometer, and the output data are three-dimensional acceleration, three-dimensional angular velocity and three-dimensional geomagnetic field strength.
[0063] The millimeter-wave radar sensor adopts a chirp triangular wave system to realize long-distance me...
Embodiment 2
[0070] As a further limitation to Embodiment 1: the data fusion layer processes the data collected by each sensor:
[0071] 1) The output data of the millimeter-wave radar sensor is the relative distance R1 between the UAV and the obstacle, the relative velocity V1, the angle between the obstacle and the normal line of the radar, including the azimuth θ1 and the pitch ψ1;
[0072] 2) The ultrasonic radar sensor inputs the relative distance R2 between the UAV and the obstacle;
[0073] 3) The binocular vision sensor outputs the object area S, azimuth θ2 and relative distance R3;
[0074] 4) The radar height sensor outputs the height value R4 between the UAV and the ground;
[0075] 5) The GPS / Beidou positioning sensor mainly obtains the altitude H2 and horizontal speed V2 of the UAV;
[0076] GPS data follows the NMEA0183 protocol, and the output information is standard and has a fixed format. Among them, GPGGA and GPVTG statements are closely related to UAV navigation. The...
Embodiment 3
[0083] As a supplement to Embodiment 1 or 2, the feature layer performs data fusion of the relative distance between the UAV and the obstacle, the data fusion of the relative height of the UAV and the ground, the data fusion of the relative speed of the UAV and the obstacle, and Obtain the attribute characteristics such as the size and shape of the obstacle;
[0084] The data fusion of the relative distance between the UAV and the obstacle is processed according to the distance range:
[0085] A. The distance is within the range of 0m to 10m. Ultrasonic radar sensor, binocular vision sensor and millimeter wave radar sensor are used for detection, but the relative accuracy of these radars is different. In the short range, the accuracy of ultrasonic is higher. However, in order to improve the accuracy of the calculation of height, weighted average is adopted, that is, the weighted value of α and β is introduced to carry out weighted average of ultrasonic radar sensor, bino...
PUM
Abstract
Description
Claims
Application Information
- R&D Engineer
- R&D Manager
- IP Professional
- Industry Leading Data Capabilities
- Powerful AI technology
- Patent DNA Extraction
Browse by: Latest US Patents, China's latest patents, Technical Efficacy Thesaurus, Application Domain, Technology Topic, Popular Technical Reports.
© 2024 PatSnap. All rights reserved.Legal|Privacy policy|Modern Slavery Act Transparency Statement|Sitemap|About US| Contact US: help@patsnap.com