Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Multi-camera fusion unmanned aerial vehicle in-building navigation and positioning method

An in-building, navigation and positioning technology, used in navigation, mapping and navigation, navigation calculation tools, etc., can solve problems such as difficult to popularize and use, incompetent for positioning requirements, and achieve reduced positioning failures, data processing speed, and robustness. strong effect

Pending Publication Date: 2021-11-30
BEIJING INSTITUTE OF TECHNOLOGYGY
View PDF6 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, these methods are not suitable for long-term, high-altitude, and large-scale positioning requirements, and are difficult to promote and use

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Multi-camera fusion unmanned aerial vehicle in-building navigation and positioning method
  • Multi-camera fusion unmanned aerial vehicle in-building navigation and positioning method
  • Multi-camera fusion unmanned aerial vehicle in-building navigation and positioning method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment approach

[0088] According to a preferred embodiment of the present invention, after the camera obtains the observation value of the passive optical marker, the observation value needs to be corrected,

[0089] Among them, it is assumed that camera j observes an optical marker [i, T W_Codei ,Σ Codei ], its observed value is {T Cj_Codei ,Σ Cj_Codei}, i∈[1,n],

[0090] The correcting the observed value is correcting the uncertainty of the observed value.

[0091] The inventor found that the uncertainty of the given observations in the prior art is usually a fixed value or given according to the residual value in the optimization process, which cannot accurately reflect the accuracy of the current measurement, resulting in easy observations. Accumulated errors occur, resulting in unsatisfactory subsequent pose fusion results, which significantly reduce the positioning accuracy of the drone. Therefore, the uncertainty of the observed value is redesigned in the present invention, which ...

Embodiment 1

[0189] UAV type: quadrotor UAV;

[0190] Number of cameras: 5

[0191] Camera installation position: The camera is installed around the drone and below the bottom bracket of the drone, surrounded by cameras at 60° intervals, and the field of view of the camera below is vertically downward;

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a multi-camera fusion unmanned aerial vehicle in-building navigation and positioning method which comprises the following steps: step 1, setting markers in a building, and establishing a marker map of a known environment; and step 2, in the flight process, the unmanned aerial vehicle obtains the pose information of the vehicle body according to the observed information of the marker. According to the multi-camera fusion unmanned aerial vehicle in-building navigation and positioning method, the navigation requirement in any flight time under the energy limitation can be met, and no accumulative error exists; the robustness is high, multiple cameras observe at the same time, it can be guaranteed that at least one optical target can be seen, and the problem of positioning failures is reduced; moreover, the single measurement error is minimized, and the positioning accuracy is remarkably improved.

Description

technical field [0001] The invention relates to the technical field of navigation and positioning of drones, in particular to a navigation and positioning method for drones in buildings, in particular to a multi-camera fusion navigation and positioning method for drones in buildings. Background technique [0002] Most of the existing drones use the GPS system for positioning. Generally, the GPS signals that can be received are calculated in milliwatts, which are easily shielded by metal, water, cement walls, etc., so that when the drone is used indoors, It is not possible to rely on the GPS system for positioning. [0003] At present, the navigation of drones in buildings is mostly realized by pure vision system, fusion of visual and inertial navigation components, laser radar, etc. Use in any environment. However, none of these methods can meet the long-term, high-altitude, and large-scale positioning requirements, and are difficult to promote and use. [0004] Therefore...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G01C21/20
CPCG01C21/206Y02T10/40
Inventor 宋韬吕军宁莫雳金忍王江何绍溟
Owner BEIJING INSTITUTE OF TECHNOLOGYGY
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products