Positioning method and device adopting visual inertial data deep fusion

A positioning method and positioning device technology, applied to measuring devices, navigation through speed/acceleration measurement, instruments, etc., can solve problems such as poor estimation effect, scale uncertainty, and high frequency of inertial navigation equipment state updates, and achieve accurate estimation , the effect of enhancing the effect

Active Publication Date: 2019-01-18
TSINGHUA UNIV
View PDF6 Cites 3 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] Inertial navigation equipment has a high status update frequency, is sensitive to attitude angles, and can handle large rotations well. It is accurate enough in a short period of time and is less affected by the environment, and can restore absolute scale information. However, its position information is obtained through acceleration integration. , t

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Positioning method and device adopting visual inertial data deep fusion
  • Positioning method and device adopting visual inertial data deep fusion
  • Positioning method and device adopting visual inertial data deep fusion

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0027] Embodiments of the present invention are described in detail below, examples of which are shown in the drawings, wherein the same or similar reference numerals designate the same or similar elements or elements having the same or similar functions throughout. The embodiments described below by referring to the figures are exemplary and are intended to explain the present invention and should not be construed as limiting the present invention.

[0028] The following describes the positioning method and device for deep fusion of visual-inertial data according to the embodiments of the present invention with reference to the accompanying drawings. First, the positioning method for deep fusion of visual-inertial data according to the embodiments of the present invention will be described with reference to the accompanying drawings.

[0029] figure 1 It is a flow chart of a positioning method for deep fusion of visual-inertial data according to an embodiment of the present i...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a positioning method and device adopting visual inertial data deep fusion, wherein the method comprises the following steps of: S1: enabling a system to be in a static state atan initial moment, and acquiring a measured value of an accelerometer and a measured value of a gyroscope in an initialization time period so as to estimate an initial state of the system; S2: afterobtaining the initial state, carrying out propagation on a system state according to the measured value of the accelerometer and the measured value of the gyroscope, and carrying out updating on a covariance matrix of the system; S3: after acquiring an image, tracking feature points by utilizing an IMU (Inertial Measurement Unit) assisted external point elimination method; and S4: for feature points failed in tracking, constructing visual measurement according to visual measurement information, and carrying out updating on the system state. The method sufficiently utilizes various sensor datain a visual inertial fusion system, can effectively utilize IMU data to promote a tracking effect and efficiency, and can accurate estimate a pose of the system in real time.

Description

technical field [0001] The invention relates to the technical field of navigation and positioning of unmanned aerial vehicles, in particular to a positioning method and device for deep fusion of visual-inertial data. Background technique [0002] Due to its small size, high maneuverability and convenient hovering, UAV is becoming an ideal platform for monitoring, exploration, rescue and other tasks. [0003] To solve the problem of autonomous navigation of UAVs without GPS (Global Positioning System, Global Positioning System), the main problems that need to be faced include: environment perception, state estimation, mission planning, flight control, etc. In this series of problems, the state estimation function, which is often called the positioning function, is the most basic and arguably the most important. On the one hand, due to the high maneuverability of the UAV and the unknown environment, we must determine the The position and attitude of the aircraft, on the other...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G01C21/16
CPCG01C21/165
Inventor 程农李建李清
Owner TSINGHUA UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products