Visual/inertial integrated navigation method based on online calibration of camera internal parameters

A navigation method and an inertial combination technology, which is applied to navigation, navigation, surveying and navigation through speed/acceleration measurement, and can solve the problem of reduced accuracy of the navigation system

Active Publication Date: 2020-05-15
NANJING UNIV OF AERONAUTICS & ASTRONAUTICS
View PDF12 Cites 5 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] In order to solve the technical problems mentioned in the above-mentioned background technology, the present invention proposes a visual/inertial integrated navigation m

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Visual/inertial integrated navigation method based on online calibration of camera internal parameters
  • Visual/inertial integrated navigation method based on online calibration of camera internal parameters
  • Visual/inertial integrated navigation method based on online calibration of camera internal parameters

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0058] The technical solutions of the present invention will be described in detail below in conjunction with the accompanying drawings.

[0059] The present invention designs a visual / inertial integrated navigation method based on camera internal reference online calibration, such as figure 1 As shown, the steps are as follows:

[0060] Step 1: Collect visual sensor data S(k) and accelerometer data at time k and gyroscope data

[0061] Step 2: use the visual sensor data S(k) to perform feature matching and detection between two adjacent image frames;

[0062] Step 3: Leverage Inertial Sensor Data with Perform pre-integration between adjacent two image frames;

[0063] Step 4: Combine visual reprojection error and inertial pre-integration error to optimize and solve carrier navigation information and camera internal parameters;

[0064] Step 5: Output carrier navigation information and camera internal parameters, and return to step 1.

[0065] In this embodiment, ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a vision/inertia integrated navigation method based on online calibration of camera internal parameters. The method comprises the steps of collecting visual sensor data S(k), accelerometer data and gyroscope data at the moment k, and performing feature matching and detection between two adjacent image frames by utilizing the visual sensor data S(k); carrying out pre-integration between two adjacent image frames by utilizing inertial sensor data; optimizing and solving carrier navigation information and camera internal parameters by combining a visual reprojection errorand an inertia pre-integration error; and outputting carrier navigation information and the camera internal parameters. According to the invention, the calibration of the camera internal parameters can be completed under the visual/inertial navigation framework, and the problem that the precision of the navigation system is reduced due to changes of the camera internal parameters can be effectively solved.

Description

technical field [0001] The invention belongs to the field of robot navigation, and in particular relates to a visual / inertial combined navigation method. Background technique [0002] Vision / inertial integrated navigation system has become a research hotspot in the field of robot autonomous navigation because of its good robustness. The visual sensor can suppress the drift problem of the inertial sensor, and the inertial sensor can make up for the problem that the visual sensor cannot work in the environment with less texture and insufficient light. Therefore, the visual / inertial integrated navigation system has broad development prospects. [0003] Most of the current visual / inertial integrated navigation methods use the internal camera parameters as fixed parameters during operation, and obtain the internal parameters of the camera through traditional offline calibration methods. But in practice, the internal parameters of the camera may change due to mechanical shock or ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G01C21/16G01C21/00
CPCG01C21/00G01C21/165
Inventor 杨子寒赖际舟吕品刘建业袁诚
Owner NANJING UNIV OF AERONAUTICS & ASTRONAUTICS
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products