Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Visual and inertial integrated navigation method fusing semantic features

A technology of inertial combination and semantic features, applied in the field of robot navigation, can solve problems such as difficulty in satisfying the robust positioning of drones, the influence of navigation system accuracy and reliability, and limited perception dimensions

Pending Publication Date: 2021-07-30
JIANGSU FRONTIER ELECTRIC TECH +1
View PDF0 Cites 2 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Due to the limited perception dimension of a single visual sensor, it is difficult to satisfy the robust positioning of drones in indoor complex environments
These have affected the accuracy and reliability of the navigation system

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Visual and inertial integrated navigation method fusing semantic features
  • Visual and inertial integrated navigation method fusing semantic features
  • Visual and inertial integrated navigation method fusing semantic features

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0054] The following will clearly and completely describe the technical solutions in the embodiments of the present invention with reference to the accompanying drawings in the embodiments of the present invention. Obviously, the described embodiments are only some, not all, embodiments of the present invention. Based on the embodiments of the present invention, all other embodiments obtained by persons of ordinary skill in the art without making creative efforts belong to the protection scope of the present invention.

[0055] In order to make the above objects, features and advantages of the present invention more comprehensible, the present invention will be further described in detail below in conjunction with the accompanying drawings and specific embodiments.

[0056] Such as figure 1 As shown, the present invention provides a visual-inertial integrated navigation method that fuses semantic features, comprising the following steps:

[0057] Step 1: Collect RGBD visual s...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a visual inertial integrated navigation method fusing semantic features. The method comprises steps of RGBD visual sensor data S(k), accelerometer data and gyroscope data at the k moment being collected, and the current pose T (k) of a camera being obtained through calculation by using a visual odometer according to the visual sensor data S (k); performing semantic plane feature extraction and matching between two adjacent image frames by using the visual sensor data S (k); performing pre-integration between two adjacent image frames by using inertial sensor data; combining the semantic plane observation residual error, the visual odometer relative pose observation residual error and the inertia pre-integration residual error to optimize and solve the carrier navigation information; and outputting the carrier navigation information and camera internal parameters. According to the method, positioning accuracy and robustness of the navigation system can be effectively improved.

Description

technical field [0001] The invention belongs to the technical field of robot navigation, and in particular relates to a visual-inertial combined navigation method integrating semantic features. Background technique [0002] Visual SLAM algorithm has become a research hotspot in the field of robot autonomous navigation due to its rich perception information. The traditional visual SLAM method extracts features such as points and lines for feature description of the environment and pose calculation. These features are described and matched through the underlying brightness relationship, and the structural redundancy information in the scene is not fully utilized. Due to the limited perception dimension of a single visual sensor, it is difficult to satisfy the robust positioning of UAVs in complex indoor environments. These have affected the accuracy and reliability of the navigation system. Contents of the invention [0003] Aiming at the deficiencies of the prior art, the...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G01C21/34G01C21/16G01S11/12G01C22/00
CPCG01C21/3407G01C21/165G01S11/12G01C22/00Y02T10/40
Inventor 黄郑王红星雍成优朱洁刘斌吕品陈玉权何容吴媚赖际舟
Owner JIANGSU FRONTIER ELECTRIC TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products