Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Invariant central difference filter method for autonomous mobile robot visual slam

A technology of robot vision and autonomous movement, applied in the field of information processing of aerospace systems, can solve the problems of high computing efficiency and high computing complexity, and achieve the effects of high computing efficiency, improved computing accuracy, and fast computing speed.

Active Publication Date: 2022-02-08
ZHENGZHOU UNIVERSITY OF LIGHT INDUSTRY
View PDF4 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] Aiming at the technical problem of high computational complexity in the visual SLAM of an autonomous mobile robot configured with an inertial measurement unit (IMU) and a monocular vision sensor, the present invention proposes an invariant central difference filter (CDKF) method for the visual SLAM of an autonomous mobile robot. Calculation of the invariant central difference filter of the SLAM matrix Lie group vector, to carry out the optimal filtering calculation of the state variables of the motion model of the autonomous mobile robot vision SLAM system, with high calculation efficiency and fast calculation speed, and has good practical application value

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Invariant central difference filter method for autonomous mobile robot visual slam
  • Invariant central difference filter method for autonomous mobile robot visual slam
  • Invariant central difference filter method for autonomous mobile robot visual slam

Examples

Experimental program
Comparison scheme
Effect test

specific Embodiment

[0119] Specific embodiment: consider the autonomous mobile robot VSLAM system, the autonomous mobile robot is equipped with a monocular camera and an inertial IMU measurement component, and the IMU measurement component provides the 3-dimensional attitude, velocity and position vector information of the autonomous mobile robot system, and the monocular camera is for p Observation is carried out at a fixed landmark position, and p observation equations are obtained, so the VSLAM system model of the autonomous mobile robot can be expressed as:

[0120]

[0121] Here the VSLAM system state vector is a Lie group matrix variable composed of 3-dimensional attitude, velocity and position vectors Bias vector from the rest of the 3D gyroscope and accelerometer form a mixed state vector; the 3-dimensional gyroscope and accelerometer output ω and a in the motion model equation form a control input variable: u=[ω T ,a T ] T . The system noise is noise vector w n is the Gaussia...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The present invention proposes an invariant central difference filter method for the visual SLAM of an autonomous mobile robot, which is used to solve the problem of high computational complexity of the state variable filtering of the motion model of the visual SLAM system of the autonomous mobile robot. The present invention designs the calculation method of the central difference filter oriented to the matrix Lie group vector space and the invariant Kalman filter, and the filter state variable is represented by the SE(3) Lie group vector of robot pose, velocity and 3D landmark position vector and acceleration Composed of meter and gyroscope deviation vector, the mean value and error value of Sigma sampling points of CDKF filter are designed in the matrix Lie group space, and the inverse depth observation model of moving image feature points is constructed by monocular vision camera, and the central difference filter of CDKF is designed Forecast and update iterative calculations, and carry out robot positioning and map construction calculation tasks. Compared with the EKF algorithm of the conventional robot system model, the present invention has high calculation efficiency, fast calculation speed and good practical application value.

Description

technical field [0001] The present invention relates to the technical field of aerospace system information processing, in particular to an invariant central difference filter method for autonomous mobile robot visual SLAM, that is, the problem of autonomous mobile robot real-time positioning and map construction system (Visual Simultaneous Localization And Mapping, VSLAM) A novel method for visual odometry invariance central difference filter models. Background technique [0002] In recent years, the visual real-time localization and mapping (VSLAM) technology of autonomous mobile robots has developed rapidly. As a multi-sensor system configuration, the data fusion methods of autonomous mobile robot systems include filter methods, gradient descent optimization methods, and beam adjustment ( Bundle Adjustment, BA) optimization technology, etc. Among them, the gradient descent optimization method has good calculation efficiency, but compared with the filtering method, the cal...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06T7/277
CPCG06T7/277G06T2207/30252
Inventor 吴艳敏丁国强田英楠娄泰山张铎方洁
Owner ZHENGZHOU UNIVERSITY OF LIGHT INDUSTRY
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products