Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Machine vision and inertial navigation fusion-based mobile robot motion attitude estimation method

A technology for robot motion and mobile robots, used in instrumentation, navigation, surveying and navigation, etc.

Inactive Publication Date: 2012-07-04
ZHEJIANG UNIV
View PDF2 Cites 148 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Overcome the accuracy problem of traditional dead reckoning caused by cumulative error

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Machine vision and inertial navigation fusion-based mobile robot motion attitude estimation method
  • Machine vision and inertial navigation fusion-based mobile robot motion attitude estimation method
  • Machine vision and inertial navigation fusion-based mobile robot motion attitude estimation method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment

[0072] 1. Synchronously collect mobile robot binocular camera images and three-axis inertial navigation data

[0073] It adopts mobile robot Pioneer 3 (Pioneer 3), Naiwei NV100 strapdown inertial navigator and Bumblebee2 binocular stereo camera. The inertial navigation sampling frequency is 100Hz, and it is placed at the geometric center of the robot. The Z-axis direction is vertical to the ground, the X-axis direction is directly in front of the robot, and the Y-axis direction is the right side of the robot and perpendicular to the X and Z axes. axis direction; the binocular stereo camera is placed directly in front of the robot, the pitch angle is 45 degrees, and the camera sampling frequency is 1Hz.

[0074] 2. Extract the features of the front and rear frame image pairs and match and estimate the motion pose

[0075] For the left and right images collected by the binocular camera, extract scale-invariant feature transform (SIFT) features, including calculating the extreme...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a machine vision and inertial navigation fusion-based mobile robot motion attitude estimation method which comprises the following steps of: synchronously acquiring a mobile robot binocular camera image and triaxial inertial navigation data; distilling front / back frame image characteristics and matching estimation motion attitude; computing a pitch angle and a roll angle by inertial navigation; building a kalman filter model to estimate to fuse vision and inertial navigation attitude; adaptively adjusting a filter parameter according to estimation variance; and carrying out accumulated dead reckoning of attitude correction. According to the method, a real-time expanding kalman filter attitude estimation model is provided, the combination of inertial navigation and gravity acceleration direction is taken as supplement, three-direction attitude estimation of a visual speedometer is decoupled, and the accumulated error of the attitude estimation is corrected; and the filter parameter is adjusted by fuzzy logic according to motion state, the self-adaptive filtering estimation is realized, the influence of acceleration noise is reduced, and the positioning precision and robustness of the visual speedometer is effectively improved.

Description

technical field [0001] The invention relates to a method for estimating the movement attitude of a mobile robot based on machine vision and inertial navigation fusion, which is suitable for the estimation and positioning of the movement attitude of an autonomous robot. Background technique [0002] Accurate land positioning systems have important applications in the autonomous navigation of mobile robots, path planning, and terrain reconstruction. Traditional robot positioning methods include GPS, inertial navigation positioning, etc. GPS is widely used in vehicle positioning, but it cannot be used in occluded areas and indoors; inertial measurement unit (IMU) realizes dead reckoning by integrating angular velocity and linear acceleration, but it is easily affected by noise and causes "drift" in positioning results. In addition, more and more studies use visual localization methods as a supplement to traditional localization methods. For example, visual odometry (VO), by t...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G01C21/00G01C21/16
Inventor 路丹晖马丽莎杨飞刘济林
Owner ZHEJIANG UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products