Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Vision and laser radar fused outdoor mobile robot pose estimation method

A technology of mobile robot and laser radar, which is applied in the fields of instrumentation, calculation, image data processing, etc.

Active Publication Date: 2021-02-23
FUZHOU UNIV
View PDF5 Cites 7 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] In view of this, the purpose of the present invention is to provide a method for estimating the pose of an outdoor mobile robot based on fusion of vision and laser radar, which overcomes the shortcomings of the method based on only a single sensor, and realizes that the mobile robot has higher precision and more robustness in the outdoor environment. high pose estimation

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Vision and laser radar fused outdoor mobile robot pose estimation method
  • Vision and laser radar fused outdoor mobile robot pose estimation method
  • Vision and laser radar fused outdoor mobile robot pose estimation method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0086] The present invention will be further described below in conjunction with the accompanying drawings and embodiments.

[0087] Please refer to figure 1 , the present invention provides a method for estimating the pose of an outdoor mobile robot that combines vision and laser radar, comprising the following steps:

[0088] Step S1: obtain point cloud data and visual image data;

[0089] Step S2: adopt the algorithm of iterative fitting to accurately estimate the ground model of the point cloud of each frame and extract the ground point;

[0090] Step S3: extract the ORB feature point to the lower half region of the visual image, and estimate the depth for the corresponding visual feature point according to the extracted ground point;

[0091] Step S4: according to the number of lines and the angular resolution of the laser radar, obtain the depth image formed by the depth information of the point cloud;

[0092] Step S5: according to the obtained depth image, calculate...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to a vision and laser radar fused outdoor mobile robot pose estimation method. The method comprises the following steps of S1, obtaining point cloud data and vision image data; S2, adopting an iterative fitting algorithm to accurately estimate the ground model and extract ground points; S3, extracting ORB feature points from the lower half area of the visual image, and estimating the depth for the visual feature points according to the ground points; s4, obtaining a depth image formed by the depth information of the point cloud; s5, extracting edge features, plane features and ground features; s6, matching the visual features by using a Hamming distance and an RANSAC algorithm, and preliminarily calculating the relative pose of the mobile robot by using an iterative closest point method; and S7, obtaining the final pose of the robot according to the relative pose obtained by vision, the point-surface constraint and the normal vector constraint provided by the ground point cloud, and the point line and the point-surface constraint provided by the non-ground point cloud. According to the invention, pose estimation with higher precision and higher robustness of the mobile robot in an outdoor environment is realized.

Description

technical field [0001] The invention relates to the field of autonomous navigation of mobile robots, in particular to a method for estimating the pose of an outdoor mobile robot by fusion of vision and laser radar. Background technique [0002] In recent years, mobile robots with autonomous navigation technology as the core have shown great development prospects in many fields, and have been widely used in various scenarios in life, such as household sweeping robots working in indoor environments, service robots, Storage UGV, etc.; agricultural survey and survey in outdoor scenes, unmanned logistics transportation, power, safety inspection and other operations. [0003] As the primary module of the entire robot navigation system, the positioning module is also the basic module to ensure that the robot realizes the navigation task. The positioning module provides real-time location information of the mobile robot, which solves the problem of "where" the mobile robot is. The ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06T7/73G06T7/77G06T7/50
CPCG06T7/74G06T7/77G06T7/50Y02T10/40
Inventor 何炳蔚刘宸希朱富伟张立伟林立雄陈彦杰
Owner FUZHOU UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products