Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Multi-source information fusion robot positioning method and system for unstructured environment

A technology of multi-source information fusion and robot positioning, which is applied in the direction of radio wave measurement systems, instruments, and utilization of re-radiation, etc., to ensure stable operation and achieve the effect of tight coupling

Pending Publication Date: 2022-07-01
SHANDONG YOUBAOTE INTELLIGENT ROBOTICS CO LTD
View PDF0 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] In order to solve the above problems, the present invention proposes a multi-source information fusion robot positioning method and system for unstructured environments. The present invention solves the problem of mobile positioning of mobile robots in dynamic and complex environments, and has the ability to Strong interference ability, good adaptability in geometrically degraded environment, and high stability in rainy and snowy weather conditions

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Multi-source information fusion robot positioning method and system for unstructured environment
  • Multi-source information fusion robot positioning method and system for unstructured environment
  • Multi-source information fusion robot positioning method and system for unstructured environment

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0041] like figure 1 As shown, the present invention provides a multi-source information fusion robot positioning method for unstructured environment, including:

[0042] Use the real-time laser point cloud emitted by the multi-line lidar and the constructed point cloud map to perform point cloud registration to calculate the current robot body pose information;

[0043] Use the binocular camera to obtain the image information of the current environment, extract the ORB visual feature points in each frame of the image to form a visual key frame, and relocate and match the current key frame and the built visual feature map to realize the current position information of the robot. Obtain;

[0044] At the same time, during the movement of the robot, the acceleration of the inertial measurement unit is integrated and processed to output the odometer information;

[0045]Based on the pose information obtained by the above three sensors, the real-time state of the robot is estimat...

Embodiment 2

[0060] This embodiment provides a multi-source information fusion robot positioning system for an unstructured environment, including:

[0061] The laser point cloud processing module is configured to: obtain the real-time laser point cloud emitted by the robot, perform point cloud registration with the preset point cloud map, and calculate the pose information of the current robot;

[0062] The image information processing module is configured to: obtain the image information of the current environment of the robot, extract the ORB visual feature points in each frame of the image to form a visual key frame, and relocate and match the current key frame and the preset visual feature map to obtain The current position information of the robot;

[0063] The acceleration processing module is configured to: obtain the acceleration of the robot, and obtain the odometer information by integrating;

[0064] The positioning information prediction module is configured to: filter the ob...

Embodiment 3

[0067] The present embodiment provides a control system based on a multi-source information fusion positioning robot in an unstructured environment, including:

[0068] The multi-line laser radar positioning module is configured to: emit a multi-line laser point cloud to scan the physical environment around the robot, perform registration and positioning based on the real-time laser point cloud and the existing environment point cloud map; use the laser to generate real-time point cloud information of the robot's surrounding environment , calculate the current robot pose information through the point cloud registration algorithm;

[0069] The binocular vision positioning module is configured to: obtain the real-time image data around the robot, and relocate it with the existing environmental visual feature point map; obtain the real-time image information in front of the robot, and use the image pixel information to extract the visual ORB feature points; According to the visua...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

According to the multi-source information fusion robot positioning method and system for the unstructured environment provided by the invention, fusion processing is carried out on multi-source information data such as laser point cloud, environment image information and acceleration which are acquired in real time, so that the multi-source information fusion robot positioning method and system for the unstructured environment can be realized in a severe weather state and a dynamic unstructured environment. Carrying out high-robustness real-time positioning under the condition of violent illumination change; in a geometrically degraded roadway area, registration positioning of laser point cloud emission fails due to lack of enough external characteristics, and at the moment, the visual positioning module can still work normally and complete robot repositioning by detecting matched image information; in an environment lacking enough illumination, the laser point cloud and acceleration information can still provide enough positioning fusion data input for the computing unit.

Description

technical field [0001] The invention belongs to the technical field of position prediction, and in particular relates to a multi-source information fusion robot positioning method and system for unstructured environments. Background technique [0002] Mobile robots can be divided into wheeled robots, crawler robots and legged robots according to their different movement modes. No matter what form of motion is used, when the robot operates in a complex and changeable workspace, it needs to perceive the pose or volume size information of surrounding objects in real time and obtain the pose information of the robot itself, so as to realize the real-time connection between the robot and the working environment. Interaction and estimation of the motion state of the robot body. [0003] The inventor found that when a mobile robot moves in an outdoor environment without typical structural features, the real-time and accurate acquisition of body pose information is faced with exter...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G01C21/00G01C21/16G01C21/20G01S17/86
CPCG01C21/005G01C21/1652G01C21/1656G01C21/165G01C21/20G01S17/86
Inventor 范永刘大宇马德盛葛怀国陈彬
Owner SHANDONG YOUBAOTE INTELLIGENT ROBOTICS CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products