Eureka AIR delivers breakthrough ideas for toughest innovation challenges, trusted by R&D personnel around the world.

An Indoor Robot Localization Method Fusion of Visual Odometry and Physical Odometry

A visual odometer and indoor robot technology, applied in the field of autonomous positioning accuracy of indoor mobile robots, can solve the problems of superposition and accumulation, which cannot be eliminated, and achieve the effect of meeting accuracy, ensuring efficiency and real-time performance, and solving the problem of error accumulation

Active Publication Date: 2020-06-16
QINGDAO KRUND ROBOT CO LTD
View PDF5 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] The existing technology can satisfy the robot in an indoor environment with a simple structure and a small area through the improved Monte Carlo particle filter and the positioning method of the physical odometer. However, the physical odometer is calculated by the displacement increment of two time periods. , it only considers local motion, so the error will continue to accumulate until the drift is too large to be eliminated, especially when the wheel slips or tilts, the positioning error will be greater

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • An Indoor Robot Localization Method Fusion of Visual Odometry and Physical Odometry
  • An Indoor Robot Localization Method Fusion of Visual Odometry and Physical Odometry

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0018] The present invention will be described in detail below with reference to the accompanying drawings and examples.

[0019] as attached figure 1 and 2 As shown, the present invention provides a

[0020] Step 1. Use ASUS depth camera Xtion to obtain color and depth images;

[0021] Step 2. Extract ORB features from the obtained two consecutive images, calculate the descriptor of each orb feature point, and estimate the camera pose change through feature matching between adjacent images: 1) Combine the depth image to obtain the effective feature point 2) Match the orb feature and depth value of the feature point, and the RANSAC algorithm eliminates the wrong point pair; 3) Obtain the rotation matrix R and translation matrix T between adjacent images, and estimate the pose transformation of the camera;

[0022] Step 3. During the movement of the robot, select the image with the most common feature points and the best match in several adjacent frames of images as the key ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses an indoor robot positioning method by combining a visual odometer and a physical odometer. The method is characterized in that by collecting the images, ORB characteristics are extracted for image coupling, camera position and pose estimation, and closed loop detection to accurately position the robot. The visual odometer is added in a known environment for closed loop detection on the robot, the accumulated error of the physical odometer based on particle filter is eliminated, the global error of the odometer is changed to staggered accumulation, and an closed is established based on the above process. The visual odometer is combined for effectively solving the problem of error accumulation of the physical odometer, and realizes automatic positioning and accurate re-positioning of the robot in the known environment, the increased computation burden is not large, efficiency and real-time performance are guaranteed, indoor navigation requirement can be preciously satisfied, and the indoor robot positioning method is effective for solving the problem of inaccurate positioning of the robot under large environment in the prior art.

Description

technical field [0001] The invention relates to a method for autonomous positioning accuracy of an indoor mobile robot, in particular to an indoor robot positioning method that combines visual odometers and physical odometers. Background technique [0002] In the related research on intelligent navigation technology of autonomous mobile robots, the simultaneous localization and mapping (SLAM) technology of robots in unknown environments is a key technology, which has both engineering and academic values. research hotspots in the field. Under this trend, scholars have proposed a variety of methods to solve the SLAM problem, and also applied a variety of sensors to solve the environmental perception problem in SLAM. [0003] The first problem to be solved by SLAM technology is to select an appropriate sensor system to realize the real-time positioning of the robot. In practical applications, sensors based on laser radar with high accuracy in ranging range and azimuth are the...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G01C21/20
CPCG01C21/206
Inventor 周唐恺江济良王运志
Owner QINGDAO KRUND ROBOT CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Eureka Blog
Learn More
PatSnap group products