Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Vision navigation method of mobile robot based on hand-drawing map and path

A mobile robot and visual navigation technology, applied in road network navigators, two-dimensional position/channel control, etc., can solve problems such as trapped detection ability, navigation failure, and prone to mismatching, and achieve high efficiency and robustness Effect

Inactive Publication Date: 2011-06-08
SOUTHEAST UNIV
View PDF5 Cites 97 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, this method requires the robot to have a strong perception ability. If the robot cannot perceive more than two landmarks at any time, the navigation will easily lose control.
Chronics and Skubic et al. have done extensive work on hand-drawn map-based navigation "Extracting navigation states from a hand-drawn map" (Skubic M., Matsakis P., Forrester B., and Chronis G. in Proceedings of the IEEEInternational Conference on Robotics and Automation (ICRA), Seoul, Korea,, vol.1, 2001: 259-264.), "Generating Multi-Level Linguistic Spatial Descriptions from RangeSensor Readings Using the Histogram of Forces" (Skubic M., Matsakis P ., Chronis G. and J. Keller. Autonomous Robots, 2003, 14(1):51-69), "Qualitative Analysis of Sketched Route Maps: Translating a Sketch into Linguistic Descriptions" (Skubic M., Blisard S., Bailey C ., etc.IEEE Transactions on Systems, Man and Cybernetics, 2004, 34 (2): 1275-1282.), this kind of navigation method is only limited to sonar as the only sensor, and its implementation process is mainly divided into three steps: Manually draw maps and paths, extract key points (QLS) from the drawn map, and detect whether the corresponding QLS is matched in real time; simulation and actual experiments show that this method can achieve ideal results in a simple environment; however, it is stuck in the The detection ability of its sensor, in a complex environment, this method is prone to mis-matching in the real-time matching process, which may lead to navigation failure

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Vision navigation method of mobile robot based on hand-drawing map and path
  • Vision navigation method of mobile robot based on hand-drawing map and path
  • Vision navigation method of mobile robot based on hand-drawing map and path

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0034] 1 Drawing and representation of hand-drawn maps

[0035] Suppose the actual environment map is M,

[0036] Here landmarks (size, location) represent key landmarks for navigation setup; static obstacles (size, location) represent objects that remain stationary for a longer period of time and cannot be used for navigation due to their indistinct features Reference objects, but the robot must avoid these static obstacles in consideration of obstacle avoidance during the moving process; dynamic obstacles (size, position) indicate that the position of objects in the environment is constantly changing during the moving process of the robot; task Area (object, position, range) represents the target or task operation area. The initial pose of the mobile robot (size, position).

[0037] The drawing of hand-drawn maps is relatively simple. Open the interactive drawing interface. Since the image information of the key landmarks in the environment is saved in the system in adva...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to a vision navigation method of a mobile robot based on a hand-drawing map and path. The method comprises the following steps of: firstly, extracting a key leading point in an operation path according to a principle of less bias and dividing an original path into a plurality of segments; then, matching corresponding reference images in a pre-drawn environmental map and information collected in real time by a robot camera in various operation processes to estimate an image most possibly existing in the current field of view; detecting the characteristics of the image by utilizing a SURF (Selective Ultraviolet Radiation Fractionation) algorithm and rapidly solving matching points by relying on a KD-TREE (K-Dimension Tree) method; solving a projection conversion matrix of the reference image and a real-time image by adopting an RANSAC (Random Sample Consensus) algorithm, further acquiring the position of the reference image in the real-time image and merging milemeter data to acquire the reference position of the robot; and finally, calculating the operation direction of the next segment according to the acquired reference position of the robot until the robot moves to the last reference point. The robot can run to an appointed region without the needs of accurate environmental map and accurate operation path and dodge a dynamic obstacle.

Description

technical field [0001] In the field of intelligent robot navigation that the present invention relates to, it is of great significance to promote service robots to enter human families as soon as possible by guiding robots to use vision and other sensors to navigate autonomously in dynamic unknown environments through human-computer interaction with hand-drawn maps. Background technique [0002] "Global" magazine reported (http: / / tekbots.eefocus.com / article / 10-01 / 1688061264400769.html) that Bill Gates once published an article in "Scientific American" about the future of the robot industry. In his view, as long as the robot industry can develop to a critical point, it may completely change the world, which is the future computer industry. Guptara, an Indian-British scholar, pointed out in his article "In 2020, Japanese Robots Rule the World" that by 2020, Japan will be the well-deserved hegemon in the field of robotics in the world; In 2020, the world robot market will reac...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G01C21/34G01C21/32G05D1/02
Inventor 李新德吴雪建朱博戴先中
Owner SOUTHEAST UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products