Vision navigation method of mobile robot based on hand-drawn outline semantic map

A mobile robot and semantic map technology, applied in surveying and navigation, navigation, road network navigator, etc., can solve the problems that the robot operation is easily affected by external interference, the environment description is not very perfect, and the actual size is not clearly expressed.

Inactive Publication Date: 2012-01-11
SOUTHEAST UNIV
View PDF7 Cites 49 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

For metric maps, it represents the precise coordinate information of the environment, so when the environment is relatively large, it is a test for the storage capacity of the computer and the measurement process of the environment; for topological maps, it represents the connectivity information of key points in the environment, and for the environment. The actual size is not clearly indicated, so the description of the environment is not very complete; the hybrid map obtained by combining the metric map and the topological map can represent the environment as a whole with a topological map, and construct a local area of ​​interest. The metric map enriches the environmental information. However, in the real navigation process, the operation of the robot between the topological nodes is easily affected by external interference.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Vision navigation method of mobile robot based on hand-drawn outline semantic map
  • Vision navigation method of mobile robot based on hand-drawn outline semantic map
  • Vision navigation method of mobile robot based on hand-drawn outline semantic map

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0065] This project is funded by the National Natural Science Foundation of China (Youth Fund) (60804063); the Natural Science Foundation of Jiangsu Province (BK2010403); the Open Fund of the Key Laboratory of Image Information Processing and Intelligent Control of the Ministry of Education (200902); Southeast University Excellent Supported by the Young Teacher Teaching and Research Funding Program (3208001203); the Innovation Fund of Southeast University (3208000501).

[0066] Below in conjunction with accompanying drawing and specific embodiment, further illustrate the present invention, should be understood that these embodiments are only for illustrating the present invention and are not intended to limit the scope of the present invention, after having read the present invention, those skilled in the art will understand various aspects of the present invention Modifications in equivalent forms all fall within the scope defined by the appended claims of this application.

...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a vision navigation method of a mobile robot based on a hand-drawn outline semantic map. The method comprises the following steps: drawing the hand-drawn outline semantic map; selecting a corresponding sub-database; designing and identifying labels; performing object segmentation; matching images included in the sub-database with segmented regions; performing coarse positioning on the robot; and navigating the robot. The unified labels are stuck on possible reference objects in a complex environment, a monocular camera of the robot is utilized as a main sensor for guiding operation of the robot according to guide of the hand-drawn outline semantic map, sonar is utilized for assisting the robot in obstacle avoidance, information of a milemeter is further fused for coarse positioning, and the navigation task is finally completed under mutual coordination of the components. By utilizing the method disclosed by the invention, the robot can realize smooth navigationwithout a precise environment map or a precise operation path and effectively avoid dynamic obstacles in a real-time manner.

Description

technical field [0001] The invention belongs to the technical field of intelligent robot navigation, in particular to a mobile robot visual navigation method based on a hand-drawn outline semantic map. Background technique [0002] As more and more home robots enter human families, when the robot faces a new environment, that is, a home environment with personalized and differentiated interior decoration, its "mind" is blank, that is to say, the robot faces an unknown home environment , because the placement of indoor items is not fixed, or the random movement of indoor people, the environment is also dynamic. For non-robot experts, or ordinary family members (novice users of robots), who want to easily operate the robot to learn the surrounding environment, the traditional method relies on the drawing of an accurate map to guide the robot to navigate. Due to the inherent limitations of the positioning accuracy of the robot, with the The increase in the complexity of the en...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G01C21/00G01C21/32
Inventor 李新德金晓彬张秀龙吴雪建
Owner SOUTHEAST UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products