Method for adaptively correcting visual homing of mobile robot based on road sign

A mobile robot and self-adaptive technology, applied to instruments, motor vehicles, navigation computing tools, etc., to achieve the effects of improving autonomous navigation capabilities, simplifying homing routes, and reducing weight

Pending Publication Date: 2021-08-10
DALIAN MARITIME UNIVERSITY
View PDF5 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0003] In view of the problems existing in visual homing technology, most visual homing algorithms calculate the homing vector based on the landmarks extracted from two panoramic images, and then determine the homing direction. Although the application of visual sensors can obtain a large amount of rich environmental information, However, it is still a problem to be solved to extract stable and accurate information related to navigation.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Method for adaptively correcting visual homing of mobile robot based on road sign
  • Method for adaptively correcting visual homing of mobile robot based on road sign
  • Method for adaptively correcting visual homing of mobile robot based on road sign

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0047] In order to enable those skilled in the art to better understand the solutions of the present invention, the following will clearly and completely describe the technical solutions in the embodiments of the present invention in conjunction with the drawings in the embodiments of the present invention. Obviously, the described embodiments are only It is an embodiment of a part of the present invention, but not all embodiments. Based on the embodiments of the present invention, all other embodiments obtained by persons of ordinary skill in the art without making creative efforts shall fall within the protection scope of the present invention.

[0048]It should be noted that the terms "first" and "second" in the description and claims of the present invention and the above drawings are used to distinguish similar objects, but not necessarily used to describe a specific sequence or sequence. It is to be understood that the data so used are interchangeable under appropriate c...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention provides a method for adaptively correcting visual homing of a mobile robot based on a road sign. The method comprises the following steps of acquiring a target position panoramic image LH and a current position panoramic image LC, based on the acquired target position image LH and the current position image LC, changing the size of a matching threshold r according to an SURF image matching algorithm, acquiring road signs with different accuracy rates, and adaptively allocating weights wi to the road signs, based on the distribution difference of the road signs on the image, adaptively allocating weights etai to the road signs, based on the weight wi and the weight etai, determining the final road sign weight phii, wherein phii=wi*etai, and based on the road sign with the final road sign weight, calculating a homing vector h, controlling the robot to move, and completing homing. According to the method, different weights are adaptively given to the road signs with different accuracy rates according to two constraint conditions for extracting the road signs, and the precision of the whole road signs is improved on the premise of not eliminating mismatched road signs, so that a more accurate homing vector is calculated, the homing precision of the robot is improved, and the autonomous navigation capability of the robot is improved.

Description

technical field [0001] The present invention relates to the technical field of visual homing of mobile robots, in particular to a method for visual homing of mobile robots based on adaptive correction of landmarks. Background technique [0002] At present, in the field of robot research, mobile robot autonomous navigation technology is an important research hotspot. Inspired by biology, Visual Homing developed from insect homing can be used as an effective technology to control mobile robots to reach the desired target because it can obtain richer and more complete environmental information. Compared with the SLAM technology that requires the mobile robot to position itself and create a map at the same time, the visual homing technology only needs to input the panoramic image of the current position and the target position, calculate the homing vector, and then determine the homing direction and successfully complete the homing task , get rid of the complex positioning and ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G05D1/02G01C21/20G01C11/04
CPCG05D1/0246G01C21/20G01C11/04G05D2201/0217
Inventor 纪勋孙国松余明裕
Owner DALIAN MARITIME UNIVERSITY
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products