Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Mobile robot visual positioning and navigating method based on ArUco code

A mobile robot and visual positioning technology, applied in surveying and mapping and navigation, navigation, navigation calculation tools, etc., can solve problems such as inability to complete curved driving, high environmental requirements, and inflexibility, etc., to overcome the strong dependence on auxiliary physical conditions and reduce Environmental requirements, convenient and flexible layout

Inactive Publication Date: 2019-02-22
SHANDONG UNIV OF SCI & TECH
View PDF6 Cites 29 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

The relative displacement can be measured by using the inertial navigation element, but the disadvantage is that the measurement data of the inertial navigation element drifts, and it needs to cooperate with other sensors to improve the accuracy, and the algorithm is complex
LiDAR can be used for precise distance measurement and positioning to complete navigation tasks, but reflectors need to be arranged to provide navigation information, and LiDAR is expensive
The robot can meet the accuracy requirements through electromagnetic and tracking methods, but it needs to arrange tracks or road signs. When the task of the robot changes, it needs to re-arrange the path, which has poor flexibility and high cost.
[0004] In addition, there have been visual navigation methods based on two-dimensional codes such as ribbons, barcodes, and QR codes. This method usually uses colored ribbons laid on the ground as a guide, and uses the location information contained in the two-dimensional codes for positioning to realize robot navigation. The ribbons and QR codes on the ground are easy to be stained, and when the task path changes, the ribbons and QR codes need to be pasted again, which has poor flexibility and high environmental requirements
In addition, this method often requires that the direction of the optical axis of the camera is perpendicular to the plane where the two-dimensional code is laid, and that the distance between the camera and the plane where the two-dimensional code is laid is very close, otherwise the robot will not be able to recognize it well, which has great limitations.
In the patent CN106969766A, an indoor autonomous navigation method based on monocular vision and two-dimensional code road signs is disclosed. Paths are rearranged, but still not flexible enough
Its shortcoming is that the two-dimensional code needs to be arranged in advance according to the driving path of the robot, and the layout of the two-dimensional code needs to be strictly in accordance with certain rules (generally grid-like), and it must be accurately measured to determine the distance between the two-dimensional codes. Pose relationship; the robot's travel route is restricted by the layout of the two-dimensional code and the navigation information of the two-dimensional code, the walking mode is single, and it cannot complete complex curve driving
Moreover, this method still uses the method of laying two-dimensional codes on the ground, and cannot obtain the three-dimensional space coordinates of the robot. It can only be limited to planning plane paths, and cannot plan the three-dimensional paths of robots operating in spaces such as drones. The application scenarios are limited.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Mobile robot visual positioning and navigating method based on ArUco code
  • Mobile robot visual positioning and navigating method based on ArUco code
  • Mobile robot visual positioning and navigating method based on ArUco code

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0048] Below in conjunction with accompanying drawing and specific embodiment the present invention is described in further detail:

[0049] Such as figure 1 Shown is the functional schematic diagram of the present invention using ArUco codes to realize robot vision positioning and navigation. In this embodiment, by creating the distribution map of ArUco codes in the environment and the navigation path of the robot in advance, using a monocular camera to observe the randomly distributed ArUco codes in the room, the three-dimensional coordinates and posture of the robot in the world coordinate system are obtained in real time, and according to the current position of the robot The direction vector between the coordinate point of the navigation path and the navigation path determines the yaw angle α of the robot during its travel. The robot can adjust its own movement direction in real time according to the yaw angle α, so as to reach the target position along the preset naviga...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a mobile robot visual positioning and navigating method based on an ArUco code and belongs to the fields of computer visual positioning and mobile robot navigation. The mobilerobot visual positioning and navigating method comprises the following steps: map creation; route creation; and positioning and navigating. By using the method disclosed by the invention, not only arethe defects such as strong dependency on auxiliary physical condition, high environmental requirement and inflexibility of a traditional visual navigation way overcome, but also a traditional visualnavigation method only suitable for a ground mobile robot is widened to a three-dimensional space, so that the mobile robot visual positioning and navigating method is very suitable for positioning and navigating an aerial unmanned aerial vehicle; and the mobile robot visual positioning and navigating method has various advantages such as high precision, high flexibility and wide application rangeand can be used for autonomously positioning and navigating intelligent mobile equipment such as mobile robots and unmanned aerial vehicles in an indoor environment, so that the application scenes are widened.

Description

technical field [0001] The invention belongs to the fields of computer vision positioning and mobile robot navigation, and in particular relates to a mobile robot visual positioning and navigation method based on ArUco codes. Background technique [0002] At present, driven by related technologies such as computers and sensors, mobile robot technology has developed rapidly, and has been widely used in medical, military, industrial and space fields. The primary task of a mobile robot is to achieve flexible autonomous positioning and navigation based on its own sensors. However, various sensors are limited by different space environments. To complete autonomous positioning and navigation tasks, different solutions need to be applied in different environments. [0003] Due to the different application environments of mobile robots, high positioning and navigation accuracy is required in some environments, such as factory workshops, indoors, etc., and the positioning accuracy us...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G01C21/20
CPCG01C21/206
Inventor 张治国李旭杰李宝祥王海霞卢晓盛春阳崔玮李玉霞
Owner SHANDONG UNIV OF SCI & TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products