Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Vision-based AGV patrolling navigation and positioning method

A positioning method and visual technology, applied in surveying and navigation, two-dimensional position/channel control, photo interpretation, etc., can solve problems such as high maintenance costs, accumulation of odometer deviations, and navigation failures, and reduce renovation and maintenance Effects of cost, stable positioning and navigation

Inactive Publication Date: 2017-09-01
NANJIANG ROBOT
View PDF8 Cites 23 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0002] In traditional line-following navigation methods, guidance methods based on magnetic guidance, electromagnetic induction, etc., these methods involve the laying of magnetic strips or guide lines, and the existing environment needs to be modified, and the maintenance cost is high.
In addition, in the laser-based trackless navigation method, the laser sensor is usually very expensive, and it cannot be widely used and popularized, and it is easy to cause the loss of positioning when the surrounding environment changes in a large range, thus making the navigation fail.
[0003] The traditional positioning method uses the odometer for positioning. With the increase of the travel of the movable device, the sideslip of the movable device, the slipping of casters and other reasons will lead to the gradual accumulation of the odometer deviation, resulting in inaccurate feedback of the pose of the movable device.
Due to technical and cost constraints, there is currently no low-cost, stable solution for robot trajectory line navigation and positioning

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Vision-based AGV patrolling navigation and positioning method
  • Vision-based AGV patrolling navigation and positioning method
  • Vision-based AGV patrolling navigation and positioning method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0035] A vision-based AGV line tracking navigation and positioning method, comprising steps:

[0036] Step 101: Obtain an image ahead of the current course;

[0037] In this embodiment, the image is acquired by a camera fixed on the AGV, the optical axis of the camera is parallel to the longitudinal axis of the AGV, the direction of the longitudinal axis of the AGV is consistent with the direction of the AGV heading, and the camera is calibrated through internal and external parameters, which include the camera Focal length, distortion coefficient, camera height, pitch angle, relative positional relationship between the camera and the center of the AGV body, etc., to obtain the conversion relationship between the image and the global coordinate system, the conversion relationship between the camera coordinates and the AGV body coordinates, etc. Wherein, the center of the AGV car body is used as a feature point for describing the trajectory movement of the AGV.

[0038] Among ...

Embodiment 2

[0054] In this embodiment, the specific process of step 102 includes:

[0055] Step 201: using a line extraction algorithm to extract line segments in the preprocessed image;

[0056] Wherein, the line segment includes the following attributes: including the start point, the end point and the length of the line segment. Wherein, the line segment includes a starting point and an ending point, and both the starting point and the ending point are in the image area, or the intersection point between the image area and the line segment is used as the ending point.

[0057] Step 202: Select two line segments as a parent straight line and a sub-straight line respectively, and determine whether the parent straight line and the sub-straight line can be fused by using a set threshold; if it is less than the set threshold, it is determined that the line segment is on a straight line and can be fused;

[0058] Specifically, a line segment is selected as the parent straight line, and the ...

Embodiment 3

[0063] Embodiment 3, in the straight line extraction algorithm described in step 102, step 201, preferably select the LSD line segment extraction algorithm,

[0064] The LSD line extraction algorithm includes the following sub-steps:

[0065] Step S301: Calculate the gradient of each pixel, and define the direction of the horizontal line as the direction perpendicular to the gradient;

[0066] Step S302: Use the region growing algorithm to divide the image into several connected domains, and the maximum difference in the direction of the horizontal line between two pixels in each connected domain is τ;

[0067] Step S303: For each connected domain, select a smallest rectangle to enclose the area, then the direction of the major axis of the rectangle is the direction of the line segment, and return the two endpoints of the line segment; use the straight line equation Ax+By+C=0 to represent step S201 and this Each line segment obtained in the invention, wherein A, B, and C are ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a vision-based AGV patrolling navigation and positioning method. The method comprises steps: an image in front of the current heading is acquired; according to first preset features of a navigation reticule, a navigation reticule object meeting the first preset features is recognized in an image; the relative position relationship between the AGV and the navigation reticule is acquired, and the relative position relationship is used for correcting the position and the heading of the AGV. An image acquisition device is adopted to replace a magnetic guidance sensor, a color adhesive tape is adopted to lay an auxiliary path, and the navigation path modification and maintenance cost is greatly reduced. In view of the shortcoming of a small viewing field of the image acquisition device, through fusing visual features and odometer information, the real-time pose of the AGV is tracked accurately. Besides, in view of accumulated errors of the odometer caused by reasons such as side slipping of a movable device and slipping of a caster, the odometer information of the AGV is corrected through setting a position correction mark with known global coordinates, and high-precision and stable AGV positioning and navigation can be realized.

Description

technical field [0001] The invention relates to an AGV navigation and positioning method, in particular to a vision-based AGV line inspection navigation and positioning method. Background technique [0002] In the traditional line-following navigation methods, guidance methods based on magnetic guidance and electromagnetic induction, etc., involve the laying of magnetic strips or guide lines, which require modification of the existing environment and require high maintenance costs. In addition, in the laser-based trackless navigation method, the laser sensor is usually very expensive, which cannot be widely used and popularized, and it is easy to cause the loss of positioning when the surrounding environment changes in a large range, thus making the navigation fail. [0003] The traditional positioning method uses the odometer for positioning. With the increase of the travel of the movable device, the sideslip of the movable device, the slipping of the casters and other reas...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G05D1/02G01C11/00G01C11/04
CPCG05D1/0217G01C11/00G01C11/04
Inventor 刘金勇戴舒炜
Owner NANJIANG ROBOT
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products