Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Visual navigation method for moving robot

A technology of robot vision and navigation methods, applied in neural learning methods, instruments, computer components, etc., can solve problems such as low precision, increased cumulative error of inertial navigation, and sensitivity to light sources

Inactive Publication Date: 2021-03-30
ZHENJIANG COLLEGE
View PDF0 Cites 1 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Magnetic strip navigation needs to lay magnetic strips on the moving path of the robot, the accuracy is low, and the magnetic strip protruding from the ground is easily damaged; with the accumulation of time, the cumulative error of inertial navigation increases, and other equipment needs to be assisted to correct it, and the high The cost of high-precision inertial navigation devices is relatively high; laser navigation needs to add reflectors on both sides of the moving path, which requires high installation accuracy of the reflectors, and is sensitive to other light sources, which is not easy to operate outdoors, and the cost is high

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Visual navigation method for moving robot
  • Visual navigation method for moving robot
  • Visual navigation method for moving robot

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0089] The present invention will be further described below in conjunction with the accompanying drawings and specific embodiments.

[0090] Such as figure 1 Shown, the inventive method comprises the following steps:

[0091] 1. Calibrate the camera using Zhang’s plane calibration method to obtain the camera’s internal parameter matrix, distortion coefficient matrix, intrinsic matrix, fundamental matrix, rotation matrix and translation matrix, correct the camera, collect video and store it;

[0092] 2. Process the video to obtain the frame image, and use the image enhancement method based on active lighting to preprocess the obtained image, including the following steps:

[0093] ①Using the depth of field to divide the image into foreground and background areas;

[0094] ② On the basis of the depth of field, the object and the background are separated according to the gradient information of the object and the background;

[0095] ③ Select the pixels at infinity that have lo...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a visual navigation method for a moving robot. The method comprises the following steps: acquiring images of left and right cameras, and calibrating the cameras; processing thevideo to obtain a frame image, and preprocessing the obtained image by adopting an image enhancement method based on active illumination; performing target identification on the image based on a YOLOv3 network target detection method; performing stereo matching on the image according to an SGBM algorithm optimized by a least square fitting interpolation method based on the obtained bounding box where the target is located; and obtaining the category of the image and the position information of the image. The target object can be detected, the boundary pre-selection box where the target is located can be framed out, and a foundation is laid for subsequent stereo matching to obtain depth information and avoid obstacles. According to the method, only matching operation is carried out on thepre-selected frame image, so that the size of the three-dimensional matching image is reduced, the calculation efficiency of the three-dimensional matching is greatly improved, and a beneficial effectis achieved for improving the real-time performance of target detection.

Description

technical field [0001] The invention relates to a visual navigation method for a moving robot, which belongs to the technical field of robots. Background technique [0002] Factories often use robots to carry workpieces and goods. In order to further realize the intelligent movement of the handling robot, it is necessary to use various sensing devices to enable the handling robot to recognize and judge the current road conditions and motion status, and make self-decisions, so that it can Make the handling robot realize completely unmanned operation, and run according to the prescribed route during the work process, or according to the task target, after identifying the environment, plan a reasonable movement trajectory by itself, and complete the intelligent operation. With the development of sports robots, their application fields are becoming more and more extensive. At present, the navigation methods of sports robots mainly include magnetic stripe navigation, inertial nav...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06T7/80G06T7/194G06T7/246G06K9/62G06N3/04G06N3/08
CPCG06T7/80G06T7/194G06T7/248G06N3/084G06T2207/20081G06T2207/20084G06T2207/30244G06V2201/07G06N3/045G06F18/22G06F18/23213G06F18/25
Inventor 徐沛黄海峰
Owner ZHENJIANG COLLEGE
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products