Autonomous robot navigation method and system based on multi-angle visual perception

An autonomous robot and visual perception technology, applied in the field of robot navigation, can solve problems such as increasing the difficulty of model training, redundant image input information, over-reliance, etc., and achieve the effect of improving path planning and obstacle avoidance capabilities

Active Publication Date: 2022-05-27
SHANDONG UNIV
View PDF5 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

The inventors found that the current method of using neural networks to construct direct mapping from multiple camera images and route mapping to behavior has the problem that the increase in image input may lead to information redundancy and increase the difficulty of model training, while directly integrating all camera functions Problems that can lead to over-dependence

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Autonomous robot navigation method and system based on multi-angle visual perception
  • Autonomous robot navigation method and system based on multi-angle visual perception
  • Autonomous robot navigation method and system based on multi-angle visual perception

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0037] refer to figure 1 , the autonomous robot navigation method based on multi-angle visual perception of this embodiment includes:

[0038] Step 1: Obtain the image of the robot's forward direction and the images on the left and right sides in real time and input it to the multi-task network.

[0039] In the specific implementation process, the camera or video camera set at the front position of the robot and on both sides of the robot may be used to perform image acquisition to obtain corresponding visual information.

[0040] Step 2: Predict the robot's freely drivable area, intersection location and intersection steering through the multi-task network.

[0041] combine figure 2 It can be seen that the multi-task network of this embodiment includes an encoder, a bottom point detection network, and a corner and intersection inference network; the encoder is used to extract the image of the robot's forward direction and the features in the images on the left and right si...

Embodiment 2

[0095] This embodiment provides an autonomous robot navigation system based on multi-angle visual perception, which includes:

[0096] (1) An image acquisition module, which is used to acquire the image of the robot's forward direction and the images of the left and right sides in real time and input it to the multi-task network.

[0097] In the specific implementation process, the camera or video camera set at the front position of the robot and on both sides of the robot may be used to perform image acquisition to obtain corresponding visual information.

[0098] (2) Navigation prediction module, which is used to predict the freely drivable area of ​​the robot, the position of the intersection and the turning of the intersection through the multi-task network.

[0099] combine figure 2 It can be seen that the multi-task network of this embodiment includes an encoder, a bottom point detection network, and a corner and intersection inference network; the encoder is used to e...

Embodiment 3

[0135] This embodiment provides a computer-readable storage medium on which a computer program is stored, and when the program is executed by a processor, implements the steps in the above-mentioned multi-angle visual perception autonomous robot navigation method.

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention belongs to the field of robot navigation and provides an autonomous robot navigation method and system based on multi-angle visual perception. Among them, the autonomous robot navigation method based on multi-angle visual perception includes real-time acquisition of the image of the robot's forward direction and the images of the left and right sides and inputting it to the multi-task network; the robot's free drivable area, intersection position and intersection steering are predicted by the multi-task network ; use the free drivable area of ​​the robot to generate local navigation indicators, the intersection position and intersection steering to generate global navigation indicators, generate combined indicators according to the local navigation indicators and global navigation indicators, and then combine the steering commands mapped in the pre-built map to obtain Robot control instructions.

Description

technical field [0001] The invention belongs to the field of robot navigation, and in particular relates to an autonomous robot navigation method and system based on multi-angle visual perception. Background technique [0002] The statements in this section merely provide background information related to the present invention and do not necessarily constitute prior art. [0003] For autonomous robot navigation, it is still challenging to design a navigation system that integrates goal-directed navigation with obstacle avoidance in an unstructured environment, such as a campus site and a street crowded with pedestrians and cars. This requires robots to be able to handle different scenarios based on sufficient perception of the surrounding environment. [0004] With deep learning showing state-of-the-art performance in various vision tasks and the low cost of RGB cameras, vision-based navigation methods have received extensive attention. There are two paradigms for vision-b...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Patents(China)
IPC IPC(8): B25J9/16G06N3/04G06N3/08G06Q10/04G06T7/292
CPCB25J9/16B25J9/1697B25J9/1666B25J9/1664G06N3/08G06Q10/047G06T7/292G06N3/045
Inventor 张伟陈伟朱坤岩宋然李贻斌
Owner SHANDONG UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products