Multi-robot collaborative path following method based on deep learning

A multi-robot and deep learning technology, applied in the computer field, can solve problems such as large recognition delay, inability to recognize scene pictures from perspectives, long request response time, etc.

Active Publication Date: 2018-12-14
NAT UNIV OF DEFENSE TECH
View PDF6 Cites 11 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0003] Multi-robot path following refers to the comprehensive use of visual information from different angles of multiple robots to make up for the limitation of the viewing angle of the single-robot system and improve the recognition accuracy of the overall road extension direction of the system. However, building a multi-robot path following system faces two major problems: (1) The requirements for the fusion method are high, and the useful information in the multi-robot perspective needs to be reasonably extracted and fused together; (2) The time delay is relatively large, because the process of data fusion involves data collection, transmission and processing , each link has a corresponding time delay
[0005] To sum up, although the path recognition and following method based on a single robot requires less resources and performs better in request response time, it has knowledge limitations and cannot recognize scene pictures with limited viewing angles.
However, the path following method based on multi-robots uses the perspectives of different angles of multi-robots to identify the direction of road extension in the environment with wider intelligence, which can improve the overall recognition accuracy, but the recognition delay is large and the request response time is long.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Multi-robot collaborative path following method based on deep learning
  • Multi-robot collaborative path following method based on deep learning
  • Multi-robot collaborative path following method based on deep learning

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0072] figure 1 It is the multi-robot environment constructed in the first step of the present invention, which is composed of ground robot nodes, aerial robot nodes and cloud server nodes. Ground / air robot nodes are robot hardware devices (such as unmanned vehicles / drones) that can perceive environmental information and receive speed commands. Cloud server nodes are resource-controllable computing devices with good computing power, which can run computing-intensive Or knowledge-intensive robotics applications.

[0073] figure 2 It is a software deployment diagram on ground and aerial robot nodes and cloud server nodes of the present invention. The robot computing node is a robot hardware device that can move in the environment and receive speed commands. It has a camera sensor on it. The cloud server nodes are equipped with the operating system Ubuntu and the Caffe deep learning model framework. In addition, the ground and air robot nodes are also equipped with a percepti...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a multi-robot collaborative path following method based on deep learning in order to improve the accuracy of path recognition so as to balance the accuracy and the recognitiontime. According to the technical scheme, an environment composed of air and ground robot nodes and a cloud server node is established, the cloud server node obtains environment pictures shot by a ground robot based on WIFI, conducts path extension direction recognition and classification based on the environment pictures, requests the air robot node to send back a high-level view and obtain a terminal decision through decision fusion processing as for the situations that recognition cannot be conducted or wrong recognition is likely to occur, and converts the obtained terminal decision into acorresponding velocity vector to be sent to the air and ground robot nodes, and thus a robot group conducts driving collaboratively. By adopting the multi-robot collaborative path following method, the accuracy rate of path recognition can be increased, and thus the appropriate balance between improving of the accuracy rate of path recognition and shortening of the recognition time is achieved.

Description

technical field [0001] The invention relates to image processing technology and robot distribution technology in the computer field, and in particular to a method for realizing vision-based multi-robot collaborative path following by using cloud computing as a back-up support and by mixing the perspectives of multi-robots. Background technique [0002] Vision-based robot path following problem is to let the robot only accept the visual image input, and automatically follow the extension direction of the artificial path. There are mainly two existing vision-based robot path following methods: the first is to use image segmentation technology, and the second is to transform the path recognition problem into an image classification problem, and then use deep learning to solve it. Image segmentation techniques aim to segment paths and backgrounds with image saliency. Image saliency means that in the input visual image, the features of the path part will be more prominent than t...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G05D1/02
CPCG05D1/0221G05D1/0223G05D1/0253G05D1/0276
Inventor 王怀民丁博刘惠耿铭阳史佩昌周星李艺颖
Owner NAT UNIV OF DEFENSE TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products