Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Robot man-machine interaction method

A technology of human-computer interaction and robotics, applied in the field of language interaction, radar obstacle avoidance, deep learning, and active visual perception of robots, can solve problems such as knowledge question answering with little research and no great progress, to improve intelligence, The effect of improving navigation accuracy

Active Publication Date: 2021-06-01
TSINGHUA UNIV
View PDF9 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Although they have made great achievements in computer vision and navigation, there is not much progress in the fusion of vision and LiDAR information for map-free navigation, and there are few studies on using the above information for navigation and knowledge question answering.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Robot man-machine interaction method
  • Robot man-machine interaction method
  • Robot man-machine interaction method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0017] The robot human-computer interaction method proposed by the present invention includes:

[0018] Shoot the RGB image and depth map of the environment, and detect the obstacle information to obtain the laser radar array, normalize the acquired data, construct the problem encoding network in human-computer interaction to encode the problem; construct the image feature extraction network, and convert the RGB The image and depth image information is extracted into a feature matrix, and the lidar data, question code and feature matrix are spliced ​​to obtain a feature fusion matrix; the convolutional network is used to obtain the data fusion matrix as the data fusion matrix of the surrounding environment; a cyclic neural network is trained as The navigator takes the data fusion matrix as input, and the output is one of the actions of "forward, left, right, and stop" to control the direction of the robot's movement.

[0019] Introduce an embodiment of the inventive method bel...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention belongs to the technical field of robot active visual perception, language interaction, radar obstacle avoidance and deep learning, and particularly relates to a robot man-machine interaction method. The robot man-machine interaction method comprises the steps of shooting an RGB image and a depth map of an environment, detecting obstacle information to obtain a laser radar array, performing normalization processing on the obtained data, and constructing a problem coding network in man-machine interaction to code a problem; constructing an image feature extraction network, extracting RGB image and depth image information into a feature matrix, and splicing the laser radar data, the problem codes and the feature matrix to obtain a feature fusion matrix; adopting a convolutional network to obtain a data fusion matrix as a data fusion matrix of the surrounding environment; and training a recurrent neural network as a navigator, taking the data fusion matrix as input, outputting a navigation result, and controlling the motion direction of a robot. According to the method, the functions of self-navigation, self-exploration, man-machine interaction and the like of the robot are realized, and the intelligence of the robot is improved.

Description

technical field [0001] The invention belongs to the technical fields of robot active visual perception, language interaction, radar obstacle avoidance and deep learning, and in particular relates to a robot human-computer interaction method. Background technique [0002] Today, autonomous robots can operate independently to complete specific tasks without human intervention. Autonomous locomotion, a major attribute of autonomous robots, depends primarily on accurate motion estimation and high-level environmental perception. However, in some cases, artificial landmarks are unknowable, or the robot is in a GPS-missing environment, so that ego motion estimation or obtaining scene information encounters great difficulties. Technically, the mobile robot gradually constructs a map consistent with the overall environment by sensing the environment, and at the same time uses this map to realize self-positioning. For a long time, robot navigation problems have been basically solved...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): B25J9/16G01C21/00G01S17/86G01S17/89
CPCB25J9/1664B25J9/1694B25J9/1697B25J9/1689G01S17/89G01S17/86G01C21/005
Inventor 刘华平陆升阳张新钰袁小虎赵怀林
Owner TSINGHUA UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products