Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

A method of robot human-computer interaction

A human-computer interaction and robotics technology, applied in the fields of language interaction, radar obstacle avoidance and deep learning, and robot active visual perception, it can solve the problems of little or no progress in knowledge question answering research, and achieve the improvement of intelligence, The effect of improving navigation accuracy

Active Publication Date: 2022-03-11
TSINGHUA UNIV
View PDF9 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Although they have made great achievements in computer vision and navigation, there is not much progress in the fusion of vision and LiDAR information for map-free navigation, and there are few studies on using the above information for navigation and knowledge question answering.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A method of robot human-computer interaction
  • A method of robot human-computer interaction
  • A method of robot human-computer interaction

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0017] The robot human-computer interaction method proposed by the present invention includes:

[0018] Shoot the RGB image and depth map of the environment, and detect the obstacle information to obtain the laser radar array, normalize the acquired data, construct the problem encoding network in human-computer interaction to encode the problem; construct the image feature extraction network, and convert the RGB The image and depth image information is extracted into a feature matrix, and the lidar data, question code and feature matrix are spliced ​​to obtain a feature fusion matrix; the convolutional network is used to obtain the data fusion matrix as the data fusion matrix of the surrounding environment; a cyclic neural network is trained as The navigator takes the data fusion matrix as input, and the output is one of the actions of "forward, left, right, and stop" to control the direction of the robot's movement.

[0019] Introduce an embodiment of the inventive method bel...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention belongs to the technical fields of robot active visual perception, language interaction, radar obstacle avoidance and deep learning, and in particular relates to a robot human-computer interaction method. This method captures the RGB image and depth map of the environment, detects obstacle information to obtain a laser radar array, normalizes the acquired data, constructs a problem encoding network in human-computer interaction to encode the problem; constructs an image feature extraction network, Extract RGB image and depth image information into a feature matrix, splice lidar data, question code and feature matrix to get feature fusion matrix; use convolutional network to obtain data fusion matrix as the data fusion matrix of surrounding environment; train a cyclic neural network As a navigator, the network takes the data fusion matrix as input, outputs the navigation result, and controls the robot's movement direction. The method realizes the robot's self-navigation, self-exploration, human-computer interaction and other functions, and improves the intelligence of the robot.

Description

technical field [0001] The invention belongs to the technical fields of robot active visual perception, language interaction, radar obstacle avoidance and deep learning, and in particular relates to a robot human-computer interaction method. Background technique [0002] Today, autonomous robots can operate independently to complete specific tasks without human intervention. Autonomous locomotion, a major attribute of autonomous robots, depends primarily on accurate motion estimation and high-level environmental perception. However, in some cases, artificial landmarks are unknowable, or the robot is in a GPS-missing environment, so that ego motion estimation or obtaining scene information encounters great difficulties. Technically, the mobile robot gradually constructs a map consistent with the overall environment by sensing the environment, and at the same time uses this map to realize self-positioning. For a long time, robot navigation problems have been basically solved...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): B25J9/16G01C21/00G01S17/86G01S17/89
CPCB25J9/1664B25J9/1694B25J9/1697B25J9/1689G01S17/89G01S17/86G01C21/005
Inventor 刘华平陆升阳张新钰袁小虎赵怀林
Owner TSINGHUA UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products