Unlock instant, AI-driven research and patent intelligence for your innovation.

Depth information-based gesture recognition system and method for robot

A technology of depth information and gesture recognition, applied in biometric recognition, neural learning methods, character and pattern recognition, etc., can solve the problem of recognition accuracy, images are easily exposed to light, etc., to achieve improved accuracy, convenient gesture recognition, real-time good effect

Pending Publication Date: 2022-04-08
UNIV OF SCI & TECH OF CHINA
View PDF1 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0006] In view of the related technologies mentioned above, the inventor believes that the recognition method based on absolute gestures also has certain shortcomings, that is, the image is easily affected by light and other backgrounds, which affects the accuracy of recognition

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Depth information-based gesture recognition system and method for robot
  • Depth information-based gesture recognition system and method for robot
  • Depth information-based gesture recognition system and method for robot

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0039] The present invention will be described in detail below in conjunction with specific embodiments. The following examples will help those skilled in the art to further understand the present invention, but do not limit the present invention in any form. It should be noted that those skilled in the art can make several changes and improvements without departing from the concept of the present invention. These all belong to the protection scope of the present invention.

[0040] The embodiment of the present invention discloses a gesture recognition system based on depth image information for service robots, such as figure 1 and figure 2 As shown, including radar, storage unit and computing unit. The radar includes lidar (lidar image generating unit or lidar generating depth image unit). LiDAR generates depth image information from objects. LiDAR uses the time-of-flight method to generate the depth image information of the corresponding object. Lidar mainly includes...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a depth information-based gesture recognition system and method for a robot. The system comprises a radar and a calculation unit, the radar generates depth image information according to an object; and the calculation unit processes the depth image information to recognize a gesture. The depth image information solves the problem that a traditional image is prone to being influenced by illumination and complex backgrounds, the extracted gestures are classified and recognized in a deep learning neural network mode, and the gesture recognition precision is effectively improved.

Description

technical field [0001] The present invention relates to the technical field of human-computer interaction of robots, in particular, to a gesture recognition system and method based on depth information for robots, and in particular, to a gesture recognition method based on depth image information that can be used for service robots recognition methods. Background technique [0002] In recent years, with the improvement of the computing power of computer hardware equipment and the improvement of machine learning algorithms, the field of artificial intelligence has developed rapidly, and intelligent robots have gradually entered the field of vision of the general public. The more important role, human-computer interaction technology, as the basis for human-machine interaction, has always been the focus of research in industry and academia. After more than half a century of development, human-computer interaction technology has changed from the traditional physical interaction...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06V10/26G06V40/10G06V10/82G06T7/136G06T7/155G06T7/194G06N3/04G06N3/08
Inventor 夏海生李智军廖朴金阚震
Owner UNIV OF SCI & TECH OF CHINA