Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Gesture identification method and system of robot, and robot

A technology of gesture recognition and robotics, which is applied in neural learning methods, character and pattern recognition, instruments, etc., can solve problems such as machine learning methods that are prone to misjudgment, failure to meet intelligent detection, and light that is too bright or too dark to improve The effects of diversity, reducing network size, and improving accuracy

Inactive Publication Date: 2018-02-23
NANJING AVATARMIND ROBOT TECH CO LTD
View PDF4 Cites 12 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

This method is usually suitable for recognition in a single background. However, in real applications, gestures are usually in complex environments, such as complex backgrounds, too bright or too dark lights, and different distances between gestures and acquisition devices; in complex environments In this situation, the machine learning method is prone to misjudgment. At this time, manual screening is required, which cannot meet the purpose of intelligent detection.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Gesture identification method and system of robot, and robot
  • Gesture identification method and system of robot, and robot
  • Gesture identification method and system of robot, and robot

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0047] In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the specific implementation manners of the present invention will be described below with reference to the accompanying drawings. Obviously, the accompanying drawings in the following description are only some embodiments of the present invention, and those skilled in the art can obtain other accompanying drawings based on these drawings and obtain other implementations.

[0048] In order to make the drawing concise, each drawing only schematically shows the parts related to the present invention, and they do not represent the actual structure of the product. In addition, to make the drawings concise and easy to understand, in some drawings, only one of the components having the same structure or function is schematically shown, or only one of them is marked. Herein, "a" not only means "only one", but also means "more than one".

[0049] Such as f...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a gesture identification method and system of a robot, and the robot. The method comprises steps of pre-acquiring pictures which contain different gestures and pictures which do not contain gestures so as to obtain a sample picture set; according to the sample picture set, making a detection sample set and a filtering sample set; according to the detection sample set, carrying out training so as to obtain an adaboost cascading gesture detector; according to filtering sample set, carrying out training so as to obtain a gesture identification convolution neural network; and identifying the acquired gesture pictures through the adaboost cascading gesture detector so as to obtain gesture identification results, and filtering the gesture identification results through the gesture identification convolution neural network so as to obtain a correct gesture identification result. According to the invention, by filtering the gesture identification results through the gesture identification convolution neural network, a gesture can be precisely identified under the complex background.

Description

technical field [0001] The invention relates to the fields of artificial intelligence and picture processing, in particular to a robot gesture recognition method, system and robot. Background technique [0002] With the development of science and technology, intelligent robots are used more and more in daily life and industrial production. In the process of realizing robot intelligence, gesture recognition is an important way of human-computer interaction, and its research and development affect the naturalness and flexibility of human-computer interaction. [0003] At present, there are many service robots that can recognize user instructions according to user gestures and help people complete many tasks. The process of conventional image processing technology and machine learning method to realize gesture recognition usually includes steps such as gesture segmentation, gesture analysis and gesture recognition. This method is usually suitable for recognition in a single b...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/00G06K9/62G06N3/04G06N3/08
CPCG06N3/082G06V40/107G06V40/28G06N3/045G06F18/2413G06F18/214
Inventor 谢阳阳
Owner NANJING AVATARMIND ROBOT TECH CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products