Depth motion map-scale invariant feature transform-based gesture recognition method

A scale-invariant feature, gesture recognition technology, applied in character and pattern recognition, instruments, computer parts, etc., can solve the problems of few model parameters, poor action recognition effect, high cost, and achieve strong robustness.

Inactive Publication Date: 2017-06-09
CHONGQING UNIV OF POSTS & TELECOMM
View PDF5 Cites 16 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] In view of this, the object of the present invention is to provide a gesture recognition method based on depth motion map-scale invariant feature transformation, which is aimed at the high cost of gesture recognition algorithms in traditional color video, and the lack of two-dimensional information leads to motion recognition effects The problem of poor real-time performance can efficiently deal with high-dimensional nonlinear popular feature quantities, with fewer model parameters, and the data features after dimensionality reduction have easy-to-interpret visualization characteristics, and real-time and efficient recognition of gesture sequences

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Depth motion map-scale invariant feature transform-based gesture recognition method
  • Depth motion map-scale invariant feature transform-based gesture recognition method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0029] The preferred embodiments of the present invention will be described in detail below with reference to the accompanying drawings.

[0030] figure 1 It is a schematic diagram of the gesture recognition method based on the depth motion map-scale invariant feature of the present invention, such as figure 1 As shown, a schematic diagram of gesture recognition based on depth motion map and scale-invariant feature transformation, including: data collection, extraction of human gesture motion features, and gesture recognition. Among them, the human body gesture data collection is based on Kinect somatosensory technology. Compared with traditional color images, depth images can provide third-dimensional depth data, are not sensitive to light, and can effectively eliminate the effects of skin color, occlusion and background on gesture recognition. influences. The common recognition method based on three-dimensional bone nodes, because Kinect is a human bone node reconstructed ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention relates to a depth motion map-scale invariant feature transform-based gesture recognition method. The method mainly comprises the following three parts: in the motion data acquisition aspect, an original depth image provided by the Kinect somatosensory technology is adopted as the input variable of a gesture recognition system. In the human body gesture feature construction aspect, a depth motion map-scale invariant feature transform-based extraction method is adopted, and data obtained after feature extraction are subjected to dimension-reduction treatment through the supervised locally linear embedding (SLLE) method. In this way, a gesture motion characteristic quantity is represented. In the gesture classifier recognition aspect, a support vector machine based on a discriminant is adopted to realize the sample training and modeling process of the characteristic quantities of a depth image sequence. Meanwhile, an unknown gesture is classified and predicted. The method of the invention can be adapted to different lighting environments, and is stronger in robustness. The method can also efficiently recognize gesture sequences in real time. Therefore, the method can be applied to the real-time gesture recognition field of man-machine interaction.

Description

technical field [0001] The invention belongs to the technical fields of gesture recognition, somatosensory technology, virtual reality and natural human-computer interaction, and relates to a gesture recognition method based on depth motion map-scale invariant feature transformation. Background technique [0002] In recent years, with the rapid development of pattern recognition, artificial intelligence and computer vision, gesture recognition based on somatosensory has become a new research hotspot. People repeat a large number of complex activities every day. For example, in addition to natural language dialogue, gestures are often involved when communicating with others. If machines and computers can understand the meaning of human gestures like normal humans and complete various instructions according to the corresponding gestures, realizing real-time interaction with people, a brand new world is waving to us. These studies have shown advantages in a large number of app...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/00G06K9/62
CPCG06V40/28G06F18/2411
Inventor 蔡林沁崔双杰虞继敏刘晓林
Owner CHONGQING UNIV OF POSTS & TELECOMM
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products