Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Gesture classification method based on tensor decomposition

A classification method and tensor decomposition technology, applied in the field of gesture classification based on tensor decomposition, can solve the problems of low classification accuracy and unsatisfactory classification accuracy, and achieve the effect of good physical meaning

Inactive Publication Date: 2017-07-11
TIANJIN UNIV
View PDF3 Cites 4 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

At the same time, this representation method is combined with the support vector machine (Support Vector Machine, SVM) classification scheme for action recognition, but the classification accuracy of this method is low when recognizing jogging and running actions
[0004] Scovanner et al. [2] We propose a three-dimensional (3D) Scale Invariant Feature Transform (SIFT) descriptor for video or 3D images, specifically using a bag-of-words approach to represent videos, and propose a method to discover spatiotemporal The relationship between words in order to better describe the video data. Although the average accuracy of this method is high, the classification accuracy is still not ideal when classifying certain actions.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Gesture classification method based on tensor decomposition
  • Gesture classification method based on tensor decomposition
  • Gesture classification method based on tensor decomposition

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0032] A gesture classification method based on tensor decomposition, see figure 1 , the gesture classification method includes the following steps:

[0033] 101: Model gesture videos with third-order tensors; decompose each gesture video using a modified high-order singular value decomposition method;

[0034] 102: Visually present and analyze the results of tensor decomposition;

[0035] 103: Use typical angles to classify gesture videos through K-nearest neighbor classifier and support vector machine classifier; change the number of column vectors in the factor matrix to conduct comparative experiments.

[0036] Wherein, in step 101, the step of modeling the gesture video with a third-order tensor is specifically:

[0037] The first order of the tensor indicates the horizontal direction, the second order indicates the vertical direction, and the third order indicates the time axis;

[0038] The image in the read sample is a matrix, and the matrix is ​​concatenated in the...

Embodiment 2

[0046] The scheme in embodiment 1 is further introduced below in conjunction with specific calculation formulas and examples, see the following description for details:

[0047]201: Modeling gesture videos with third-order tensors;

[0048] Specifically: the Cambridge Gesture Database is used in this experiment, including 9 types of gestures, such as figure 2 As shown, a gesture video is represented as a three-order tensor, where the first order of the tensor represents the horizontal direction, the second order represents the vertical direction, and the third order represents the time axis. First read the pictures in the sample as a matrix, and then concatenate these matrices in the third order of the tensor, thus forming a third-order tensor representing the video.

[0049] 202: Utilize a modified High Order Singular Value Decomposition (HOSVD) for each gesture video [7] method to break down;

[0050] First, for N-order (dimensional) tensors Carry out the matrix calcul...

Embodiment 3

[0080] Below in conjunction with specific calculation formula, example, the scheme in embodiment 1 and 2 is carried out feasibility verification, see the following description for details:

[0081]The database of this experiment is the Cambridge Gesture Database, which contains a total of 900 samples, which are divided into 9 categories according to the type of gesture, and 5 types according to the light level of the picture, corresponding to Set1, Set2, Set3, Set4 and Set5 respectively. There are 9 types of gestures under each light level, namely five fingers together to the left, five fingers together to the right, five fingers together to make a fist, five fingers apart to the left, five fingers apart to the right, five fingers apart to close together, V-shaped gestures to the left, V-sign to the right, V-sign to close. Each gesture contains another 20 samples. Each sample contains several pictures, and the number of pictures is inconsistent. One picture in the sample cor...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a gesture classification method based on tensor decomposition; the gesture classification method comprises the following steps: using a third-order tensor to model gesture videos; using a modified higher-order singular value decomposition method to decompose each gesture video; carrying out visualized presentation and parsing the tensor decomposition result; respectively using a typical angle to classify the gesture videos through a K neighbor classifier and a support vector machine classifier; changing a factor matrix column vector number so as to carry out a contrast experiment. The method can visualize the tensor decomposition result, and the result can be presented for classification, thus better understanding the tensor decomposition physics meanings.

Description

technical field [0001] The invention relates to the field of action recognition, in particular to a gesture classification method based on tensor decomposition. Background technique [0002] In recent years, human-computer interaction and statistical learning have been developing continuously. Action recognition is an important problem to be solved. Due to the complexity of human actions, there will be great differences between the same actions. Therefore, how to effectively extract The essential difference between different actions or the extraction of invariant features between them from the same action is crucial. [0003] Schuldt et al. [1] Local events in videos are captured using local spatio-temporal features that can adapt to changes in the size, frequency, and velocity of moving patterns. At the same time, this representation method is combined with the support vector machine (Support Vector Machine, SVM) classification scheme for action recognition, but the class...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06K9/00G06K9/62
CPCG06V40/107G06F18/24147G06F18/2411
Inventor 苏育挺刘琛琛张静刘安安
Owner TIANJIN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products