Human body motion classification method based on compression perception

A technology of compressed sensing and human motion, applied in the field of video analysis, which can solve the problem of low accuracy

Inactive Publication Date: 2016-10-26
北京九艺同兴科技有限公司
View PDF5 Cites 10 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Jiang et al. used the features generated by the Action bank detector as the feature representation of the video, which relied on pre-trained detectors, and the accuracy was not high.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Human body motion classification method based on compression perception
  • Human body motion classification method based on compression perception
  • Human body motion classification method based on compression perception

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0067] In order to enable those skilled in the art to better understand the technical solutions of the present invention, the present invention will be further analyzed below in conjunction with specific examples.

[0068] A human action classification method based on compressive sensing. By treating all action training samples as an over-complete dictionary, an action classification algorithm based on compressive sensing is designed. The method includes: space-time interest point detection, video bag-of-words model-based Feature expression, construction of visual dictionary and action classification algorithm based on compressed sensing are four steps, of which: Step 1: Spatio-temporal interest point detection, using the method of spatio-temporal interest point detection to count the features based on time changes. For a video sequence, the point of interest is determined by three dimensions, the x and y axes marking the spatial position and the t axis marking the time. The i...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention relates to a human body motion classification method based on compression perception, comprising the four steps of space-time interest point detection, video characteristic expression based on a bag-of-word model, construction of a visual dictionary and a motion classification algorithm based on compression perception. In step 1, solving training sample characteristics to obtain a training sample matrix A=[A1,A2,...AK] belong to Rm*n, k categories, a test sample y belong to RM and an optional fault tolerance degree epsilon>0; in step 2, solving a dictionary Z, a classifier W and a coefficient matrix A; and for a new video motion sequence, employing the classifier W obtained in the second step for classification, and finally obtaining the category estimation of video motion. The human body motion classification method fuses space-time interest detection, dictionary learning and video expression characteristics in a learning framework, and simultaneously learns a linear classifier; the human body motion classification method simultaneously learns a discrimination dictionary, a discrimination coding coefficient and a classifier through an optimal method, is simple to calculate, has good robustness, and enhances the capability of processing non-linear data through a compression perception method.

Description

technical field [0001] The invention relates to a method for classifying human actions, in particular to a method for classifying human actions based on compressed sensing, and belongs to the field of video analysis. Background technique [0002] It is well known that extracting data from videos to make reasonable representations of actions is especially important for action classification. Usually we need to select the method of action representation according to the method of action classification. For example, trajectory-based methods are suitable for long-distance monitoring in open environments, while 3D models are often used in gesture recognition. Parameswaran et al. proposed to use the following four criteria to evaluate action representation methods: simplicity, completeness, continuity, and uniqueness. [0003] The shape of the human body contour is the most intuitive way to represent actions, so there are also a large number of shape-based human action represent...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/62G06K9/00
CPCG06V40/23G06F18/2453
Inventor 张瑞萱汪成峰王庆张凯强
Owner 北京九艺同兴科技有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products