Eureka AIR delivers breakthrough ideas for toughest innovation challenges, trusted by R&D personnel around the world.

Video human action reorganization method based on sparse subspace clustering

A technology of recognition method and clustering method, which is applied in the field of computer vision pattern recognition and video image processing, can solve problems such as high cost, complex algorithm, and unsatisfactory effects in human behavior recognition, so as to improve performance, improve accuracy, and alleviate Effects of Overfitting and Gradient Diffusion Problems

Inactive Publication Date: 2015-06-24
UNIV OF ELECTRONICS SCI & TECH OF CHINA
View PDF6 Cites 36 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Although this method has strong anti-interference ability to the specific direction of the human body, bone size, and spatial position, and has a certain degree of generalization ability, it can be applied to human behavior recognition in a more ideal environment, but it needs to use 3D technology with high cost. In addition, the algorithm of this method is relatively complex, and the effect of human behavior recognition in more complex scenes is still not ideal

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Video human action reorganization method based on sparse subspace clustering

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0032] The hardware configuration that the present invention adopts is: Dell server, 8 nuclear 2.60Ghz CPU, 128Gb memory; Software configuration is: Windows Server 2003 operating system, OpenCV open source computer vision library, Microsoft Visual Studio 2010 development environment, Matlab simulation environment etc.

[0033] The concrete implementation stage of the present invention comprises training stage and identification stage, and its concrete implementation steps are as follows:

[0034] A. Establish a model for video human behavior recognition:

[0035] A1: Establish a three-dimensional spatio-temporal subframe cube: Divide each frame on the human behavior video of the same category in the human behavior database Hollywood2 used for learning into subframes of the same size (16×16 pixels), and then form the corresponding human behavior video The time series length of some consecutive frames (10 frames) is used as its thickness to establish a three-dimensional space-ti...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention belongs to computer visual pattern recognition and a video picture processing method. The computer visual pattern recognition and the video picture processing method comprise the steps that establishing a three-dimensional space-time sub-frame cube in a video human action reorganization model, establishing a human action characteristic space, conducting the clustering processing, updating labels, extracting the three-dimensional space-time sub-frame cube in the video human action reorganization model and the human action reorganization from monitoring video, extracting human action characteristic, confirming category of human sub-action in each video and classifying and merging on videos with sub-category labels. According to the computer visual pattern recognition and the video picture processing method, the highest identification accuracy is improved by 16.5% compared with the current international Hollywood2 human action database. Thus, the video human action reorganization method has the advantages that human action characteristic with higher distinguishing ability, adaptability, universality and invariance property can be extracted automatically, the overfitting phenomenon and the gradient diffusion problem in the neural network are lowered, and the accuracy of human action reorganization in a complex environment is improved effectively; the computer visual pattern recognition and the video picture processing method can be applied to the on-site video surveillance and video content retrieval widely.

Description

technical field [0001] The invention belongs to computer vision pattern recognition and video image processing methods, in particular to a neural network based on deep learning that adopts sparse subspace (SSC) clustering and subdivision and splits a large number of layers into several layers Fewer shallower approaches to video action recognition based on deep learning neural networks. Background technique [0002] Human behavior recognition based on video is a hot issue in the field of computer vision in recent years. As a typical video understanding problem, human behavior patterns can be identified and determined by analyzing the characteristics of human motion in video image sequences. More specifically, it extracts feature information that can describe behavior from video image sequences, uses machine learning and other technologies to understand it, and uses classifiers to classify to achieve the purpose of recognizing human behavior. [0003] With the development of ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/00G06K9/62
Inventor 郝宗波桑楠陆霖霖吴杰杨眷玉万士宁赵俊朱前芳鄢宇烈
Owner UNIV OF ELECTRONICS SCI & TECH OF CHINA
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Eureka Blog
Learn More
PatSnap group products