Human body behavior identification method adopting non-supervision multiple-view feature selection

A feature selection and recognition method technology, applied in character and pattern recognition, instruments, computer components, etc., can solve the problems of not being able to give full play to the advantages of multi-view features and less research, so as to improve the recognition accuracy and anti-noise interference ability Effect

Inactive Publication Date: 2014-02-12
ZHEJIANG UNIV
View PDF2 Cites 18 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Therefore, when performing feature selection, traditional feature selection methods cannot give full play to the advantages of multi-view features.
[0005] On the other hand, in the research of video-based human behavior recognition, although various types of visual features have been continuously proposed, how to combine and apply multiple features has also been researched and discussed in some literatures, but for There are relatively few studies on how to perform fast and effective feature selection among multiple features.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Human body behavior identification method adopting non-supervision multiple-view feature selection
  • Human body behavior identification method adopting non-supervision multiple-view feature selection
  • Human body behavior identification method adopting non-supervision multiple-view feature selection

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0053] The public human behavior dataset KTH is used to test the human behavior recognition ability of this method. The KTH dataset contains a total of 600 human action videos divided into 6 different motion types. figure 1 Shows some KTH behavior video data samples, figure 2 and image 3 It shows the comparison between this method (denoted as AUMFS) and other existing methods (Max Vairance, Laplacian Score, Feature Ranking, Multi-Cluster Feature Selection (MCFS) and Nonnegative Discriminative Feature Selection (NDFS)) comparison method recognition performance with the feature selection dimension change comparison results, Figure 4 is the sensitivity of different parameters of the method performance under the KTH data set, Figure 5 The method iteration convergence curve is shown. Below in conjunction with the concrete technical scheme described above illustrate the steps that this example implements, as follows:

[0054] 1) First, 80% of the KTH data set is used as a p...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a human body behavior identification method adopting non-supervision multiple-view feature selection. The method includes the steps that firstly, a plurality of types of visual feature expression are extracted from sets of video data, including different human body behavior types, collected in advance to acquire a multi-view feature data matrix; then, in terms of each view, a visual sense similarity graph and a geometric Laplacian matrix which are related to the corresponding view are built so as to build a target function for solving a multi-view feature selection matrix and solving a data clustering type matrix; the multi-view feature selection matrix is optimized and calculated through the iteration gradient descent method, and a two-value feature selection matrix is acquired according to the line sequencing result of W; finally, video data to be identified are converted into corresponding multi-view feature data, distances between data to be identified after feature selection and multi-view feature data collected in advance are compared, and a video to be identified is identified as the human body behavior type in the jth video data collected in advance, wherein j is the serial number of video data, corresponding to the minimum distance of each list of multi-view feature data, collected in advance. The method is high in calculation speed and has high identification accurate rate and noise and interference resistance ability.

Description

technical field [0001] The present invention relates to the subjects of unsupervised learning, multi-view learning, feature selection and human behavior recognition in machine learning and computer vision research, and in particular to a human behavior recognition method for unsupervised multi-view feature selection. Background technique [0002] With the rapid improvement of modern computer computing performance and the development of computer vision technology, especially feature extraction technology, people will extract different types of visual feature expressions for video and image processing objects. For example, for images, global features, such as color histograms, texture features, and contour features, and local features, such as SIFT, LBP, and GLOH, are often extracted. For video objects, in addition to appearance features (such as color, texture, edge) and motion features (such as motion history map and motion energy map features), local moment features (such a...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/66
Inventor 肖俊冯银付庄越挺计明明张鹿鸣
Owner ZHEJIANG UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products