Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Human action classification method based on video local feature dictionary

A technology of local features and human actions, applied in computer parts, character and pattern recognition, instruments, etc., can solve the problems of increasing feature space noise, clustering together, unable to guarantee similar vectors, etc., to avoid memory, improve The effect of accuracy

Inactive Publication Date: 2016-09-07
WUHAN UNIV
View PDF0 Cites 13 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

There are two problems in traditional dictionary generation methods: one is that it is difficult to measure the semantic similarity of high-dimensional features by using predefined distance functions such as Euclidean distance and cosine distance, and it is impossible to guarantee that similar vectors are clustered together; the other is local There are a large number of background features and features that are not related to category identification in the features. These redundant features participate in visual expression and increase the noise of the feature space. Traditional clustering methods cannot screen these features.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Human action classification method based on video local feature dictionary
  • Human action classification method based on video local feature dictionary
  • Human action classification method based on video local feature dictionary

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0062] In order to facilitate those of ordinary skill in the art to understand and implement the present invention, the present invention will be described in further detail below in conjunction with the accompanying drawings and embodiments. It should be understood that the implementation examples described here are only used to illustrate and explain the present invention, and are not intended to limit this invention.

[0063] please see figure 1 , a kind of human action classification method based on video partial feature dictionary provided by the present invention, comprises the following steps:

[0064] Step 1: Extract local feature vectors from video clips with action category labels, each video corresponds to a set of local feature vectors, each set is used as an example bag, and the bag label is the action category;

[0065] The implementation of the present invention can adopt the method based on intensive sampling to extract the local feature descriptors in the vid...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The present invention discloses a human action classification method based on a video local feature dictionary. The method comprises a step of extracting a local feature from a training video with a category label, wherein a feature package is formed by the feature vector set in each segment of video, a step of grouping feature packages, using a multi-instance learning method to learn a local feature classifier, using a mode of cross-validation in the multi-instance learning, and marking multiple instances with largest ranking in each package as positive instances in updating the positive instances, a step of taking a learned classifier as the dictionary of feature encoding, and using a maximum pooling method to pool the local feature response to obtain the global vector expression of a video, and a step of using a global feature vector to learn, obtaining the classifier of each action category, and using the classifier to classify the action in a new video. According to the invention, the improvement of the accuracy of estimation is facilitated, the memory of an initial value by classification is avoided, and the accuracy of estimated positive samples is ensured at the same time.

Description

technical field [0001] The invention belongs to the technical field of automatic video analysis, and relates to a method for automatically classifying or recognizing human body movements in videos, in particular to a method for pooling feature sets based on human body movements or scene local features extracted from videos The dictionary, and then use the pooled global vector representation to classify the action. Background technique [0002] Action recognition or classification can be applied to many fields such as detection of video surveillance events, video retrieval, human-computer interaction, and behavior understanding. However, due to factors such as camera movement, complex scene changes, changes in perspective, changes in action speed, and occlusion of human body parts, the visual differences of similar actions are relatively large, and there are high similarities in different types of actions. A great challenge has come. Due to the poor robustness of directly o...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06K9/00
CPCG06V20/41G06V20/46
Inventor 胡瑞敏李红阳陈军陈华锋徐增敏吴华王晓冯铭
Owner WUHAN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products