Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Behavior identification method based on AP cluster bag of words modeling

A technology of AP clustering and recognition methods, applied in character and pattern recognition, instruments, computer parts, etc., can solve problems such as low efficiency and low recognition rate, achieve reduced clustering time, better clustering effect, and improved The effect of behavior recognition rate

Inactive Publication Date: 2016-08-03
ZHEJIANG UNIV OF TECH
View PDF3 Cites 4 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0003] In order to overcome the problem of low efficiency and low recognition rate of the existing bag-of-words model in modeling and behavior recognition after multiple local features are jointly described, the present invention proposes a behavior recognition method based on AP clustering bag-of-words modeling

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Behavior identification method based on AP cluster bag of words modeling
  • Behavior identification method based on AP cluster bag of words modeling
  • Behavior identification method based on AP cluster bag of words modeling

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0016] The present invention will be further described below in conjunction with drawings and embodiments.

[0017] refer to figure 1 , an action recognition method based on AP clustering bag-of-words modeling, using the currently recognized classic action recognition algorithm test data set KTH for verification, the video has illumination changes, scale changes, noise effects, camera shakes, etc. Experiments were carried out on all the videos in the data set, and compared with the traditional bag-of-words model based on K-Means clustering, the visual dictionary capacity of the bag-of-words model based on K-Means clustering was taken in turn as 300, 400, 500, 800, 1000, 1500 for comparison. The leave-one-out cross-validation method is adopted for the behavior data set, that is, for each action class, 80% of the videos are randomly selected as the training set, and the remaining 20% ​​are used as the test set.

[0018] The implementation process of the behavior recognition me...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a behavior identification method based on AP cluster bag of words modeling. The method comprises the following steps: detecting time-space interest points of videos; obtaining combined feature vectors by describing all the detected time-space interest point by use of a 3D HOG and 3D HOF descriptors; generating a visual dictionary by performing AP clustering on all the feature vectors, and re-describing the feature vectors by use of the visual dictionary; describing feature vectors of test videos by use of the visual dictionary; and obtaining behavior types of the test videos by learning and classifying features obtained through the precious two steps by use of a support vector machine. According to the invention, a proper visual dictionary capacity can be obtained at a time, multiple tests carried out for a conventional bag of words model are unnecessary, the clustering time can be greatly reduced, the method has a better clustering effect for multiple local features of combined description, and the behavior recognition rate is improved.

Description

technical field [0001] The invention relates to the fields of image processing, video processing, pattern recognition and the like, and in particular to the field of video-based human behavior recognition. Background technique [0002] At present, in the field of human behavior recognition based on video, the method based on local spatiotemporal interest points has become the mainstream method because of its good robustness to various disturbances. This method performs behavior description by directly detecting spatio-temporal interest points on video sequences and extracting underlying features from them. In the classification and recognition phase, the classic bag-of-words model is generally used for behavior modeling and classification. In order to improve the behavior recognition rate, many current behavior recognition methods based on local features generally use multiple spatio-temporal interest point descriptors in the feature extraction stage, so the extracted local ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/62G06K9/00
CPCG06V40/20G06V20/41G06F18/23G06F18/28G06F18/214G06F18/2411
Inventor 宦若虹郭峰王楚
Owner ZHEJIANG UNIV OF TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products