Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Dense trajectory covariance descriptor-based behavior recognition method

A recognition method and descriptor technology, applied in the field of video processing, can solve the problems of low accuracy of behavior recognition results, failure to consider dynamic characteristics of behavior subjects, and inability to accurately describe behavior and motion, and achieve the effect of improving description ability.

Active Publication Date: 2017-09-22
XIDIAN UNIV
View PDF6 Cites 12 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Although this method takes into account the relationship between different features, it only considers the static features and does not consider the dynamic features of the behavior subject, and cannot accurately describe the movement of the behavior, resulting in low accuracy of behavior recognition results.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Dense trajectory covariance descriptor-based behavior recognition method
  • Dense trajectory covariance descriptor-based behavior recognition method
  • Dense trajectory covariance descriptor-based behavior recognition method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0023] The implementation of the present invention will be further described in detail below in conjunction with the accompanying drawings.

[0024] refer to figure 1 , the behavior recognition steps based on the dense trajectory covariance descriptor of the present invention are as follows:

[0025] Step 1, densely sample the video sequence, and calculate the dense optical flow f at the sampling point.

[0026] (1.1) Carry out grid sampling every w pixels of the video frame to obtain sampling points, and the value of parameter w is set to 5;

[0027] (1.2) Use Gunnar for the sampling points obtained in (1.1) Algorithm to calculate optical flow:

[0028] (1.2a) Express the neighborhood pixel values ​​of each pixel in the image as a quadratic polynomial:

[0029] f(x)=x T Ax+b T x+c,

[0030] Among them, f(x) represents the pixel value corresponding to the neighborhood x, A is a symmetric matrix, b is a vector, and c represents the offset. These parameters can be estima...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a dense trajectory covariance descriptor-based behavior recognition method, and mainly aims at solving the problem that the behavior recognition correctness is low as the prior art does not consider the correlation between different features and cannot correctly describe the movements of behavioral agents. The method comprises the following steps of: 1) extracting dense trajectories of a video, and for each pixel point in a trajectory cube, obtaining a gradient, a spatial position and time derivatives of the gradient, a light stream and a movement boundary, and taking the features as bottom-layer features; 2) obtaining a bottom-layer feature set, solving a covariance matrix of the bottom-layer feature set and projecting the bottom-layer feature set to an Euclidean space to obtain descriptors of trajectory sub-blocks; 3) connecting the descriptors of the trajectory sub-blocks in series so as to obtain a dense trajectory-based covariance matrix descriptor; and 4) carrying out BOW coding on the covariance matrix descriptor and then carrying out behavior recognition on the covariance matrix descriptor by utilizing a linear SVM classification model. The method has the effect improving the behavior description ability and the recognition correctness, and can be used for the complicated environment of video monitoring.

Description

technical field [0001] The invention belongs to the technical field of video processing, and in particular relates to a behavior recognition method, which can be used to describe video behavior in complex video monitoring environments. Background technique [0002] In the field of action recognition, artificial local features have become an effective way of feature representation. Local features do not require specific algorithms to detect human body parts, and are robust to effects such as complex backgrounds, illumination changes, and video noise. [0003] Typical local features include: spatio-temporal interest points STIP, cubes, and dense trajectories, often combined with gradient orientation histogram HOG, optical flow orientation histogram HOF, 3D gradient orientation histogram HOG3D, motion boundary histogram MBH, and extended accelerated robustness Descriptors such as feature ESURF are used in combination. [0004] The extraction of local features mainly consists ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06K9/00G06K9/62G06T7/20G06T7/246
CPCG06T7/20G06T7/246G06T2207/20081G06T2207/10016G06V20/40G06F18/2411G06F18/214
Inventor 同鸣闫娜赵梦傲汪厚峄
Owner XIDIAN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products