A human body behavior identification method based on multilayer depth features

A technology of deep features and recognition methods, applied in character and pattern recognition, instruments, computer parts, etc., can solve problems such as low algorithm recognition and complex recognition models, and achieve the effect of solving complex models, improving recognition power, and improving accuracy

Inactive Publication Date: 2019-04-02
NANJING UNIV OF POSTS & TELECOMM
View PDF3 Cites 8 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] In view of the above defects or improvement needs of the prior art, the present invention provides a human behavior recognition method based on multi-layer depth features, which simultaneously considers the classification ability of the fully connected layer of the deep learning model and the semantic context description ability of the convolutional layer , to generate video feature expressions with strong discrimination, so as to improve the description ability of visual features, and solve the technical problems of complex human behavior recognition models and low algorithm recognition in the prior art

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A human body behavior identification method based on multilayer depth features
  • A human body behavior identification method based on multilayer depth features
  • A human body behavior identification method based on multilayer depth features

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0024] The present invention will be further described in detail below in conjunction with the accompanying drawings and specific embodiments. Examples of the embodiments are shown in the accompanying drawings, and the specific embodiments described in the following embodiments of the present invention are only illustrative illustrations of specific embodiments of the present invention, and are intended to explain the present invention, rather than constituting a limitation of the invention.

[0025] The present invention provides a human behavior recognition method based on multi-layer depth features, such as figure 1 shown, including the following steps:

[0026] S1, training a deep learning model on the target database;

[0027] S2, input the sample into the deep learning model, and extract the feature map of the top convolutional layer and the feature of the top fully connected layer;

[0028] S3, perform the maximum pooling operation on each channel of the feature map ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention provides a human body behavior recognition method based on multilayer depth features. The human body behavior recognition method comprises the following steps of training a deep learningmodel on a target database; inputting the sample into a deep learning model, and extracting a top convolutional layer feature map and top full connection layer features; respectively carrying out maximum pooling operation on each channel of the top convolutional layer feature map, and connecting pooling results into a column vector; and connecting a column vector formed by the pooling result witha top layer full connection layer feature to serve as a final video feature expression, and finishing a behavior identification task by combining a support vector machine. The method is based on characteristics of a top full connection layer and a top convolution layer of a deep learning model. The classification capability of the full connection layer of the deep learning model and the semanticcontext description capability of the convolutional layer are considered, and the complementarity and respective advantages of the multi-layer features are fused, so that the identification capabilityof video feature expression is improved, and the behavior identification precision and the operation efficiency of the algorithm are improved.

Description

technical field [0001] The invention relates to a human behavior recognition method, in particular to a human behavior recognition method based on multi-layer depth features, and belongs to the technical field of video behavior recognition. Background technique [0002] Video-based behavior recognition has a wide range of application scenarios and market demands in many fields, such as intelligent security monitoring, intelligent robots, human-computer interaction, video-based retrieval and other fields. In recent years, although behavior recognition methods have emerged in an endless stream, due to problems such as background interference, occlusion, and intra-class errors, learning to express behavioral features with strong discrimination is still the focus and difficulty in the field of computer vision. [0003] With the improvement of the performance of computing devices and the advent of the era of big data, deep learning has become an effective tool to solve the proble...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/00
CPCG06V40/20G06V20/41
Inventor 盛碧云肖甫李群沙乐天黄海平沙超
Owner NANJING UNIV OF POSTS & TELECOMM
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products