Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Behavior identification method based on 3D convolution neural network

A convolutional neural network and recognition method technology, which is applied in the fields of feature matching, machine learning, pattern recognition and video image processing, can solve the problem of lack of classification ability for short-term simple actions, achieve high accuracy and avoid over-fitting , the effect of low complexity

Inactive Publication Date: 2015-01-14
UNIV OF ELECTRONICS SCI & TECH OF CHINA
View PDF2 Cites 49 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Another method is to use storyline to describe the occasional relationship between behaviors, and OR graphs (AND-OR graphs) are used as a mechanism to represent the storyline model, which lacks the ability to classify short-term simple actions

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Behavior identification method based on 3D convolution neural network
  • Behavior identification method based on 3D convolution neural network
  • Behavior identification method based on 3D convolution neural network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0030] Training uses the BP algorithm, but the network structure of CNN itself is very different from the traditional neural network, so the BP algorithm used by CNN is also different from the traditional BP algorithm. Since CNN is mainly composed of a convolutional layer and a down-sampling layer alternately, their respective formulas for calculating the backward error δ propagation are different.

[0031] Using the square error cost function, the calculation formula of the output layer δ is:

[0032]

[0033] Among them, y is the actual output vector of the network, t is the expected label vector, which has n components, and the f function is a sigmoid function. Is the Schur product, that is, the corresponding elements of the two vectors are multiplied, u is the weighted sum of the output of the upper node, and the calculation formula is as follows:

[0034] u l =W l x l-1 +b l

[0035] The output x of layer l-1 is multiplied by the weight W of layer l, plus the bias b.

[0036] The...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a behavior identification method based on a 3D convolution neural network, and relates to the fields of machine learning, feature matching, mode identification and video image processing. The behavior identification method is divided into two phases including the off-line training phase and the on-line identification phase. In the off-line training phase, sample videos of various behaviors are input, different outputs are obtained through calculation, each output corresponds to one type of behaviors, parameters in the calculation process are modified according to the error between an output vector and a label vector so that all output data errors can be reduced, and labels are added to the outputs according to behavior names of the sample videos corresponding to the outputs after the errors meet requirements. In the on-line identification phase, videos needing behavior identification are input, calculation is conducted on the videos through the same method as the training phase to obtain outputs, the outputs and a sample vector for adding the labels are matched, and the name of the sample label most matched with the sample vector is viewed as a behavior name of the corresponding input video. The behavior identification method has the advantages of being low in complexity, small in calculation amount, high in real-time performance and high in accuracy.

Description

Technical field [0001] The present invention relates to the field of computer vision, in particular to methods of machine learning, feature matching, pattern recognition and video image processing. Background technique [0002] Computer-based behavior recognition is to understand and describe human behavior from video or image sequences containing people, which belongs to the category of image analysis and understanding. The ability to automatically detect people and understand their behavior is the core function of the intelligent video system. In recent years, due to social needs, including industrial security, exchange interfaces, games, etc., people's interest in human behavior recognition has increased. The research content of human behavior recognition is very rich, mainly involving pattern recognition and machine learning, image processing, artificial intelligence and other disciplines. Three existing mainstream technical solutions used in behavior recognition are descri...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06K9/62G06N3/02
CPCG06V40/23G06V20/46G06V10/28G06F18/2411
Inventor 郝宗波桑楠吴杰余冬
Owner UNIV OF ELECTRONICS SCI & TECH OF CHINA
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products