Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Human behavior identification method based on three-dimensional convolutional neural network and transfer learning model

A three-dimensional convolution and neural network technology, applied in the field of image processing, can solve the problems of the same motion integrity, motion range, motion speed difference, difficulty in distinguishing, incomplete motion, etc., to achieve fast recognition speed, high detection accuracy, and improved accuracy. Effect

Active Publication Date: 2017-12-22
BEIHANG UNIV
View PDF10 Cites 93 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Considering the distribution of behavior in time and space dimensions, different individuals may have different performances for the same type or type of action. These differences are often caused by different individuals' personal understanding of the same type or type of action. Therefore, there are often large differences in the integrity, motion trajectory, motion range, and motion speed of the same action, resulting in difficulties in distinguishing; (2) There are often many interference information in dynamic video information, and many of these information are in the process of static image processing. did not appear in
For example, occlusion between individuals, between individuals and the background environment, relative motion between individuals or between individuals and the background environment during the occlusion process, changes in light intensity and contrast in the video sequence, video The movement and zoom of the lens during shooting, and the action in a video sequence may be incomplete, etc.
All these difficulties have brought great challenges to the research of behavior recognition in video sequences.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Human behavior identification method based on three-dimensional convolutional neural network and transfer learning model
  • Human behavior identification method based on three-dimensional convolutional neural network and transfer learning model
  • Human behavior identification method based on three-dimensional convolutional neural network and transfer learning model

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0033] In order to make the purpose, technical solution and advantages of the present invention clearer, the present invention will be further described in detail below in conjunction with the accompanying drawings and specific embodiments.

[0034] Such as figure 1 As shown, the present invention specifically realizes the following steps:

[0035] Step 1. Read the video, decompose the video into many continuous single-frame images, and then stack the single-frame images to obtain the cube structure required by the neural network, and at the same time determine the corresponding behavior classification label for each cube structure. That is, from the original video data, the video is disassembled into a series of continuous frame images by frame-by-frame sampling, and they are stacked in the time dimension to obtain many cube structures with a size of w×h×d that can fully present an action. Among them, w represents the width of the image, h represents the height of the image,...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The present invention relates to a human behavior identification method based on a three-dimensional convolutional neural network and a transfer learning model. The method comprises: performing sampling frame by frame of video, stacking obtained continuous single-frame images on a time dimension to form an image cube with a certain size, and taking the image cube as the input of the three-dimensional convolutional neural network; performing training of a basic multi-classification three-dimensional convolutional neural network model when implementation is performed, selecting part of classes of input samples from a test result to construct a sub-data set, training a plurality of dichotomy models on the basis of the sub-data set, and selecting a plurality of models with the best dichotomy result; and finally, employing the transfer learning to transfer the knowledge learned by the models to an original multi-classification model to perform retraining of the transferred multi-classification model. Therefore, the multi-classification identification accuracy is improved and human behavior identification with high accuracy is realized.

Description

technical field [0001] The invention relates to image processing technology in video, in particular to a human behavior recognition method based on a three-dimensional convolutional neural network and a migration learning model. Background technique [0002] In today's society, with the rapid development of storage devices, Internet technology and social networks, large-scale video data has been generated. How to use these video data for target recognition and behavior analysis has become a growing demand. Whether it is intelligent security monitoring, customer shopping behavior analysis, smart home systems, somatosensory games, or the action recognition of pedestrians on the road during unmanned driving, they all rely on high-efficiency and high-precision human behavior recognition systems. The purpose of human behavior recognition is to classify and recognize the behavior or action of one or more people in the video. The research object is often a series of video sequence...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06K9/00G06K9/62G06N3/08G06N99/00
CPCG06N3/08G06N20/00G06V40/20G06F18/241
Inventor 王田陈阳乔美娜陶飞
Owner BEIHANG UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products