3D (three-dimensional) convolutional neural network based human body behavior recognition method

A technology of convolutional neural network and recognition method, applied in the field of human behavior recognition based on 3D convolutional neural network, to achieve the effect of high representativeness of extracted features, fast extraction speed and strong anti-interference

Inactive Publication Date: 2015-12-16
XIDIAN UNIV
View PDF5 Cites 84 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

The present invention solves the recognition problem of human behavior by adopting 3D convolutio

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • 3D (three-dimensional) convolutional neural network based human body behavior recognition method
  • 3D (three-dimensional) convolutional neural network based human body behavior recognition method
  • 3D (three-dimensional) convolutional neural network based human body behavior recognition method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0044] The specific implementation manners of the present invention will be further described in detail below in conjunction with the accompanying drawings.

[0045] combined with figure 1 The concrete steps of the present invention are described as follows:

[0046] Step 1, video input.

[0047] Input the video images in the six video files of walking, jogging, running, boxing, handwaving, and handclapping in the KTH dataset into the computer, read the video images frame by frame, and obtain image information.

[0048] Step 2, preprocessing.

[0049] In the first step, the images with obvious human behavior characteristics are screened from the image information, and the screened images are saved. Observe the image information of each human behavior, and manually delete blank images without human body and images with less than two-thirds of the body's limbs.

[0050] In the second step, the size of the screened image is unified into 120×160 pixels to obtain an image of the ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The present invention discloses a 3D (three-dimensional) convolutional neural network based human body behavior recognition method, which is mainly used for solving the problem of recognition of a specific human body behavior in the fields of computer vision and pattern recognition. The implementation steps of the method are as follows: (1) carrying out video input; (2) carrying out preprocessing to obtain a training sample set and a test sample set; (3) constructing a 3D convolutional neural network; (4) extracting a feature vector; (5) performing classification training; and (6) outputting a test result. According to the 3D convolutional neural network based human body behavior recognition method disclosed by the present invention, human body detection and movement estimation are implemented by using an optical flow method, and a moving object can be detected without knowing any information of a scenario. The method has more significant performance when an input of a network is a multi-dimensional image, and enables an image to be directly used as the input of the network, so that a complex feature extraction and data reconstruction process in a conventional recognition algorithm is avoided, and recognition of a human body behavior is more accurate.

Description

technical field [0001] The invention belongs to the technical field of image processing, and further relates to a human behavior recognition method based on a 3D convolutional neural network in the technical field of computer vision. The invention can be used in an intelligent monitoring system to identify abnormal behaviors of human bodies in the environment, and can also be used in sports training to regulate the actions of athletes. Background technique [0002] Previous human action recognition methods are based on some harsh assumptions about the application scenarios, such as small scale changes and small visual changes of the target. But this is difficult to satisfy in the real world. [0003] Currently, in this regard, most current methods follow two steps: (1) extract complex artificial features in the original input; (2) learn a classifier on the acquired features. [0004] In real-world scenarios, it is almost difficult to know what features are important for a ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G06K9/00G06K9/62
CPCG06V40/20G06F18/2411
Inventor 韩红焦李成叶旭庆张鼎王伟李阳阳马文萍王爽
Owner XIDIAN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products