Human body behavior identification method based on convolutional neural network

A convolutional neural network and recognition method technology, applied in biological neural network models, neural architecture, character and pattern recognition, etc., to achieve accurate results, reduce computational complexity, and reduce computational complexity.

Active Publication Date: 2019-08-16
NANJING UNIV OF POSTS & TELECOMM
View PDF4 Cites 5 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0006] Technical problem: The technical problem to be solved by the invention is to use a set of systems to enable the use of convolutional neural networks for human behavior recognition for input human posture sequences, improve the accuracy of human behavior recognition, and reduce the computational complexity of the learning model Spend

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Human body behavior identification method based on convolutional neural network
  • Human body behavior identification method based on convolutional neural network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0039] In specific implementation, figure 1 It is a human behavior recognition method flow based on convolutional neural network.

[0040] This example uses the MSRAction3D dataset, which is captured by the Microsoft Kinect v1 depth camera and contains 20 actions.

[0041] First, the system sequentially acquires the human skeleton sequences in the dataset. In the received joint pose sequence, given specific N frames [F 1 , F 2 ,...,F N ] of the human skeleton sequence s, let (x i ,y i ,z i ) is the nth frame {F n} ∈ 3D coordinates of each human joint in s, where n ∈ [1,N].

[0042] Next, the joint pose sequence transforms the 3D coordinates of each human joint in s into the 3D joint coordinates in the normalized space s′ (x′ i ,y′ i ,z' i ), stack the coordinates of all normalized spaces to form a time series [F′ 1 ,F' 2 ,…,F′ n ] to represent the entire action sequence, quantizing these elements into the RGB color space. In the order of two arms, one torso, and...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention provides a human behavior recognition method based on a convolutional neural network, and the method comprises the following steps: collecting images of different human actions, and enabling each image to have a human skeleton sequence; forming a mobile skeleton descriptor image and a mobile joint descriptor image of the human body image according to the human skeleton sequence; training the convolutional neural network by taking the mobile skeleton descriptor images and the mobile joint descriptor images of different human body actions as a training set respectively; respectively inputting the image of the to-be-identified human body action into the two trained convolutional neural networks to respectively obtain the score of each human body action; and adding the scores corresponding to the same human body action in the two scores to obtain an action with the highest score as a human body behavior identification result. According to the method, the convolutional neuralnetwork can be used for human body behavior recognition for the input human body posture sequence, the accuracy of human body behavior recognition is improved, and the computational complexity of a learning model is reduced.

Description

technical field [0001] The invention relates to a human behavior recognition method based on a convolutional neural network, which belongs to the interdisciplinary technical fields of behavior recognition, deep learning, and machine vision. Background technique [0002] Human behavior recognition is an important research topic in the field of computer vision, which has important theoretical significance and practical application value. [0003] With the development of science and technology, there are currently two main ways to obtain the human skeleton: joint point estimation through RGB images or direct acquisition through depth cameras (such as Kinect). The depth camera is becoming one of the most commonly used sensors for human behavior recognition. [0004] At present, the use of depth map and human skeleton data for human action recognition has become very popular, but there are still some limitations in the existing technology. First of all, the traditional depth ma...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/00G06K9/62G06N3/04
CPCG06V40/20G06N3/045G06F18/214
Inventor 赵立昌陈志岳文静吴宇晨孙斗南周传
Owner NANJING UNIV OF POSTS & TELECOMM
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products