Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Human body action recognition method based on cyclic convolutional neural network

A human action recognition, neural network technology, applied in the fields of image classification, pattern recognition and machine learning, can solve the problem of low accuracy of human action recognition

Active Publication Date: 2019-11-26
UNIV OF ELECTRONICS SCI & TECH OF CHINA
View PDF7 Cites 15 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] In view of the above-mentioned research problems, the object of the present invention is to provide a human action recognition method based on a circular convolutional neural network, which solves the problems in the prior art due to the changes within and between action categories or the video is composed of continuous frames. Causes problems such as low accuracy of human motion recognition

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Human body action recognition method based on cyclic convolutional neural network
  • Human body action recognition method based on cyclic convolutional neural network
  • Human body action recognition method based on cyclic convolutional neural network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0052] The present invention will be further described below in conjunction with the accompanying drawings and specific embodiments.

[0053] A human action recognition method based on cyclic convolutional neural network can be widely used in video-based category similarity recognition, including the following steps:

[0054] S1. Construct a data set, that is, randomly select sequence pairs of the same length from public data sets, and each frame in each sequence includes RGB images and optical flow images; the public data sets are UCF101-split1 data set, HMDB51 data set, UCFSPORT data set or UCF11 dataset, where the two action clips in the sequence pair are from the same action category or from different action categories. Specifically: first cut the video sequence in the public data set into fixed-length action segments (segments), obtain multiple sequences, and randomly select a pair of segments, that is, a sequence pair, which can be from the same action category (Positive...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a human body action recognition method based on a cyclic convolutional neural network, belongs to the field of image classification, pattern recognition and machine learning, and solves the problems of low human body action recognition precision and the like caused by changes inside action categories and between the categories or video composed of continuous frames. The method comprises: constructing a data set, namely randomly selecting sequence pairs with the same length from a public data set, and each frame in each sequence comprising an RGB image and an optical flow image; constructing a twin network, wherein each network in the twin network sequentially comprises a CNN layer, an RNN layer and a Temporal Pooling layer; constructing an 'identification-verification' joint loss function; training a constructed deep convolutional neural network and an 'identification-verification' joint loss function based on the data set; and based on the to-be-recognized human body action sequence pair, sequentially passing through the trained deep convolutional neural network and the trained 'identification-verification' joint loss function to obtain an action category recognition result of the sequence pair. The method is used for human body action recognition in the image.

Description

technical field [0001] A human action recognition method based on a cyclic convolutional neural network is used for human action recognition in images, and belongs to the fields of image classification, pattern recognition and machine learning. Background technique [0002] Human action recognition is one of the hotspots and cutting-edge research topics in the field of computer vision and machine learning. It has broad application prospects in intelligent video surveillance, intelligent human-computer interaction, and content-based video analysis. [0003] The main problem to be solved in video-based human action recognition is to process and analyze the original image or image sequence data collected by the sensor (camera) through the computer, and learn and understand the human action and behavior in it. Human action recognition mainly includes the following three steps: first, detect the appearance and motion information from the image frame and extract the underlying fea...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/00G06N3/04G06N3/08
CPCG06N3/084G06V40/23G06N3/045
Inventor 程建高银星汪雯苏炎洲白海伟
Owner UNIV OF ELECTRONICS SCI & TECH OF CHINA
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products