Action recognition method based on neural network and action recognition device based on neural network

A technology of action recognition and neural network, which is applied in the field of action recognition based on neural network, can solve problems such as increased computing consumption, reduced time receptive field of neural network, difficulty in separating action from background, etc., to improve recognition efficiency and reduce calculation volume effect

Active Publication Date: 2018-03-06
TSINGHUA UNIV
View PDF3 Cites 19 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

These algorithms can only consider the appearance features and action features in one frame when detecting people, which leads to a large reduction in the time receptive field of the neural network, and it is difficult to separate actions with small action ranges from the background.
In addition, when judging the detection frame of each detection person,...

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Action recognition method based on neural network and action recognition device based on neural network
  • Action recognition method based on neural network and action recognition device based on neural network
  • Action recognition method based on neural network and action recognition device based on neural network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0069] Various exemplary embodiments, features, and aspects of the present disclosure will be described in detail below with reference to the accompanying drawings. The same reference numbers in the figures indicate functionally identical or similar elements. While various aspects of the embodiments are shown in drawings, the drawings are not necessarily drawn to scale unless specifically indicated.

[0070] The word "exemplary" is used exclusively herein to mean "serving as an example, embodiment, or illustration." Any embodiment described herein as "exemplary" is not necessarily to be construed as superior or better than other embodiments.

[0071] In addition, in order to better illustrate the present disclosure, numerous specific details are given in the following specific implementation manners. It will be understood by those skilled in the art that the present disclosure may be practiced without some of the specific details. In some instances, methods, means, componen...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention relates to an action recognition method based on a neural network and an action recognition device based on the neural network. The method comprises the steps that a video to be recognized is inputted to a trained first three-dimensional neural network model to be processed so that the action extraction result of the video to be recognized is obtained; the action instance detection result of the video to be recognized is determined according to the action extraction result of the video to be recognized; the video to be recognized is inputted to a trained second three-dimensionalneural network model to be processed so that the action class discrimination result of the video to be recognized is obtained; and the action class of the video to be recognized is determined according to the action instance detection result of the video to be recognized and the action class discrimination result of the video to be recognized. Different recognition results obtained by using two three-dimensional neural network models are combined so that the recognition efficiency of the three-dimensional neural network models can be enhanced and the computational burden of the single three-dimensional neural network model can be reduced.

Description

technical field [0001] The present disclosure relates to the technical field of neural networks, in particular to a neural network-based action recognition method and device. Background technique [0002] Action positioning is generally divided into two types, one is positioning only in space, and the other is simultaneous positioning in time and space. In a long video with multiple action performers performing actions at the same time, different action instances affect each other and overlap. Since the neural network obtains generalized expressions about categories, it is difficult for traditional neural network-based localization methods to distinguish these overlapping actions. [0003] In traditional action localization methods, a typical two-dimensional plus time motion localization framework is to detect moving people in each frame, and then connect these detected people together in different frames to form a Action instance. These algorithms can only consider the a...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G06K9/00G06K9/62
CPCG06V40/20G06V20/41G06F18/214
Inventor 季向阳吴嘉林杨武魁王谷
Owner TSINGHUA UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products