Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Method for recognizing actions on basis of deep feature extraction asynchronous fusion networks

A feature extraction and fusion network technology, applied in neural learning methods, character and pattern recognition, biological neural network models, etc., can solve problems such as action limitations

Inactive Publication Date: 2018-07-13
SHENZHEN WEITESHI TECH
View PDF2 Cites 10 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, due to the huge changes in the video scene and the interference of noisy content unrelated to the video theme, the convolutional network automatic feature acquisition technology has made relatively little progress in action recognition; and most people focus on how to learn features to directly describe behavior of action classes, how to introduce more information flows or strengthen the correlation between flows, therefore, existing techniques have limitations in distinguishing the ambiguity of action classes

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Method for recognizing actions on basis of deep feature extraction asynchronous fusion networks
  • Method for recognizing actions on basis of deep feature extraction asynchronous fusion networks
  • Method for recognizing actions on basis of deep feature extraction asynchronous fusion networks

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0032] It should be noted that, in the case of no conflict, the embodiments in the present application and the features in the embodiments can be combined with each other. The present invention will be further described in detail below in conjunction with the drawings and specific embodiments.

[0033] figure 1 It is a system frame diagram of an action recognition method based on deep feature extraction asynchronous fusion network of the present invention. It mainly includes coarse-grained to fine-grained networks, asynchronous fusion networks, and deep feature extraction asynchronous fusion networks. First feed each spatial frame of the input video appearance stream and each short-term optical flow stack of the motion stream into a coarse-grained to fine-grained network, integrate multiple action-class granular deep features, and create a more accurate feature representation, and then The extracted features are input into the asynchronous fusion network that integrates infor...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a method for recognizing actions on the basis of deep feature extraction asynchronous fusion networks. The method is implemented by the aid of main contents including coarse-grained-to-fine-grained networks, asynchronous fusion networks and the deep feature extraction asynchronous fusion networks. The method includes procedures of inputting each short-term light stream stackof each space frame and each movement stream of input video appearance stream into the coarse-grained-to-fine-grained networks; integrating depth features of a plurality of action class grain sizes;creating accurate feature representation; inputting extracted features into the asynchronous fusion networks with different integrated time point information stream features; acquiring each action class prediction results; combining the different action prediction results with one another by the deep feature extraction asynchronous fusion networks; determining ultimate action class labels of inputvideo. The method has the advantages that deep-layer features can be extracted from the multiple action class grain sizes and can be integrated, accurate action representation can be obtained, complementary information in a plurality of pieces of information stream can be effectively utilized by means of asynchronous fusion, and the action recognition accuracy can be improved.

Description

technical field [0001] The invention relates to the field of computer vision analysis, in particular to an action recognition method based on deep feature extraction asynchronous fusion network. Background technique [0002] Action recognition, which aims to identify action class labels of input action videos. Due to its importance in many applications, action recognition has attracted the attention of many researchers and has become a hot direction in the field of computer vision analysis. Action recognition technology can meet the needs of intelligent video surveillance, content-based video analysis and other tasks for automatic analysis and intelligence, and promote social development and progress. Motion recognition technology can be applied to intelligent monitoring to improve the quality of monitoring and save a lot of human resources; it can also be used in smart homes to monitor human body movements in real time, predict dangerous movements, and avoid accidental inj...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06K9/00G06N3/04G06N3/08
CPCG06N3/049G06N3/08G06V40/20G06N3/045
Inventor 夏春秋
Owner SHENZHEN WEITESHI TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products