Eureka AIR delivers breakthrough ideas for toughest innovation challenges, trusted by R&D personnel around the world.

Behavior identification method based on space-time context association

A technology of spatio-temporal context and recognition method, which is applied in the fields of deep learning, pattern classification and recognition, can solve problems such as unsatisfactory long-term correlation feature extraction performance, difficulty in applying battery capacity to wearable devices, and damage to the front and rear dependence characteristics of time series data, etc., to achieve The effect of increasing recognition accuracy, reducing loss, and increasing speed

Pending Publication Date: 2020-06-05
UNIV OF ELECTRONICS SCI & TECH OF CHINA
View PDF6 Cites 2 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

For the spatiotemporal characteristics of behavior, there are certain defects in the current deep learning-based methods. For example, although the RNN-based method can learn long-term feature dependencies, it cannot describe local features well.
However, CNN-based methods are not ideal for long-term correlation feature extraction.
Although the CNN+RNN method can further improve the learning ability of spatio-temporal features, the method of first using CNN to extract spatial features from data and then performing time-related modeling damages the potential context-dependent characteristics of time-series data to a certain extent.
In addition, the model is also very bloated, which is difficult to apply to wearable devices with limited computing power and battery capacity

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Behavior identification method based on space-time context association

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0027] In order to describe the technical content, structural features, achieved goals and effects of the present invention in detail, it will be described in detail in conjunction with the implementation modes and accompanying drawings.

[0028] The present invention proposes a behavior recognition method based on a deep network model based on spatio-temporal context features, which achieves good results in human behavior recognition. The schematic diagram of the whole algorithm is shown in figure 1 Shown, including training phases and steps:

[0029] The training steps include:

[0030] Step A1: Import the sensor perception data X of user behavior into the deep network model to perform convolution mapping operation to obtain convolution mapping data;

[0031] The overall user behavior of the above technical solutions is mainly divided into periodic behavior and sporadic behavior;

[0032] Periodic behavior: such as walking, running, cycling, etc.;

[0033] Scattered beha...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention belongs to the technical field of identification, provides a behavior identification method of a deep network model based on space-time context association, and aims to solve the problemthat the spatial feature learning range in a CNN model is limited by the size of a sensing domain, reduce the loss of behavior feature representation by the model, and improve the accuracy of behavior recognition. According to the main scheme, user behavior data is imported for convolution mapping operation; then a behavior space-time characteristic graph TSF is obtained by utilizing a grid LSTMneural network, the behavior space-time characteristic graph TSF is imported into an attention gate module to carry out different time characteristic weight learning, a behavior characteristic graph is obtained, and the behavior characteristic graph is input into a softmax classifier to calculate probability distribution D of a behavior category; cross entropy loss function operation is carried out on the probability distribution D of the behavior category and the training set behavior label Y to obtain a loss Loss0, and an l2 loss function is introduced as a final total loss function L; and according to the total loss function L, the numerical value of the model imaginary-seating parameter is modified by using a back propagation operation to obtain the deep network model M.

Description

technical field [0001] The invention belongs to the technical field of deep learning, pattern classification and recognition, and in particular relates to a behavior recognition method based on a deep network model associated with spatio-temporal context. Background technique [0002] Deep learning methods bring strong feature extraction capabilities to behavior (mainly divided into periodic behaviors (such as walking, running, cycling, etc.) and some scattered behaviors (such as drinking tea, taking cups, cooking, etc.)) , which reduces the difficulty of artificial feature modeling. Behavioral data based on multi-sensors is not only time-sequential, but also has certain local spatial characteristics between sensors. For the spatio-temporal characteristics of behavior, there are certain defects in the current deep learning-based methods. For example, although the RNN-based method can learn long-term feature dependencies, it cannot describe local features well. However, CNN...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/00G06K9/62G06N3/04G06N3/08
CPCG06N3/049G06N3/084G06V40/20G06N3/045G06F18/2415
Inventor 席瑞范淑焕侯孟书宋元凤
Owner UNIV OF ELECTRONICS SCI & TECH OF CHINA
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Eureka Blog
Learn More
PatSnap group products