Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Behavior recognition system and method based on multi-position sensor feature fusion

A feature fusion and recognition system technology, applied in sensors, neural learning methods, character and pattern recognition, etc., can solve the problem of ignoring spatial dependencies, not considering the spatial and temporal dependencies of axial data, and unable to effectively extract multi-dimensional time series data spatial features and other problems to achieve the effect of accurately identifying human behavior

Pending Publication Date: 2022-08-09
NORTHEASTERN UNIV
View PDF0 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

The one-dimensional convolution kernel performs convolution on the time axis. When dealing with single-sensor and less-dimensional time-series data, it can effectively mine the time-dependency of time-series data. However, when facing multi-sensor multi-dimensional time-series data, one-dimensional convolution The kernel ignores the spatial dependence between different types of sensors and different axial sensor data, and cannot effectively extract the spatial features of multi-dimensional time series data
The two-dimensional convolution kernel can solve the above problems, but when constructing two-dimensional spatio-temporal data input, it simply splices the axial data without considering the spatio-temporal dependence between the axial data
However, the two-level post-fusion strategy still simply sends the sensor information into the convolutional neural network in terms of feature extraction, and fails to design an effective feature extraction method to further extract the spatial features of different sensor information.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Behavior recognition system and method based on multi-position sensor feature fusion
  • Behavior recognition system and method based on multi-position sensor feature fusion
  • Behavior recognition system and method based on multi-position sensor feature fusion

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0057] When processing multi-sensor multi-dimensional data, the two-dimensional convolution kernel comprehensively considers the time dependence of multi-dimensional data and the spatial dependence between different types of sensor data. However, most researchers just list and splicing multiple sensor data, and the spatial features extracted by this method are limited. Because theoretically, different axial data on the same sensor are independent of each other, and the correlation is weak, and there is no correlation between different axial data of different sensors, and simple splicing can only mine different axes of the same sensor. The spatial features extracted are limited due to the spatial dependencies between the orientation data and different axial data from different sensors. From the data point of view, we select five triaxial sensors from the public dataset SKODA to calculate the correlation between each axis data, such as Figure 4 to Figure 6 As shown, the weak c...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention belongs to the technical field of behavior recognition, and provides a behavior recognition system and method based on multi-position sensor feature fusion. The behavior recognition system based on multi-position sensor feature fusion comprises a data module and a model module, the data module stores and preprocesses collected original data to obtain action pictures for subsequent model training; the model module comprises a feature extraction sub-module and a classification sub-module, and a final prediction result is obtained. According to the method, spatial dependence among all data is abandoned, spatial dependence among same axial data of different sensors with stronger correlation is selected, three groups of action pictures are formed by extracting the data according to (x, y and z) three-axis directions and fusing the data, and feature fusion is performed through a secondary post-fusion model, so that the accuracy of the data fusion is improved. The time and space characteristics of the multi-dimensional time sequence data are effectively extracted, so that the aim of accurately identifying human behaviors is fulfilled.

Description

technical field [0001] The invention relates to the technical field of behavior recognition, in particular to a behavior recognition system and method based on multi-position sensor feature fusion. Background technique [0002] Human behavior is an indispensable part of people's daily life, and each behavior has its own special meaning and function. Accurately identify human behavior. Recognition is the basic function of many health monitoring, smart factories and sports scenes. It is important for promoting the field of artificial intelligence. progress is of great significance. At present, human behavior recognition is divided into behavior recognition based on video time series data and behavior recognition based on sensor time series data according to the data type. Compared with video data, wearable sensor data is not limited by the scene, and the storage and transmission are fast, making it easier to realize real-time recognition. For example, the elderly can wear sma...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/62G06N3/04G06N3/08G06V10/764G06V10/80G06V10/82G06V40/20G01C21/16A61B5/11
CPCG06N3/084G06V10/806G06V10/764G06V10/82G06V40/20G01C21/16A61B5/1117G06N3/045G06F18/2415G06F18/253
Inventor 邓诗卓郭珠宝林博谦陈东岳贾同
Owner NORTHEASTERN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products