Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Human body posture recognition method based on time and channel double attention

A technology of human posture and recognition method, applied in character and pattern recognition, neural learning methods, instruments, etc., can solve problems such as few convolution layers, insufficient feature map information, and inability to accurately locate target action time, etc. The effect of complexity

Pending Publication Date: 2020-10-30
NANJING NORMAL UNIVERSITY
View PDF0 Cites 5 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Compared with traditional machine learning methods, such as logistic regression, decision tree, Markov model, etc., although these shallow deep learning methods have significantly improved accuracy, the feature map information is not sufficient due to the small number of convolutional layers. Rich
At the same time, the pervasive convolution calculation cannot accurately locate the time when the target action occurs and accurately locate the type of target action in a series of long data.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Human body posture recognition method based on time and channel double attention
  • Human body posture recognition method based on time and channel double attention
  • Human body posture recognition method based on time and channel double attention

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0039] The technical solutions and effects of the present invention will be described in detail below in conjunction with the accompanying drawings and specific implementation.

[0040] The present invention proposes a human body posture recognition method based on time and channel double attention, comprising the following steps:

[0041] Step1, recruit volunteers and wear motion sensors to record three movements of volunteers in different body parts (such as wrists, chests, legs, etc.) Axis acceleration data, and attach corresponding action category labels to these action signal data;

[0042] Step2, perform data cleaning and noise removal on the collected triaxial acceleration data, perform frequency resampling processing on the cleaned data, divide the data into training set and test set after normalization processing, and the frequency resampling processing And the normalization processing is as follows: the data is subjected to time series signal frequency downsampling ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a human body posture recognition method based on time and channel double attention. The method comprises the following steps: acquiring original data of various human body actions by using a built-in sensor of mobile equipment, attaching an attribute label of the action, performing sliding window and normalization processing, segmenting into a training sample set and a testsample set, establishing a deep convolutional neural network model based on time and channel double attention, and importing a training sample and a test sample for training and optimal adjustment toobtain a recognition result of the human body action. According to the invention, channel attention and time sequence attention are superposed, the type and the occurrence time of the target action can be accurately positioned after a large amount of coarse-grained training data is trained, so that the complexity of manually marking the training data is greatly reduced, and the invention plays animportant role in sports, interactive games, medical care, universal monitoring systems and the like.

Description

technical field [0001] The invention belongs to the field of intelligent monitoring of wearable devices, and in particular relates to a human body gesture recognition method based on time and channel dual attention. Background technique [0002] In recent years, with the development of computer technology and the popularization of smart technology, we have entered a new round of global technological change, and technologies such as large-scale cloud computing, Internet of Things, big data and artificial intelligence are also developing rapidly. Among them, human gesture recognition technology is also an important research trend in computer vision related fields. It has a wide range of applications and can be used in various fields such as health monitoring, motion detection, human-computer interaction, film and television production, and game entertainment. People can use the sensors worn by the human body to collect the motion trajectory data of human joint points to reali...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/00G06K9/40G06K9/62G06N3/04G06N3/08
CPCG06N3/084G06V40/20G06V10/30G06N3/045G06F18/24G06F18/214
Inventor 张雷高文彬刘悦
Owner NANJING NORMAL UNIVERSITY
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products