Human activity gesture recognition method based on multi-level end-to-end neural network

A neural network and human activity technology, applied in the medical field, can solve the problems of a large amount of training data for neural networks, difficulty in distinguishing human activities, and limited recognition accuracy, so as to reduce computational complexity and power consumption, improve recognition accuracy, The effect of improving accuracy

Active Publication Date: 2020-11-27
周军
View PDF5 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] First, it is usually necessary to extract a large number of complex features, which introduces a large amount of power consumption
[0006] Second, when there are new behaviors, it may take a lot of effort to find new features
[0007] Third, due to the limitations of human knowledge and experience, the recognition accuracy is often limited
However, there are situations in which some behavioral characteristics are similar in existing structures, making it difficult to distinguish human activities and behaviors
Not only that, due to the neural network involved, the computational complexity is relatively high, and the power consumption is relatively high, and the neural network also requires a large amount of training data

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Human activity gesture recognition method based on multi-level end-to-end neural network
  • Human activity gesture recognition method based on multi-level end-to-end neural network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment

[0045] like Figure 1 to Figure 2 As shown, this embodiment provides a human activity gesture recognition method based on a multi-level end-to-end neural network, using several motion sensors arranged on the surface of the human body and used to collect raw data with tagged data of human activity gestures. The motion sensors include, but are not limited to, accelerometers, gyroscopes, and magnetometers. It should be noted that the serial numbers such as "first" and "second" in this embodiment are only used to distinguish similar components or terms. In addition, the types of human activity gestures in this embodiment include but are not limited to going upstairs, going downstairs, walking, jogging, standing, and sitting. Specific steps are as follows:

[0046] The first step, multi-level end-to-end neural network training: use the motion sensor to collect the labeled data of human activity posture, use the sliding window to cut the labeled data, and obtain the first labeled ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a human activity gesture recognition method based on a multi-level end-to-end neural network, comprising the following steps: collecting labeled data of human activity gestures, cutting the labeled data by using a sliding window, and obtaining several segments of equidistant A first labeled data window, processing the first labeled data window using a gait-based data augmentation algorithm to obtain segments of a second labeled data window; using the first labeled data window and the second labeled data window Windowed pair multi-layer end-to-end neural network training. Collect the raw data of any human activity posture, and perform sliding window cutting on the raw data to obtain several consecutive motion data windows to be recognized; import the motion data windows to be recognized into the trained multi-level terminal in sequence Peer-to-peer neural network to identify the type of human activity gestures. The invention has the advantages of high recognition accuracy, low calculation complexity, low power consumption, etc., and has broad market prospects in the fields of medical technology, behavior supervision and the like.

Description

technical field [0001] The invention relates to the field of medical technology, in particular to a human activity posture recognition method based on a multi-level end-to-end neural network. Background technique [0002] Human behavior recognition has always been a very popular research field. Its purpose is to analyze and identify human action types and behavior patterns through a series of observations, and describe them using natural language. With the breakthrough of machine learning algorithms, the accuracy of human behavior recognition is getting higher and higher, which makes human behavior recognition enter all aspects of life. Human behavior recognition technology has broad application prospects and considerable economic value, and the application fields involved mainly include: video surveillance, medical diagnosis and monitoring, intelligent human-computer interaction, virtual reality, etc. In the field of video surveillance, traditional video surveillance mainl...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Patents(China)
IPC IPC(8): G06K9/00G06K9/62G06N3/04G06N3/08
CPCG06N3/08G06V40/25G06V40/20G06N3/045G06F18/24G06F18/214
Inventor 周军黄家辉
Owner 周军
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products