Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Human body posture recognition method based on convolutional neural network

A convolutional neural network and recognition method technology, applied in the field of wearable device intelligent monitoring, can solve the problems of long computing time, large computing load, low recognition accuracy, etc., and achieve the effect of shortening network training time

Active Publication Date: 2020-09-29
NANJING NORMAL UNIVERSITY
View PDF6 Cites 5 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0006] Purpose of the invention: In response to the above problems, the purpose of the present invention is to provide a human body posture recognition method based on a smaller filter convolutional neural network for wearable devices, so as to get rid of the large computing load of the computer, the long computing time and accurate recognition. low degree problem

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Human body posture recognition method based on convolutional neural network
  • Human body posture recognition method based on convolutional neural network
  • Human body posture recognition method based on convolutional neural network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0032] The technical solutions and effects of the present invention will be described in detail below in conjunction with the drawings and specific implementation.

[0033] The present invention provides a human body gesture recognition method based on convolutional neural network, which includes the following steps:

[0034] Step1. Recruit volunteers and wear mobile sensors to record the three actions of volunteers in different body parts (such as wrists, chest, legs, etc.) (such as standing, sitting, going up stairs, going down stairs, jumping, walking, etc.) Axis acceleration data, and attach corresponding action category labels to these action signal data;

[0035] Step2, traverse the collected three-axis acceleration data, and remove the null values ​​that the sensor failed to record correctly. The traversed data is subjected to frequency down sampling processing, and the data is divided into training sets after normalization processing And the test set, the frequency down-samp...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a human body posture recognition method based on a convolutional neural network. The method comprises the steps that original data of a mobile sensor are collected and labeled,data frequency down-sampling and normalization processing are conducted, a training set and a test set are divided, the convolutional neural network is trained, and a model is transplanted to an Android terminal for human body posture recognition. The method is used for human body posture recognition according to a convolutional neural network. According to the method, a Spit-Transform-Merge strategy is introduced into the implementation of the method; a group of Lego convolution kernels with a smaller channel number is provided, the convolution kernels are stacked according to a random mapping and cyclic matrix method so as to realize convolution operation, and finally, generated Lego feature maps are vertically combined and sent to a classifier through a full connection layer for sensordata identification. The method has the advantages of being high in recognition speed, high in recognition accuracy, small in calculation amount, high in generalization capacity and the like, and meanwhile the method plays a very important role in smart home, health detection, motion tracking and the like.

Description

Technical field [0001] The invention belongs to the field of intelligent monitoring of wearable equipment, and in particular relates to a human body gesture recognition method based on a convolutional neural network. Background technique [0002] In recent years, with the development of information technology and the popularization of intelligent technology, global technological changes are being further advanced, and technologies such as cloud computing, Internet of Things, big data and artificial intelligence are also developing rapidly. Among them, human body gesture recognition technology has begun to be widely used in computer vision related fields. Its application range is very wide, and it can be used in various fields such as human-computer interaction, film and television production, motion analysis, games and entertainment. People can use the human body gesture recognition to locate the motion trajectory of the human body joint points and record its motion data, and re...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/00G06K9/62G06N3/04G06N3/08
CPCG06N3/08G06V40/20G06N3/045G06F18/24G06F18/214Y02D10/00
Inventor 张雷唐寅王嘉琦滕起
Owner NANJING NORMAL UNIVERSITY
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products