Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Human body posture recognition method based on convolutional neural network of covariance matrix transformation

A convolutional neural network and covariance matrix technology, which is applied in the field of human gesture recognition, can solve the problems of recognition accuracy attenuation and operation time, and achieve the effect of expanding the scope of use and reducing the decline in recognition accuracy.

Pending Publication Date: 2021-11-12
NANJING NORMAL UNIVERSITY
View PDF0 Cites 2 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] Purpose of the invention: The purpose of the present invention is to provide a convolutional neural network based on covariance matrix transformation for the human body posture recognition method in response to the data-related problems that arise, and to solve the problem of attenuation of recognition accuracy and long operation time caused by correlation to a certain extent. question

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Human body posture recognition method based on convolutional neural network of covariance matrix transformation
  • Human body posture recognition method based on convolutional neural network of covariance matrix transformation
  • Human body posture recognition method based on convolutional neural network of covariance matrix transformation

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0040] The data collection process and network construction of the present invention will be further described in detail below in conjunction with the accompanying drawings and specific implementation.

[0041] As shown in the flow chart, the present invention proposes a convolutional neural network based on covariance matrix transformation for a human body gesture recognition method, including the following steps:

[0042] Step 1: All subjects have the same model of smartphone (model is iphone11), and place the smartphone above the left and right wrists by fixing the mobile phone bag, and then the subjects perform daily actions (walking, running, up and down stairs and jumping, etc.) , the accelerometer and gyroscope in the smartphone will record the three-axis sensor data when the subject is moving, and each set of data will collect 300 movements in advance.

[0043] The data collection scheme is shown in the attached figure, for the same subject has a smart phone in both left...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a human body posture recognition method based on a convolutional neural network of covariance matrix transformation, and relates to the application field of artificial intelligence, and the method comprises the steps: employing a smart phone to collect a sensor time sequence related to a human body posture, carrying out the normalization of original data, segmenting a data set and fixing an action attribute label by using operations such as a sliding window, then establishing a convolutional neural network system by using covariance matrix transformation, eliminating data correlation by using inverse transformation of a covariance matrix, adding covariance operation before each layer of convolution to avoid data correlation, training network parameters by using a training data set, testing the data set to verify the network precision, and finally solidifying the network model to generate a pb file, and migrating the pb file to the android client through the Android studio, so that the mobile equipment can identify the current state of the user by using an accelerometer and a gyroscope of the mobile equipment.

Description

technical field [0001] The invention belongs to the field of posture sensor data recognition, and in particular relates to a human body posture recognition method for a wearable device using a convolutional neural network based on covariance matrix transformation. Background technique [0002] With the advancement of science and technology and the development of society, artificial intelligence has reached an unprecedented level of development and has been integrated into all aspects of life. For smart devices, better detection of human activity and state is a necessary research direction. While facilitating people's lives, it can also improve people's daily behavior. More and more sensors are integrated in wearable devices and applied to various fields: physical therapy, sports recognition, social interaction, etc. The time series data obtained by sensors is used as the research object, and reasonable predictions can be made for different models. Get information about the ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06K9/00G06N3/04G06N3/08
CPCG06N3/08G06N3/045
Inventor 权威铭端越峰张雷
Owner NANJING NORMAL UNIVERSITY
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products