Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Human body action recognition method and system

A technology for human action recognition and human coordinate system, which is applied in the field of human action recognition methods and systems, can solve problems such as the inability to meet the needs of anti-fall, and achieve the effect of improving accuracy

Pending Publication Date: 2022-06-03
SHANGHAI UNIV OF SPORT
View PDF2 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Traditional human motion recognition methods can no longer meet the requirements of human motion state recognition, targeted exercise results, and humanized exercise and fall prevention in the fields of kinematics analysis, medical diagnosis, personnel care, and virtual reality. need

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Human body action recognition method and system
  • Human body action recognition method and system
  • Human body action recognition method and system

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0072] like figure 1 As shown, the human body motion recognition method provided by this embodiment, the method converts the human body motion recognition result into feature information, and the feature information is 3D key point information, including the following steps:

[0073] S1: Monitor the three-axis motion velocity, three-axis motion angular velocity and three-axis acceleration of human joints in the human coordinate system in real time, and construct a state variable prediction matrix calculation model X at the next moment k+1|k ;

[0074] S2: Build the state vector error prediction value calculation model P at the next moment k+1|k ;

[0075] S3: Monitor the three-dimensional motion coordinate values ​​of human joints in the human coordinate system in real time, and construct a fusion threshold calculation model;

[0076] S4: the next moment state variable prediction model matrix X constructed according to the step S1 k+1|k and the state vector error predictio...

Embodiment 2

[0079] like figure 1 As shown, the human body motion recognition method provided by this embodiment, the method converts the human body motion recognition result into feature information, and the feature information is 3D key point information, including the following steps:

[0080] S1: Monitor the three-axis motion velocity, three-axis motion angular velocity and three-axis acceleration of human joints in the human coordinate system in real time, and construct a state variable prediction matrix calculation model X at the next moment k+1|k ;

[0081] S2: Build the state vector error prediction value calculation model P at the next moment k+1|k ;

[0082] S3: Monitor the three-dimensional motion coordinate values ​​of human joints in the human coordinate system in real time, and construct a fusion threshold calculation model;

[0083] S4: the next moment state variable prediction model matrix X constructed according to the step S1 k+1|k and the state vector error predictio...

Embodiment 3

[0115] On the basis of Embodiment 2, as a preferred embodiment of the present invention, the fusion threshold condition in step S4 is 0.01i , X j )i , X j ) ≥ 0.3, then repeat steps S1-S4 to ensure smooth fusion of the human body motion parameters at each moment, so that the final output of the human body motion state parameter results at the ith moment and the human body motion state parameter results at the jth moment are smooth and uninterrupted. Fusion to form accurate human motion state data from the i-th time to the j-th time;

[0116] The fusion threshold D (X i , X j ) is calculated as follows:

[0117]

[0118] where τ q is the i-th state variable matrix X i to the jth state variable matrix X j The weight coefficient of the qth joint in all state variable matrices of , |X i | is the state variable matrix X at the i-th time i rank of , |X j | is the state variable matrix X at the jth time j rank.

[0119] As a preferred embodiment of the present invention...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a human body action recognition method and system, and the method comprises the steps: monitoring the motion parameters of human body joints in a human body coordinate system in real time, and constructing a state variable prediction matrix calculation model at the next moment, a state vector error prediction value calculation model at the next moment, and a fusion threshold calculation model; according to the constructed state variable prediction model matrix at the next moment and the state vector error prediction value calculation model, performing gain correction on a state vector error prediction value at the next moment, and constructing a human body state variable matrix at the next moment and an accurate state vector prediction value calculation model at the next moment; judging whether the state variable predicted value deviation at two different moments meets a fusion threshold condition or not; and if yes, outputting a fused human body action recognition result, and if not, continuing to repeat the steps. According to the method, the human body motion parameters at each moment can be fused smoothly, the finally output state parameter results are fused smoothly and uninterruptedly, and accurate human body motion state data within the monitoring time range are formed.

Description

technical field [0001] The invention belongs to the technical field of modern sports service support, and in particular relates to a method and a system for recognizing human body movements. Background technique [0002] Human motion state monitoring or recognition technology and devices are widely used in kinematic analysis, medical diagnosis, personnel care, virtual reality and other fields. [0003] In terms of sports training for athletes, most wearable devices on the market, such as wristbands, only include data recording and analysis functions such as step counting, mileage, and calorie consumption, and do not have motion recognition and counting functions for certain types of sports. . In order to facilitate coaches to better analyze the performance of athletes in training and competition, it is hoped that the number of technical movements performed by athletes on the field can be recorded. In the past, human motion recognition and prediction methods such as Chinese...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/00G06F3/01A63F13/211A63F13/212G01S19/42
CPCG06F3/011A63F13/211A63F13/212G01S19/42G06F2218/00
Inventor 何磊何俊毅
Owner SHANGHAI UNIV OF SPORT
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products