Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Human motion behavior recognition method

A technology of human motion and recognition methods, applied in the field of motion recognition, can solve problems such as the skeleton structure and influence of the human body that cannot be well represented, and achieve the effect of reducing data storage and computing overhead

Pending Publication Date: 2022-08-05
国家体育总局体育信息中心
View PDF0 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, the above methods cannot represent the skeleton structure of the human body very well. The skeleton data of the human body is a graph structure, and each joint affects the points it connects.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Human motion behavior recognition method
  • Human motion behavior recognition method

Examples

Experimental program
Comparison scheme
Effect test

specific Embodiment approach 1

[0014] Specific implementation one: as figure 1 As shown, a method for recognizing human motion behavior described in this embodiment includes the following steps:

[0015] Step 1. Use the motion capture system to collect data on multiple movements of the human body, and use 9-axis sensors for all joints of the human body when collecting the data of each movement, such as figure 2 As shown, the 9-axis sensor includes a 3-axis accelerometer, a 3-axis gyroscope, and a 3-axis magnetometer, and the acquisition includes the three-axis accelerometer data, the three-axis gyroscope data, the three-axis magnetometer data, and the three-axis attitude angle. the raw data of the data;

[0016] Step 2: Construct a CRF conditional random field model applied to text sequence data lambda=CRF(w1,w2,...,wn), where w1 to wn are model parameters;

[0017] Step 3: Convert the time series of human motion raw data including all joints into feature vectors, mark the corresponding motion types, and...

specific Embodiment approach 2

[0020] Embodiment 2: In Embodiment 2, a fixed number of implicit sequences are defined. The CRF model is trained by using the time series feature vector, latent sequence and human motion behavior type of human motion raw data as corpus.

[0021] A fixed number of implicit sequences are defined, and semantic features of human motion behavior are assigned to the implicit sequences, which can greatly reduce the number of elements in the implicit sequences and improve the training efficiency.

[0022] For example, a latent sequence can be defined based on how much each joint changes between two frames. Suppose 10 joints are selected, and the 60-degree variation range of each joint in the space X plane, Y plane and Z plane is used as an element of the implicit sequence, which can be defined to include 3 3 *10 = implicit sequence of 270 elements.

[0023] Implicit sequences can also be defined based on the direction of motion of each joint. Assuming that 10 joints are selected, t...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to a human body motion behavior recognition method, which comprises the following steps of: 1, respectively carrying out data acquisition on a plurality of motions of a human body by using a motion capture system to obtain original data including triaxial accelerometer data, triaxial gyroscope data, triaxial magnetometer data and triaxial attitude angle data; 2, constructing a CRF (Conditional Random Field) model applied to the text sequence data; step 3, converting the time sequence of the original data of the human motion containing all the joints into feature vectors, marking corresponding human motion behavior types, and using the feature vectors and the human motion behavior types as corpora to train a CRF model; and step 4, giving the to-be-classified human body motion data time sequence feature vector to perform human body motion behavior recognition to obtain a corresponding human body motion behavior type. According to the invention, through the design of the feature vector and the model structure, the data storage overhead and the time complexity are reduced while the human body action is accurately identified.

Description

technical field [0001] The invention belongs to the technical field of motion recognition, and in particular relates to a method for recognizing human motion behavior. Background technique [0002] The motion capture system based on inertial measurement unit (IMU) is the most promising in commercial development, it can be used almost anywhere, and it truly realizes the collection of motion data that is not constrained by the scene. Several full-body human motion capture systems have been used in the computer graphics and animation industries. The data obtained using the motion capture method is more accurate than the way data is collected from video. Human motion capture data, as a time series, records the spatiotemporal information of each node when the human body is in motion, and the information of each node at each moment describes a human body posture. [0003] After the IMU measurement unit collects the motion data, it is necessary to perform subsequent processing on...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06V40/20G06V10/764G06V10/774G06V10/84G06N7/00
CPCG06V40/20G06V10/764G06V10/774G06V10/84G06N7/01
Inventor 邱旭东崔利荣刘文浩孙瑜
Owner 国家体育总局体育信息中心
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products