Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Action prediction method based on human skeleton sequence

An action prediction and skeleton technology, applied in neural learning methods, instruments, biological neural network models, etc., can solve the problems of ignoring the potential of student models and failing to capture different importance, and achieve the effect of improving the classification of model predictions

Pending Publication Date: 2022-06-03
SHENYANG AEROSPACE UNIVERSITY
View PDF0 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, such action prediction research methods only consider the full video pre-training as the teacher model, regardless of the observation rate of the current input sample of the student model, there are strong constraints on the characteristics or soft labels output by the teacher model of the full video pre-training Alignment, does not make full and reasonable use of the prior knowledge of observation rate, ignores the potential of a student model to learn from multiple teacher models, and fails to capture the different importance of different teacher models for each sample instance

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Action prediction method based on human skeleton sequence
  • Action prediction method based on human skeleton sequence
  • Action prediction method based on human skeleton sequence

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0050] In order to make the objectives, technical solutions and advantages of the present invention clearer, the present invention will be further described in detail below with reference to the embodiments. It should be understood that the specific embodiments described herein are only used to explain the present invention, but not to limit the present invention.

[0051] This embodiment provides an action prediction method based on human skeleton sequence, see figure 1 , the method includes the following steps:

[0052] S1: Use the graph convolutional network model to extract the spatiotemporal features of the human skeleton sequence data to be predicted, and input the spatiotemporal features into the trained multi-generation student model to obtain multiple classification results;

[0053] S2: Integrate multiple classification results to obtain the final prediction result.

[0054] Among them, the result fusion in step S2 can adopt the method of weighted fusion, which can...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses an action prediction method based on a human skeleton sequence, and the method makes full use of a human topological graph structure, employs a graph convolution network to extract rich and representative spatial-temporal features, and employs a knowledge distillation technology of a multi-teacher model, a low-observation-rate sequence to learn prior knowledge from a complete sequence and a high-observation-rate sequence, thereby achieving the prediction of the human skeleton sequence. Specifically, each sample instance adaptively selects a corresponding pre-training teacher model according to the observation rate, and knowledge transfer learning is carried out. And the multi-teacher model adaptively guides the generated student model, and iteratively trains and generates a plurality of student models by adopting a regenerative network technology. In the regenerative network training process, due to the fact that the model classification performance is weak under the low observation rate, the cross entropy loss function is weighted on the basis of the sample instance observation rate, and therefore the student model is optimized. By adopting a weighted fusion strategy, prediction results of a plurality of student models are fused as a final prediction classification result, and the method has the advantages of high prediction accuracy and the like.

Description

technical field [0001] The present invention relates to the technical field of action prediction of computer vision, in particular to an action prediction method based on human skeleton sequences. Background technique [0002] Human actions usually take place in 3D space, and 3D skeleton sequences provide more comprehensive information than RGB videos captured by 2D cameras. Today, with the rapid development of low-cost depth sensors such as Microsoft Kinect and Asus Xtion, accurate bone data can be directly generated in real time, and capturing human skeleton movements has become more accurate, simple and convenient. Therefore, in the field of computer vision, the research of action prediction based on human skeleton sequences is more and more attractive. The research on action prediction based on human skeleton sequence is cutting-edge and challenging, and it has very broad practical application prospects, such as human-computer interaction, automatic driving, intelligent...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06V40/20G06K9/62G06N3/04G06N3/08G06V10/774G06V10/82
CPCG06N3/08G06N3/045G06F18/214
Inventor 刘翠微赵晓雪杜冲李照奎石祥滨
Owner SHENYANG AEROSPACE UNIVERSITY
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products