Unlock instant, AI-driven research and patent intelligence for your innovation.

An Action Evaluation Method Based on Human 3D Joint Detection

An evaluation method and joint point technology, applied in the field of computer vision, can solve the problems of low accuracy, easy to be affected by occluders, lack of mature algorithms and products, etc., and achieve the effect of solving the effect of unequal action time

Active Publication Date: 2022-07-01
CHONGQING UNIV OF POSTS & TELECOMM
View PDF5 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0006] However, there is still a lack of effective solutions for using sports videos to analyze and evaluate the actions of target people.
One of the difficulties is that the two-dimensional human posture assessment is easily affected by occlusions, and the accuracy of unconventional movements such as interlaced joints is low. The second difficulty is that there are physical differences between individuals, such as fat or thin, tall or short. Calculating the Euclidean distance of the joint points to evaluate the accuracy of the action is low; the third difficulty is that the same action can be performed by different people at different speeds, and it cannot be compared and analyzed frame by frame
[0007] To sum up, there is currently a lack of mature algorithms and products for the standard evaluation of motion in sports videos.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • An Action Evaluation Method Based on Human 3D Joint Detection
  • An Action Evaluation Method Based on Human 3D Joint Detection
  • An Action Evaluation Method Based on Human 3D Joint Detection

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0050] The embodiments of the present invention are described below through specific specific examples, and those skilled in the art can easily understand other advantages and effects of the present invention from the contents disclosed in this specification. The present invention can also be implemented or applied through other different specific embodiments, and various details in this specification can also be modified or changed based on different viewpoints and applications without departing from the spirit of the present invention. It should be noted that the drawings provided in the following embodiments are only used to illustrate the basic idea of ​​the present invention in a schematic manner, and the following embodiments and features in the embodiments can be combined with each other without conflict.

[0051] Among them, the accompanying drawings are only for illustrative description, and represent only schematic diagrams, not physical drawings, and should not be co...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to an action evaluation method based on detection of three-dimensional joint points of a human body, which belongs to the field of computer vision, and includes the steps of: S1: detecting three-dimensional joint points of a human body on a single frame picture after framed video; S2: extracting a specified number of frames of the video Key frame; S3: Construct motion vector features and joint kinetic energy features, and extract feature values; S4: Multi-feature fusion to construct a key frame action similarity comparison model: Integrate the sub-features in step S3 to build personalized models for different types of actions ;Construct motion vector feature similarity function based on cosine similarity, construct joint kinetic energy similarity function based on weighting function; obtain key frame action similarity comparison model based on two similarity functions, and compare the key frame set of the action to be detected and the standard action , and finally get the motion similarity of the motion video. This method is more accurate and scientific, and can be used for physical fitness action correction and teaching.

Description

technical field [0001] The invention belongs to the technical field of computer vision, and in particular relates to an action evaluation method based on detection of three-dimensional joint points of a human body. Background technique [0002] With the progress of artificial intelligence algorithms and computer image processing performance, the pose assessment and behavior understanding of objects in videos have become a hot issue in the field of computer vision. And it has been applied in many fields, such as sports-assisted training, abnormal behavior detection, gesture and gait recognition, etc. [0003] Human posture assessment can be widely used in various sports events. Human motion recognition is used in physical education and fitness teaching. By capturing and analyzing motions, a personalized technical diagnosis report can be obtained to provide auxiliary training tools for athletes and coaches. , to improve the level of athletes. [0004] The standard degree of ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06V40/20G06V20/40G06K9/62
CPCG06V40/20G06V20/42G06V20/46G06F18/23213
Inventor 许国良李轶玮李万林文韬雒江涛
Owner CHONGQING UNIV OF POSTS & TELECOMM