Action evaluation method based on human body three-dimensional articulation point detection

An evaluation method and joint point technology, applied in the field of computer vision, can solve the problems of low accuracy, lack of mature algorithms and products, and low evaluation accuracy, and achieve the effect of solving the unequal action time.

Active Publication Date: 2020-05-12
CHONGQING UNIV OF POSTS & TELECOMM
View PDF5 Cites 31 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0006] However, there is still a lack of effective solutions for using sports videos to analyze and evaluate the actions of target people.
One of the difficulties is that the two-dimensional human posture assessment is easily affected by occlusions, and the accuracy of unconventional movements such as interlaced joints is low. The second difficulty is that there are physical differences between individuals, such as fat or thin, tall or short. Calculating the Euclidean distance of the joint points to evaluate the accuracy of the action is low; the third difficulty is that the same action can be performed by different people at different speeds, and it cannot be compared and analyzed frame by frame
[0007] To sum up, there is currently a lack of mature algorithms and products for the standard evaluation of motion in sports videos.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Action evaluation method based on human body three-dimensional articulation point detection
  • Action evaluation method based on human body three-dimensional articulation point detection
  • Action evaluation method based on human body three-dimensional articulation point detection

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0050] Embodiments of the present invention are described below through specific examples, and those skilled in the art can easily understand other advantages and effects of the present invention from the content disclosed in this specification. The present invention can also be implemented or applied through other different specific implementation modes, and various modifications or changes can be made to the details in this specification based on different viewpoints and applications without departing from the spirit of the present invention. It should be noted that the diagrams provided in the following embodiments are only schematically illustrating the basic concept of the present invention, and the following embodiments and the features in the embodiments can be combined with each other in the case of no conflict.

[0051] Wherein, the accompanying drawings are for illustrative purposes only, and represent only schematic diagrams, rather than physical drawings, and should...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention relates to an action evaluation method based on human body three-dimensional articulation point detection, which belongs to the field of computer vision and comprises the following steps: S1, performing human body three-dimensional articulation point detection on a single-frame picture after video framing; S2, extracting key frames of a specified number of frames of the video; S3, constructing motion vector features and joint kinetic energy features, and extracting feature values; and S4, constructing a key frame action similarity comparison model through multi-feature fusion: fusing the sub-features in the step S3, and constructing a personalized model for different types of actions; constructing a motion vector feature similarity function based on cosine similarity, and constructing a joint kinetic energy similarity function based on a weighting function; and obtaining a key frame action similarity comparison model based on the two similarity functions, comparing the action to be detected with the key frame set of the standard action, and finally obtaining the action similarity of the motion video. The method is more accurate and scientific, and can be used for physical fitness action correction and teaching.

Description

technical field [0001] The invention belongs to the technical field of computer vision, and in particular relates to an action evaluation method based on detection of three-dimensional joint points of a human body. Background technique [0002] With the advancement of artificial intelligence algorithms and computer image processing performance, pose assessment and behavior understanding of objects in videos has become a hot issue in the field of computer vision. And it has been applied in many fields, such as sports auxiliary training, abnormal behavior detection, gesture and gait recognition, etc. [0003] Human posture assessment can be widely used in various sports events. Human body motion recognition is used in physical education and fitness teaching. By capturing and analyzing motions, a personalized technical diagnosis report can be obtained, providing auxiliary training tools for athletes and coaches. , Improve the level of athletes. [0004] The standard degree of...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/00G06K9/62
CPCG06V40/20G06V20/42G06V20/46G06F18/23213
Inventor 许国良李轶玮李万林文韬雒江涛
Owner CHONGQING UNIV OF POSTS & TELECOMM
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products