Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Fitness action recognition and evaluation method based on machine vision and deep learning

A technology of deep learning and machine vision, applied in character and pattern recognition, instruments, computer components, etc., can solve problems such as physical injuries, home fitness can not achieve fitness effects, and irregular fitness movements, etc., to achieve strong reference value and improve Scoring Accuracy and Scoring Efficiency, Effect of Improving Fitness Efficiency

Inactive Publication Date: 2021-06-18
SHANGHAI UNIV OF ENG SCI
View PDF8 Cites 1 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0003] Due to the limitation of time and economic conditions, many people choose to exercise at home, but due to the lack of real-time guidance and judgment of coaches in home exercise, home exercise cannot achieve good fitness results, and even prone to physical injuries caused by irregular exercise

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Fitness action recognition and evaluation method based on machine vision and deep learning
  • Fitness action recognition and evaluation method based on machine vision and deep learning
  • Fitness action recognition and evaluation method based on machine vision and deep learning

Examples

Experimental program
Comparison scheme
Effect test

Embodiment

[0038] A fitness action recognition and evaluation method based on machine vision and deep learning, such as figure 1 shown, including the following steps:

[0039] S1: Obtain the video of the fitness action to be evaluated, and set the action type.

[0040] In this embodiment, the user can choose to start the computer camera to take pictures of the fitness movements, or choose to upload the fitness videos of the corresponding movements, and the computer will write them into the corresponding files. The video format adopts .MP4 format, MJPG encoder is adopted, and the video frame rate is 30 frames. After cutting by computer, the video screen size is: 640*480.

[0041] S2: Establish a human skeleton model for each frame of video image.

[0042] Step S2 specifically includes:

[0043] S21: Select the COCO human body model, and use the CMU human body posture dataset to obtain the key points of the bones of each frame of the video image. The key points of the bones include nose...

Embodiment 2

[0086] In this embodiment, the present invention further includes step S6: if the action score is less than 60, prompting the wrong position of the action.

[0087] The specific steps of step S6 include:

[0088] S61: Determine whether the action score is less than 60 points, if yes, go to step S61, otherwise output the evaluation score.

[0089] S62: Get the evaluation score G m corresponding to a m,n In is the scoring standard type of the failing score, and the error action prompt corresponding to the scoring standard type is obtained.

[0090] Taking the push-up action as an example, in Embodiment 1, the acquired action score is 65.1, and if it is greater than or equal to 60, it is output.

[0091] Taking plank support as an example, the 70, 80, 90, 100 respectively, 150, 165, 180, 195 respectively, are 165, 175, 185, and 195, respectively, and the identified angular feature 2-3-4 is 67°, x 7,1 =67°, calculated as a 7,1 =0.72; the recognition angle feature 16-2...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to a fitness action recognition and evaluation method based on machine vision and deep learning, and the method comprises the following steps: obtaining a to-be-evaluated fitness action video, and setting an action type; establishing a human skeleton model of each frame of video image; extracting a frame of video image of which the action is closest to the standard action as a to-be-evaluated image; establishing a scoring rule standard for obtaining the action score; and calculating a distance feature and an angle feature of the human skeleton model in the to-be-evaluated image, and scoring the action in the to-be-evaluated image according to a scoring standard corresponding to the action type in the scoring standards. Compared with the prior art, a bottom-up human body posture recognition algorithm is adopted, human body fitness actions are recognized, related action scores are obtained, the accuracy and efficiency of score obtaining are improved, wrong actions can be effectively prompted, and the fitness efficiency is improved.

Description

technical field [0001] The present invention relates to the field of machine vision and sports health, in particular to a fitness action recognition and evaluation method based on machine vision and deep learning. Background technique [0002] As people pay more attention to physical health, more and more people improve their physical health through fitness. The traditional way of fitness guidance is to correct under the supervision and guidance of the coach, requiring professionals to guide the exercise in a specific environment. [0003] Due to the limitation of time and economic conditions, many people choose to exercise at home, but due to the lack of real-time guidance and judgment of coaches in home exercise, home exercise cannot achieve good fitness results, and even prone to physical injuries caused by irregular exercise . Contents of the invention [0004] The purpose of the present invention is to provide a fitness action recognition and evaluation method based...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/00G06K9/62G06N3/04
CPCG06V40/23G06V20/48G06N3/045G06F18/22
Inventor 崔嘉亮钟倩文郑树彬彭乐乐文静林湧
Owner SHANGHAI UNIV OF ENG SCI
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products