Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Standing long jump evaluation method based on deep learning attitude estimation

A pose estimation and deep learning technology, applied in neural learning methods, computing, computer components, etc., can solve the problem of low judgment accuracy and achieve high accuracy

Pending Publication Date: 2020-10-30
四川中科凯泽科技有限公司
View PDF0 Cites 4 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Traditional human body posture recognition is mainly obtained and recognized in the form of video or pictures. For the judgment of standing long jump and other sports, its judgment accuracy is low

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Standing long jump evaluation method based on deep learning attitude estimation
  • Standing long jump evaluation method based on deep learning attitude estimation
  • Standing long jump evaluation method based on deep learning attitude estimation

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0028] A standing long jump evaluation method based on deep learning attitude estimation, comprising the following steps:

[0029] A. Collect the image before take-off and zoom the image. In this step, the image is preferably zoomed to 640x480.

[0030] B. Define the long jump detection area. The long jump detection area is the maximum width distance from the starting point of the long jump to the landing point. There is no limit to the height, as long as it meets the detection range of the long jump area. Detect the feature points of the human body and complete the limb connection through the deep convolutional neural network, mainly including the key points of the limbs below the human head, including the neck, left shoulder, right shoulder, left elbow, right elbow, left wrist, right wrist, left thigh, Right thigh, left knee, right knee, left ankle and right ankle.

[0031] C. Collect the final landing point of the human body and detect whether the final landing point of th...

Embodiment 2

[0034] Based on the above-mentioned embodiments, this embodiment specifically discloses the structure and construction method of the deep convolutional neural network in the above-mentioned embodiments, and will be described in conjunction with the standing long jump.

[0035] Deep Convolutional Neural Network Training Dataset:

[0036] The deep convolutional neural network requires a large number of data sets as the basis for deep learning network training, and its data set is a very important part of the evaluation accuracy. The human body key point evaluation training data set in this embodiment uses two kinds of data: COCO2016 and self-labeling Among them, COCO2016 is an open source competition data set, and Keypoint Evaluation completes the evaluation of key points of the human body; for the self-labeled data set, we mainly collect a large number of posture images such as human dances, and use LabelMe to complete the labeling of key points of the human body. The purpose o...

Embodiment 3

[0051] Based on the principle of Embodiment 1 and the deep convolutional neural network trained in the embodiment, this embodiment discloses a specific example.

[0052] A. Collect the image before take-off and scale the image to 640x480.

[0053] B. Designate the long jump detection area, such as figure 1 In the area A, the feature points of the human body are detected through the deep convolutional neural network and the limbs are connected, mainly including the key points of the limbs below the head of the human body, including the neck, left shoulder, right shoulder, left elbow, right elbow, left wrist, right Wrist, left thigh, right thigh, left knee, right knee, left ankle and right ankle, such as figure 1 shown.

[0054] C. When the human body lands, judge whether the landing point is within the detection area. If so, determine which limb feature points of the human body are within the detection area. Delay for a certain period of time, such as 1 second or 2 seconds, t...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a standing long jump evaluation method based on deep learning attitude estimation. The standing long jump evaluation method comprises the following steps: A, collecting an image before jumping and zooming the image; B, defining a long jump detection area, detecting human body feature points through a deep convolutional neural network, and completing limb connection; C, collecting a human body final landing point and detecting whether the human body final landing point is in the detection area or not; and D, calculating the standing long jump distance according to the coordinates of the take-off point and the final landing point. According to the scheme, the coordinates of the human body at the two points are calculated based on the take-off point and landing point separation algorithm, so that the standing long jump distance is obtained; the human body posture estimation adopts the deep convolutional neural network, and the accuracy is high.

Description

technical field [0001] The present invention relates to the field of posture evaluation, and more specifically relates to a standing long jump evaluation method based on deep learning posture estimation. Background technique [0002] The popularization and upgrading of the Internet have brought about rapid changes in human production and lifestyle. On the Internet, people can work, study, communicate with each other, make friends, and engage in entertainment activities. The Internet has brought many conveniences to people's lives, but the inherent potential of the Internet has not been fully tapped, so people hope to further explore the inner power of the Internet through thinking, so as to raise human living standards to a higher level. For example, a typical representative is the field of image processing, and human body pose recognition is an important branch of the image field. The traditional human body posture recognition is mainly obtained and recognized through the ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06K9/00G06K9/46G06K9/62G06N3/04G06N3/08
CPCG06N3/08G06V40/23G06V10/44G06N3/045G06F18/22G06F18/214
Inventor 唐成
Owner 四川中科凯泽科技有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products