Stadium auxiliary training method and system

A technology for sports venues and training methods, applied in the field of sports training methods and systems, can solve problems such as lack, and achieve the effect of convenient training

Inactive Publication Date: 2020-02-04
BEIJING INSTITUTE OF TECHNOLOGYGY
View PDF8 Cites 6 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

There is a general lack of such equipment in current stadiums

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Stadium auxiliary training method and system
  • Stadium auxiliary training method and system
  • Stadium auxiliary training method and system

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0043] Embodiment 1 provides a kind of gymnasium auxiliary training method, as figure 1 shown, including the following steps:

[0044] S1. Use the camera to collect the video data of the athlete;

[0045] S2. Detect the skeletal joint points of the human body from the video data, and generate posture data of the athlete;

[0046] S3. Using the preset standard skeletal joint point posture data to compare with the athlete's skeletal joint point posture data to detect whether the movement of the athlete is standard;

[0047] S4. When the movement of the athlete is not standard, replay the movement of the athlete on the display screen, and mark the nonstandard position.

[0048] The traditional key point detection of human skeleton adopts the method of template matching to detect the skeleton joint points and limb structure of the human body. However, if the detected skeletal joint points do not contain depth information, it will be difficult to perform accurate analysis and ev...

Embodiment 2

[0052] Embodiment 2 and Embodiment 3 provide two different methods for obtaining depth information of skeletal joint points. Embodiment 2 At least two cameras are arranged in the stadium, and step S22 calculates the depth data of the bone joint points through the video data of more than two viewing angles.

[0053] Depth estimation generally requires binocular positioning in three-dimensional space, because the image seen by a single eye is two-dimensional, and two-dimensional information cannot be used to represent three-dimensional space, that is, the offset generated by the position far away from the camera is small, and the position close to the camera The offset produced by the position is larger. The existence of this difference is brought about by the three-dimensional space. The geometric law when two cameras shoot video on the same horizontal line is as follows: image 3 shown.

[0054] The depth of the scene is represented by z; the three-dimensional scene is mappe...

Embodiment 3

[0058] Embodiment 3 only deploys one camera, and calculates the depth data of skeletal joint points through monocular video, and the specific method is:

[0059] S221, using the one-side image of the binocular video as an input for training, and using the other-side image as a reference standard, to establish a neural network;

[0060] S222. Perform binocular video training on the data through the neural network, and obtain a neural network function for predicting the corresponding image on the other side through the input image;

[0061] S223. Input monocular video data, and use the neural network function to obtain the image on the other side corresponding to each frame of image, thereby converting the monocular video data into binocular video data;

[0062] S224. Calculate depth data of skeletal joint points through the binocular video data.

[0063] Suppose we have a very powerful function F, use the left image as the training input, and the right image as the corresponding...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention relates to a stadium auxiliary training method and system. The stadium auxiliary training method comprises the steps of collecting video data of a sporter through a camera; detecting bone joint points of the human body, and generating posture data of the sporter; comparing preset standard bone joint point posture data with the bone joint point posture data of the sporter, and detecting whether the movement of the sporter is standard or not; and when the motion of the sporter is not standard, playing back the motion of the sporter on a display screen, and marking at a non-standardposition. According to the stadium auxiliary training method and system provided by the invention, the actions of the sporter are analyzed, evaluated and corrected by shooting the video of the sporter, so that the training of the sporter is greatly facilitated.

Description

technical field [0001] The invention relates to a physical training method and system, in particular to an auxiliary training method and system for gymnasiums. Background technique [0002] Now more and more people choose to engage in various sports such as basketball, badminton, table tennis etc. in the gymnasium. Athletes generally learn various movements by watching teaching videos and inviting coaches on the spot. Athletes very much hope that auxiliary training methods and equipment can be provided in the stadium. By taking videos of the athletes, the movements of the athletes are analyzed, and evaluation and correction suggestions are given. There is a general lack of such equipment in present stadiums. Contents of the invention [0003] The object of the present invention is to address the deficiencies of the prior art and provide an auxiliary training method for gymnasiums, which can analyze, evaluate and correct the movements of the athletes by taking videos of t...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/00
CPCG06V40/23G06V20/42
Inventor 黄天羽朱文涛
Owner BEIJING INSTITUTE OF TECHNOLOGYGY
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products