Supercharge Your Innovation With Domain-Expert AI Agents!

Multi-person rope skipping analysis method based on video-audio multi-mode deep learning

A technology of deep learning and analysis methods, applied in the field of deep learning, can solve problems such as difficulty in recording rope skipping results, failure to rule out rope skipping attempts, etc., and achieve the effect of accurate statistical analysis of rope skipping

Pending Publication Date: 2022-05-10
开望(杭州)科技有限公司
View PDF0 Cites 1 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

At present, most of the solutions that use video to analyze the performance of rope skippers use a single image to analyze the fluctuation of the portrait's up and down jumps. This kind of failed attempt during the rope skipping process cannot be ruled out, and it is difficult to achieve accurate records of rope skipping results.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Multi-person rope skipping analysis method based on video-audio multi-mode deep learning
  • Multi-person rope skipping analysis method based on video-audio multi-mode deep learning
  • Multi-person rope skipping analysis method based on video-audio multi-mode deep learning

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0031] The following will clearly and completely describe the technical solutions in the embodiments of the present invention with reference to the accompanying drawings in the embodiments of the present invention. Obviously, the described embodiments are only some, not all, embodiments of the present invention. Based on the embodiments of the present invention, all other embodiments obtained by persons of ordinary skill in the art without making creative efforts belong to the protection scope of the present invention.

[0032] The embodiment of the present invention discloses a multi-person rope skipping analysis method based on video-audio multimodal deep learning, such as figure 1 shown, including the following steps:

[0033] S1. Obtain the audio-visual file of the rope skipping process, and separate the video and audio in the audio-visual file to obtain a video image signal and a stereo audio signal;

[0034] S2. Carry out portrait detection and extraction for the video ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a multi-person rope skipping analysis method based on video-audio multi-mode deep learning, and the method comprises the steps: obtaining an audio-video file in a rope skipping process, separating a video from an audio, carrying out the portrait detection and extraction of a video image signal, tracking a target portrait, extracting the coordinates of skeleton feature points of the target portrait, and carrying out the preprocessing, the method comprises the following steps: acquiring a single-channel audio signal, slicing and intercepting the single-channel audio signal, carrying out time-frequency transformation to obtain a frequency spectrum signal, preprocessing the frequency spectrum signal, and fusing the preprocessed video signal and audio signal to obtain a video-audio fusion signal; and enabling the video-audio fusion signal to pass through a bidirectional long-short time memory cyclic convolutional neural network and a cascaded full-connection network to obtain an output signal flow, converting the output signal flow into a square wave signal, filtering the square wave signal, and carrying out statistical analysis on a rising edge or a falling edge. Interference of non-rope skipping testers can be effectively filtered, and more accurate rope skipping statistical analysis is achieved.

Description

technical field [0001] The present invention relates to the technical field of deep learning, and more specifically relates to a multi-person rope skipping analysis method based on video-audio multimodal deep learning. Background technique [0002] Rope skipping has a long history and is a very ancient sports game. Nowadays, it has also become a very popular fast and effective exercise method in the fast-paced life. [0003] At present, there are mainly two solutions for intelligent counting of rope skipping on the market, one is built into the rope skipping equipment, and the other is using video surveillance. The built-in intelligence in the skipping rope requires the purchase of additional equipment for intelligent hardware, which is not easy to promote, and video capture equipment is already very popular. At present, most of the schemes that use video to analyze the performance of rope skippers use a single image to analyze the fluctuation of the portrait's up and down...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06V40/10G06K9/62G06V10/774G10L19/008G10L25/45
CPCG10L25/45G10L19/008G06F18/214
Inventor 朱亮亮熊杰
Owner 开望(杭州)科技有限公司
Features
  • R&D
  • Intellectual Property
  • Life Sciences
  • Materials
  • Tech Scout
Why Patsnap Eureka
  • Unparalleled Data Quality
  • Higher Quality Content
  • 60% Fewer Hallucinations
Social media
Patsnap Eureka Blog
Learn More