An action video extraction and classification method based on moving object detection

A moving target, classification method technology, applied in character and pattern recognition, instruments, computer parts and other directions, can solve the problem of inability to effectively locate the starting frame, increase the amount of calculation and the difficulty of action recognition, and achieve real-time video extraction and performance. The effect of classification tasks

Pending Publication Date: 2019-06-14
CHONGQING INST OF GREEN & INTELLIGENT TECH CHINESE ACADEMY OF SCI
View PDF7 Cites 7 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

For example: to identify a 5-minute long 25fps video stream, first divide the video stream into a video segment every 25 frames, and randomly extract several frames of RGB images or optical flow images from the 25 frames, representing the segment input to The CNN feature extraction network is used to determine the starting point and identify the action type; obviously the extracted features only represent the overall action information of the clip, and the highest positioning accuracy of the starting point of the action needs to be in every frame of the video clip, so the action cannot be effectively located The starting frame, and the recognition of the action type of each frame image
Of course, in theory, the video clip can be further subdivided to each frame as a video clip for the interpretation of the starting point of the action, but this will undoubtedly greatly increase the amount of calculation and the difficulty of action recognition

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • An action video extraction and classification method based on moving object detection
  • An action video extraction and classification method based on moving object detection
  • An action video extraction and classification method based on moving object detection

Examples

Experimental program
Comparison scheme
Effect test

Embodiment

[0030] When the figure skating coach guides the essentials of the athletes' movements and analyzes the opponent's movements, it is often necessary to make a video collection for the athletes and edit the movements they are interested in, such as: front and outside jumps. This embodiment provides a sports goal-based motion video extraction and classification methods for detection, combining figure 1 , the method consists of the following steps:

[0031] step one:

[0032] Obtain figure skating videos under different scenes, different resolutions, frame rates, contrast, shooting angles, different number of people being shot, different shooting distances and other factors from event cameras and network videos, establish a sports video database, and analyze sports Human body movements in the video database are identified as jumping, spinning, lifting, footwork, and twisting, etc., and are categorized and stored in the sports action library and marked;

[0033] Step Two: Combine ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses an action video extraction and classification method based on moving object detection, and belongs to the field of big data artificial intelligence deep learning. The method comprises the following steps of S1, collecting motion videos of various scenes and different qualities, establishing a motion video database, classifying and marking human body actions in the motion video database, and establishing a motion action library; S2, establishing a detection model of the moving object based on the video image by utilizing a deep learning technology; S3, training the moving target detection model by using the moving action library; and S4, taking the video shot by the user in real time as the input of the moving target detection model, judging the human body action category, and judging whether the action video needs to be extracted and stored in combination with the user demand. The method provided by the invention can automatically, accurately and quickly complete judgment of the motion category and the motion starting moment, and can adapt to video extraction and classification tasks in different scenes in real time.

Description

technical field [0001] The invention relates to an action video extraction and classification method based on moving target detection, which belongs to the field of big data artificial intelligence, and is especially suitable for human action recognition, intelligent sports, video editing and the like. Background technique [0002] Action recognition technology is widely used in competitive sports, health checks, medical research, pedestrian navigation and rescue and other fields. For example, the crawling action of a baby can be an important indicator of cerebral palsy in infants, and the analysis of cerebral palsy in infants and action correction for infant crawling actions Action recognition technology is particularly important for problems such as [0003] Visual recognition technology is usually used for human action recognition. Traditionally, the extraction and classification of action videos relies heavily on manual editing, which is inefficient. [0004] In the pr...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G06K9/00G06K9/32G06K9/62
Inventor 张学睿张帆姚远郑志浩
Owner CHONGQING INST OF GREEN & INTELLIGENT TECH CHINESE ACADEMY OF SCI
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products