Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Action classifying method for small sample videos

A small sample and video technology, applied in the computer field, can solve problems such as increasing computing performance, and achieve the effect of small sample video action recognition promotion

Active Publication Date: 2019-08-30
FUDAN UNIV
View PDF6 Cites 7 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

But this method of using all video frame information will greatly increase computing performance while improving performance

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Action classifying method for small sample videos
  • Action classifying method for small sample videos
  • Action classifying method for small sample videos

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0027] The present invention is further described below through specific embodiments and accompanying drawings.

[0028] figure 1 A comparison chart of the small sample video action recognition setting based on the intelligent human body proposed by the present invention and the classic recognition is shown. The black ones represent real-world videos, and the magenta ones represent virtual world videos. The classic small sample video action recognition is to migrate from the real training set video to the real test set video of different actions; the small sample video action recognition we propose is to migrate from the virtual training set video generated based on intelligent human body to the real test set video with the same action. Test set videos.

[0029] figure 2 A schematic diagram of the real test video and the corresponding generated virtual training video data of the present invention is shown. The real test video comes from real human actions such as waving, ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention belongs to the technical field of computers, and particularly relates to an action classifying method for small sample videos. Aiming at solving the problem that the types of a trainingset and a test set are intersected in the existing setting of small sample video action classification, the invention provides a small sample video recognition new mode based on an intelligent human body, generates a large number of virtual videos with the same action in a 3D intelligent virtual human body and virtual environment interaction mode, and provides a training sample for a deep neural network. In addition, the invention also provides a data enhancement method based on video segment replacement, capable of expanding a limited data set by means of a method of replacing a certain segment in an original video with video segments with similar semantics. Experiments show that the method can play a great role in promoting small sample video action recognition, and has good robustness and strong algorithm transportability.

Description

technical field [0001] The invention belongs to the technical field of computers, and in particular relates to a method for classifying small-sample video actions. Background technique [0002] With the rapid development of deep learning, many tasks in the field of computer vision have achieved good results. Video action recognition has gradually become a hot research issue for researchers at home and abroad. At present, there have been many models that can achieve a high degree of recognition on existing video action recognition data sets, but most of them rely on a large number of manually labeled data. . In practical applications, it is more the case that the video to be predicted has only one or a few labeled samples. The small-sample video action recognition research is how to make the network model have the ability to quickly learn video feature representation and then perform action recognition under the condition of very little labeled data. [0003] The existing ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/00G06K9/62
CPCG06V40/23G06V20/40G06F18/241
Inventor 姜育刚傅宇倩付彦伟汪成荣
Owner FUDAN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products