Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Efficient short video content intelligent classification method based on deep learning and attention mechanism

A classification method and deep learning technology, applied in the field of computer vision, to achieve the effect of taking into account the prediction accuracy, improving model performance, and improving prediction accuracy

Pending Publication Date: 2020-02-18
NANJING UNIV OF AERONAUTICS & ASTRONAUTICS
View PDF4 Cites 3 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] In order to solve the problem of automatic video content classification methods in the prior art, the present invention proposes an efficient short video content intelligent classification method based on deep learning and attention mechanism, which combines two-dimensional convolutional neural network and three-dimensional convolutional neural network Advantages of the network, design a deep learning series structure with attention mechanism, use pseudo-3D convolutional neural network instead of traditional 3D convolutional neural network, and quickly classify short video content

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Efficient short video content intelligent classification method based on deep learning and attention mechanism
  • Efficient short video content intelligent classification method based on deep learning and attention mechanism
  • Efficient short video content intelligent classification method based on deep learning and attention mechanism

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0029] The present invention will be further illustrated below in conjunction with specific implementation examples.

[0030] Such as figure 1 As shown, the overall framework of the network is a series connection of two convolutional neural networks, and finally outputs the predicted probability of each category.

[0031] Such as figure 2 As shown, the specific process is as follows: First, uniformly draw frames from the short video, here the number of extracted frame images is set to N: first divide all video frames into N shares, and then randomly extract a frame from each share, in chronological order Arranged and fed into a two-dimensional convolutional neural network. The attention mechanism adopts the Squeeze- and-excitation module.

[0032] Such as image 3 As shown, the details of the two-dimensional convolutional neural network layer are described, including the detailed network structure diagram and convolution kernel parameters. Here, the two-dimensional convo...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses an efficient short video content intelligent classification method based on deep learning and an attention mechanism, and relates to an efficient method for intelligently classifying short videos according to content. A core algorithm model of the method is composed of a two-dimensional convolutional neural network and a pseudo three-dimensional convolutional neural networkwhich are connected in series and used for extracting shallow space information and high-dimensional space and time information respectively, finally, the probability that a video belongs to each category is obtained through a normalized exponential function, and final prediction classification is obtained according to the probability. Time performance and prediction accuracy are both considered,the method can be used for real-time content supervision and classification of short videos, and the obtained result can be used for reference of short video recommendation.

Description

technical field [0001] The invention belongs to the technical field of computer vision, and in particular relates to an intelligent classification method for short video content based on deep learning and an attention mechanism. Background technique [0002] As the fastest-growing Internet content dissemination method in recent years, short videos have diverse content and low thresholds for production and distribution. The massive short videos make supervision difficult, and it is easy to mix illegal videos involving violence and pornography. The present invention uses deep learning to automatically classify short video content, which can assist the short video platform to review and supervise the short videos uploaded by users. The classified short videos can also be used as reference factors for short video recommendations. History recommends associated short videos to enhance the competitiveness of the short video platform. [0003] Deep learning has become one of the ma...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/00G06N3/04G06N3/08
CPCG06N3/08G06V20/41G06N3/048G06N3/045
Inventor 包秀平袁家斌陈蓓
Owner NANJING UNIV OF AERONAUTICS & ASTRONAUTICS
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products