Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Model training method, video category detection method and device, electronic device and computer readable medium

A model training and classification technology, applied in the computer field, can solve the problems of large amount of data calculation, large space, and low efficiency of video category detection, and achieve the effect of less data calculation, small storage space, and improved efficiency.

Active Publication Date: 2019-08-13
BEIJING QIYI CENTURY SCI & TECH CO LTD
View PDF12 Cites 28 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] The embodiment of the present application proposes a model training method, a video category detection method, a device, an electronic device, and a computer-readable medium to solve the problem that the storage space occupied by a three-dimensional convolutional neural network is relatively large when performing category detection on a video in the prior art. Large, large amount of data calculation, resulting in low efficiency of video category detection technical problems

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Model training method, video category detection method and device, electronic device and computer readable medium
  • Model training method, video category detection method and device, electronic device and computer readable medium
  • Model training method, video category detection method and device, electronic device and computer readable medium

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0029] The application will be further described in detail below with reference to the drawings and embodiments. It can be understood that the specific embodiments described here are only used to explain the related invention, but not to limit the invention. In addition, it should be noted that, for ease of description, only the parts related to the relevant invention are shown in the drawings.

[0030] It should be noted that the embodiments in the application and the features in the embodiments can be combined with each other if there is no conflict. Hereinafter, the present application will be described in detail with reference to the drawings and in conjunction with embodiments.

[0031] Please refer to figure 1 , Which shows the process 100 of an embodiment of the model training method according to the present application. The model training method includes the following steps:

[0032] Step 101: Obtain a sample set.

[0033] In this embodiment, the execution subject of the mo...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The embodiment of the invention discloses a model training method, a video category detection method and device, an electronic device and a computer readable medium. An embodiment of the video category detection method comprises the steps of extracting a key frame of a target video, and generating a key frame sequence; inputting the key frame sequence into a feature extraction model to obtain a feature information sequence corresponding to the key frame sequence; and inputting the feature information sequence into a video category detection model to obtain a category detection result of the target video. According to the method and the device, the video category detection efficiency is improved.

Description

Technical field [0001] The embodiments of the present application relate to the field of computer technology, and specifically relate to model training methods, video category detection methods, devices, electronic equipment, and computer-readable media. Background technique [0002] With the development of computer technology, video applications have emerged. Users can use video applications to upload and publish videos. In order to ensure video quality and facilitate video push to other users, it is usually necessary to determine the category of content involved in the video uploaded by the user. [0003] The related method is usually to extract video features using a three-dimensional convolutional neural network, and then classify the video based on the video features. However, because the three-dimensional convolutional neural network occupies a large storage space and a large amount of data calculation, the efficiency of using this method for video category detection is low...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06K9/62G06F16/75
CPCG06F16/75G06F18/24G06F18/214
Inventor 刘洁王涛蔡东阳刘倩
Owner BEIJING QIYI CENTURY SCI & TECH CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products