Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

A Video Segment Recommendation Method Based on Graph Convolutional Network

A technology of video clips and convolutional networks, applied in neural learning methods, biological neural network models, instruments, etc., can solve problems such as cold start items, and achieve the effect of avoiding data sparseness

Active Publication Date: 2021-05-04
HEFEI UNIV OF TECH
View PDF6 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0007] In order to solve the deficiencies of the prior art, the present invention proposes a video segment recommendation method based on a graph convolutional network, in order to make more accurate recommendations for users, especially for user groups with scarce historical data, so as to better Fix to cold start item issue

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A Video Segment Recommendation Method Based on Graph Convolutional Network
  • A Video Segment Recommendation Method Based on Graph Convolutional Network
  • A Video Segment Recommendation Method Based on Graph Convolutional Network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment

[0070] In order to verify the effectiveness of the method of the present invention, the present invention grabs a large number of video clips from the video clip sharing platform Gifs.com as a data set, and each clip consists of a quadruple s ,t e >, where u represents the user id, v represents the source video id of the segment, and t s Indicates the start time point of the segment, t e Indicates the end time point. The original dataset includes 14,000 users, 119,938 videos, and 225,015 segment annotations. In this experiment, all clips are processed into a fixed duration of 5s, and a threshold θ is set. When the overlap between the user’s actual interaction clip and the data set exceeds θ, it is considered that the user has generated positive feedback on the clip. After data cutting, ensure that each segment is fixed for 5s, and the final data set D is obtained.

[0071] The present invention uses five indicators, including MAP (Mean Average Precision), NMSD (Normalized M...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a method for recommending video clips based on a graph convolutional network, including: 1. Constructing a scoring matrix for video clips by users; 2. Processing user sets and video clip sets to obtain user embedding matrix and video clip embedding matrix ;3. Construct a bipartite graph based on content attributes according to the user's rating matrix; 4. Input the constructed bipartite graph into the graph convolutional network, and continuously update the user embedding matrix; 5. Use the graph convolutional network to calculate user Predictive value of segment preferences, so as to recommend segments to users. The present invention can make more accurate recommendations for users, especially for user groups with scarce historical data, so as to better solve the problem of cold start items.

Description

technical field [0001] The invention relates to the field of video recommendation, in particular to a video segment recommendation method based on a graph convolutional network. [0002] technical background [0003] With the popularity of online videos, the number of videos has exploded in recent years. Faced with a large number of videos, how to effectively edit and display the most interesting clips to each user, so as to make more accurate video recommendations to users, has become a very urgent need. [0004] Regarding the technology of extracting video fragments, the more popular method is to extract the most representative fragments of the video based on the visual characteristics of the content, so that users can better preview the video content. In order to better integrate the interest preference information of user groups, for example, in 2016, Gygliet et al. proposed to use the neural network model to learn the characteristics of popular GIF animations on the Int...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06F16/735G06F16/783G06N3/04G06N3/08
CPCG06F16/735G06F16/783G06N3/08G06N3/045
Inventor 吴乐杨永晖汪萌洪日昌
Owner HEFEI UNIV OF TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products