Video clip recommendation method based on graph convolution network

A technology of video clips and convolutional networks, applied in neural learning methods, biological neural network models, special data processing applications, etc., can solve problems such as cold start items, achieve good video highlight recommendations, and avoid data sparse problems

Active Publication Date: 2020-02-25
HEFEI UNIV OF TECH
View PDF6 Cites 9 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0007] In order to solve the deficiencies of the prior art, the present invention proposes a video segment recommendation method based on a graph convolutional network, in order to make more accurate recommendations for users, especially for user groups with scarce historical data, so as to better Fix to cold start item issue

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Video clip recommendation method based on graph convolution network
  • Video clip recommendation method based on graph convolution network
  • Video clip recommendation method based on graph convolution network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment

[0070] In order to verify the effectiveness of the method of the present invention, the present invention grabs a large number of video clips from the video clip sharing platform Gifs.com as a data set, and each clip is composed of a quadruple. s ,t e > , Where u is the user id, v is the source video id of the clip, t s Indicates the start time point of the fragment, t e Indicates the end time point. The original data set includes 14,000 users, 119,938 videos, and 225,015 segment annotations. In this experiment, all segments are processed into a fixed duration of 5s, and a threshold θ is set. When the overlap between the user's actual interaction segment and the data set exceeds θ, it is considered that the user has generated positive feedback on the segment. After data cutting, make sure that each segment is fixed for 5s, and get the final data set D.

[0071] The invention uses five indicators, including MAP (Mean Average Precision), NMSD (Normalized Meaningful Summary Duration...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a video clip recommendation method based on a graph convolution network. The video clip recommendation method comprises the steps of 1, constructing a scoring matrix of a userfor video clips; 2, processing the user set and the video clip set to obtain a user embedding matrix and a video clip embedding matrix; 3, constructing a bipartite graph based on content attributes according to the score matrix of the user; 4, inputting the constructed bipartite graph into a graph convolution network, and continuously updating a user embedding matrix; and 5, calculating a preference prediction value of the user for the segment by utilizing a graph convolution network, thereby recommending the segment to the user. According to the method, more accurate recommendation can be carried out on the user, especially for a user group with few historical data, so that the problem of cold start of an article is better solved.

Description

Technical field [0001] The invention relates to the field of video recommendation, in particular to a video segment recommendation method based on a graph convolutional network. [0002] technical background [0003] With the popularity of online video, the number of videos has begun to explode in recent years. In the face of a large number of videos, how to effectively edit and show each user the most interesting clips, so as to provide users with more accurate video recommendations, has become an urgent need. [0004] Regarding the technique of extracting video segments, a popular method is to extract the most representative segments of the video based on the visual features of the content, so that users can better preview the video content. In order to better integrate the interest preference information of user groups, for example, in 2016, Gygliet et al. proposed to use neural network models to learn popular GIF features on the Internet, so as to achieve automatic extraction of...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06F16/735G06F16/783G06N3/04G06N3/08
CPCG06F16/735G06F16/783G06N3/08G06N3/045
Inventor 吴乐杨永晖汪萌洪日昌
Owner HEFEI UNIV OF TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products