Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Video style migration method based on time domain consistency constraint

A consistency and video technology, applied in image data processing, instrumentation, computing, etc., can solve the problems of not considering time domain correlation, lack of long-term consistency of stylized video, poor coherence effect of stylized video, etc., to achieve improved Training speed, satisfying real-time performance, and improving visual effects

Active Publication Date: 2019-08-27
XIDIAN UNIV
View PDF4 Cites 12 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Although this method improves the speed of video style conversion, it does not consider the time-domain correlation between frame images that are farther apart, resulting in the lack of long-term consistency of the stylized video, making the coherence of the stylized video poor.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Video style migration method based on time domain consistency constraint
  • Video style migration method based on time domain consistency constraint
  • Video style migration method based on time domain consistency constraint

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0030] Embodiments and effects of the present invention will be further described below in conjunction with the accompanying drawings.

[0031] refer to figure 1 , the realization steps of the present invention are as follows:

[0032] Step 1. According to the video training set V and the style image a, the optical flow information is calculated offline.

[0033] (1a) Obtain video training set V and style image a, wherein V contains N groups of video sequences, and each group of video sequences I nIncluding four images, each of which is the first frame I of a video 1 , frame 4 I 4 , frame 6 I 6 and frame 7 I 7 , where n={1,2,...,N};

[0034] (1b) Calculate the optical flow information between different frame images through the existing variational optical flow method and optical flow confidence information C n ={c (1,7) ,c (4,7) ,c (6,7)},in Indicates the optical flow information of frame i to frame 7, c (i,7) Denotes the optical flow confidence matrix between f...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a video style migration method based on time domain consistency constraint, which mainly solves the problems of flickering and incoherence before and after video style migration in the prior art. The implementation scheme comprises the following steps of 1) obtaining a video data set and a style image, and calculating the optical flow information offline; 2) constructing avideo style conversion network based on the time domain consistency constraint; 3) training a style conversion network model by using the video data set, the style image and the optical flow information, and updating the weight and bias of each layer; and 4) inputting the test video into the trained style conversion network model, and outputting a result which is the stylized video. According to the present invention, the nonlinear mapping relation between the input video and the stylized video is fitted through the training style conversion network model, different styles of rendering is conducted on the real video under the guidance of the relation, so that the time domain consistency and the visual effect of the stylized video are improved, and the method can be used for the video processing scenes of video rendering and style conversion.

Description

technical field [0001] The invention belongs to the technical field of video image processing, and in particular relates to a video style transfer method, which can be used in video processing occasions of video rendering and style conversion. Background technique [0002] Since the nineteenth century, not only artists have been exploring how to create more attractive works of art, but also relevant personnel in the field of image processing have been thinking about this issue. With the development of deep learning, in 2015, Leon A.Gatys et al. in "ANeural Algorithm of Artistic Style [J]. Computer Science, 2015." proposed to use the deep neural network model to extract the style features of the image and assign it to another An algorithm for an image, which is called an "image style transfer algorithm". The style characteristics of artworks specifically include texture characteristics, color characteristics, brushstroke characteristics, contrast characteristics and changes ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06T3/00
CPCG06T3/04
Inventor 董伟生张珍谢雪梅石光明孙璐
Owner XIDIAN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products