Image style transfer method based on convolutional neural network

A technology of convolutional neural network and transfer method, which is applied in the direction of graphic image conversion, image data processing, 2D image generation, etc.

Active Publication Date: 2017-07-14
UNIV OF ELECTRONICS SCI & TECH OF CHINA
View PDF2 Cites 44 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] In view of the above existing problems or deficiencies, in order to solve the problem of efficiently performing styl...

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Image style transfer method based on convolutional neural network
  • Image style transfer method based on convolutional neural network
  • Image style transfer method based on convolutional neural network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0025] The present invention will be further described in detail below in conjunction with the accompanying drawings and embodiments.

[0026] figure 2 is the target content image, image 3 for the target style image. Our goal is to generate image 4 such that it fuses figure 2 the content of and image 3 style of.

[0027] Step 1. Select the deep convolutional neural network VGG-19, which achieved excellent results in the ImageNet image classification competition in 2014, as our image advanced semantic feature extraction model Φ, and select figure 2 for target content image X C , image 3 For target style image X S , select ReLU2_2 as the content constraint layer, select ReLU1_1, ReLU2_1, ReLU3_1, ReLU4_1 and ReLU5_1 as the style constraint layer, and select the setting threshold ε=5e -3 and the highest number of iterations th = 200;

[0028] Step 2, the target content image X C Input to the convolutional neural network VGG-19 to calculate the filter response Φ(X ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention belongs to the field of image processing and computer vision, and more particularly discloses an image style transfer method based on a convolutional neural network. Based on a high-level semantic representation in the convolutional neural network, an image content model and an image style model are established, then an initial image is optimized to have a content representation similar to a content image and a style representation similar to a style image in the same convolutional neural network, an image fused with the content of the content image and the style of the style image is therefore generated, and the style transfer function is realized. The method of the invention can achieve style transfer for any style image.

Description

technical field [0001] The invention belongs to the field of image processing and computer vision, relates to related technologies such as deep learning and image generation, and specifically relates to an image style transfer method based on a convolutional neural network. Background technique [0002] In daily life, whether it is taking pictures or painting, people often hope to make it have a certain style through post-editing. However, image editing and painting require high skills and rich experience, and it is difficult for ordinary people to realize the function of style transfer without learning. [0003] The current existing image style transfer methods are mainly realized by non-parametric algorithms. These methods can effectively transfer the texture primitive structure such as color and thin edge of the style image to the content image. However, these methods can only extract the low-level semantic features of the image and realize the primary style transfer of...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G06T3/00G06T11/00
CPCG06T3/0012G06T11/001
Inventor 朱策夏志强向俊曌文宏雕虢齐王征韬
Owner UNIV OF ELECTRONICS SCI & TECH OF CHINA
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products