Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Image color expression mode migration method based on deep convolutional neural networks

A deep convolution and neural network technology, applied in the field of deep learning, can solve problems such as incomplete separation of image content and style, distortion of the resulting image structure, and destruction of natural image structure information.

Active Publication Date: 2018-10-26
XI AN JIAOTONG UNIV
View PDF3 Cites 24 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0003] In the current style transfer method, even with the help of the powerful image feature representation ability of the deep convolutional neural network, the content and style of the image still cannot be completely separated, so the current stylized image has some structural information of the style image, For example, when implementing style transfer between natural images and impressionist painting images, the output result image has a large number of strokes similar to impressionist painting images, which destroys the structural information of the original natural image
The main reason is that the current mainstream style transfer methods all use the VGG-19 (or VGG-16) network trained on the ImageNet dataset to extract the content representation and style representation of the image, and the network is based on the ImageNet dataset. The object classification task of 1000 categories is trained, and the feature extraction of the deep neural network model is task-oriented, so even if the statistical measure of the feature representation extracted by VGG-19 (or VGG-16) trained in this classification task is used, it is still Structural information in small regions is preserved, resulting in structurally distorted resulting images

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Image color expression mode migration method based on deep convolutional neural networks
  • Image color expression mode migration method based on deep convolutional neural networks
  • Image color expression mode migration method based on deep convolutional neural networks

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0096] The present invention will be further described in detail below in conjunction with specific embodiments, which are explanations of the present invention rather than limitations.

[0097] The present invention extracts the style features of the image by using a deep convolutional neural network pre-trained in the image style recognition task. The network can extract image color features more efficiently. The VGG-19 (or VGG-16) network introduces too much structural information in the image style feature extraction, so that the generated result image destroys the structural information of the content image, causing distortion problems.

[0098] A deep convolutional neural network pre-trained for image style recognition tasks, which is trained in the task of distinguishing images from natural images and impressionist painting images, can represent image color pattern features more efficiently. The structure of the specific deep convolutional neural network proposed in the p...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a deep convolutional neural network capable of completing an image style recognition task, and provides an image color expression mode migration method. According to the method,content representation features of a to-be-processed content image and an initialized image are extracted through the deep convolutional neural network pre-trained in an object recognition task, calculation of a content loss function is carried out, style representation features of the initialized image and a style image are extracted through the deep convolutional neural network pre-trained in an image style recognition task, calculation of a style loss function is carried out, and finally, a result of a total loss function is obtained; and a gradient descent algorithm is used to start fromthe initialized image to carry out iterative optimization in an image domain according to the total loss function to obtain a corresponding image of a smallest result of the total loss function to usethe same as a final result image. According to the method of the invention, image color expression mode migration can be completed, a case where massive strokes similar to impressionist painting images exist in an output result image can be avoided at the same time, and structure information of original natural images can be retained.

Description

technical field [0001] The patent of the present invention relates to deep learning, in particular to a deep convolutional neural network and a method for image color expression pattern migration based on a deep convolutional neural network. Background technique [0002] With the development of deep learning, deep convolutional neural networks have shown a strong ability in the task of separating and combining the content and style of reconstructed images, and image style transfer has made great progress. However, in the image style transfer task, there is no precise definition of image style. The style of an image can be any visual attribute of the image, such as texture, color, etc., so there is no uniform standard for evaluating the results of image style transfer. . [0003] In the current style transfer method, even with the help of the powerful image feature representation ability of the deep convolutional neural network, the content and style of the image still canno...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06T3/00G06K9/62
CPCG06F18/214G06F18/24G06T3/04
Inventor 牟轩沁张保成
Owner XI AN JIAOTONG UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products