Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Style migration method for target edge sharpening

A clear and stylistic technology, applied in the field of image processing, can solve the problems of limited retention of semantic content and spatial layout of images, blurred boundaries, image distortion, etc., and achieve the effect of reducing parameters, reducing the amount of parameters, and increasing the depth of the network.

Active Publication Date: 2020-11-24
LANZHOU JIAOTONG UNIV
View PDF25 Cites 3 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0003] In view of the destruction of the image layout and the blurring of the boundaries between the foreground, background and other objects, the style color of the generated image will be mixed with colors that are not in the style image, the semantic content and spatial layout of the image are limited, and the generated image The problem of distortion will be generated. The present invention proposes a style transfer method for object edge clarity, which realizes structural constraints of style images, texture synthesis, and parameter sharing between images with similar styles.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Style migration method for target edge sharpening
  • Style migration method for target edge sharpening
  • Style migration method for target edge sharpening

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0034] The present invention proposes a style transfer method for object edge clarity. The principle of the present invention is as follows: firstly, a codec deep neural network is built, and the encoder converts the input into down-sampled features through subsequent convolutional layers and maximum pooling layers. map, the decoder uses an unpooling layer and a convolutional layer to upsample the feature map. Secondly, under the condition of ensuring the same receptive field, replace the large convolution kernel with a small convolution kernel, increase the depth of the network, reduce parameters, and obtain a more accurate matting mask. Again, a normalization layer is added to the conventional convolutional layer of the migration network. The normalization layer learns the affine parameters to match the statistical information of the content image and the style image, realizes parameter sharing between images with similar styles, and reduces the cost of the migration model. ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to the field of image processing, in particular to a style migration method for target edge sharpening, which comprises the following steps of: establishing a deep neural mattingnetwork; performing feature map extraction, style image reconstruction and content image reconstruction on an original style image and an original content image; optimizing a loss function generatedby the style image and the content image; calculating an Euclidean distance between the original style image and an extracted content image feature map; calculating a mean square error between the original style image and an extracted style image feature map; determining a total loss function of the style image, the original style image and the content image; and iteratively updating the error calculated by the total loss function, and outputting a style image. According to the invention, structural constraint and texture synthesis of style images and parameter sharing among images with similar styles are realized.

Description

technical field [0001] The invention relates to the field of image processing, in particular to a style transfer method for object edge definition. Background technique [0002] Image artistic style rendering is an important research direction in the field of computer vision. Image art stylization has a large number of applications in the film industry, animation production, game rendering and other fields. Artistic image style means that the semantic content of an image is represented by the style of another image. Traditional style transfer methods are mostly manual modeling, requiring professional experience and complex mathematical formulas, and have poor semantic content preservation and spatial constraints. Due to their powerful image representation capabilities, deep neural networks have quickly become a popular tool for image stylization, driving the development of many neural style transfer methods in recent years. Therefore, an efficient and accurate image styli...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06T3/00G06T5/50G06N3/08G06N3/04
CPCG06T5/50G06N3/08G06T2207/20221G06T2207/20081G06T2207/20084G06N3/045G06T3/04Y02T10/40
Inventor 沈瑜杨倩吴亮张泓国王霖梁丽王海龙李丹丹
Owner LANZHOU JIAOTONG UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products