Image style migration model training method and image style migration method

A model training and image technology, applied in the field of image processing, can solve the problems such as the inability to transfer the style of the filter, the limited types of filters, etc., and achieve the effect of controllability and speed.

Active Publication Date: 2018-09-28
GUOXIN YOUE DATA CO LTD
View PDF10 Cites 36 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0003] In the prior art, image style transfer is usually achieved by adding a filter corresponding to the image processing effect, but the form of the filter is only to add a layer (mask) to the image without modifying the pixels of the image, that is, The image itself does not really achieve style transfer, and the types of filters are limited, and style transfer cannot be performed for styles other than the corresponding style of the filter

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Image style migration model training method and image style migration method
  • Image style migration model training method and image style migration method
  • Image style migration model training method and image style migration method

Examples

Experimental program
Comparison scheme
Effect test

example 1

[0066] Example 1: Subtract the pixel values ​​of the pixels corresponding to the R channel positions of the second feature map and the third feature map; the first feature Figure 5 The values ​​of pixels A, B, C, D, and E in the R channel are: 235, 233, 232, 230, and 240, respectively. The values ​​of the pixels A', B', C', D', and E' on the third feature map corresponding to the pixel positions on the second feature map in the R channel are: 125, 127, 124, 130, 132 . Subtract the pixel values ​​of the pixels corresponding to the R channel positions of the second feature map and the third feature map, and the obtained channel difference values ​​corresponding to each pixel point in the R channel are: 110, 106, 108, 100, and 108 .

[0067] Perform noise elimination processing on the channel difference, the process is: detect whether the channel difference corresponding to each pixel point in each channel is greater than 1, if greater than 1, calculate the channel loss of the...

example 2

[0081] Example 2: The second feature map includes three pixels A, D, and C, and the pixel values ​​of pixel A on the three channels of R, G, and B are respectively: 255,167,220; pixel D is on the three channels of R, G, and B The pixel values ​​on the channels are 250, 162, 221 respectively; the pixel values ​​of pixel C on the R, G, and B channels are 240, 150, 190 respectively;

[0082] Then the average value of the pixel point A in the three color channels of R, G, and B is: (255+167+220) / 3=214;

[0083] The pixel mean value of the pixel point B in the three color channels of R, G, and B is: (250+162+221) / 3=211;

[0084] The pixel average value of the pixel point C in the three color channels of R, G, and B is: (240+150+190) / 3=193.

[0085] Assume that the pixels in the third feature map corresponding to the three pixel positions of the calculated pixels A, B, and C are: A', B', and C', and the pixel A' is in the three colors of R, G, and B The pixel mean value of the cha...

example 3

[0106] Example 3: Assume that the first feature map includes three pixels A, B, and C, and the pixel values ​​of pixel A on the three channels of R, G, and B are respectively: 255,167,220; pixel B is on R, G, and B The pixel values ​​on the three channels are 250, 162, and 221 respectively; the pixel values ​​of pixel C on the R, G, and B channels are 240, 150, and 190 respectively;

[0107] To normalize the pixel values ​​of each pixel in the first feature map in different color channels is to divide the pixel values ​​of each pixel in the first feature map in different color channels by 255.

[0108] For example, in this example three, the normalized results of the pixel values ​​of pixel A on the three channels of R, G, and B are respectively: 255 / 255, 167 / 255, 220 / 255; The normalized results of the pixel values ​​on the channels are: 250 / 255, 162 / 255, 221 / 255; the normalized results of the pixel values ​​of pixel C on the R, G, and B channels are: 240 / 255, 150 / 255, 190 / 2...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention provides an image style migration model training method and an image style migration method. The image style migration model training method comprises the following steps that: obtaininga style reference image and a content image; inputting the style reference image and the content image into a first neural network, extracting a first feature vector for the content image, and extracting a second feature vector for the style reference image; on the basis of the first feature vector, reducing the content image to obtain the migration image of the content image; inputting the stylereference image and the migration image into the first neural network, and extracting a third feature vector for the migration image; on the basis of the second feature vector and the third feature vector, calculating a hue loss between the style reference image and the migration image; and according to the hue loss, training the first neural network. By use of the method, the trained image stylemigration model can be obtained in a higher speed.

Description

technical field [0001] The present application relates to the technical field of image processing, in particular, to an image style transfer model training method and an image style transfer method. Background technique [0002] The purpose of image style transfer is to directional change the texture, color, content, etc. of the image, so that the image changes from one style to another; Perform style transfer on landscape photos taken under dim light conditions to obtain images under bright light conditions, etc. [0003] In the prior art, image style transfer is usually achieved by adding a filter corresponding to the image processing effect, but the form of the filter is only to add a layer (mask) to the image without modifying the pixels of the image, that is, The image itself does not really achieve style transfer, and the types of filters are limited, and style transfer cannot be performed for styles other than the corresponding style of the filter. Contents of the ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06T3/00G06N3/08
CPCG06N3/08G06T3/0012
Inventor 孙源良刘萌樊雨茂李彩虹
Owner GUOXIN YOUE DATA CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products