Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

A Method for Propagation of Image Editing Based on Improved Convolutional Neural Networks

A convolutional neural network and image editing technology, applied in the field of image editing and dissemination, can solve problems such as poor model generalization ability, color overflow, and poor image coloring

Active Publication Date: 2021-06-08
ZHEJIANG UNIV OF TECH
View PDF3 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] The present invention aims to overcome the problems of high brush stroke requirements and poor model generalization ability in image editing and dissemination, and proposes a method for image editing and dissemination based on an improved convolutional neural network, which can extract More reasonable image characteristics, and at the same time can improve the problem of color overflow in the process of editing propagation

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A Method for Propagation of Image Editing Based on Improved Convolutional Neural Networks
  • A Method for Propagation of Image Editing Based on Improved Convolutional Neural Networks
  • A Method for Propagation of Image Editing Based on Improved Convolutional Neural Networks

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0041]With reference to accompanying drawing, further illustrate the present invention:

[0042] A method for image editing propagation based on an improved convolutional neural network, comprising the following steps:

[0043] 1) For an image to be processed, add color strokes to the image in an interactive way to obtain figure 1 stroke map;

[0044] 2), extract training set and test set to the image in step 1), be used for the training and the test of model respectively;

[0045] 3), use figure 2 The combined convolutional structure in the construct Figure 5 The two-branch convolutional neural network of is trained on the training set; where the combined convolution is composed of image 3 Deformable Convolution Sum Figure 4 Separable convolution composition;

[0046] 4), test the training set to achieve Image 6 Edit propagation effects in ;

[0047] The function of this method is basically the same as that of the existing editing propagation method. Its improve...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

A method based on an improved convolutional neural network to realize image editing propagation. First, combined convolution is introduced to replace traditional convolution. Through this structure, more reasonable image features can be extracted, and the number of parameters of the model and convolution operations can be reduced. number. At the same time, a biased loss function that weights the misclassified background classes is introduced to prevent the background classes from being miscolored and causing color overflow. The method includes the following steps: adding strokes to an image to be processed in an interactive manner; extracting a training set and a test set from the image according to the strokes; using an improved convolutional neural network for model training; using the trained model for testing , and finally achieve image coloring.

Description

technical field [0001] The invention relates to a method for image editing and dissemination, in particular to a method for realizing image editing and dissemination based on an improved convolutional neural network. Background technique [0002] With the development of digital multimedia hardware and the rise of software technology, the demand for image color processing continues to increase, and it is particularly important to perform fast and efficient image color processing on display devices. Editing propagation refers to the process of image editing and processing through user interaction, where users give different color strokes to different objects in the image, perform feature extraction and recognition, and realize image editing and processing. [0003] At present, there are many edit propagation algorithms based on a single image, which are mainly divided into two categories. The first type of method can transform the edit propagation problem into an optimization...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06T11/80G06T11/40G06N3/08G06N3/04
CPCG06N3/08G06T11/40G06T11/80G06N3/045
Inventor 刘震陈丽娟汪家悦
Owner ZHEJIANG UNIV OF TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products