Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Network training method and device, image processing method and device, storage medium and electronic equipment

An image processing and training method technology, applied in the field of image processing, can solve the problems of high-quality training data, image processing parameters, etc.

Active Publication Date: 2018-06-29
BEIJING SENSETIME TECH DEV CO LTD
View PDF2 Cites 35 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] However, in image effect processing tasks such as image aesthetic enhancement, it is extremely difficult to collect a considerable amount of high-quality training data and label image processing parameters, so the existing strongly supervised learning methods have great limitations.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Network training method and device, image processing method and device, storage medium and electronic equipment
  • Network training method and device, image processing method and device, storage medium and electronic equipment
  • Network training method and device, image processing method and device, storage medium and electronic equipment

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0066] figure 1 It is a flow chart showing the training method of the image processing neural network according to Embodiment 1 of the present invention.

[0067] Embodiments of the present invention propose a weakly supervised learning method based on generative adversarial networks, which considers the relationship between image effect classification and image effect transformation, and is used to train an image processing neural network with strong image effect transformation capabilities. The image processing neural network only learns parameters for image effect transformation processing based on the labeled data of image effect classification as training supervision information. In this process, there is no need to carry out fine effect enhancement data annotation on the sample image, and the learning of image effect transformation parameters with weak supervision is realized. For example, in the task of image aesthetic enhancement processing, it is only necessary to pe...

Embodiment 2

[0085] An exemplary way of constructing a parametric generative neural network as a generator and a classification neural network as a discriminator is described below.

[0086] According to the second embodiment of the present invention, the parameter generation neural network is obtained by transforming the general classification neural network. The general classification neural network may be, for example, a general neural network for generating image classifications or a general neural network for generating certain image effect classifications.

[0087] The general classification neural network can be pre-trained using an applicable machine learning method, or a trained general classification neural network can be used. Since the general classification neural network used for image effect classification has good feature extraction ability related to the expected image effect, the parameter generation neural network (and classification neural network) can be constructed ba...

Embodiment 3

[0096] According to an exemplary embodiment of the present invention, the image effect transformation parameters may include, but are not limited to, at least one of the following parameters: a first parameter for image clipping and a second parameter for image color enhancement. It should be pointed out that the training method proposed by the present invention is applicable to the training of the image processing neural network with any existing or possibly applicable image effect transformation parameters, and is not limited to the above two parameters.

[0097] Correspondingly, a first output branch of the first parameter used for image clipping and a second output branch of the second parameter used for image color enhancement may be respectively set at the end of the parameter generating neural network.

[0098] A detailed description is given below for exemplary training of the classification neural network and the parameter generation neural network for generating the f...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

Embodiments of the invention provide a network training method and device, an image processing method and device, a storage medium and electronic equipment. The neural network training method comprises the following steps of: generating a neural network through parameters, obtaining an effect transformation parameter of a first sample image, and transforming the first sample image into a second sample image; obtaining effect classification and detection data of the second sample image through a classification neural network; and training the parameters to generate the neural network accordingto the effect classification and detection data of the second sample image and image effect classification labeling information of the first sample image. On the basis of a generative adversarial network, simple and object labeling data of image effect classification is taken as supervision information of training, accurate image effect parameter data labeling does not need to be carried out on selected sample images, and parameters for generating image effect transformation parameters are trained through a weak supervised learning manner so as to generate neural networks.

Description

technical field [0001] Embodiments of the present invention relate to image processing technologies, and in particular, to a training method of an image processing neural network, an image processing method, a device, a storage medium, and electronic equipment. Background technique [0002] In various image-related applications, the captured images need to be processed according to the needs of the application scene, for example, beautification processing, background blur processing, and cropping processing for portraits captured by mobile phones. [0003] Here, Image Aesthetic Enhancement (Image Aesthetic Enhancement) is an image enhancement technology that attempts to improve image quality or image aesthetics based on computer aesthetics. In the existing image aesthetic enhancement technology, a small number of sample images are mainly used for image feature extraction, and a model for generating enhanced parameters is trained based on a strongly supervised machine learnin...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/62G06K9/46
CPCG06V10/56G06F18/24G06F18/214
Inventor 邓煜彬吕健勤汤晓鸥
Owner BEIJING SENSETIME TECH DEV CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products