Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Natural image matting method based on deep learning

A natural image matting and deep learning technology, applied in neural learning methods, image enhancement, image analysis, etc., can solve problems such as unfavorable high-level semantic information acquisition, lack of overall features, and decreased accuracy, to expand the scope of context information, improve Cutout quality effect

Active Publication Date: 2020-05-15
SUN YAT SEN UNIV
View PDF4 Cites 17 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, if the downsampling factor is simply reduced, the receptive field will be too small, and the feature extractor can only learn local features in a small range, which is not conducive to the acquisition of high-level semantic information and also causes a decrease in accuracy. Contradictions between extracting high-level semantic information
For problems such as missing blocks and low prediction accuracy of difficult samples, the targets of map matting are generally relatively large. These large targets can be regarded as composed of blocks of different sizes. If only convolution operations with the same kernel size are used, the same experience In the wild situation, it is easy to cause different features of large blocks to be extracted due to different internal details of the blocks, lacking higher-level overall features, and several adjacent small blocks may be recognized as a whole, so the image matting The multi-scale problem of objects also needs to be solved in the task; in addition, in order to solve the problem of missing blocks in large-area transparent foreground and the problem of difficult samples, the natural image matting model also needs to introduce a method to expand the scope of context information

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Natural image matting method based on deep learning
  • Natural image matting method based on deep learning
  • Natural image matting method based on deep learning

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0043] The present invention will be further described below in conjunction with the accompanying drawings and specific embodiments. It should be noted that these descriptions are only exemplary and not intended to limit the scope of the present invention.

[0044] figure 1 It is a flow chart of a natural image matting method based on deep learning according to an embodiment of the present invention, as follows figure 1 Some specific implementation processes of the present invention are described by way of example. Specific steps are as follows:

[0045] Step S1: Obtain a matting data set, make a training set and a test set suitable for training and testing respectively, and perform data enhancement at the same time. Here, take the Adobe Deep Matting data set released by Xu et al. in 2017 as an example.

[0046] The Adobe Deep Matting dataset only provides the foreground image and alpha mask of the training set, and the foreground image, alpha mask, and ternary image of the ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a natural image matting method based on deep learning, and the method comprises the following steps: obtaining a matting data set, and carrying out the data enhancement; establishing a natural image matting model with an encoder-decoder structure; in order to reserve detail information, designing an encoder to enable the downsampling multiple to be 4, in order to compensatefor reduction of a receptive field caused by downsampling multiple reduction, introducing a cavity convolution to expand the receptive field, and storing the maximum pixel position in the maximum pooling operation so as to provide position information for an upsampling stage; in order to solve the multi-scale problem, a cavity space pyramid module is connected to the top of the encoder; designinga global context module in a decoder, wherein the global context module is used for fusing high-level features corresponding to the encoder and the decoder; finally training and testing. According tothe method, more detail information is reserved in the feature extraction process, and meanwhile, multi-scale features are associated, so that the model can capture global information, model processing details and large-area transparent objects are facilitated, and the matting quality is improved.

Description

technical field [0001] The invention relates to the technical field of image processing, in particular to a natural image matting method based on deep learning. Background technique [0002] Cutout technology is an image processing technology on digital images. It was originally developed by the film and television industry, and has now become a crucial technology in the production of visual effects. Using matting technology, producers in the fields of movies, advertisements, posters, etc. can seamlessly embed a desired character or object into a specified scene. However, the production of such special effects mostly uses the blue screen cutout technology, which needs to place the characters or objects to be embedded in a solid color background for shooting, which greatly limits the application of the cutout technology. With the development of computer technology, users have an increasingly strong demand for extracting objects of interest from a natural image, and at the sa...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06T7/11G06T7/194G06N3/04G06N3/08
CPCG06T7/11G06T7/194G06T2207/10004G06T2207/20016G06T2207/10024G06T2207/20081G06T2207/20084G06N3/084G06N3/048G06N3/045
Inventor 赖剑煌邓卓爽
Owner SUN YAT SEN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products