Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Semantic segmentation method based on improved full convolutional neural network

A convolutional neural network and semantic segmentation technology, applied in the field of computer vision, can solve the problems of loss of image detail sensitivity and feature resolution reduction, and achieve the effect of enriching feature map details, expanding receptive field, and improving accuracy

Inactive Publication Date: 2018-11-30
NANJING UNIV OF POSTS & TELECOMM
View PDF5 Cites 29 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] The technical problem to be solved by the present invention is to overcome the deficiencies of the prior art, provide a semantic segmentation method based on the improved full convolutional neural network, and solve the problem of continuous multiple maximum pooling and downsampling operations in the existing full convolutional network. The feature resolution is sharply reduced, and the feature map restored by the final upsampling loses the sensitivity to the details of the image.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Semantic segmentation method based on improved full convolutional neural network
  • Semantic segmentation method based on improved full convolutional neural network
  • Semantic segmentation method based on improved full convolutional neural network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0020] The specific implementation of the present invention will be further described in detail below in conjunction with the accompanying drawings of the embodiments, so as to make the technical solution of the present invention easier to understand and grasp, so as to define and support the protection scope of the present invention more clearly.

[0021] Such as figure 1 As shown, the present invention designs a semantic segmentation method for improving the full convolutional network, and obtains a multi-hole full convolutional network based on the improvement of the full convolutional neural network, and performs end-to-end training on it. This method specifically comprises the following steps:

[0022] Step 1. Obtain training image data.

[0023] Since the network layer is relatively deep and the number of parameters required for training is large, the amount of training data that needs to be prepared needs to meet a certain order of magnitude requirement. The PASCAL VO...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a semantic segmentation method based on an improved full convolutional neural network. The semantic segmentation method comprises the steps of: acquiring training image data; inputting the training image data into a porous full convolutional neural network, and obtaining a size-reduced feature map through a standard convolution pooling layer; extracting denser features while maintaining the feature map size through a porous convolutional layer; predicting the feature map pixel-by-pixel to obtain a segmentation result; using the stochastic gradient descent method SGD totrain parameters in the porous convolutional neural network in the training; acquiring image data that needs to be subjected to semantic segmentation, and inputting the image data the trained porous convolutional neural network, and obtaining a corresponding semantic segmentation result. The invention can improve the problem that the feature map of the final upsampling recovery in the full convolution network loses sensitivity to the details of the image, and effectively expands the receptive field of a filter without increasing the number of parameters and the amount of calculation.

Description

technical field [0001] The invention relates to a semantic segmentation method based on a porous fully convolutional neural network, which belongs to the field of computer vision. Background technique [0002] Image semantic segmentation is a key technology for image understanding, and it is widely used in automatic driving systems (specifically street view recognition and understanding), drone applications (landing point judgment) and wearable devices. Image semantic segmentation realizes the classification of all pixels in the image. Before deep learning is applied to image semantic segmentation, there are many methods such as the simplest pixel-level threshold method, segmentation method based on pixel clustering, and graph partitioning method. Shi et al. proposed a Normalized cut (N-cut) method based on graph partitioning, which takes into account the connection weights between different parts of the partition and the nodes of the whole graph to achieve the purpose of c...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/62
CPCG06F18/214
Inventor 霍智勇戴伟达
Owner NANJING UNIV OF POSTS & TELECOMM
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products