Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

A Deep Neural Network Method Based on Spatial Fusion Pooling

A deep neural network and spatial fusion technology, applied in neural learning methods, biological neural network models, neural architectures, etc., can solve the problem that the pooling layer cannot effectively extract deep-level features and reduce feature channels, so as to promote wide application, The effect of reducing the number of spatial channels, improving classification efficiency and accuracy

Inactive Publication Date: 2021-02-19
TIANJIN UNIV
View PDF4 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0011] The purpose of the present invention is to solve the problem that the pooling layer of the existing deep convolutional neural network cannot effectively extract deep-level features, and propose a deep convolutional neural network method based on spatial fusion pooling suitable for image classification. The channel (spatial) information between channels extracts more representative features, and at the same time, through the fusion of spatial information, the number of feature channels is reduced to further improve the efficiency of the neural network

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A Deep Neural Network Method Based on Spatial Fusion Pooling
  • A Deep Neural Network Method Based on Spatial Fusion Pooling
  • A Deep Neural Network Method Based on Spatial Fusion Pooling

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0029] The present invention will be further described below in conjunction with the accompanying drawings.

[0030] figure 1 Describes the traditional pooling operation. The traditional pooling operation is to perform a pooling operation on a single feature map, usually selecting a certain neighborhood P on the single feature map j A certain value in , such as a, replaces the entire neighborhood (a, b, c, d) as the output of pooling. Its main function is to perform down-sampling operations in the channel to reduce the spatial dimension and computational complexity. However, because it does not take into account the information between channels, the extracted feature representation ability is weak, and deep-level features cannot be extracted.

[0031] figure 2 The spatial fusion pooling operation proposed in this patent is described, which makes full use of the information between channels and within the channel, realizes the spatial fusion of information, and then extract...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to a deep neural network method based on space fusion pooling, which is used for image classification, including: collecting various types of images, and marking the image categories as image label information; dividing image sets; dividing collected images For the training set, verification set and test set, the training set is used to train the convolutional neural network; the deep neural network structure applied to image classification is designed, including the number of convolutional layers used and the number of layers of the spatial fusion pooling layer, and the design The number of filters in the convolutional layer, design the fusion function form and spatial sliding step size and pooling function and pooling window size and step size in the space fusion pooling layer, design the convolution filter for feature fusion Structure, design the number of iterations of the network training cycle and the final convergence conditions of the network, and initialize the network parameters; input the training data batches into the network for calculation and training.

Description

technical field [0001] The invention relates to a method for high-performance image recognition classification and object recognition in the field of computer vision, in particular to a method for image recognition classification and object recognition using a deep learning method. Background technique [0002] In recent years, deep learning technology has been widely used in multiple tasks such as image classification, semantic segmentation and object detection, and automatic driving in the field of computer vision. As an important implementation method in deep learning technology, deep convolutional neural network has achieved remarkable results in many tasks. [0003] Deep convolutional neural networks are often composed of multi-layer convolutional layers and pooling layers. The convolutional layer contains filter parameters for feature extraction, and the pooling layer is used to maintain the translation invariance of the neural network and reduce the impact of data dis...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06N3/04G06N3/08G06K9/62
CPCG06N3/08G06N3/045G06F18/241
Inventor 庞彦伟李亚钊
Owner TIANJIN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products