Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Deep learning method for lightweight bottleneck attention mechanism

A technology of deep learning and attention, applied in neural learning methods, neural architecture, biological neural network models, etc., can solve problems such as increasing model overhead and reducing attention mechanism

Pending Publication Date: 2022-03-01
JINAN UNIVERSITY
View PDF0 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] In summary, the existing attention mechanism will greatly increase the overhead of the model when it is integrated into other network models. At present, it is urgent to reduce the overhead of the attention mechanism.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Deep learning method for lightweight bottleneck attention mechanism
  • Deep learning method for lightweight bottleneck attention mechanism
  • Deep learning method for lightweight bottleneck attention mechanism

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0031] In this embodiment, real blood cell data sets from various organs are used, and the data sets include three categories: white blood cells, red blood cells, and platelets. The blood cell data set has a total of 5000 images.

[0032] The deep learning method of the lightweight bottleneck attention mechanism includes the following steps:

[0033] T1. Processing the data set: Firstly, data segmentation is performed on the data set, and the data set is divided into a training set of 4000 samples and a test set of 1000 samples.

[0034] T2. Construct a convolutional neural network neural network embedded with a lightweight bottleneck attention mechanism. The structure diagram is as follows figure 2 , the specific structure is as follows:

[0035] Enter an image of size 128x128x3 into figure 2 Network, the network structure includes: 4 layers of convolution layer, 4 layers of maximum pooling layer, 1 layer of residual unit layer, 1 layer of average pooling layer and 1 lay...

Embodiment 2

[0047] This embodiment is based on a real cat and dog data set, which contains two categories: cats and dogs. The cat and dog dataset has a total of 3680 images.

[0048] The deep learning method of the lightweight bottleneck attention mechanism includes the following steps:

[0049] T1. Processing the data set: Firstly, data segmentation is performed on the data set, and the data set is divided into a training set of 3000 samples and a test set of 680 samples.

[0050] T2. Construct a VGG network embedded with a lightweight bottleneck attention mechanism. The structure diagram is as follows image 3 , the specific structure is as follows:

[0051] T21, the model input is an image of 224×224×3;

[0052] T22. Enter the first convolution layer, the size of the convolution kernel is 3×3, the step size is 1, and the number is 64, and the output feature map of 224×224×64 is obtained;

[0053] T23. Enter the first pooling layer, the size of the pooling filter is 2×2, and the ste...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a deep learning method for a lightweight bottleneck attention mechanism. The attention mechanism is divided into a channel attention branch and a space attention branch. The channel attention branch obtains space information of the middle feature map through global average pooling and maximum value pooling, information of k channels in a left neighborhood and a right neighborhood with the channel as the center is rolled up and combined, the two obtained feature maps are added according to elements, and channel attention is generated; the spatial attention branch uses convolution to perform dimension reduction, context acquisition is performed through two dilated convolutions, and finally, the convolution is used to compress the number of channels to one dimension to generate spatial attention; according to the method, the two branch attention are fused, channel attention is broadcasted and expanded according to two dimensions of space, then space attention is broadcasted and expanded according to channel dimensions, the two expanded attention branches are added according to elements, and lightweight bottleneck attention is generated through Sigmoid function operation. The method is low in calculation cost and high in model learning ability.

Description

technical field [0001] The invention relates to the technical field of attention mechanism learning, in particular to a deep learning method of a lightweight bottleneck attention mechanism. Background technique [0002] In recent years, Convolutional Neural Networks (CNN) has rapidly promoted the development of computer vision, and it has demonstrated powerful performance in tasks such as image classification, target detection, and semantic segmentation. In order to further enhance the feature expression ability of CNN, recent research mainly focuses on three important factors of the network model: network depth, network width and network cardinality. [0003] In addition to the above three factors, many researchers have integrated the attention module into the convolution module in recent years, which proves that the attention module has great potential in improving the feature expression ability of the network. [0004] The attention module improves the performance of CNN...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06N3/08G06N3/04
CPCG06N3/082G06N3/045
Inventor 邓玉辉李鸿
Owner JINAN UNIVERSITY
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products