Multi-scale self-attention target detection method based on weight sharing

A technology of weight sharing and target detection, which is applied in the field of computer vision, can solve problems such as disappearance and gradient dispersion, and achieve the effect of reducing time consumption

Inactive Publication Date: 2020-11-20
SHANGHAI MARITIME UNIVERSITY
View PDF0 Cites 14 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0007] The present invention better solves the problem of gradient dispersion and disappearance due to the increase of network depth by introducing a residual network structure; by introducing more detection branches of different sizes, the input images that the network can adapt to are increased, and these detection branches can be parallelized Execute multiple convolution operations or pooling operations to obtain more comprehensive information and image representation while reducing time consumption; adopt a weight sharing mechanism to reduce the redundancy of parameters in the network for targets of different sizes, A multi-scale target detection method is designed; in order to make the model pay more attention to key areas, a self-attention mechanism is introduced

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Multi-scale self-attention target detection method based on weight sharing
  • Multi-scale self-attention target detection method based on weight sharing
  • Multi-scale self-attention target detection method based on weight sharing

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0041] The present invention will be further described in detail below in conjunction with the accompanying drawings and specific embodiments. However, it should not be interpreted that the scope of the above-mentioned theme of the present invention is limited to the following embodiments, and all technologies realized based on the contents of the present invention belong to the scope of this aspect.

[0042] The overall implementation process of a multi-scale self-attention target detection model based on weight sharing provided by the present invention is as follows: figure 1 As shown, the specific description is as follows:

[0043] The training set in PASCALVOC2012 is selected as the training data. The present invention removes pictures that are too large or too small in some pictures, and screens out label data for target detection. A total of 3,000 training images of different backgrounds and different target categories, 500 verification images, and 500 test samples we...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention provides a multi-scale self-attention target detection method based on weight sharing. A device for the method is composed of a weight-shared multi-scale convolutional network and a visual self-attention module. According to the method, multi-scale, an attention mechanism, fine-grained feature extraction and lightweight of a model can be considered to a certain extent; a residual block and expansion convolution technology is innovatively fused into a multi-scale convolution network; the method has the capability of realizing deep feature extraction while the lightweight of the model is ensured; the visual attention mechanism is introduced into the multi-scale convolution network, so that the network can pay attention to key areas in the image, and computing resources are saved. The method is wide in adaptability and high in robustness, and can be used for various target detection tasks. The method is experimented on a well-known data set, experimental results show that the method has high accuracy while ensuring the lightweight of the model, average accuracy of 73.6 is obtained, and effectiveness of the method is proved.

Description

technical field [0001] The present invention relates to the field of computer vision in deep learning, in particular to a method for extracting fine-grained features in the complex situation where objects of different sizes in a graph need to be recognized, specifically a multi-scale self-attention object detection network based on weight sharing. Background technique [0002] Object detection has always been a research hotspot in the field of computer vision, and it can be applied in many fields. With the continuous development of the field of computer vision, the requirements for object detection are getting higher and higher. Traditional object detection networks use a single convolutional neural network model for feature extraction, and these methods are still widely used today. The purpose of image feature extraction is to obtain the information features in the image, so that the pixels in the image are divided into different categories such as points, curves or contin...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G06K9/46G06K9/32G06K9/62G06N3/08G06N3/04
CPCG06N3/084G06V10/25G06V10/40G06N3/045G06F18/214
Inventor 刘晋李越
Owner SHANGHAI MARITIME UNIVERSITY
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products