Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Class activation mapping target positioning method and system based on convolutional neural network

A convolutional neural network and target positioning technology, which is applied in the field of class activation mapping target positioning methods and systems, can solve problems such as weakly supervised task performance, and achieve the effect of improving accuracy

Active Publication Date: 2021-03-09
NANKAI UNIV
View PDF3 Cites 5 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] However, weakly supervised problems such as semantic segmentation usually require more accurate object localization information
Coarse object position information generated by class activation maps limits the upper bound of weakly supervised task performance

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Class activation mapping target positioning method and system based on convolutional neural network
  • Class activation mapping target positioning method and system based on convolutional neural network
  • Class activation mapping target positioning method and system based on convolutional neural network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0043] This embodiment provides a convolutional neural network-based class activation mapping target location method;

[0044] Convolutional Neural Network-based Class Activation Mapping object localization methods, including:

[0045] S101: Input the image to be processed into the trained convolutional neural network, perform backpropagation according to the category information, and obtain the gradient corresponding to each feature map of each convolutional layer in the network; wherein, each convolutional layer is Output a feature map; each feature map includes C sub-feature maps; C is a positive integer; each sub-feature map has a one-to-one corresponding gradient;

[0046] S102: Select M convolutional layers from the convolutional neural network, and multiply the C sub-feature maps extracted by each of the M convolutional layers with weights; wherein, the weights are The gradient corresponding to the feature map;

[0047] Input the multiplication processing result into ...

Embodiment 2

[0109] The present embodiment provides a class activation mapping target positioning system based on a convolutional neural network;

[0110] Convolutional Neural Network-based Class Activation Mapping Object Localization System, including:

[0111] Gradient calculation module, which is configured to: input the image to be processed into the trained convolutional neural network, perform backpropagation according to category information, and obtain the gradient corresponding to each feature map of each convolutional layer in the network; wherein , each convolutional layer outputs a feature map; each feature map includes C sub-feature maps; C is a positive integer; each sub-feature map has a one-to-one corresponding gradient;

[0112] The class activation map acquisition module is configured to: select M convolutional layers from the convolutional neural network, and extract C sub-feature maps and weights from each convolutional layer in the M convolutional layers Perform multi...

Embodiment 3

[0118] This embodiment also provides an electronic device, including: one or more processors, one or more memories, and one or more computer programs; wherein, the processor is connected to the memory, and the one or more computer programs are programmed Stored in the memory, when the electronic device is running, the processor executes one or more computer programs stored in the memory, so that the electronic device executes the method described in Embodiment 1 above.

[0119] It should be understood that in this embodiment, the processor can be a central processing unit CPU, and the processor can also be other general-purpose processors, digital signal processors DSP, application specific integrated circuits ASIC, off-the-shelf programmable gate array FPGA or other programmable logic devices , discrete gate or transistor logic devices, discrete hardware components, etc. A general-purpose processor may be a microprocessor, or the processor may be any conventional processor, o...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a class activation mapping target positioning method and system based on a convolutional neural network, and the method comprises the steps of inputting a to-be-processed imageinto the trained convolutional neural network, carrying out the back propagation according to the class information, and obtaining the gradient corresponding to each feature map of each convolution layer in the network, wherein each convolution layer outputs a feature map, each feature map comprises C sub-feature maps, and each sub-feature map has a one-to-one corresponding gradient; selecting Mconvolution layers from the convolution neural network, and multiplying the C sub-feature maps extracted by each convolution layer in the M convolution layers by the weight, wherein the weight is a gradient corresponding to the sub-feature map; inputting the multiplication processing result into a nonlinear ReLU function, performing summation operation on the channel dimension on an output value of the ReLU function, obtaining a corresponding class activation mapping graph for each selected convolution layer, and obtaining M class activation mapping graphs; and fusing the M class activation mapping graphs to obtain a positioning graph.

Description

technical field [0001] The present application relates to the technical field of image processing, in particular to a convolutional neural network-based activation-mapping target location method and system. Background technique [0002] The statements in this section merely mention the background art related to this application, and do not necessarily constitute the prior art. [0003] Currently, many attention models utilize convolutional neural network-based image classifiers to generate class activation maps. In the case of only image category labels, these maps can locate the location of the target object region, and pixels with larger activation values ​​are more likely to belong to the target object. Image-level labels only indicate whether the target object exists, and these labels do not provide information about the location of the object in the image. Therefore, the localization ability of class activation maps can make up for this deficiency of image-level label...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06T7/73G06K9/62G06N3/04G06N3/08
CPCG06T7/73G06N3/084G06T2207/20081G06T2207/20084G06T2207/20104G06T2207/20221G06N3/045G06F18/25G06F18/241
Inventor 程明明张长彬姜鹏涛
Owner NANKAI UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products