Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Feature Visualization Method for Deep Neural Networks Based on Constrained Optimization-like Activation Mapping

A deep neural network and constrained optimization technology, applied in the field of deep neural network feature visualization of constrained optimization class activation mapping, can solve problems such as weak class discrimination and high noise, and achieve strong class discrimination, less noise, and good visual effects. Effect

Active Publication Date: 2022-05-03
ZHEJIANG UNIV
View PDF3 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] In order to solve the problems existing in the background, the present invention provides a deep neural network feature visualization method based on class activation mapping for the problem that the current deep neural network feature visualization results are noisy and the class discrimination is weak

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Feature Visualization Method for Deep Neural Networks Based on Constrained Optimization-like Activation Mapping
  • Feature Visualization Method for Deep Neural Networks Based on Constrained Optimization-like Activation Mapping
  • Feature Visualization Method for Deep Neural Networks Based on Constrained Optimization-like Activation Mapping

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0041] The present invention will be described in further detail below in conjunction with the accompanying drawings and specific embodiments.

[0042] According to the example that the complete method of content of the present invention implements and its implementation situation are as follows:

[0043] The embodiment uses the deep neural network VGG19 trained on the ImageNet data set as the target model, and is described in detail as follows:

[0044] 1) Obtain a pre-trained model by training or downloading. Torchvision provides a pre-trained VGG19 model on the ImageNet dataset, which can be directly loaded and used.

[0045] 2) Set the feature map to be used, that is, the output of a certain layer of the VGG19 model as the feature map used for subsequent visualization, for example, select the output "features.34" of the last convolutional layer of VGG19.

[0046] 3) For an image X to be tested, such as figure 2 As shown, the input pre-training model is forwarded to obt...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a feature visualization method of a deep neural network based on constrained optimization class activation mapping. Obtain a pre-trained model built with a deep neural network for image classification by training or downloading; use the pre-trained model to forward pass an image to be tested to obtain a feature map, and further process to obtain the final weight vector; through the final weight The vector weights and sums the components of the feature map to obtain a visualized feature map, which is presented as the final visualization result. The present invention can perform feature visualization on any deep neural network, can achieve better visualization effect of deep feature interpretability, has less noise and stronger class discrimination.

Description

technical field [0001] The invention relates to an image feature visualization method in the field of deep learning interpretability, in particular to a deep neural network feature visualization method for constrained optimization and class activation mapping. Background technique [0002] Deep learning techniques have achieved remarkable results and superior performance in some fields, especially in the field of computer vision, such as image classification and other tasks. However, because its mathematical principles have not been fully proven, its end-to-end black-box nature makes it impossible for humans to know how a deep neural network makes decisions. Therefore, research on the interpretability of deep learning has gradually emerged in recent years. One of the most direct ideas is to use visualization technology to obtain image regions that play a positive role in prediction, especially to visualize the feature representation of the middle layer of deep neural network...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06V10/764G06K9/62G06V10/40G06N3/04G06N3/08
CPCG06N3/08G06V10/40G06N3/045G06F18/2411
Inventor 孔祥维王鹏达
Owner ZHEJIANG UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products