Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

A Visual Method for Interpreting Convolutional Neural Networks

A convolutional neural network and neuron technology, applied in the field of machine learning and visualization, can solve problems that are not universal, complex, and limited in scope of use

Active Publication Date: 2021-04-23
TIANJIN UNIV
View PDF3 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, this method is only applicable to shallow neural networks (only one hidden layer). Once a deep neural network is encountered, the extracted rules are too complex for humans to analyze and understand, which leads to the failure of this method. When it comes to convolutional neural networks, the complexity of the rules extracted by this method will further increase
The third is the visualization of deep learning features proposed by Fu Kun et al. (Patent Publication No. CN106909945A), but it is only a qualitative analysis of convolution features, which is equivalent to verifying that the features learned by the deep learning model are from low-level to high-level, and Failure to understand the model's decision-making process
Therefore, the scope of these methods is limited and not universal

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A Visual Method for Interpreting Convolutional Neural Networks
  • A Visual Method for Interpreting Convolutional Neural Networks
  • A Visual Method for Interpreting Convolutional Neural Networks

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0047] The specific processing process of the method proposed in the present invention includes three main steps: model and data preparation, data preprocessing, and multi-view visualization.

[0048] 1. Model and data preparation

[0049] The model and data preparation are the input of this visualization method. The model can be AlexNet or complex VGG16, etc. The data is the data set used to train the model. These models and data can be found in some open source libraries, such as Caffe, Tensorflow etc., the model and its training data will be used in the data preprocessing stage.

[0050] 2. Data preprocessing

[0051] The purpose of preprocessing is to provide data for visualization, which mainly includes steps such as extraction of judgment conditions, semantic generation, and decision tree generation.

[0052] (1) Judgment condition extraction:

[0053] Choose the appropriate judgment condition form according to the complexity of the model. The form can be roughly divide...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to a visualization method for explaining convolutional neural networks, comprising: preparing a convolutional neural network model M and its training set S; extracting all the judgment conditions used by the model M in the decision-making process; using neurons and human corpus The matching degree of the semantics in the middle determines the semantics of the neurons, and generates understandable semantics for all judgment conditions; forms a decision tree T, uses its decision-making process as the decision-making process of the model M; converts the decision tree T into a tree flow graph; makes neurons Semantic view; making neuron relationship diagram; making decision data flow diagram; building an interactive visualization system.

Description

technical field [0001] The present invention relates to machine learning and visualization techniques, especially visualization methods for explaining deep convolutional neural networks. Background technique [0002] Machine learning has become one of the most efficient data analysis tools. It has received extensive attention in the industrial and academic fields. Despite the high efficiency of machine learning models, their opacity and inexplicability are the most criticized places. If you look at the machine learning model according to its interpretability and learning ability, you will find that linear regression has the highest interpretability and the lowest learning ability, while the neural network model, on the contrary, has the lowest interpretability and the highest learning ability. At the same time, in industry, users who use neural networks to make predictions need to understand how neural networks make decisions. Academically, researchers also hope to have a...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06N3/04G06N3/08G06K9/62
Inventor 张加万林培文贾世超孙迪
Owner TIANJIN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products