Supercharge Your Innovation With Domain-Expert AI Agents!

Construction method of deep neural network

A deep neural network and construction method technology, which is applied in neural learning methods, biological neural network models, neural architectures, etc. The effect of avoiding convergence interference

Inactive Publication Date: 2017-01-04
SUZHOU INST FOR ADVANCED STUDY USTC +1
View PDF0 Cites 10 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Although this model further reduces the error rate by 30% to 40% compared with previous methods, the calculation amount of this multi-model is almost equivalent to the sum of the individual training calculation amount of each model
As the neural network parameters of deep learning will increase and the amount of calculation will increase, the calculation cost of such multi-model combination is quite expensive.
Moreover, the quality of each model of the multi-column deep neural network may be uneven, which will also affect the accuracy of the final prediction.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Construction method of deep neural network
  • Construction method of deep neural network
  • Construction method of deep neural network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment

[0026] We propose a new deep neural network architecture named "Fissionable Deep Neural Network". The network structure includes multiple branches sharing parameters and multiple Softmax classifiers; and during training, the structure of the entire network changes dynamically until the entire network structure is split into multiple models.

[0027] The fissile deep neural network structure is a tree structure with shared parameters, such as figure 1 As shown, the whole structure includes input layer, convolution layer, pooling layer, fully connected layer, SoftMax layer and voting layer. The connection between each layer is carried out through data transfer. The root node is the data input layer, all leaf nodes are the Softmax layer, and the voting layer is only used during testing. The numbers following the names of each layer are just to better distinguish each layer and have no other meaning. The path from the root node to a certain leaf node is a neural network with a l...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a construction method of a deep neural network. The structure of the deep neural network is a tree structure of a shared parameter and comprises a plurality of shared parameter branches and a plurality of Softmax layers; when the convergence rate of a certain branch is lowered, a fissile node which owns a plurality of outputs is subjected to fission to obtain one fission node which has the same type with the fissile node, a new feature detector is created, different features are generated, a parent node and a child node of the fission node inherit the fissile node, and the parameter of the fission node is initialized. On a premise of multi-model combination, calculation cost can be reduced, and a plurality of high-quality models can be obtained by fission.

Description

technical field [0001] The invention relates to a deep neural network structure, in particular to a construction method of a deep neural network. Background technique [0002] With the popularity of deep learning in various fields, the scale of neural network parameters is getting larger and larger. [0003] In the structure of deep neural networks, model composition can almost always improve the performance of machine learning methods. Averaging multi-model predictions can further reduce the error rate. At present, research on neural networks is focused on neural networks with a static structure, that is, before training, the structure of the network has been completely designed and fixed, and there will be no change thereafter. The multi-column deep neural network is such a combination model, which trains multiple fixed-structure deep networks separately, and then averagely predicts the results of all deep networks. Although this model further reduces the error rate by ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06N3/08G06N3/04
CPCG06N3/082G06N3/04G06N3/084
Inventor 吴俊敏谭东旭郑焕鑫
Owner SUZHOU INST FOR ADVANCED STUDY USTC
Features
  • R&D
  • Intellectual Property
  • Life Sciences
  • Materials
  • Tech Scout
Why Patsnap Eureka
  • Unparalleled Data Quality
  • Higher Quality Content
  • 60% Fewer Hallucinations
Social media
Patsnap Eureka Blog
Learn More