A method for efficiently tensioning a fully connected neural network

A neural network and fully connected technology, which is applied in the field of highly efficient tensorized fully connected neural networks, can solve the problems of sensitive initialization of network weight parameters and unstable network accuracy, and achieve the effect of improving classification accuracy and reducing the number of parameters.

Inactive Publication Date: 2019-02-15
GUANGDONG UNIV OF TECH
View PDF0 Cites 5 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0006] In the literature "Tensorizing neural networks", Novikov et al proposed to express the network weight of the fully connected layer using TT decomposition (tensor train decomposition). This method can manually adjust the parameter compression rate of the network, but this method is sensitive to the initialization of network weight parameters. The accuracy of the compressed network is unstable

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A method for efficiently tensioning a fully connected neural network
  • A method for efficiently tensioning a fully connected neural network
  • A method for efficiently tensioning a fully connected neural network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0039] The present invention will be further described below in conjunction with specific embodiment:

[0040] See attached figure 1 As shown, a method for efficiently tensorizing a fully connected neural network described in this embodiment includes the following steps:

[0041] S1: Input the fully connected layer of the neural network into the vector x'∈R N Represented in higher-order tensor form: The spatial information in the input of the fully connected layer can be preserved to improve the classification accuracy of the neural network; after the vector is expressed as a tensor, its elements have not changed, but the dimension has changed to n 1 ×n 2 ×…×n n ; For ease of description and visualization of tensors, this embodiment uses a circle to represent a tensor, the number of line segments on the circle represents the number of dimensions of the tensor, and the numbers next to the line segments represent the size of the dimension. input tensor graphics such as ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention relates to a method for efficiently tensioning a fully connected neural network, comprising the following steps of S1 expressing the input vector of the fully connected layer of the neural network as a tensor form; S2 expressing and tensioning the weights of all connection layers of the neural network as tensor ring decomposition form; 3 carrying out the forward propagation of the neural network by using a tensor algorithm of the whole connected layer of the neural network; S4 in the back propagation process of the neural network, updating the core matrix of the tensor ring decomposition form of the weight of the whole connecting layer of the neural network; S5 judging whether the data set completes the iteration, and if so, returning the weight tensor parameter of the network; if not, returning to the step S3. The method of the invention can reduce the parameters used by the neural network and the training time of the neural network, reduce the dependence and consumptionof the large-scale neural network on the hardware level such as the memory size of the computer, the GPU calculation ability and the like, and reduce the number of the neural network parameters without reducing the classification accuracy of the network.

Description

technical field [0001] The invention relates to the technical field of neural network deep learning, in particular to a method for efficiently tensorizing a fully connected neural network. Background technique [0002] Neural networks usually contain fully connected layers. The fully connected layer performs affine transformation y=Wx on the input vector x to obtain y, and then outputs y. [0003] The neural network has shown excellent performance in the field of image classification and detection. By increasing the number of layers of the network and expanding the width of each layer, it can fit various image data sets and complete various complex classification tasks. However, the deepening and widening of the network leads to a sharp increase in network parameters, which in turn reduces the speed of computer training neural networks. In addition, updating a large number of network parameters requires larger computer memory and a GPU with more computing power. Nowadays,...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06N3/08
CPCG06N3/082
Inventor 陈欣琪周郭许
Owner GUANGDONG UNIV OF TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products