Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Neural network parameter compression method and related device

A technology of neural network and compression method, applied in the direction of biological neural network model, etc., can solve the problems of reduced performance utilization, low compression effect, low compression performance, etc.

Pending Publication Date: 2019-09-24
CHANGSHA UNIVERSITY OF SCIENCE AND TECHNOLOGY
View PDF0 Cites 2 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, when the order of neural network parameters is lower, the compression performance is lower
That is, when Tucker decomposition processing is applied to low-order input data, for example, when Tucker is used to compress first-order parameters, the compression performance is the lowest after compressing the same amount of data, and it cannot be well compressed by Tucker calculations. result
Furthermore, when performing neural network learning on low-order input data, due to the low compression effect, the amount of compression to be processed is high, reducing performance utilization

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Neural network parameter compression method and related device
  • Neural network parameter compression method and related device
  • Neural network parameter compression method and related device

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0041] The core of this application is to provide a neural network parameter compression method, a neural network parameter compression device, computer equipment, and a computer-readable storage medium, by upgrading low-order input data to high-order input data, and then compressing the high-order input Perform tensor decomposition processing, and finally expand the result of tensor decomposition into parameters of the same order as the low-order input data, realize compression of low-order input data, reduce the amount of data during neural network processing, and improve performance utilization.

[0042] In order to make the purpose, technical solutions and advantages of the embodiments of the present application clearer, the technical solutions in the embodiments of the present application will be clearly and completely described below in conjunction with the drawings in the embodiments of the present application. Obviously, the described embodiments It is a part of the emb...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a neural network parameter compression method, which comprises the following steps: carrying out ascending processing on low-order input data to obtain high-order input data; carrying out tensor decomposition processing on the high-order input data to obtain a high-order output result; and expanding the high-order output result according to the order of the low-order input data to obtain a low-order output result. For the neural network parameter compression method, the low-order input data is upgraded into the high-order input data, then tensor decomposition processing is conducted on the high-order input data, and finally tensor decomposition results are expanded into parameters of the same order of the low-order input data, so that compression of the low-order input data is achieved, the data size during neural network processing is reduced, and the performance utilization rate is increased. The invention further discloses a neural network parameter compression device, computer equipment and a computer readable storage medium which have the above beneficial effects.

Description

technical field [0001] The present application relates to the field of computer technology, in particular to a neural network parameter compression method, a neural network parameter compression device, computer equipment, and a computer-readable storage medium. Background technique [0002] With the continuous development of information technology, a variety of parameter processing methods have emerged in the field of deep learning to improve the performance of the deep learning process. [0003] In the prior art, tensor decomposition processing is usually used to perform parameter compression processing on parameters. Generally, tensor decomposition processing includes CP decomposition processing and Tucker decomposition processing. Among them, the Tucker decomposition process is to decompose the tensor into the core tensor and three factor matrices, so as to decompose the parameters of the neural network. When the parameter order of the neural network is higher, the com...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06N3/02
CPCG06N3/02
Inventor 何施茗李卓宙唐杨宁王进邓玉芳陈启民
Owner CHANGSHA UNIVERSITY OF SCIENCE AND TECHNOLOGY
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products