Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Deep neural network calculation acceleration method and device

A deep neural network and pre-computing technology, applied in the field of computer readable storage medium and deep neural network computing acceleration, can solve the problems of limiting the prediction speed of deep neural network, complicated retraining process, high matrix sparsity, etc. effect, the effect of saving the calculation process

Active Publication Date: 2018-11-20
BEIJING BAIDU NETCOM SCI & TECH CO LTD
View PDF4 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Among them, the matrix-vector multiplication in the matrix operation is memory-constrained, thus limiting the prediction speed of the deep neural network during the calculation process
However, the accuracy loss of the binary network is large
The pruning pruning algorithm requires a high degree of matrix sparsity, and the retraining process is complicated
Therefore, none of the existing calculation methods can well realize the calculation acceleration of the neural network.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Deep neural network calculation acceleration method and device
  • Deep neural network calculation acceleration method and device
  • Deep neural network calculation acceleration method and device

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0058] In the following, only some exemplary embodiments are briefly described. As those skilled in the art would realize, the described embodiments may be modified in various different ways, all without departing from the spirit or scope of the present invention. Accordingly, the drawings and descriptions are to be regarded as illustrative in nature and not restrictive.

[0059] An embodiment of the present invention provides a method for accelerating calculation of a deep neural network, such as figure 1 shown, including the following steps:

[0060] S100: Sampling each input vector that needs to be input into the matrix model to obtain a plurality of sampling vectors.

[0061] S200: Perform product quantization on each sampling vector according to a preset quantization parameter to obtain multiple quantization points.

[0062] S300: Divide the matrix model into multiple matrix blocks according to the quantization parameter.

[0063] S400: Calculate each quantization point...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The embodiment of the invention provides a deep neural network calculation acceleration method and device, a terminal and a computer readable storage medium. The method includes the following steps: sampling each input vector that needs to be input into a matrix model to obtain multiple sampling vectors; performing product quantization on each sampling vector according to a preset quantization parameter to obtain a plurality of quantization points; dividing the matrix model into a plurality of matrix blocks according to the quantization parameter; performing calculation on each quantization point and each matrix block to obtain a plurality of pre-calculation tables; and calculating each input vector through each pre-calculation table to obtain a calculation result of the matrix model. According to the embodiment of the invention, the pre-calculation table of the same matrix model only needs to be established once, and all input vectors that need to be calculated by the matrix model canuse the pre-calculation table to perform table lookup calculation, so that the calculation process of the input vectors and the matrix model can be effectively saved, and meanwhile, an original calculation effect of the matrix model can also be maintained.

Description

technical field [0001] The present invention relates to the technical field of data processing, in particular to a method, device, terminal and computer-readable storage medium for accelerating deep neural network calculations. Background technique [0002] Methods for speeding up deep neural networks in the prior art include matrix operations, pruning (pruning) algorithms, and binary networks. Among them, matrix-vector multiplication in matrix operations is memory-constrained, thus limiting the prediction speed of deep neural networks during computation. However, the accuracy loss of the binary network is relatively large. The pruning pruning algorithm requires a high degree of matrix sparsity, and the retraining process is complicated. Therefore, none of the existing calculation methods can well realize the calculation acceleration of the neural network. [0003] The above information disclosed in this Background section is only for enhancement of understanding of the b...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06N3/063G06F9/28
CPCG06F9/28G06N3/063
Inventor 朱志凡冯仕堃陈徐屹朱丹翔曹宇慧何径舟
Owner BEIJING BAIDU NETCOM SCI & TECH CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products