Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

A high-speed real-time quantization structure and operation implementation method for deep neural network

A technology of deep neural network and implementation method, which is applied in the direction of neural learning method, biological neural network model, physical realization, etc., and can solve problems such as inaccuracy

Active Publication Date: 2021-07-06
SHENYANG INST OF AUTOMATION - CHINESE ACAD OF SCI
View PDF5 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

But on a library like ImageNet, it is still not accurate enough

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A high-speed real-time quantization structure and operation implementation method for deep neural network
  • A high-speed real-time quantization structure and operation implementation method for deep neural network
  • A high-speed real-time quantization structure and operation implementation method for deep neural network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0069] The deep neural network can be applied in the image recognition of image processing. The deep neural network is composed of a multi-layer network. Here is an example of the result of one layer and the image operation. The input data is the gray value of the image, as shown in Table 3, Table 3 It is a binary value, and the value corresponds to the gray value of the image. The deep neural network completes the convolution and other operations on the image, and recognizes and classifies the image according to the calculation results.

[0070] deep neural network

[0071] Such as figure 1 As shown, the expression of parameters in the form of integer multiple powers of parameters can be expressed in a unit (same layer) with relatively concentrated operations, that is, as long as the relative relationship of parameters in the unit is in the form of integer multiple powers, a shared weight, you can use parameters in the form of powers of integer multiples. parameters such a...

Embodiment 2

[0085] The same applies to image recognition. The unquantified raw data of the deep network are shown in Table 5.

[0086] The expression of parameters in the form of integer multiple powers of parameters can be expressed in a unit (same layer) with relatively concentrated operations, that is, as long as the relative relationship of the parameters in the unit is the parameters in the form of integer multiple powers, shared weights are proposed. You can use parameters in the form of powers of integer multiples. The parameters shown in the table are temporarily quantified by using the 4th power of 2 as the largest value in the corresponding parameter, and the 4th power of 2 corresponds to 6.84, the 3rd power of 2 corresponds to 3.42, the 2nd power of 2 corresponds to 1.71, and the 1st power of 2 corresponds to 6.84. The power of 2 corresponds to 0.855, the 0th power of 2 corresponds to 0.4275, and the common coefficient 0.4275 is proposed. The quantified results are shown in T...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to a high-speed real-time quantization structure and operation realization method for a deep neural network. The processed data is processed through a deep neural network to obtain processing results, and some or all nodes are selected in the deep neural network as calculation modules. The node parameters in the calculation module are or It is quantized as a parameter in the form of an integer multiple power of 2, and the input data of the calculation module is operated with the parameter in the form of an integer multiple power of 2 to obtain an output result. The invention can simplify the resources consumed by calculation and reduce system requirements.

Description

technical field [0001] The invention is a deep neural network quantization structure and method. Deep neural network can be applied in image recognition, speech recognition, big data analysis, etc. Background technique [0002] Deep neural network is a kind of learning-based method. It abstracts features layer by layer and combines the underlying abstractions to form high-level feature abstractions to discover the characteristics of data and solve different data representation problems. Its topology and calculation method simulate the nervous system of the human brain, and it has been proved that it can accurately perceive the characteristics of data. Deep neural networks include CNN, DNN, RNN and other structures. In recent years, methods based on deep neural networks have achieved good results in target image recognition, speech recognition, and big data analysis. [0003] In 2006, Professor Hinton of the University of Toronto proposed a fast layer-by-layer unsupervised...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06N3/063G06N3/08
CPCG06N3/063G06N3/08
Inventor 周广超罗海波惠斌
Owner SHENYANG INST OF AUTOMATION - CHINESE ACAD OF SCI
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products