Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Neural network accelerator compiling method and device

A neural network and compiling method technology, applied in the field of neural network accelerator compiling methods and devices, can solve problems such as difficulty in achieving real-time performance, no neural network dedicated accelerator optimization, poor versatility, etc., achieve rapid deployment, solve parameter loading and module utilization rate effect

Pending Publication Date: 2021-10-26
TSINGHUA UNIV
View PDF0 Cites 3 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] However, quantization learning, sparse pruning and other technologies reduce the size of the neural network network during the training and inference stage, and have not been optimized in conjunction with the neural network-specific accelerators at the edge.
The calculation of the neural network algorithm at the edge is still inefficient and has poor versatility. The data transmission between different layers still needs to consume a lot of redundant resources, and the utilization rate of each module still has a lot of room for improvement. At the same time, the accelerator runs Sinking the instruction set from the cloud to the edge requires complex adjustments, and it is far from being able to meet real-time requirements only through convolutional hardware acceleration

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Neural network accelerator compiling method and device
  • Neural network accelerator compiling method and device
  • Neural network accelerator compiling method and device

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0047] In order to make the purpose, technical solutions and advantages of the present invention clearer, the technical solutions in the present invention will be clearly and completely described below in conjunction with the accompanying drawings in the present invention. Obviously, the described embodiments are part of the embodiments of the present invention , but not all examples. Based on the embodiments of the present invention, all other embodiments obtained by persons of ordinary skill in the art without creative efforts fall within the protection scope of the present invention.

[0048] figure 1 A schematic flow chart of the neural network accelerator compilation method provided by the present invention, such as figure 1 As shown, the present invention provides a neural network accelerator compilation method, comprising:

[0049]Step 101, based on the neural network structure information and preset instruction types, generate dependencies between each preset instruc...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a neural network accelerator compiling method and device, and the method comprises the steps: generating a dependency relationship between preset instruction types and a plurality of neural network compiler instruction queues based on neural network structure information and preset instruction types, wherein the neural network compiler instruction queue is a queue composed of neural network compiler instructions of the same preset instruction type; according to the dependency relationship, determining a parallel operation strategy between the neural network compiler instruction queues; and generating an acceleration instruction of the neural network accelerator according to the parallel operation strategy. According to the method and device, flexible dynamic adjustment technologies such as the circular buffer and the superscale are fused in the accelerator special for the neural network, so that the problems of neural network parameter loading and module utilization rate can be effectively solved, and the neural network can be deployed at the edge end more quickly.

Description

technical field [0001] The invention relates to the technical field of artificial intelligence, in particular to a method and device for compiling a neural network accelerator. Background technique [0002] At present, artificial intelligence (AI) technology is advancing by leaps and bounds. Deep neural networks suitable for complex tasks such as identification, detection, and tracking are used in various industries. In order to deploy AI algorithms at the edge and achieve end-cloud collaboration, embedded The rapid development of neural network processor technology. [0003] Neural network reasoning is expensive in terms of computation and space. In order to support edge hardware with low power consumption and low computing power, compression techniques such as quantization learning and sparse pruning have been proposed in large numbers, and operators such as convolution are also implemented at the edge. parallel operation. Among them, quantitative learning replaces the f...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06N3/063
CPCG06N3/063
Inventor 刘勇攀张驰石皓冰袁竹清张璐杨华中
Owner TSINGHUA UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products