Neural network compiling method for storage and calculation integrated platform

A technology of neural network and compilation method, which is applied in the field of integrated storage and computing, which can solve problems such as mapping, difficult hardware execution efficiency, and the situation where weight update is not considered, so as to reduce the number of remapping weights, balance computing load, and improve parallel efficiency Effect

Active Publication Date: 2021-03-09
SHANGHAI JIAO TONG UNIV
View PDF3 Cites 16 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0013] 1. It cannot support a variety of neural network programming frameworks, and has not done more explorations in computational graph-level optimization;
[0014] 2. There is no flexible operator optimization and scheduling interface, and a similar mapping method is adopted for all operators. Therefore, for a new specific operator, it is difficult for programmers to maximize the execution efficiency of the hardware;
[0015] 3. Deploy the weights of the entire network on the array at one time, without considering the need to update the weights. In fact, it is difficult to map the weights of the network to the Crossbararray (cross array) at one time due to the limitation of the technology level and the weight scale of the neural network model.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Neural network compiling method for storage and calculation integrated platform
  • Neural network compiling method for storage and calculation integrated platform
  • Neural network compiling method for storage and calculation integrated platform

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0049] The following describes several preferred embodiments of the present invention with reference to the accompanying drawings, so as to make the technical content clearer and easier to understand. The present invention can be embodied in many different forms of embodiments, and the protection scope of the present invention is not limited to the embodiments mentioned herein.

[0050] In the drawings, components with the same structure are denoted by the same numerals, and components with similar structures or functions are denoted by similar numerals. The size and thickness of each component shown in the drawings are shown arbitrarily, and the present invention does not limit the size and thickness of each component. In order to make the illustration clearer, the thickness of parts is appropriately exaggerated in some places in the drawings.

[0051] by figure 2 The Core hardware structure of the integrated storage-computing accelerator is shown as an example to illustra...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a neural network compiling method for a storage and calculation integrated platform, which relates to the field of storage and calculation integration, and comprises the following steps: analyzing a neural network model, and mapping the neural network model into intermediate representation described by calculation nodes; optimizing a calculation graph; converting into operator-level intermediate representation; carrying out operator task division and binding with a hardware basic unit; performing operator-level optimization, and reducing the number of times of reading discontinuous memories and the number of times of weight mapping. According to the invention, the calculation flow graph and the neural network operator are optimized according to the characteristics of storage and calculation integrated calculation, the expenditure of writing back an intermediate result between graph-level operators is reduced, and the frequency of remapping the weight when the storage and calculation resources are insufficient is reduced.

Description

technical field [0001] The invention relates to the field of storage-computing integration, in particular to a neural network compiling method for a storage-computing integration platform. Background technique [0002] Deep learning has made many breakthroughs in speech recognition, image recognition and other fields. The existing deep neural network needs to complete calculations in a shorter time and with lower power consumption, which puts forward higher requirements for deep learning computing chips. Therefore, an accelerator with integrated storage and calculation using a non-volatile memory (Non-volatile memory, NVM) such as a memristor as a calculation unit appears. This kind of accelerator effectively solves the bottleneck of bandwidth, and has the characteristics of low power consumption and high speed. Its research and development also opens up a new field for in-memory computing. [0003] At present, the basic algorithms of artificial intelligence are relatively ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G06N3/04G06N3/063G06N3/08
CPCG06N3/063G06N3/08G06N3/045
Inventor 绳伟光师紧想蒋剑飞景乃锋王琴毛志刚
Owner SHANGHAI JIAO TONG UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products