Multi-core neural network processor and multi-task allocation scheduling method for processor

A technology of neural network and scheduling method, applied in the fields of multi-core neural network processor, multi-task allocation and scheduling, can solve the problem of low utilization of system resources, and achieve the effect of improving utilization and performance and improving performance

Pending Publication Date: 2022-08-05
中科物栖(北京)科技有限责任公司
View PDF0 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0010] The purpose of the present invention is to solve the problem of low utilization of system resources caused by fixed and static allocation and parallel strategies in the existing multi-core neural network processor multi-t

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Multi-core neural network processor and multi-task allocation scheduling method for processor
  • Multi-core neural network processor and multi-task allocation scheduling method for processor
  • Multi-core neural network processor and multi-task allocation scheduling method for processor

Examples

Experimental program
Comparison scheme
Effect test

Example Embodiment

[0026] Example 1

[0027] The multi-task allocation and scheduling method for the multi-core neural network processor is characterized by comprising the following steps:

[0028] like figure 2 As shown, this figure includes a general framework view of the invention.

[0029] Step 1: Get a list with multiple neural network tasks;

[0030] Obtain the information of multiple neural network tasks to be scheduled, including model information of each neural network task, including size information of each layer, calculation type, and deadline set for each neural network task.

[0031] Step 2: Use the offline compilation module to compile each task in the multi-task list in turn, and finally obtain a multi-version compilation result of each task;

[0032] like image 3 shown, the figure lists the flow chart for offline compilation of modules.

[0033] Step S302, the offline compilation module evaluates each layer of the neural network in turn by using the analysis and evaluatio...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a multi-core neural network processor and a multi-task allocation scheduling method for the processor. The method comprises the following steps: acquiring a list containing a plurality of neural network tasks; sequentially compiling each task in the multi-task list by utilizing an offline compiling module to obtain a result with a plurality of compiling versions; according to a system state, the online scheduling module selects an optimal version from a plurality of versions compiled by each task to perform scheduling execution; through the offline compiling module, the technology can obtain a plurality of optimal compiling versions meeting different congestion conditions and distribution conditions, and does not need to retain all compiling versions, so that the storage overhead is saved. Through the online scheduling module, the compiling version with the optimal performance is dynamically selected from multiple versions of the to-be-scheduled layer to be executed, multiple layers of different tasks are scheduled to be parallel on the multi-core neural network processor, and the performance of multiple systems is improved.

Description

technical field [0001] The invention relates to the technical field of artificial intelligence, in particular to a multi-core neural network processor and a multi-task allocation and scheduling method for the multi-core neural network processor. Background technique [0002] Neural network processor (neural processing unit, NPU) is designed to efficiently support the execution and calculation of neural network (neural network, NN), its basic components are weight buffer (weightbuffer), unified buffer (unified buffer) ), the calculation array and the activation unit (Norm / Activationmodule). The weight buffer is used to store the weight information in NN. The unified buffer is mainly used to store input eigenvalues ​​(ifmaps) and output eigenvalues ​​(ofmaps). The calculation array is mainly used to calculate operations such as convolution in NN. It is composed of multiple processing elements (PEs), and each PE has a multiplier-accumulator inside to complete the multiplicati...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G06F9/48G06F9/50G06N3/063
CPCG06F9/4881G06F9/5027G06N3/063Y02D10/00
Inventor 张磊高成思
Owner 中科物栖(北京)科技有限责任公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products