Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Neural network acceleration coprocessor, processing system and processing method

A coprocessor and neural network technology, applied in the field of artificial intelligence and chip design, can solve problems such as low computing efficiency, achieve fast reading and writing speed, improve scalability, and simple algorithms

Pending Publication Date: 2021-01-08
CHINA ELECTRIC POWER RES INST +2
View PDF0 Cites 5 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0007] The embodiment of the present invention provides a neural network acceleration coprocessor, a processing system and a processing method, which are used to solve the problem that the calculation efficiency of the convolutional layer calculation is very low currently through pure software.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Neural network acceleration coprocessor, processing system and processing method
  • Neural network acceleration coprocessor, processing system and processing method
  • Neural network acceleration coprocessor, processing system and processing method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment approach

[0053] Such as figure 2 As shown, the embodiment of one aspect of the present invention provides a neural network acceleration coprocessor, including a control module 301, an address generation module 302, a multiply-accumulate module 303 and an output saturation module 304;

[0054] The address generating module 302 is used for matching storage addresses for input data and corresponding output data;

[0055] The multiply-accumulate module 303 is used for neural network convolution operation;

[0056] The output saturation module 304 is used to limit the range of output data, and output operation results;

[0057] The control module 301 is used to receive the extended instruction sent by the main processor, control the address generating module to match the input and corresponding output data address according to the extended instruction, read data from the memory according to the matching address, and control the multiplication and accumulation module to read Perform convolu...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a neural network acceleration coprocessor, a processing system and a processing method. The system comprises a coprocessor, a main processor and a memory, wherein the main processor is used for sending an expansion instruction; the memory is used for storing data; the coprocessor is used for receiving the expansion instruction sent by the main processor, reading the input data from the memory according to the received expansion instruction, performing neural network calculation on the input data to obtain output data, and storing the output data into the memory; whereinthe coprocessor is used for processing time-consuming operation in the convolutional neural network, the main processor controls the coprocessor to perform neural network calculation on the input data through the expansion instruction, the utilization rate of the CPU is reduced, and compared with pure software, convolution operation efficiency is improved by more than 20 times.

Description

technical field [0001] The invention relates to the fields of artificial intelligence and chip design, in particular to a neural network acceleration coprocessor, a processing system and a processing method. Background technique [0002] The convolutional layer is the core computing module in the convolutional neural network. Usually, the calculation amount of the convolutional layer accounts for more than 90% of the calculation of the entire convolutional network. figure 1 It shows the process of convoluting an output feature map. Each input feature map corresponds to a convolution kernel. The dotted boxes of different colors in the input map correspond to different outputs. Each output is composed of different input maps with the same position and The convolution kernel is obtained by multiplying and accumulating. Each output is the result of local information processing of the input, which reflects the local feature information. The same input feature map uses the same c...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06N3/04G06N3/06G06F9/38G06F9/28
CPCG06N3/06G06F9/3877G06F9/28G06N3/045
Inventor 张树华仝杰张鋆赵传奇王辰张明皓
Owner CHINA ELECTRIC POWER RES INST
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products