Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Hardware accelerator, data processing method, system-on-chip and medium

A hardware accelerator and data technology, applied in the fields of system-level chips and media, hardware accelerators, and data processing methods, can solve problems such as high computing resource utilization, low computing resource utilization, and data conflicts.

Active Publication Date:
View PDF6 Cites 2 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] 1. Due to the need to take into account more CNN and other computing applications, the utilization rate of computing resources is not high;
[0005] 2. There are a lot of splicing data in the existing technical process, but the splicing of data does not need to be considered in the method and process optimized for RNN alone;
[0006] 3. There is a dependency relationship between the data of the preceding and following instructions. There is a special module to judge the data dependency relationship to determine whether the pipeline is to stop or run. The method and process cannot achieve a high utilization rate of computing resources;
[0007] 4. Some existing methods and architectures need to balance resources and recycle hardware resources through software scheduling. Without planning instructions, data conflicts often occur (the required data has not been calculated yet, and instructions are arranged reasonably) and Resource conflict (each instruction has a complete set of computing resources, which are distributed throughout the pipeline), and the operation efficiency is low

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Hardware accelerator, data processing method, system-on-chip and medium
  • Hardware accelerator, data processing method, system-on-chip and medium
  • Hardware accelerator, data processing method, system-on-chip and medium

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0072] The following will clearly and completely describe the technical solutions in the embodiments of the present invention with reference to the accompanying drawings in the embodiments of the present invention. Obviously, the described embodiments are only some, not all, embodiments of the present invention. Based on the embodiments of the present invention, all other embodiments obtained by persons of ordinary skill in the art without making creative efforts belong to the protection scope of the present invention.

[0073] see figure 1 As shown, the embodiment of the present invention discloses a hardware accelerator 02 for running a cycle neural network instruction set 01, the hardware accelerator 02 is used to process the instruction set 01, and the instruction set 01 includes:

[0074] The data flow control instruction 11 is used to perform data flow control to control the data at the input end and the data at the output end of the recurrent neural network calculation ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a hardware accelerator, a data processing method, a system-on-chip and a medium. An instruction set processed by the hardware accelerator comprises a data flow control instruction used for executing data flow control; a conventional type calculation instruction used for executing conventional type calculation so as to complete conventional type calculation in a recurrent neural network; a special type calculation instruction used for executing special type calculation so as to complete the special type calculation in the recurrent neural network; an exponential shift instruction used for executing exponential shift so as to complete data normalization in the recurrent neural network calculation; a data transfer instruction used for executing data transfer so as to complete data transfer operation between different registers and data transfer operation between the registers and a memory during calculation of the recurrent neural network. According to the technical scheme, the calculation resource utilization rate of the hardware accelerator used for operating the recurrent neural network can be effectively improved, and conflicts in the aspects of data and resources can be effectively avoided.

Description

technical field [0001] The invention relates to the technical field of artificial intelligence, in particular to a hardware accelerator, a data processing method, a system-level chip and a medium. Background technique [0002] At present, hardware accelerators for neural networks include Google's TPU, NVDIA's NVDLA, Cambrian and so on. The mainstream neural network hardware accelerator has done a lot of calculation optimization for CNN (ie Convolutional Neural Networks, convolutional neural network) network and RNN (ie Recurrent Neural Network, cyclic neural network) neural network, and in the process of hardware calculation for volume The convolution operation of the product and the convolution kernel of different sizes has been optimized. [0003] In the prior art, there are no relevant methods and technologies specifically for recurrent neural networks that require high throughput and real-time reasoning. Moreover, the existing technologies mainly focus on the methods a...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06N3/06G06N3/04G06F9/30G06F9/28
CPCG06N3/06G06F9/28G06F9/30098G06N3/048G06N3/045Y02D10/00G06N3/063G06F9/3877G06F9/30109G06F9/30036G06F9/30065G06F9/30014G06F9/30032G06F9/30043G06F9/3005G06N3/0442G06N3/044
Inventor 王岩黄运新张吉兴李卫军
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products