Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Convolutional neural network acceleration engine, convolutional neural network acceleration system and method

A convolutional neural network and acceleration engine technology, applied in the field of heterogeneous computing acceleration, can solve problems such as performance and energy efficiency degradation, neural network model mismatch, etc., to achieve the effect of speeding up computing, reducing data access, and improving performance

Active Publication Date: 2020-05-19
HUAZHONG UNIV OF SCI & TECH
View PDF6 Cites 24 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0006] Aiming at the defects and improvement needs of the prior art, the present invention provides a convolutional neural network acceleration engine, convolutional neural network acceleration system and method, the purpose of which is to solve the inconsistency between the mapping strategy of the existing DCNN accelerator and the neural network model. Match technical issues that degrade performance and energy efficiency

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Convolutional neural network acceleration engine, convolutional neural network acceleration system and method
  • Convolutional neural network acceleration engine, convolutional neural network acceleration system and method
  • Convolutional neural network acceleration engine, convolutional neural network acceleration system and method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0074]In order to make the object, technical solution and advantages of the present invention clearer, the present invention will be further described in detail below in conjunction with the accompanying drawings and embodiments. It should be understood that the specific embodiments described here are only used to explain the present invention, not to limit the present invention. In addition, the technical features involved in the various embodiments of the present invention described below can be combined with each other as long as they do not constitute a conflict with each other.

[0075] In the present invention, the terms "first", "second" and the like (if any) in the present invention and drawings are used to distinguish similar objects, and are not necessarily used to describe a specific order or sequence.

[0076] The convolution kernel is a weight matrix, and the output feature image is composed of multiple channels, such as figure 1 As shown, each output feature ima...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a convolutional neural network acceleration engine, a convolutional neural network acceleration system and a convolutional neural network acceleration method, and belongs to the field of heterogeneous computing acceleration, wherein the physical PE matrix comprises a plurality of physical PE units, and the physical PE units are used for executing row convolution operation and related partial sum accumulation operation; the XY interconnection bus is used for transmitting the input feature image data, the output feature image data and the convolution kernel parameters from the global cache to the physical PE matrix, or transmitting an operation result generated by the physical PE matrix to the global cache; the adjacent interconnection bus is used for transmitting anintermediate result between the same column of physical PE units; the system comprises a 3D-Memory, and a convolutional neural network acceleration engine is integrated in a memory controller of eachVault unit and used for completing a subset of a convolutional neural network calculation task; the method is optimized layer by layer on the basis of the system. According to the invention, the performance and energy consumption of the convolutional neural network can be improved.

Description

technical field [0001] The invention belongs to the field of heterogeneous computing acceleration, and more specifically relates to a convolutional neural network acceleration engine, a convolutional neural network acceleration system and a method. Background technique [0002] With the popularization of intelligent computing, including speech recognition, object detection, scene marking and automatic driving, etc., the prediction accuracy of the deep neural network model is required to be higher and higher, and the design of the deep neural network model (DCNN) tends to be deeper and deeper. And the scale is getting bigger and bigger, and the computing platform needs to provide enough computing power and storage capacity for this. [0003] For applications such as deep neural networks, it brings many challenges to the computing platform: the number of layers and parameter shapes of different neural network models are different, which requires high hardware flexibility; diff...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06N3/063G06N3/04
CPCG06N3/063G06N3/045
Inventor 曾令仿程倩雅张爱乐程稳方圣卿杨霖李弘南施展冯丹
Owner HUAZHONG UNIV OF SCI & TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products