On-chip architecture, pooling computing accelerator array, unit and control method

A control method and computing unit technology, applied in the field of convolutional neural networks, can solve problems such as low utilization

Active Publication Date: 2021-06-04
SHANGHAI WESTWELL INFORMATION & TECH CO LTD
View PDF9 Cites 2 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0011] Then, considering the characteristics of the operator, in the architecture design, if a window of any size is supported, such as the window of NXN proposed in this patent (as disclosed in paragraph 67 of it: in a specific embodiment of the present invention, the pooling window’s The size is N×N, N is a positive integer, and the moving step of the pooling window is equal to N), then if the accelerator on-chip architecture blindly tries to support any size, it will inevitably consume a lot of on-chip logic resources, and these are not commonly used The logical resources of the size may be idle for a long time in practical applications, and the utilization rate is very low

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • On-chip architecture, pooling computing accelerator array, unit and control method
  • On-chip architecture, pooling computing accelerator array, unit and control method
  • On-chip architecture, pooling computing accelerator array, unit and control method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0052] Example embodiments will now be described more fully with reference to the accompanying drawings. Example embodiments may, however, be embodied in many forms and should not be construed as limited to the examples set forth herein; rather, these embodiments are provided so that this disclosure will be thorough and complete and will fully convey the concept of example embodiments communicated to those skilled in the art. The described features, structures, or characteristics may be combined in any suitable manner in one or more embodiments.

[0053] Furthermore, the drawings are merely schematic illustrations of the present disclosure and are not necessarily drawn to scale. The same reference numerals in the drawings denote the same or similar parts, and thus repeated descriptions thereof will be omitted. Some of the block diagrams shown in the drawings are functional entities and do not necessarily correspond to physically or logically separate entities. These functio...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention provides an on-chip architecture, a pooling computing accelerator array, a pooling computing accelerator unit and a control method, the pooling computing accelerator is composed of an input direct storage access module, a pooling computing module and an output direct storage access module, and the control method comprises the following steps: the input direct storage access module, the pooling computing module and the output direct storage access module are connected according to an on-chip clock cycle; a column of feature data is moved from the off-chip memory in sequence and input to the pooling computing module, and the number of rows of the moved column of feature data is determined according to the parallel computing power of the on-chip computing resources; the pooling computing module sequentially performs pooling computing on a column of feature data moved from the input direct storage access module according to the on-chip clock cycle, and outputs a pooling computing result to the output direct storage access module; and the output direct storage access module sequentially moves pooling computing results calculated by the pooling computing module to the off-chip memory according to the on-chip clock period. According to the invention, pooling acceleration is realized.

Description

technical field [0001] The invention relates to the field of convolutional neural networks, in particular to an on-chip architecture, a pooled computing accelerator array, a unit and a control method. Background technique [0002] Convolutional Neural Network (CNN) is a feedforward neural network. Its artificial neurons can respond to surrounding units within a part of the coverage area, and it has excellent performance for large-scale image processing. It mainly includes convolutional layer and pooling layer. Convolutional neural networks have been widely used in image classification, object recognition, and object tracking. [0003] In convolutional neural networks, pooling calculations are usually required, but how to optimize pooling calculations is a technical problem to be solved in the field of convolutional neural network chips. [0004] At present, in the application publication number CN110322388A, the patent name is pooling method and device, pooling system, and...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06F15/78G06F15/80G06N3/04G06F13/42
CPCG06F15/7807G06F15/8076G06F13/4221G06N3/045Y02D10/00
Inventor 谭黎敏桑迟宋捷
Owner SHANGHAI WESTWELL INFORMATION & TECH CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products