Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

cnn accelerator and electronics

An accelerator and engine technology, applied in the field of convolutional neural network, can solve problems such as time cost, waste of memory resources, inflexibility, etc., and achieve the effects of reducing inference error rate, good scalability, and improving performance

Active Publication Date: 2021-08-03
GOWIN SEMICON CORP LTD
View PDF6 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] Most of the existing CNN accelerators are only implemented with a single core of a field programmable gate array (FPGA core), which is limited by the processing power (computing and storage), power consumption, cost, and volume of the FPGA core chip itself. Computing and storage complexity to meet the capabilities of the FPGA core chip itself, and usually only the inference stage of the convolutional neural network is implemented. The hierarchical structure of the engine inside the CNN accelerator and the weight distribution between layers are determined, which needs to be determined in advance. Completing the training process of the network is inflexible. Each layer of the engine must read and write data with the storage unit, which not only wastes memory resources, but also requires time and cost for each data read and write.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • cnn accelerator and electronics
  • cnn accelerator and electronics
  • cnn accelerator and electronics

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0031] The technical solutions proposed by the present invention will be described in further detail below in conjunction with the accompanying drawings and specific embodiments. The advantages and features of the present invention will become clearer from the following description. It should be noted that all the drawings are in a very simplified form and use imprecise scales, and are only used to facilitate and clearly assist the purpose of illustrating the embodiments of the present invention. In this article, "and / or" means to choose one or both.

[0032] Please refer to figure 1 and figure 2 , an embodiment of the present invention proposes a CNN accelerator based on MCU core 11 and FPGA core 12 System-on-Chip (SoC) 1, this SoC1 takes MCU core 11 as the core, based on FPGA core 12 programmable characteristics, realizes the CNN accelerator The training phase and the inference phase of a convolutional neural network. Specifically, the CNN accelerator includes a CNN tra...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The present invention provides a kind of CNN accelerator and electronic equipment, and described CNN accelerator comprises the CNN trainer that is arranged in MCU kernel, the CNN engine that is arranged in FPGA kernel and CNN memory, can combine the data processing capability of MCU kernel and FPGA The combination of the parallel processing capability and programmability of the core improves the data processing speed and parallel processing capability of the CNN accelerator, and is compatible with the training and inference stages of the convolutional neural network. Moreover, the training weight value obtained by the CNN trainer is directly transmitted to the CNN engine, which can optimize the weight distribution between the layers in the CNN engine at any time, and reduce the inference error rate. In addition, using the storage resources of the FPGA core as the CNN memory saves the read and write time between each execution unit in the CNN engine and the CNN memory, and ensures the data read and write rate under a certain storage capacity, thereby speeding up the CNN accelerator. overall operational efficiency.

Description

technical field [0001] The invention relates to the technical field of convolutional neural networks, in particular to a CNN accelerator and electronic equipment. Background technique [0002] Artificial intelligence system (AI) is the study of the laws of human intelligence activities, the construction of artificial systems with certain intelligence, and the study of how to make computers complete the tasks that required human intelligence in the past, that is, to study how to use computer software and hardware to simulate human beings. Basic theories, methods, and techniques of certain intelligent behaviors. [0003] A convolutional neural network (CNN) is very similar to a normal neural network, consisting of neurons with learnable weights and bias constants. Each neuron takes some input and does some convolution and the output is a score for each class. The difference is that the default input of the convolutional neural network is an image, which allows us to encode s...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06N3/063G06N3/04G06N3/08
CPCG06N3/063G06N3/08G06N3/045
Inventor 刘锴宋宁范召杜金凤
Owner GOWIN SEMICON CORP LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products