Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Convolutional network accelerator, configuration method and computer readable storage medium

A convolutional network and configuration method technology, applied in data exchange networks, digital transmission systems, electrical components, etc., can solve the problems of not making full use of FPGA resources, not achieving time-division multiplexing, and not having a general implementation scheme, etc. Achieve flexible configurability, improve the balance of bandwidth, time and accelerator operation time

Active Publication Date: 2020-07-14
HUAZHONG UNIV OF SCI & TECH
View PDF14 Cites 9 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] To sum up, the existing problems in the existing technology in many convolutional network hardware acceleration schemes implemented on FPGA are: (1) they are all aimed at a specific network structure model,
[0006] (2) All the network layers of the model are implemented on the chip. This method can only be used for some smaller networks, or the FPGA resources are not fully utilized.
[0007] (3) Time-sharing multiplexing is not achieved, and the degree of parallelism is low
[0008] Difficulty in solving the above technical problems: Among the public solutions for implementing convolutional neural networks on FPGA, there is no good general-purpose implementation solution, and at the same time, resource consumption needs to be considered while achieving high parallelism

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Convolutional network accelerator, configuration method and computer readable storage medium
  • Convolutional network accelerator, configuration method and computer readable storage medium
  • Convolutional network accelerator, configuration method and computer readable storage medium

Examples

Experimental program
Comparison scheme
Effect test

Embodiment

[0068] The FPGA chip-based convolutional network accelerator provided by the embodiment of the present invention is based on the basic structure of a single-layer convolutional network, that is, the structure of convolutional layer + pooling layer + activation layer + batch normalization operation layer. For the number of layers of the overall network model, the executed forward network layer obtains the configuration parameters of the current layer such as the size of the input and output feature map (length, width, number of channels), and the size of the convolution kernel (length, width, number of channels) , step size of convolution and pooling operations, etc., and load feature maps and weight parameters in batches from DDR (double data rate off-chip memory) through configuration parameters. At the same time, the acceleration kernel of the convolutional layer can also configure the degree of parallelism according to the configuration parameters.

[0069] The present inve...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention belongs to the technical field of hardware acceleration of a convolutional network. The invention discloses a convolutional network accelerator, a configuration method and a computer readable storage medium. The method comprises the steps: judging the number of layers, where a whole network model is located, of a currently executed forward network layer through a mark; obtaining a configuration parameter of the currently executed forward network layer, and loading a feature map and a weight parameter from a DDR through the configuration parameter; meanwhile, the acceleration kernel of the convolution layer configures the degree of parallelism according to the obtained executed forward network layer configuration parameters. According to the method, the network layer structureis changed through configuration parameters, only one layer structure can be used when the network FPGA is deployed, flexible configurability is achieved, and meanwhile the effect of saving and fullyutilizing on-chip resources of the FPGA is achieved. A method of splicing a plurality of RAMs into an overall cache region is adopted, the bandwidth of data input and output is improved, ping-pong operation is adopted, and therefore feature map and weight parameter loading and accelerator operation are in pipeline work.

Description

technical field [0001] The invention belongs to the technical field of hardware acceleration of convolutional networks, and in particular relates to a convolutional network accelerator, a configuration method and a computer-readable storage medium. Background technique [0002] At present, with the development of deep learning technology, convolutional neural network is more and more widely used in computer vision, such as target detection and recognition, tracking, semantic segmentation, speech recognition and natural language processing. Its outstanding data fitting performance And the versatility of the model makes the application of convolutional neural network in various complex scene fields replace the original traditional modeling method and become the benchmark in this field. But at the same time, the powerful data fitting ability is at the cost of a huge amount of data and calculation. For example, the model size of AlexNet is 233MB, and the calculation amount is 0....

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): H04L12/24
CPCH04L41/082H04L41/14H04L41/142H04L41/145
Inventor 钟胜卢金仪颜露新王建辉徐文辉颜章唐维伟李志敏
Owner HUAZHONG UNIV OF SCI & TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products