FPGA-based binary neural network acceleration system

A binary neural and acceleration system technology, applied in the field of integrated circuit design, can solve the problems that the calculation speed is easily limited by serial calculation, the key calculation path is long, and the resource occupation is large, so as to reduce the calculation cost, reduce the number of calculations, The effect of increasing the calculation speed

Pending Publication Date: 2020-11-13
XIDIAN UNIV
View PDF1 Cites 4 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0006] The object of the present invention is to aim at the defect of above-mentioned prior art, propose a kind of acceleration system of binarization neural network based on FPGA, be used to solve the cal

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • FPGA-based binary neural network acceleration system
  • FPGA-based binary neural network acceleration system
  • FPGA-based binary neural network acceleration system

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0037] The present invention will be further described in detail below in conjunction with the accompanying drawings and specific implementation examples.

[0038] refer to figure 1 , the present invention includes a weight data cache module, an input feature data cache module, a configuration data cache module, a weight data conversion module, a convolution module, a pooling module, a full connection module, a result processing module, a result cache module and a control module realized by an FPGA. module, where:

[0039] The weight data cache module is used to cache the convolutional layer weight data and fully connected layer weight data of the binary neural network through the memory DDR on the FPGA;

[0040] The input feature data cache module is used to cache the input feature data of the binary neural network through the memory DDR on the FPGA;

[0041] In this embodiment, the convolutional layer weight data, fully connected layer weight data, and input feature data o...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention provides an FPGA-based binary neural network acceleration system, belongs to the technical field of integrated circuit design, and is used for solving the technical problems that the calculation speed is easily limited by serial calculation and more resources are occupied due to a long key calculation path of convolution operation in the prior art. The acceleration system comprises aweight data caching module, an input characteristic data caching module, a configuration data caching module, a weight data conversion module, a convolution module, a pooling module, a full connection module, a result processing module, a result caching module and a control module which are realized through an FPGA. The method can be applied to scenes such as rapid target detection in an embeddedenvironment.

Description

technical field [0001] The invention belongs to the technical field of integrated circuit design, and relates to an acceleration system of a binary neural network, in particular to an acceleration system of a binary neural network based on FPGA, which can be applied to scenarios such as rapid target detection in an embedded environment . Background technique [0002] With the continuous development of deep learning, its application in the industrial field is becoming more and more extensive. Deep learning techniques have greatly improved the automation of industrial applications. Among them, the convolutional neural network is more widely used in computer vision due to its excellent performance, such as image classification, target detection, dynamic tracking and other scenarios. [0003] When using convolutional neural networks, in order to obtain higher accuracy, researchers usually tend to construct deeper and more complex neural networks, which will require larger netw...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G06N3/063G06N3/04G06F15/78
CPCG06N3/063G06F15/7807G06F15/7867G06N3/045
Inventor 田玉敏王泉杨鹏飞李喜林王振翼梁瑀
Owner XIDIAN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products