Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Quantitative neural network acceleration method based on field programmable array

A neural network and array technology, applied in the field of neural network-based image processing, can solve problems such as low energy efficiency, failure to meet the low energy consumption requirements of the mobile terminal, and difficulty in meeting the high performance requirements of the neural network and low energy consumption requirements of the mobile terminal. Power Consumption, Fast Inference Power Consumption, Effect of Reduced Storage Requirements

Pending Publication Date: 2021-04-09
UNIV OF ELECTRONICS SCI & TECH OF CHINA
View PDF2 Cites 5 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Generally speaking, the CPU can complete 10-100 GFLOP operations per second, but the energy efficiency is usually lower than 1GOP / J, so it is difficult to meet the high performance requirements of the neural network and the low energy consumption requirements of the mobile terminal.
In contrast, the peak performance provided by the GPU can reach 10TOP / S, so it is an excellent choice for high-performance neural network applications, but it cannot meet the low energy consumption requirements of the mobile terminal.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Quantitative neural network acceleration method based on field programmable array
  • Quantitative neural network acceleration method based on field programmable array
  • Quantitative neural network acceleration method based on field programmable array

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0021] In order to facilitate those skilled in the art to understand the technical content of the present invention, the content of the present invention will be further explained below in conjunction with the accompanying drawings.

[0022] Such as figure 1 Shown, method of the present invention comprises:

[0023] Step 1: Neural Network Weight Space Approximation. For weight space approximation, given a neural network for image processing, the weights play an important role in the final result. Each layer of the neural network is represented as a calculation graph. After the input and weight are calculated by convolution (CONV) or full connection (FC), the bias value is added, and the final output is obtained through the activation function. The original weight space is a continuous and complex real number space. It is expected that the quantized weight space only contains the three numbers 1, -1 and 0. Therefore, it is necessary to approximate the weight space to a spars...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a quantitative neural network acceleration method based on a field programmable array, is applied to the field of image processing, and aims to solve the problem of low image processing efficiency in the prior art. Each layer of a neural network for image processing is expressed as a calculation graph, after input and weight are subjected to convolution or full-connection calculation, a bias value is added, and a final output is obtained through an activation function; the weight space is approximated to a sparse discrete space; numerical quantification is performed on the processed weight to obtain a quantified neural network for image processing; then an accelerator matched with the quantitative neural network for image processing is designed; and each layer of the quantized image processing neural network is calculated according to the corresponding accelerator to obtain an image processing result. By adopting the method provided by the invention, the image processing application can be deployed in a resource-limited embedded system, and the method has the characteristics of rapid reasoning and low power consumption.

Description

technical field [0001] The invention belongs to the field of image processing, in particular to an image processing technology based on a neural network. Background technique [0002] Neural networks (NN) have achieved good results in many fields such as object detection and semantic segmentation, but how to deploy artificial intelligence (AI) applications of neural networks in practical applications such as autonomous driving and autonomous robots is challenging. This is because the devices in practical applications are generally resource-constrained embedded systems, which have less memory and insufficient computing power, but neural networks usually have a huge amount of parameters and calculations, which require a large amount of storage resources and Computing resources, resource-constrained embedded systems cannot be satisfied. The low-precision quantization of the neural network model can effectively reduce storage requirements. Using field programmable gate arrays a...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06N3/04G06N3/063G06N3/08G06F9/50
CPCG06N3/063G06N3/084G06F9/5016G06N3/045Y02D10/00
Inventor 詹瑾瑜周星志江维孙若旭温翔宇宋子微廖炘可范翥峰
Owner UNIV OF ELECTRONICS SCI & TECH OF CHINA
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products