Eureka AIR delivers breakthrough ideas for toughest innovation challenges, trusted by R&D personnel around the world.

Neural network inference method based on software and hardware cooperative acceleration

A technology of software-hardware collaboration and neural network, applied in the computer field, can solve problems such as no software-hardware system collaborative acceleration, and achieve the effect of improving utilization rate and computing performance

Active Publication Date: 2020-05-29
UNIV OF ELECTRONICS SCI & TECH OF CHINA
View PDF10 Cites 9 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0007] At this stage, most of the designs for accelerating the inference process of convolutional neural networks focus on hardware acceleration on the FPGA side. Few acceleration designs propose software acceleration, let alone collaborative acceleration of the software and hardware system as a whole.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Neural network inference method based on software and hardware cooperative acceleration
  • Neural network inference method based on software and hardware cooperative acceleration
  • Neural network inference method based on software and hardware cooperative acceleration

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0026] The present invention will be further described in detail below in conjunction with specific drawings and embodiments.

[0027] The neural network inference method proposed in the present invention is suitable for convolutional neural networks or other neural networks that also have alternately connected convolutional layers and pooling layers, and pipelined fully connected layers. The neural network includes an input layer, an N-layer convolutional layer, an N-layer pooling layer, a K-layer fully connected layer, and an output layer, where N and K are both positive integers, and N≥K; the input signal of the i-th pooling layer Is the output signal of the i-th convolutional layer, the output signal is the input signal of the i+1th convolutional layer, and the input signal of the first convolutional layer is the output signal of the input layer, i∈[1,N] ; The input signal of the jth fully connected layer is the output signal of the j-1 fully connected layer, j∈[2, K]; the in...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

A neural network inference method based on software and hardware cooperative acceleration comprises the steps that firstly, a convolution operation module, a pooling operation module and a full connection operation module are built and configured, then, the configured modules are used for building a neural network inference system, and a neural network comprises convolution layers and pooling layers which are alternately connected and a full connection layer which is in pipeline connection; the trained parameters and input data of the neural network are stored into a neural network inference system; and finally, the neural network inference system starts to work. A first input signal is input into the first convolution layer in a first small work period of each large work period, a secondinput signal is input into the first convolution layer in a second small work period of each large work period, one layer of operation is complted by each small work period neural network, and an operation result is input into the next layer; and the first input signal and the second input signal are sequentially subjected to each layer of operation of the neural network to obtain inference results of the first input signal and the second input signal.

Description

Technical field [0001] The invention belongs to the field of computer technology, and specifically relates to a neural network inference method based on software and hardware cooperative acceleration, which can be applied to a general convolutional neural network. Background technique [0002] Convolutional neural network is the most important model in the field of artificial intelligence deep learning. It is widely used in image classification, recognition, and understanding scenarios, and has achieved high accuracy. In 1998, LeCun et al. proposed a convolutional neural network (LeNet) composed of a convolutional layer and a downsampling layer, forming the prototype of a modern convolutional neural network. In 2012, in the ImageNet large-scale recognition challenge, Krizhevsky and others used ReLU as the activation function to build an AlexNet convolutional neural network, and obtained an excellent result of 84.7% classification accuracy, which became an important turning point ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06N5/04G06N3/063G06N3/04
CPCG06N5/041G06N3/063G06N3/045
Inventor 彭析竹梅亚军李俊燊唐鹤
Owner UNIV OF ELECTRONICS SCI & TECH OF CHINA
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Eureka Blog
Learn More
PatSnap group products