Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Compressed convolutional neural network-oriented parallel convolution operation method and apparatus

A technology of convolution neural network and convolution operation, which is applied in the field of digital signal processing and dedicated hardware accelerators, can solve problems such as lowering system efficiency, repeated data input, and input data bandwidth bottlenecks, and achieves high execution efficiency, good performance, and reduced The effect of repeated reads

Active Publication Date: 2017-07-14
国交金流供应链科技(上海)有限公司 +1
View PDF3 Cites 78 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

The traditional method is to use multiple separate convolution operation units (including 3×3 convolution operation unit, 1×1 convolution operation unit) to perform convolution operations in parallel, and each convolution operation unit needs to load the same set of input feature maps data, which may cause input data bandwidth bottleneck or repeated input of data, reducing the efficiency of the system

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Compressed convolutional neural network-oriented parallel convolution operation method and apparatus
  • Compressed convolutional neural network-oriented parallel convolution operation method and apparatus
  • Compressed convolutional neural network-oriented parallel convolution operation method and apparatus

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0023] The present invention will be described in detail below in conjunction with specific embodiments. The following examples will help those skilled in the art to further understand the present invention, but do not limit the present invention in any form. It should be noted that those skilled in the art can make several changes and improvements without departing from the concept of the present invention. These all belong to the protection scope of the present invention.

[0024] According to the parallel convolution operation device for compressed convolution neural network provided by the present invention, it includes: a 3×3 convolution calculation module based on a shift register chain, and a 3×3 convolution calculation module is set in the 3×3 convolution calculation module Product calculation offset register, 1 × 1 convolution calculation parameter register, 1 × 1 convolution calculation offset register; the 3 × 3 convolution calculation offset register, 1 × 1 convol...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a compressed convolutional neural network-oriented parallel convolution operation method and apparatus. The method comprises the steps of determining an adopted operation mode according to input control signal convolution data shift chain length selection, accumulated offset enabling and convolution calculation enabling; and by adopting two serial shift register chains, inputting convolution data, convolution parameters and channel offset, and performing 3X3 and 1X1 convolution operation at the same time for a same input convolution data stream. According to the method, a multiplier, an accumulator, a parameter register and an offset register are added only based on original serial shift register chain-based 3X3 convolution operation; the realization method is simple; the executive efficiency is high; and the convolution operation in a compressed neural network algorithm can be effectively accelerated. According to the apparatus, a plurality of characteristic graphs can be output at the same time through simple hardware unit expansion and copying; and the apparatus has the advantages of low power consumption, high function unit utilization rate and high processing speed.

Description

technical field [0001] The present invention relates to the field of digital signal processing and dedicated hardware accelerators, in particular to a parallel convolution operation method and device for compressed convolutional neural networks. Background technique [0002] In recent years, Convolutional Neural Network (CNN) has made significant progress in deep learning. The most famous is that in 2012, Alex Krizhevsky and others proposed a classic CNN computing structure AlexNet, which has achieved great success in image classification and recognition. Such as figure 1 As shown, the input of AlexNet is a 3-channel 227×227 image data. The entire processing process includes 8 layers of operations. The first five layers are convolutional layers, and the last three layers are fully connected layers. The first layer of convolution uses The convolution kernel with a width of 3×11×11, the number of convolution kernels is 96, the second layer of convolution uses a convolution k...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06F17/15G06N3/04
CPCG06F17/153G06N3/04
Inventor 陈锋
Owner 国交金流供应链科技(上海)有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products